Essential Experience / Criteria:
1. Bachelor’s degree in Analytics, Statistics, Computing, or Engineering related fields.
2. 1-3 years of experience or multiple internships in Analytics, Business Intelligence, or Data Science roles, preferably in an Internet or 'Direct to Consumer' company with large, complex, high-velocity data.
3. Good foundation in ETL (Extract, Transform, Load) processes.
4. Proficiency in data query/manipulation using SQL and data visualization using tools like PowerBI.
5. Good foundation in programming language (Python).
6. Results-oriented and detail-focused, with strong problem-solving skills and the ability to think creatively and quickly.
7. Effective communicator capable of conveying technical information to both technical and business audiences
Desirable Experience / Criteria:
1. Experience using Airflow, Databricks, and PySpark is a strong plus.
2. Experience using Big Data technologies (Spark, PySpark, Databricks, Big Query, RedShift) is a strong plus.
3. Experience in applying LLM (ChatGPT) and related tools to solve diverse problems and improve efficiency, particularly in Data Migration projects is a strong plus.