The Must-Haves
At least 5+ years of relevant experience in develop‐ing scalable, secured, distributed, fault
tolerant, re‐silient & mission‐critical Big Data platforms.
Proficiency in at least one of the programming languages Python, Scala or Java.
Strong understanding of big data and related technologies like Flink, Spark, Airflow, Kafka
etc.
Experience with different databases – NoSQL, Columnar, Relational.
You have a hunger for consuming data, new data technologies, and discovering new and
innovative solutions to the company's data needs
You are organized, insightful and can communicate your observations well, both written and
verbally to your stakeholders to share updates and coordinate the development of data
pipelines
The Nice-to-Haves
You have a degree or higher in Computer Science, Electronics or Electrical Engineering,
Software Engineering, Information Technology or other relat‐ed technical disciplines.
You have a good understanding of Data Structure or Algorithms or Machine Learning
models.