Participate in the design and development of Big Data analytical applications
Design, support, and continuously enhance the project code base, continuous integration pipeline, etc.
Write complex ETL processes and frameworks for analytics and data management
Work inside the team of industry experts on cutting-edge Big Data technologies to develop solutions for deployment at a massive scale
Requirements & Skills:
Proficiency in the design and development of ETL pipelines using big data frameworks like spark, flink, and beam. Experience with workflow management tools like Airflow is a plus.
Proficiency in Java, Scala/ Python.
Experience with modern data architecture like data lakes, data lakehouses, and cloud computing platforms
Experience with version control systems, CI/CD practices, testing, and migration tools for database and software
Hands-on experience with Kubernetes.
Understanding of performance and scalability aspects of Distributed Data systems.
Persevere, learn, and adapt to new tools and technologies, in a fast-paced environment.