Design, develop, and maintain scalable and reliable ETL (Extract, Transform, Load) pipelines to integrate structured and unstructured data from various sources
Automate data workflows and ensure efficient data movement across platforms
Collaboration and Team Management
Work closely with data analysts, data scientists, and business stakeholders to understand data needs and provide tailored data solutions
Translate business requirements into technical specifications for data pipelines, storage, and processing solutions
Data Governance and Compliance
Maintain clear and comprehensive documentation for data pipelines, data warehouse structures, and data governance practices
Ensure compliance with data governance policies and standards for data security and privacy, working closely with security and governance teams
Requirements & Skills:
Bachelor’s degree or Master’s in Computer Science, Data Engineering, Information Technology, or a related field
2-3+ years in data engineering, with strong hands-on experience in a big data warehouse
Team player with strong collaboration skills, able to work effectively with both internal and external teams