Design, build, and maintain robust, scalable data pipelines using GCP services such as Google Cloud Storage, BigQuery, Dataflow, Dataform, and Pub/Sub.
Develop and implement data models to support analytics and reporting needs using BigQuery. Automate data workflows and orchestration.
Ensure data quality and consistency through the implementation of best practices in data governance and monitoring. Optimise data processes for performance and cost-efficiency.
Collaborate with a team of analysts to share your technical expertise and help them adapt new tools and standards to streamline data consumption.
Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs.
Work with analysts and software engineers to iterate on our telemetry design.
Requirements & Skills:
5+ years of experience working with data engineering and ETL processes in the gaming and/or entertainment industry.
Significant expertise in Google Cloud Platform services.
Proven experience with SQL and Python for data processing and transformation.
Proven experience in data modelling, schema design, and data warehousing, including preparing datasets for consumption by BI/reporting tools.
Proficiency in implementing CI/CD pipelines for data workflows and job orchestration. Working experience with GitLab.
Knowledge of software engineering best practices, including version control, code reviews, and automated testing.
Excellent problem-solving skills and attention to detail.
Strong communication skills with the ability to collaborate effectively with technical and non-technical stakeholders.