Designing, developing, and maintaining data services for processing large amounts of raw and (un)structured (spatial) data via ETL pipelines.
Structuring, scaling, and further automating existing ETL processes.
Setting up new ETL processes within our MS Azure cloud platform.
Performing quality checks through pair programming/code reviews.
Contributing to the development of the field within the development team, for example by providing substantive guidance to your colleagues and mapping out risks or opportunities of proposed solutions.
Requirements & Skills:
Experience in designing ETL pipelines on Azure with Databricks, Azure Data Factory, and Azure Data Lake Storage.
Experience working with PostgreSQL/PostGIS.
Experience in Python and SQL.
Knowledge of conceptual data modeling (ERD and UML).
Good command of the English language, both spoken and written.
You do not hesitate to experiment and test hypotheses without knowing the exact outcome.
You are able to explain complex issues within your field to a broad audience.
You proactively share your ideas and enjoy it when your colleagues do the same.
You get energized by working in a team and know how to concretize the added value of teamwork.
A completed HBO/WO education in computer science, informatics, geoinformation, geomatics, or something similar.
At least 5 years of relevant work experience in a similar role.