Provide technical expertise for MI Data-lake components and implement MLOps frameworks.
Lead and define end-to-end architecture for data pipeline solutions in collaboration with colleagues.
Act as a developer, tester, and deployment specialist for data pipelines.
Work closely with stakeholders to translate business needs into sustainable solutions.
Collect, process, and validate data from various sources to ensure data integrity.
Automate repetitive processes and identify value-creation opportunities for API manufacturing.
Build, deploy, and maintain digital products, focusing on continuous delivery and next-gen architecture.
Requirements & Skills:
Master’s degree in Computer Science, Software Engineering, or a related field.
2-4 years of experience in a similar role, including work in large organizations or as a consultant/intern, with experience in regulated industries as an advantage.
Proficiency in Python, R, and SAS, with experience in Go, Java, C++, C#, or Rust.
Experience with Linux, CI/CD tools, Docker (or similar), and software development best practices.
Strong SQL skills and experience with modern data warehouses (preferably Redshift).
Familiarity with networking design, management, and relational databases (Oracle, SQL Server).
Excellent communication skills in English; and Danish are a plus.