Design and implement optimal and scalable data pipeline architecture.
Define and deploy data engineering best practices & standards.
Provide guidance to assemble large, complex data sets that meet functional / non-functional business requirements.
Provide guidance to build complex infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources and ‘big data’ technologies.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, and re-designing infrastructure for greater scalability.
Technical Leading: Team of Data Engineers ( 5-7) that
build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
Keep data separated and secure across national boundaries through multiple data centers.
Work with stakeholders including the Executive, CX, Design, and Dev teams to assist with data-related technical issues and support their data infrastructure needs.
Be a force of proposal on new ideas & improvements on Data & AI products and delivery processes
Requirements & Skills:
Willingness to participate in all levels of project work when necessary.
Excellent English and French written and verbal communication skills.
Join the engine that is changing CAE, pointing towards the next horizon of growth through digital innovations to support our customers in their success.
Bachelor’s degree in Computer Science, Engineering, or related field. A Master’s is a plus.
A minimum of 8 years of industry experience working with data, coding, and scripting (Python/Java/Scala/SQL/JS/Bash), design, and testing.
A minimum of 4 years experience in leading a development team of data engineers
Solid knowledge of CS fundamentals in algorithms and data structures.
Experience supporting and working with cross-functional teams in a dynamic environment.
Experience with big data tools: Hadoop, Spark, Kafka.
Experience with relational SQL and NoSQL databases, including SQL Server and CosmosDb.
Experience with automated data pipeline and workflow management tools: DevOps, ARM, Data Factory, Airflow.
Experience with Microsoft cloud services: Azure, Databricks.
Experience with stream-processing systems: Storm, Spark-Streaming.