Adhering to Company Policies and Procedures with respect to Security, Quality, and Health & Safety.
Writing application code and tests that conform to standards.
Developing infrastructure automation and scheduling scripts for reliable data processing.
Continually evaluating and contributing towards using cutting-edge tools and technologies to improve the design, architecture, and performance of the data platform.
Supporting the production systems running the deployed data software.
Regularly reviewing colleagues’ work and providing helpful feedback.
Working with stakeholders to fully understand requirements.
Be the subject matter expert for the data platform and supporting processes and be able to present to others to share knowledge.
Requirements & Skills:
The ability to problem-solve.
Knowledge of AWS or equivalent cloud technologies.
Knowledge of Serverless technologies, frameworks, and best practices.
Experience using AWS CloudFormation or Terraform for infrastructure automation.
Knowledge of Scala or OO language such as Java or C#.
SQL or Python development experience.
High-quality coding and testing practices.
Willingness to learn new technologies and methodologies.
Knowledge of agile software development practices including continuous integration, automated testing, and working with software engineering requirements and specifications.
Good interpersonal skills, positive attitude, and willingness to help other members of the team.
Experience debugging and dealing with failures on business-critical systems.
Exposure to Apache Spark, Apache Trino, or another big data processing system.
Knowledge of streaming data principles and best practices.
Understanding of database technologies and standards.
Experience working on large and complex datasets.
Exposure to Data Engineering practices used in Machine Learning training and inference.
Experience using Git, Jenkins, and other CI/CD tools.