Principal Big Data Engineer, Mastercard

Principal Big Data Engineer, Mastercard

Company Mastercard
Job title Principal Big Data Engineer
Job location Fallon, United States of America
Type Full Time

Responsibilities:

  • Drive the development of products and platforms with a focus on Machine Learning and Data Engineering.
  • Design secure, reliable, and scalable solutions that incorporate both functional and non-functional requirements from product teams, security teams, and technology stakeholders. Present these solutions for review within Loyalty Products and to the wider Mastercard Architecture & Technology Architecture Review Team.
  • Define the technical strategy for Loyalty Data and Predictive Analytics, contributing to long-term goals, roadmaps (12-18 months), and detailed planning at quarterly and sprint levels.
  • Lead modernization efforts and data transformation solutions for Loyalty Data products, ensuring compliance with payment standards and regulations like PCI-DSS, GDPR, and local On-Soil requirements.
  • Collaborate with technical leads, data stewards, DBAs, and database developers to support large-scale, like-for-like data migrations.
  • Stay informed about emerging trends and best practices in big data and cloud platforms, providing technical leadership to enhance the quality of data platforms continuously.
  • Work closely with teams like Data Privacy, Information Security, Cryptographic Architecture, and Business Operations to gather requirements and ensure compliance with Mastercard policies, industry standards, and regulations.
  • Translate business requirements into technical specifications involving data streams, integrations, transformations, databases, and data warehouses.
  • Define the framework, standards, and principles for data architecture, covering areas such as modeling, metadata, security, and reference data.
  • Evaluate and recommend the best analytics solutions, considering usability, technical feasibility, timelines, and stakeholder needs.
  • Break down large solutions into smaller, achievable milestones to gather feedback and data from product managers and stakeholders.

Requirements & Skills:

  • Expertise in data platform engineering concepts, architecture design patterns, and industry best practices.
  • Subject matter expertise in Databricks and experience in cloud infrastructure, particularly AWS and/or Azure.
  • Advanced statistical analysis, coding, and data engineering techniques.
  • Open-source tools, predictive analytics, machine learning, and advanced statistics for basic analytical tasks.
  • Proficiency in Python, Scala, Spark, Hadoop platforms and tools (Hive, Impala, Airflow, NiFi, Sqoop), and SQL to build Big Data products and platforms.
  • Experience developing and deploying production-level data applications and workflows/pipelines, including machine learning systems in Java, Scala, or Python, with full lifecycle involvement—from data ingestion to feature engineering, modeling, and evaluation.
  • Architecting scalable data migration solutions using cloud and data lake technologies, along with data transformation tools like Airflow and NiFi for data pipeline orchestration and SQL Server SSIS for ETL.
  • Cloud security models, encryption strategies, and network layers for hybrid on-prem and cloud environments.
  • Curiosity, creativity, and enthusiasm for technology and innovation.
  • Strong quantitative and problem-solving skills.
  • An ability to multitask with attention to detail.
  • Self-motivation, flexibility, and independence in managing your workload.

apply for job button