Data Engineering Architect, Mastercard

Data Engineering Architect, Mastercard

Company Mastercard
Job title Data Engineering Architect
Job location Prague, Czechia
Type Full Time

Responsibilities:

  • Perform Proofs of Concepts to assess and identify the technologies per the needs of the organization. Provide recommendations to Senior Management on the selection of technologies and strategic technical direction for the department. Lead the creation of roadmaps that align with the department strategy and long-term business vision and objectives. Coach and mentor team members, and enable knowledge sharing via brown bag sessions on new and upcoming technologies, methodologies, and application-appropriate frameworks.
  • Lead implementation of processes and supporting tools per the evolving needs of the organization. Identifying process gaps, providing estimates for business solutions, defining the project’s initial scope/requirements, and managing it during its lifecycle. Plan project scope and define scope for each phase and iteration to meet business/time-to-market needs.
  • Instruct and guide the team to ensure that appropriate processes and documentation are used. Ensure compliance with the defined standards. Actively look for opportunities to modify and enhance standards per the department’s needs. Develop documentation templates, lead, and enforce the development of artifacts throughout the solution development lifecycle
  • Ensure compliance with Audit Requirements by proactively educating the team on Compliance Requirements and Integrating Compliance Requirements in SDLC. Represent team during Compliance Audits
  • You will be hands-on and work closely to guide a team of Data Managers, Leads and Front-Line Engineers, and cross-functional teams in the associated data maintenance, integrations, enhancement, loads, and transformation processes for the organization.
  • Own and drive the evaluation of data management technologies and lead the implementation with a high bar of performance, scalability, and availability standards.
  • Defining and designing the data warehouse, data lakes, analytic data stores, and AI/ML sandboxes and establishing the data integration strategy.
  • Define data architecture standards, principles, and guidelines.
  • Establish data governance policies and procedures and implement data quality assurance processes.
  • Design and optimize the systems for scalability, supportability, speed, efficiency cost/resources, and reliability.
  • Ensure data security and privacy compliance with relevant regulations and standards to protect sensitive data from breaches and unauthorized access.
  • Provide technical guidance and mentorship to junior data engineers and developers, leadership, and other cross-functional teams.

Requirements & Skills:

  • Bachelor’s degree in information systems, Information Technology, Computer Science or Engineering or equivalent experience
  • Extensive understanding of and experience with Software Engineering Concepts and Methodologies. Eight-plus years of experience with software engineering methodologies is highly desirable.
  • Proven record of accomplishment in the selection and implementation of methodologies with a focus on Lean methodology, and delivery of High Efficiency Solutions leading to productivity gains.
  • Experience in vendor relationship management, vendor selection, RFI/RFP process, and SOW is highly desired.
  • Proven capacity to lead and influence matrix-based project team members and work as a member of a diverse and geographically distributed team.
  • Experience in operations or SAAS software as a service environment is highly desirable.
  • Must be politically savvy, high-energy, detail-oriented, proactive, and able to function under pressure in an independent environment. Possess the ability to have a high degree of initiative and self-motivation to drive results.
  • Demonstrate ability to organize, manage, plan, and control several concurrent initiatives with conflicting needs.
  • Proven ability to sell solutions, persuade cross-functional groups, garner support for solutions, and bring a group to a consensus and strong ability in collection and formulation of data, development of Metrics, and dashboard reporting.
  • Experience and/or certification with AWS Cloud-based services or equivalent.
  • Experience working in enterprise databases Athena, redshift, SQL Server, Cassandra, Hadoop, Hive, etc., and ensuring following industry best practices around data privacy.
  • Expertise in using Python or Scala, Spark, Hadoop platforms and tools Hive, Impala, Airflow, NiFi, Scoop, and SQL to build Big Data products & platforms.
  • Experience in Java/.NET, Scala, or Python technologies and delivering analytics involving all phases like data ingestion, feature engineering, modelling, tuning, evaluating, monitoring, and presenting.
  • Experience in anonymizing data, data product development, analytical models, and AI Artificial Intelligence governance.
  • Experience with SQL, Multi-threading, Message Queuing & Distributed Systems Kafka, Rabbit, Kinesis, SQS, SNS.
  • Flexible to work with global offices across several time zones.
  • Outstanding problem-solving skills and the ability to navigate complex data challenges.
  • Strong data engineering experience in an agile production environment.

apply for job button