Enterprise Data Design, Qualcomm India

Enterprise Data Design, Qualcomm India

Company Qualcomm India Private Limited
Job title Enterprise Data Design (Data Engineer, Staff)
Job location Hyderabad, Telangana, India
Type Full Time

Responsibilities:

  • Design and develop data pipelines in AWS, leveraging services such as EC2, IAM, S3, Glue, Athena, and Lambda, among others.
  • Expertise in data flow scheduling with Apache Airflow and data integration using Informatica Cloud (IICS).
  • Demonstrate advanced data modeling and architecture skills in Data Lake and/or Data Warehouse environments.
  • Proficiency in programming with Python or equivalent languages, emphasizing OOPS, data structures, and algorithms.
  • Experience with SQL and/or NoSQL databases, CI/CD practices using Jenkins, and data API development.
  • Engage in event/message-based integrations and support Machine Learning projects within a SAFe Agile Scrum methodology.
  • Utilize containerization technologies like Docker or Kubernetes and infrastructure automation tools such as CloudFormation or Terraform.
  • Exhibit a proactive learning attitude in a fast-paced environment, with excellent communication, interpersonal, and problem-solving skills.

Requirements & Skills:

  • Bachelor’s degree in Computer Engineering, Computer Science, Information Systems, or a related field, with 5+ years of IT-related work experience; or 7+ years of IT-related work experience without a Bachelor’s degree.
  • At least 10 years of programming experience, preferably in Python or Java using OOPS, Data Structures, and Algorithms.
  • At least 5 years of AWS experience including EC2, IAM, S3, CDK, Glue, Athena, Lambda, RedShift, Dynamo DB, Snowflake, and RDS.
  • At least 5 years of Informatica Cloud (IICS) experience.
  • At least 10 years of experience with SQL and/or NoSQL Databases.
  • At least 3 years of experience with Data Structures and algorithms.
  • 5-7 years of expertise in scheduling data flows using Apache Airflow.
  • 10+ years of strong data modelling (Functional, Logical, and Physical) and data architecture experience in Data Lake and/or Data Warehouse.
  • At least 3 years of experience with CI/CD and DevSecOps using Jenkins.
  • At least 2 years of experience with design and development of Data APIs (python, flask/fast API, etc.) to expose data in the platform and supporting Document and Graph stores using Polyglot persistence.
  • At least 2 years of experience with event/message-based integrations (inbound and outbound) on the platform.
  • AWS Certified Data Engineer or AWS Certified Solutions Architect designation.
  • Experience or exposure to other cloud platforms such as Azure and GCP.
  • Knowledge of Big Data handling with PySpark.
  • A Bachelor of Science in Computer Science, Information Technology, Engineering, or a related field, or 7+ years of equivalent professional experience.

apply for job button