Principal Data Engineer, GSK

Principal Data Engineer, GSK

Company GSK
Job title Principal Data Engineer
Job location Bangalore, Karnataka, India
Type Full Time

Responsibilities:

  • Design, develop, and implement high-quality software solutions.
  • Troubleshoot and address complex technical challenges and improve existing software to fit unique organization needs and ecosystem.
  • Provide leadership and consultation on technical matters to a variety of development functions.
  • Analyze business requirements and translate to technical requirements to develop proposals outlining how the organization’s products and services can meet these needs and be integrated and implemented with the organization’s technical infrastructure.
  • Design end-to-end technical solutions that are robust, secure, resilient, scalable and maintainable which aligned with business requirements, architectural best practices and standards to facilitate seamless data flow and communication between systems.
  • Innovation mindset, looking for opportunities to enhance & improve the GCO products & tools with new tech innovation, AI/ML, LLM/NLP, Automation and helping to embed them in products
  • Lead the definition of scope, delivery approach, resourcing plan, and cost models to deliver new technology (system, data, and services as a product) or amend existing technologies to deliver business needs and objectives.
  • Define technical roadmap and own delivery oversight for all in-scope systems working closely with Technology Product Owners and leadership teams.
  • Identify technical risks, vulnerabilities, and dependencies and develop mitigation strategies to address them. Proactively assess and manage risks throughout the software development lifecycle.
  • Lead the integration of disparate systems, platforms, and data sources to create unified and interoperable solutions.
  • Provide technical leadership, guidance, and mentorship to Technology development teams sourced from GSK’s technology delivery partners including architects, engineers, testers and analysts. Collaborate with cross-functional teams to foster a culture of innovation, continuous improvement, and knowledge sharing.
  • Define and evolve the data architecture strategy, including data modeling, data storage, and data processing frameworks.
  • Optimize the performance of systems and applications through architectural design, performance tuning, and capacity planning. Identify bottlenecks, inefficiencies, and areas for improvement, and implement solutions to enhance system performance and responsiveness.
  • Apply software design principles to complex work in research, design and development of new or existing products, tools and processes required for operation, maintenance and testing
  • Liaise with hardware, software, and systems design engineers to ensure that products and services are modified, configured and installed.
  • Maintain familiarity with architectural standards and vision, R&D Technology Roadmap, technical landscape, validation requirements, central service request processes, GSK policies, and other aspects of GSK environment.
  • Experiments with new technologies and supports continuous experimentation and learning of the teams to drive innovation in solving complex problems, overcoming limitations of incumbent technology and assess new ways of working to deliver efficiency, risk mitigation and optimization strategies.

Requirements & Skills:

  • Bachelor’s degree in Computer Science or Engineering.
  • 12+ years of experience with minimum 5 years of experience in software development of complex solutions in a large-scale organization.
  • Experience creating production-ready technical solutions using innovative technologies.
  • Experience with agile/scrum techniques.
  • Strong experience in architecting ‘fit-for-purpose’ and ‘fit-for-use’ complex solutions on the Microsoft Azure cloud stack [IaaS, PaaS and SaaS]
  • Strong hands-on in Azure services and components like ADF, ADLS2, Azure Event Hub, Azure Functions, Databricks, Spark and Python, SQL/NoSQL DBs, Data Fabric, Denodo and Data Warehousing Solutions like Snowflakes etc.
  • Knowledge and experience of AI/ML tool stack, frameworks, NLP/LLM and MLOps
  • Data science (e.g. AI/ML), data analytics including Power BI & data quality/integrity
  • Ability to stitch together various components towards a scalable, secure, reliable and sustainable enterprise solution.
  • Experience with accountability for E2E Data Architecture and Lifecycle (including non-functional requirements and operations) management.
  • Key knowledge of architectural patterns across code and infrastructure development.
  • Excellent understanding of architecture patterns and best practices with proven innovative and leading-edge thinking to build cloud-ready systems.
  • Strong experience with technical solution design and delivery oversight.
  • Ability to work on complex projects and in a distributed environment.
  • Able to appreciate short term vs. long term goals and take both tactical and strategic decisions.
  • Great communication skills, ability to communicate complex technical concepts to a non-technical audience.
  • Strong organizational skills, the ideal candidate has the ability to work in a fast-paced. environment and has the ability to quickly adapt to changing priorities.
  • Stays on top of the latest trends and develops expertise in emerging cloud technologies.
  • Works well as a technical leader and individual contributor.
  • Build processes supporting data transformation, data structures, metadata, dependency, and workload management.
  • Able to assist the project team in planning and execution to achieve release level goals.
  • Is pro-active and vigilant in detecting, communicating effectively and managing risks to Product and Project leadership teams.

apply for job button