Analyze the different source systems, profile data, understand, document & fix Data Quality issues
Gather requirements and business process knowledge in order to transform the data in a way that is geared toward the needs of end users
Write complex SQL to extract & format source data for ETL/data pipeline
Create design documents, Source to Target Mapping documents, and any supporting documents needed for deployment/migration Design, Develop, and Test ETL/Data pipelines
Design & build metadata-based frameworks needs for data pipelines
Write Unit Test cases, execute Unit Testing, and document Unit Test results
Deploy ETL/Data pipelines Use DevOps tools to version, push/pull code, and deploy across environments
Support team during troubleshooting & debugging defects & bug fixes, business requests, environment migrations & other ad-hoc requests
Do production support, enhancements, and bug fixes
Work with business and technology stakeholders to communicate EDW incidents/problems and manage their expectations
Leverage ITIL concepts to circumvent incidents, manage problems, and document knowledge
Maintain and improve already existing processes
Ensure that the data pipelines are stable, secure, maintainable, and highly available.
Requirements & Skills:
15+ Years Key skillset, ADF, SSIS, Azure, Synapse, SQL Note: we need someone with good communication skills
15+ years of experience in ETL & Data Warehousing
Should have excellent leadership & communication skills
Should have strong working experience in Data Lakehouse architecture
Should have in-depth knowledge of SSIS ETL Tool and good working knowledge of Power BI
Should have worked on data sources such as SAP and Salesforce
Should have very good knowledge of SSIS (ETL Tool), Azure Cloud, ADF, Azure Synapse Analytics & Azure Hub Events
Should have built solution automation in any of the above ETL tools
Should have executed at least 2 Azure Cloud Data Warehousing projects
Should have worked on at least 2 projects using Agile/SAFe methodology
Should have demonstrated working knowledge of ITIL V4 concepts such as Incident Management, Problem Management, Change Management & Knowledge Management
Should have working experience on any DevOps tools like GitHub, Jenkins, etc & on semi-structured data formats like JSON, Parquet, and/or XML files & written complex SQL queries for data analysis and extraction
Should have an in-depth understanding of Data Warehousing, Data Analysis, Data Profiling, Data Quality & Data Mapping Should have cross-global location experience and have been part of a team with at least 20+ members in a global delivery model
Should have experience in working with product managers, project managers, business users, applications development team members, DBA teams, and Data Governance teams on a daily basis to analyze requirements, design, development, and deployment of technical solutions