Implementing Concepts of Data Engineering framework within GCP Cloud using the Data Mesh Architecture framework
Design and implement the Data Warehouse using ETL/ELT frameworks
Build and refine job automation and orchestration for the pipeline to handle exception handling, rerun jobs, fault tolerance, retrospective, logging, alerts, notifications, etc.
Develop processes for handling batch and streaming data
Build a compatible target data state to include a compatible schema design, catalogue, data modelling acceptable for consumption via APIs, Client, and analytics
Provide data usage patterns for analytics, API, and other consumption patterns from the target data store
Build target state data publishing and visualization, and integration
Building or use of an AI/Machine Learning prototype for implementing and supporting existing algorithms
Implement Identity and Access Management (IAM) roles across the data assets. Also, define and execute different delegate roles as needed
Implement IAM and DLP dataset-level-access-controls in the target data store
Build a CICD automation pipeline facilitating automated deployment and automated testing
Deliver end-to-end comprehensive documentation along with code samples
Provide Data Analytical Teams support for ease of use to build further Analytical Data Models, Analysis, or insights, using different tool sets