Data Architecture Development: Design, develop, and maintain the reporting data architecture within the Snowflake data warehouse using advanced data modeling techniques, ensuring scalability, efficiency, and data integrity.
DBT Implementation: Utilize DBT (Data Build Tool) to build and automate data pipelines for transformations, streamlining the process of data transformation, cleansing, and enrichment.
Advanced SQL Expertise: Write complex SQL queries and scripts to extract, transform, and analyze data from various sources, optimizing query performance and ensuring adherence to best practices.
Python Data Exploration: Leverage Python programming skills to explore and manipulate data, develop custom analytics solutions, and extract actionable insights from large and complex datasets.
Data Mining and Insights: Collaborate with business stakeholders to understand reporting requirements and use data mining techniques to uncover trends, patterns, and insights that drive strategic decision-making and business growth.
Performance Optimization: Continuously monitor and optimize the performance of data pipelines, queries, and processes to ensure timely and accurate delivery of data to reporting and analytics applications.
Documentation and Knowledge Sharing: Document data architecture, processes, and solutions, and provide training and knowledge-sharing sessions to empower other team members with the skills necessary to leverage data effectively.
Requirements & Skills:
5+ years of experience in data engineering, analytics engineering, or a related role, with a focus on building reporting data architecture
5+ years of experience in advanced SQL, with the ability to write complex queries, optimize query performance, and ensure data quality and integrity.
2+ years of hands-on experience with DBT (Data Build Tool) for data transformation and pipeline orchestration.
Strong proficiency in Snowflake data warehouse, including experience with data modeling, performance optimization, and administration.
Proficiency in Python programming for data exploration, manipulation, and analysis.
Experience working with complex and disparate data sources, including structured and unstructured data.
Strong problem-solving skills and the ability to thrive in a fast-paced, dynamic environment.
Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders.
Demonstrated ability to independently uncover insights and relationships across numerous datasets
Bachelor’s degree in Computer Science, Statistics, or another quantitative field