Translate business requirements into data models that are easy to understand and used by different disciplines across the company. Design, implement, and build pipelines that deliver data with measurable quality under the SLA
Partner with business domain experts, data analysts, and engineering teams to build foundational data sets that are trusted, well understood, and aligned with business strategy and enable self-service
Be a champion of the overall strategy for data governance, security, privacy, quality, and retention that will satisfy business policies and requirements
Own and document foundational company metrics with a clear definition and data lineage
Identify, document, and promote best practices
Requirements & Skills:
Bachelor’s degree in Computer Science, Engineering, or related field, or equivalent training, fellowship, or work experience
3+ years of experience working in data integration, pipelines, data modeling,
Experience designing and deploying code in Data platforms in a cloud-based and Agile environment
Fluent English is required
Availability to work 2 days per month in our Santiago office
You will design, implement, and scale data pipelines that transform billions of records into actionable data models that enable data insights.
You will help lead initiatives to formalize data governance and management practices and rationalize our information lifecycle and key company metrics.
You will provide mentorship and hands-on technical support to build trusted and reliable domain-specific datasets and metrics.
You will have deep technical skills, be comfortable contributing to a nascent data ecosystem and building a strong data foundation for the company.
You will be a self-starter, detail and quality oriented, and passionate about having a huge impact at PagerDuty.