Design, develop, and implement scalable geospatial data architectures for analytics and machine learning using GCP services.
Evaluate and recommend tools and technologies for data ingestion, storage, processing, and visualization.
Data Governance:
Establish and enforce data governance frameworks to ensure data integrity, quality, and security across the geospatial data landscape.
Collaborate with Data Governance and data management teams to define and implement data policies, standards, and practices.
Knowledge of privacy laws such as CCPA and GDPR would be nice to have.
Cross-Functional Leadership:
Lead and mentor cross-functional teams, including data scientists, ML and data engineers, and data/BI analysts, ensuring alignment with project goals and timelines.
Facilitate communication between technical and non-technical stakeholders to drive understanding and support for initiatives.
Geospatial Expertise:
Provide subject matter expertise in geospatial data, technologies, and methodologies, staying current with industry trends and best practices.
Develop and conduct training sessions and workshops to enhance the team’s geospatial capabilities.
Project Management:
Oversee project lifecycles from conception through deployment, ensuring successful delivery of solutions.
Manage timelines, budgets, and resources effectively to meet project objectives.
Requirements & Skills:
Bachelor’s degree in Geospatial Science, Computer Science, Geography, or a related field. A master’s degree is preferred.
5+ years of experience in geospatial data architecture or related roles, with a strong focus on GCP or AWS platforms.
Proven track record in data governance and modern data management platforms.
Expertise in geospatial data processing tools (e.g., ArcGIS, QGIS, Carto Sedona).
Proficiency in cloud platforms and related services (BigQuery, Redshift, Dataplex, DataZone).
Familiarity with database technologies (SQL, NoSQL), diverse data formats (Avro, Iceberg, Parquet, Delta Lake), spatial data formats (raster, vector, GeoJSON, GeoParquet), data integration tools, ETL and ELT workflows, data modeling, data lineage and cataloging tools, batch and streaming pipeline design.