Job DescriptionPosition Overview : Client's Delivery Insights initiative treats engineering effectiveness as a product.
They empower product teams with actionable insights—spanning DORA metrics, advanced measures of engineering practices and principles, and emerging areas such as GenAI-driven productivity signals.
As aLead Data Engineer, you will architect, engineer, scale, and evangelize data insights and analytic solutions that illuminate how teams deliver, collaborate, and innovate.
Your work will directly shape how engineers continuously improve delivery speed, quality, and value.What You'll DoImplement critical slices of the Delivery Insights roadmap : build robust pipelines, dashboards, and services that provide real-time (near real-time) visibility into engineering practices and outcomes.Partner with engineering leaders to operationalize DORA metrics (e.g., deploymentfrequency, lead time, change failure rate, MTTR) and design next-generation measures(e.g., code quality signals, collaboration patterns, adoption of engineering principles).
Explore and deliver GenAI-driven insights, including AI-assisted developer productivitymetrics and meaningful usage signals that help teams adopt AI responsibly.Design and maintain data models, pipelines, and APIs that transform raw engineeringtelemetry (GitHub, CI / CD, incident data, collaboration tools) into trusted insights.Embed with product teams through workshops, consultations, and value-streammapping to surface friction, co-create metrics, and harden reporting practices.Partner with Platform Engineering, SRE, Security, and Developer Experience teams toensure alignment on governance, observability, and automation.Drive inner-source contributions and community knowledge sharing; publish dashboards,documentation, and ADRs to scale best practices across engineering.Continuously evaluate and pilot emerging tools and data approaches (e.g.,OpenTelemetry, engineering analytics platforms, AI-based workflow integrations).
What You'll Bring8–12 years of professional data engineering or software engineering experience with increasing scope and impact.Demonstrated success building data platforms for engineering or operational analytics (preferably in developer productivity, reliability, or delivery domains).
Hands-on expertise with data pipelines and orchestration (e.g., Airflow, dbt, Spark, Kafka) and modern cloud infrastructure (AWS, Kubernetes, Terraform).
Strong coding proficiency in Python and SQL (plus familiarity with Java / TypeScript desirable).
Experience working with CI / CD, observability, and collaboration systems (e.g., GitHub, Argo, Datadog, Slack, Jira) as data sources.Proven ability to define and measure engineering productivity metrics; adept at distinguishing signal from noise in data.Strong data visualization and storytelling skills using tools like Looker, Tableau, Grafana, or custom dashboards.Passion for product thinking and customer empathy—you seek feedback early and iterate quickly to make data truly actionable.Excellent facilitation, mentoring, and communication skills; thrive in cross functional, highly collaborative environments.Bachelor's degree in computer science, Data Science, or related field—or equivalentpractical experience.Tech Stack & Tooling You'll TouchData Pipelines & Orchestration : Airflow, dbt, Spark, KafkaCloud & Infrastructure : AWS (EKS, Lambda, S3, Glue), Docker, Kubernetes, TerraformProgramming Languages : Python, SQL, TypeScript / JavaObservability & Governance : OpenTelemetry, Datadog, Splunk, SonarQubeAI-Assisted Dev Tools : GitHub Copilot, Windsurf, LLM-based analytics and workflow integrationsVisualization & Reporting : Looker, Grafana, Tableau, custom dashboardsSalary / Rate : $20-$25 / hour (depends on experience level).
This is a contract to hire position with candidates expected to work 40 hours / week.
Full-Time Conversion Salary : MXN 75,000–MXN 80,000 per month.
Data Engineer • Monterrey, Nuevo León, México