Role Summary
Own the design and operation of reliable, secure, and cost‑efficient data pipelines built
with Apache Airflow (2.x) and Python. You’ll deliver batch and streaming ingestion,
transformations, and curated datasets that power connected and infotainment
experiences. You’ll lead Python-based ETL / ELT, DAG orchestration, and data platform
reliability, security, and observability across our cloud environments.
We are an Android app development team looking for someone to own and lead our cloud
data engineering in close partnership with mobile, backend, and product teams. This
ownership would include taking abstract requirements, refining / defining them, creating
development tasks, and then implementing those tasks.
Responsibilities
deferrable operators, providers, and the Secrets backend; manage cross‑DAG
dependencies and SLAs.
and databases; package code as reusable libraries.
deployments); implement blue / green or canary DAG releases.
validation in CI.
compatibility.
clustering, and retention.
post‑incident reviews and reliability improvements.
handling, and data contracts; enforce RBAC in Airflow and warehouses.
provision infra with Terraform / Helm; containerize with Docker.
compute footprints to optimize cost / perf.
data contracts; document decisions and operational runbooks.
Skills and Qualifications
tuning).
robust testing and monitoring.
(BigQuery, Redshift, Snowflake) and relational systems (PostgreSQL / MySQL).
integrations), API gateways, and secrets management (Vault / AWS Secrets
Manager / GCP Secret Manager).
performance tuning.
autonomous execution with well‑documented decisions.
Nice to Have
containerization with Docker and orchestration on Kubernetes is a plus.
CDC patterns.
Logistics
target‑oriented environment.
approach to problem solving.
Senior Data Engineer • Guadalajara, Mexico Metropolitan Area, Mexico