Job Title : Data Pipeline Engineer (Python / Airflow) Location : Guadalajara, MX
Job-Type : Full-time
Client : Cognizant
Job Description :
We are seeking an experienced Senior Data Pipeline Engineer to design, build, and operate secure, reliable, and cost-efficient data pipelines supporting our Android connected and infotainment experiences. You will lead Python-based ETL / ELT development, Airflow orchestration, and data platform operations across cloud environments, working closely with Android, backend, and product teams.
Responsibilities
Build and maintain Airflow 2.x DAGs (TaskFlow, dynamic DAGs, deferrable operators, providers).
Develop robust Python ETL / ELT ingesting from APIs, storage, message buses, and databases.
Operate Airflow on Azure / Kubernetes; support blue / green and canary DAG releases.
Implement data quality testing, monitoring, observability, and lineage.
Design scalable batch and streaming pipelines with strong schema management.
Manage SQL and blob data stores (partitioning, clustering, retention).
Enforce security best practices, IAM, RBAC, secrets management, and data contracts.
Build CI / CD pipelines and IaC (Terraform, Docker, Helm).
Optimize cost / performance and document runbooks and decisions.
Qualifications
~8+ years data or backend engineering experience with strong Python skills.
~2+ years Airflow 2.x expertise.
~ Proven track record building reliable ETL / ELT pipelines (batch + streaming).
~ Strong SQL and experience with major warehouses (BigQuery, Redshift, Snowflake).
~ Familiarity with IAM, OAuth / OIDC, secrets management, and monitoring.
~ Excellent communication and ability to work autonomously.
Nice to Have
~ Terraform, Kubernetes, Docker, Spark / Beam, Kafka / Event Hubs, dbt, Delta Lake, feature stores, automotive / IoT experience.
Data Engineer • Guadalajara, Mexico