Cloud Solutions Architect - Data Orchestration
We are seeking a skilled professional to serve as a trusted advisor, delivering innovative technical solutions centered around data orchestration.
As a Cloud Solutions Architect, you will design, develop, and optimize data ingestion and extraction pipelines orchestrated by Apache Airflow. Your work will enable rapid development and optimization of data pipelines and orchestrations critical to our data ecosystem, supporting diverse use cases across the organization.
- Key Responsibilities :
- Collaborate with a small team of technologists to design, develop, and optimize data ingestion and extraction pipelines orchestrated by Apache Airflow.
- Identify new use cases and oversee onboarding of new domain teams into Airflow-based workflows.
- Review, tune, and enhance existing data pipelines for performance and scalability.
- Develop reusable frameworks and tools for managing a large number of pipelines across the organization.
- Create and maintain detailed architecture, data flow diagrams, and operational documentation covering physical and logical layers.
- Provide reference implementations, including composable data pipelines, integrations with third-party solutions, and new Airflow features.
- Stay updated with the latest Astro and Apache Airflow features to recommend impactful enhancements.
- Build automation assets, best practices, and technical documentation to foster team efficiency.
- Collaborate with Domain and Engineering teams to gather product feedback and translate requirements into scalable solutions.
- Work closely with technology teams to maximize value from Airflow and Astronomer platform adoption.
Requirements :
Proven experience with Apache Airflow in production environments, including designing, deploying, and maintaining workflows.Experience on Astronomer Platform.Skilled in developing ETL, Data Warehousing, and ML / AI use cases within cloud environments.Strong proficiency in Python programming.Solid understanding of Azure cloud-native data architecture and tools (e.g., Data Factory, Synapse).Demonstrated technical leadership in complex projects.Excellent communication skills, both oral and written.Eagerness to learn new technologies and develop reference implementations.Experience migrating workflows from legacy schedulers (e.g., Tidal) to Apache Airflow.Familiarity with integrating Airflow with Azure Data Factory and other data services.Hands-on experience with Snowflake and Databricks platforms.Knowledge of containerized environments and experience with Kubernetes, whether on-premise or in the cloud.SQL expertise and experience working within regulated or enterprise data environments.What We Offer :
A highly competitive compensation package.A multinational organization with opportunities to work abroad.Laptop / equipment.12 days of paid annual leave, sick leave, and national holidays.Maternity & Paternity leave plans.A comprehensive insurance plan.Retirement savings plans.Higher education certification policy.Extensive training opportunities.On-demand Udemy courses for all employees.Cutting-edge projects at leading financial institutions.A flat and approachable organization.A truly diverse and global work culture.Saving funds plan.