Tech Lead :
~ Min +10 years in technology project implementation
- Demonstrates up-to-date expertise in Data Engineering, complex data pipeline development.
- Experience in agile models
- Design, develop, implement and tune large-scale distributed systems and pipelines that process large volume of data; focusing on scalability, low -latency, and fault-tolerance in every system built.
- Experience with, Python, Java to write data pipelines and data processing layers
- Experience in Advance Pipelines with Airflow
- Experience Continuous Integration, DevOps, Github.
- Performance tuning experience of systems working with large data sets
- Proven, working expertise with Big Data Technologies Hadoop, Hive, Kafka, Presto, Spark.
- Highly Proficient in SQL.
- Experience with Cloud Technologies
- GCP – DataProc, Big Query, Cloud Fuctions
- Azure – ADLS, Data Factory.
- Experience with relational model, memory data stores desirable (Sql Server, Oracle, Cassandra, Druid)
- Knowledge in implementing advanced analytics models using ML / AI (Desirable)
- Knowledge of BI tools (Power BI, Tableau, Looker, etc.) Desirable
- Provides and supports the implementation and operations of the data pipelines and analytical solutions
- Experience in REST API data service – Data Consumption
- Experience in managing work teams
- Advanced project management
- English conversational (Advanced)
- Retail experience is a huge plus.