Be a part of Stefanini !
At Stefanini we are more than 30,000 geniuses, connected from more than 40 countries, we co-create a better future.
¡Apply Big Data Engineer !
Requirements :
- 3 years of years of BIG data development experience.
- Experienc designing, developing, and operating large-scale data systems running at petabyte scale.
- Experience building real-time data pipelines, enabling streaming analytics, supporting distributed big data, and maintaining machine learning infrastructure.
- Able to interact with engineers, product managers, BI developers, and architects, providing scalable and robust technical solutions.
- Intermediate English
Essential Duties and Responsibilities :
Design, develop, implement and tune large-scale distributed systems and pipelines that process large volume of data; focusing on scalability, low -latency, and fault-tolerance in every system built.Experience with Java , Python to write data pipelines and data processing layersExperience in Airflow & Github .Experience in writing map-reduce jobs.Demonstrates expertise in writing complex, highly-optimized queries across large data setsProven, working expertise with Big Data Technologies Hadoop, Hive, Kafka, Presto, Spark, HBase.Highly Proficient in SQL .Experience with Cloud Technologies ( GCP , Azure )Experience with relational model, memory data stores desirable ( Oracle, Cassandra, Druid )Provides and supports the implementation and operations of the data pipelines and analytical solutionsPerformance tuning experience of systems working with large data setsExperience in REST API data service – Data ConsumptionRetail experience is a huge plus.What’s in for you?
Fully remoteTraining PathLife insurancePunctuality bonusGrocery vouchersRestaurant vouchersLegal benefits + Profit sharing (PTU)Learning and Mentoring platformsDiscounts at language schoolsGym discount