Overview
Join us in the Procurement Execution Center (PEC) as a Data Engineer as part of a diverse team of data and procurement individuals. In this role, you will be responsible for deploying supporting the E2E management of our data, including : ETL / ELT, DW / DL, data staging, data governance, and manage the different layers of data required to ensure a successful BI & Reporting for the PEC. This role will work with multiple types of data, spreading across multiple functional areas of expertise, including Fleet, MRO & Energy, Travel, Professional Services, among others.
Responsibilities
- Serve as the main technical resource for any data-related requirement
- Demonstrate an ability to communicate technical knowledge through project management and contributions to product strategy
- Deploy data ingestion processes through Azure Data Factory to load data models as required into Azure Synapse
- Build and design robust, modular and scalable ETL / ELT pipelines with Azure DataFactory, Python and / or dbt
- Assemble large, complex, robust and modular data sets that meet functional / nonfunctional business requirements
- Build the infrastructure required for optimal ETL / ELT of data from a wide variety of data sources using Data Lakehouse technologies and ADF
- Develop data models that enable DataViz, Reporting and Advanced Data Analytics, striving for optimal performance across all data models
- Maintain conceptual, logical, and physical data models along with corresponding metadata
- Manage the DevOps pipeline deployment model, including automated testing procedures
- Deploy data stewardship and data governance across our data warehouse, to cleanse and enhance our data, using knowledge bases and business rules
- Ensure compliance with system architecture, methods, standards, practices and participate in their creation
- Clearly articulate and effectively influence both business and technical teams
- Perform the necessary data ingestion, cleansing, transformation, and coding of business rules to support annual Procurement bidding activities
- Support the deployment of a global data standard
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader
- Support Rate Repository management as required (including Rate Card uploads to our DW)
- Other Procurement duties as assigned
Qualifications
Bachelor’s degree in related field (Engineering, Computer Science, Data Science or similar)4+ years of relevant professional experience in BI Engineering, data modeling, data engineering, software engineering or other relevant roles. Strong SQL knowledge and experience working with relational databasesKnowledge in DW / DL concepts, data marts, data modeling, ETL / ELT, data quality / stewardship, distributed systems and metadata managementExperience building and optimizing data pipelines, architectures, and data setsAzure Data Engineering certification preferred (DP-203)ETL / ELT development experience (4+ years), ADF, dbt and Snowflake are preferredAbility to resolve ETL / ELT problems by proposing and implementing tactical / strategic solutionsStrong project management and organizational skillsExperience with object-oriented scripting languages : Python, Scala, R, etcExperience with NoSQL databases is a plus to support the transition from On-Prem to CloudExcellent problem solving, critical thinking, and communication skillsRelevant experience with Azure DevOps (CI / CD, git / repo management) is a plusDue to the global nature of the role, proficiency in English language is a must#J-18808-Ljbffr