OverviewJoin us in the Procurement Execution Center (PEC) as a Data Engineer as part of a diverse team of data and procurement individuals.
In this role, you will be responsible for deploying supporting the E2E management of our data, including : ETL / ELT, DW / DL, data staging, data governance, and manage the different layers of data required to ensure a successful BI & Reporting for the PEC.
This role will work with multiple types of data, spreading across multiple functional areas of expertise, including Fleet, MRO & Energy, Travel, Professional Services, among others.ResponsibilitiesServe as the main technical resource for any data-related requirementDemonstrate an ability to communicate technical knowledge through project management and contributions to product strategyDeploy data ingestion processes through Azure Data Factory to load data models as required into Azure SynapseBuild and design robust, modular and scalable ETL / ELT pipelines with Azure DataFactory, Python and / or dbtAssemble large, complex, robust and modular data sets that meet functional / nonfunctional business requirementsBuild the infrastructure required for optimal ETL / ELT of data from a wide variety of data sources using Data Lakehouse technologies and ADFDevelop data models that enable DataViz, Reporting and Advanced Data Analytics, striving for optimal performance across all data modelsMaintain conceptual, logical, and physical data models along with corresponding metadataManage the DevOps pipeline deployment model, including automated testing proceduresDeploy data stewardship and data governance across our data warehouse, to cleanse and enhance our data, using knowledge bases and business rulesEnsure compliance with system architecture, methods, standards, practices and participate in their creationClearly articulate and effectively influence both business and technical teamsPerform the necessary data ingestion, cleansing, transformation, and coding of business rules to support annual Procurement bidding activitiesSupport the deployment of a global data standardCreate data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leaderSupport Rate Repository management as required (including Rate Card uploads to our DW)Other Procurement duties as assignedQualificationsBachelor's degree in related field (Engineering, Computer Science, Data Science or similar)4+ years of relevant professional experience in BI Engineering, data modeling, data engineering, software engineering or other relevant roles.
Strong SQL knowledge and experience working with relational databasesKnowledge in DW / DL concepts, data marts, data modeling, ETL / ELT, data quality / stewardship, distributed systems and metadata managementExperience building and optimizing data pipelines, architectures, and data setsAzure Data Engineering certification preferred (DP-203)ETL / ELT development experience (4+ years), ADF, dbt and Snowflake are preferredAbility to resolve ETL / ELT problems by proposing and implementing tactical / strategic solutionsStrong project management and organizational skillsExperience with object-oriented scripting languages : Python, Scala, R, etcExperience with NoSQL databases is a plus to support the transition from On-Prem to CloudExcellent problem solving, critical thinking, and communication skillsRelevant experience with Azure DevOps (CI / CD, git / repo management) is a plusDue to the global nature of the role, proficiency in English language is a must
#J-18808-Ljbffr
Data Engineer • Nuevo León, México