We are seeking a highly skilled DBT Developer with hands-on experience in Snowflake and Sigma Computing to join our data team.
The ideal candidate will be responsible for designing, building, and maintaining robust data transformation pipelines using DBT, while leveraging Sigma for data visualization and reporting.
You will collaborate with cross-functional teams, stakeholders, and data analysts to analyze requirements and deliver scalable and high-performing data solutions.
This role requires a strong understanding of modern data stack principles, proficiency in SQL, and a passion for transforming raw data into actionable insights.
RESPONSIBILITIES
Develop and maintain scalable data models and transformation pipelines using DBT.
Collaborate with data analysts, engineers, and business stakeholders to understand data requirements.
Create and manage Sigma dashboards and reports to support business intelligence needs.
Ensure data quality, consistency, and reliability across all layers of the data stack.
Optimize SQL queries and DBT models for performance and maintainability.
Implement version control and CI / CD practices for DBT projects.
Monitor and troubleshoot data pipeline issues and ensure timely resolution.
Document data models, transformations, and dashboard logic for cross-functional understanding.
Design, develop, automate, monitor, maintain, and performance tune ELT / ETL to manage high-volume data transfer to and from internal and external systems.
Collaborate cross-functionally with project managers and agile teams to estimate development efforts and ensure complete delivery of solutions and fulfillment requirements.
Configure and manage monitoring / alerting around replication latency, performance (cluster & query)
Deploy CI / CD, enable pipelines, and adapt best practices using tools such as GitHub.
Implement data models and schemas to support business requirements and ensure data consistency and integrity.
Provide technical expertise, troubleshooting, and recommendations to enhance existing systems and processes.
Create and maintain technical documentation, including data flow diagrams, data dictionaries, and process documentation.
Write and optimize SQL queries, stored procedures, and views for data extraction, transformation, and loading.
QUALIFICATIONS
Bachelor's degree in Computer Science, Information Systems, or a related field.
Experience with ELT / ETL tools, dbt is mandatory, and Fivetran is highly preferred.
5+ years of experience in data engineering, analytics, or a related field.
Strong proficiency in SQL and experience with DBT (Cloud or Core).
Hands-on experience with Sigma Computing for data visualization and reporting.
Familiarity with cloud data warehouses (e.g., Snowflake, BigQuery, Redshift).
Experience with Git and version control workflows.
Understanding of data modeling concepts (e.g., star schema, normalization).
Excellent problem-solving and communication skills.
Strong customer-facing communication & presentation skills a must.
Ability to work across multiple teams simultaneously.
Ability to work with teams across geographical regions.
Deep understanding of SQL; ability to write multi-layer statements that handle complex transformation needs.
Hands-on experience developing data pipelines in a cloud environment.
Demonstrated ability to understand / discover business needs and craft data merges, transformations and aggregations, transformations to deliver data that can be used in reports, analytic repositories, and applications.
LATAM Candidates
100% remote
Hired as contractors (freelancers)
Payments through the Deel platform
Salary between $3800 and $4300 USD per month
#J-18808-Ljbffr
Developer • Querétaro, México