DBT Cloud Engineer
Tredence Inc.
2 - 5 years
Kolkata
Posted: 20/02/2026
Job Description
Tredence is a leading data science and analytics consulting company known for solving complex business problems with a strong focus on Last-Mile Adoption. We combine deep domain expertise with advanced engineering to deliver scalable and business-impactful data solutions.
Role Overview
We are looking for an experienced DBT Engineer to join our Data Engineering team. The ideal candidate has a strong data engineering background, hands-on experience building scalable data transformation pipelines using DBT (Data Build Tool), and a solid understanding of cloud data platforms. You will work closely with analytics, product, and consulting teams to design and operationalize highquality data models that power analytics and AI solutions for global clients.
Key Responsibilities
- Design, build, and optimize DBT-based data transformation pipelines following modular, testable, and scalable development patterns.
- Develop and maintain semantic data models, ensuring data quality, governance, and version control.
- Collaborate with cross-functional teams to understand data needs and convert business requirements into technical implementations.
- Integrate DBT workflows with orchestration tools such as Airflow, Astronomer, Dataform, or cloud-native schedulers.
- Implement data quality tests, documentation, and lineage using DBT features.
- Optimize performance of transformations and queries on cloud data warehouses like Snowflake, BigQuery, Redshift, or Azure Synapse.
- Work with CI/CD pipelines for automated DBT deployments using Git, GitHub Actions, Azure DevOps, or similar tools.
- Participate in code reviews, enforce engineering best practices, and contribute to improving internal frameworks and accelerators.
- Troubleshoot data issues and provide ongoing support for production data pipelines.
Required Skills & Experience
- 59 years of experience in Data Engineering with at least 2+ years hands-on in DBT development.
- Strong SQL skills and experience modeling data using methodologies like Kimball, Data Vault, or star/snowflake schema design.
- Expertise in at least one major cloud platform:
- Azure (preferred), AWS, or GCP
- Hands-on experience with cloud data warehouses:
- Snowflake, BigQuery, Redshift, Databricks SQL, Synapse, etc.
- Solid understanding of software engineering practices such as version control, CI/CD, modular code design, and testing.
- Experience with orchestration tools (Airflow, Prefect, Dagster, Azure Data Factory, etc.).
- Familiarity with Python for scripting and automation.
- Strong problem-solving skills with the ability to work in a fast-paced consulting environment.
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
