🔔 FCM Loaded

Data Engineer

RandomTrees

2 - 5 years

Hyderabad

Posted: 22/02/2026

Getting a referral is 5x more effective than applying directly

Job Description

Data Engineering Specialist

Experience: 7+ Years

Location: Chennai/Hyderabad

Work Mode: Hybrid 3 days office in a week

Overview of the requirement:

Random Trees is looking for a skilled Data Engineering Specialist to design and implement data solutions. The ideal candidate will have experience with Snowflake, SQL, DBT, Python/Pyspark or any Data modelling tools and Azure/AWS/GCP, along with a strong foundation of Cloud Platforms. You will be responsible for developing scalable, efficient data architectures that enable personalized customer experiences and advanced analytics.

Roles and Responsibility:

  • Implement and maintain data warehousing solutions in Snowflake to handle large-scale data processing and analytics needs.
  • Optimize workflows using DBT to streamline data transformation and modelling processes.
  • Optimize workflows using any data modelling tools to streamline data transformation and modelling processes.
  • Strong expertise in SQL/PLSQL with hands-on experience in querying, transforming, and analysing large datasets.
  • Good experience in Python programming
  • Expertise with cloud data platforms for large-scale data processing.
  • Solid understanding of data profiling, validation, and cleansing techniques.
  • Support both real-time and batch data integration, ensuring data is accessible for actionable insights and decision-making.
  • Strong understanding of data modelling, ETL/ELT processes, and modern data architecture frameworks.
  • Hands-on experience with Python for data engineering tasks and scripting.
  • Collaborate with cross-functional teams to identify and prioritize project requirements.
  • Develop and maintain large-scale data warehouses on Snowflake.
  • Optimize database performance and ensure data quality.
  • Troubleshoot and resolve technical issues related to data processing and analysis.
  • Participate in code reviews and contribute to improving overall code quality.

Job Requirements:

  • Strong understanding of data modelling and ETL concepts.
  • Experience with Snowflake and Dany Data Modelling is highly desirable.
  • Optimize workflows using DBT to streamline data transformation and modelling processes.
  • Hands-on experience with Python for data engineering tasks and scripting.
  • Strong expertise in SQL with hands-on experience in querying, transforming, and analysing large datasets.
  • Expertise with cloud data platforms (Azure preferred) and Big Data technologies for large-scale data processing.
  • Excellent problem-solving skills and attention to detail.
  • Ability to work collaboratively in a team environment.
  • Strong communication and interpersonal skills.
  • Familiarity with agile development methodologies.

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.