Responsibilities
- Develop and maintain data models, transformations, and workflows using DBT to support analytics and reporting requirements.
- Collaborate with data analysts, data engineers, and stakeholders to understand data needs and translate them into DBT models and transformations.
- Optimize and enhance existing DBT processes for improved performance, reliability, and maintainability.
- Ensure data quality and consistency by implementing data validation checks and monitoring data pipelines.
- Document DBT models, transformations, and processes for knowledge sharing and future reference
- Stay current with industry best practices and trends in data modelling, data warehousing, and analytics
- Write efficient and well-organized software to ship products in an iterative, continual release environment
- Contribute to and promote good software engineering practices across the team
- Mentor and educate team members to adopt best practices in writing and maintaining production code
- Communicate clearly and effectively to technical and non-technical audiences
- Actively contribute to and re-use community best practices
Requirements:
- University or advanced degree in engineering, computer science, mathematics, or a related field
- Proficiency in SQL and experience working with relational databases and NoSQL databases
- Hands-on experience with DBT data modelling and transformation tools.
- Strong understanding of data warehousing concepts and best practices.
- Experience working with cloud-based data platforms such as Snowflake, BigQuery, or Redshift.
- Strong experience working with big data tools: Big Data tech stack (Hadoop, Spark, Kafka etc.)
- Experience with at least one cloud provider solution (AWS, GCP, Azure, GCP preferred)
- Strong experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
- Strong knowledge of data pipeline and workflow management tools
- Expertise in standard software engineering methodology, e.g. unit testing, code reviews, design documentation
- Experience creating Data pipelines that prepare data for ingestion & consumption appropriately
- Experience in setting up, maintaining and optimizing databases/filesystems for production usage in reporting, analytics.
- Experience with workflow orchestration (Airflow, Tivoli etc.)
- Working knowledge of Git hub /Git Toolkit
- Familiarity with analytics and visualization tools such as Looker, Tableau, or Power BI.
- Experience working in an Agile development environment.
Data Build Tool