Login Sign Up

Senior Data Engineer [T500-25696]

Ferguson GCC

5 - 10 years

Bengaluru

Posted: 07/05/2026

Getting a referral is 5x more effective than applying directly

Job Description

Company Overview:

Since 1953, Ferguson has been a source of quality supplies for a wide range of industries. Together, we build better infrastructure, better homes, & better businesses. We exist to make our customers complex projects simple, successful, & sustainable by proactively solving problems, adapting to change, & continuously improving how we serve our customers, communities, & each other.

Ferguson is a Fortune 500 company providing best-in-class products, services, & capabilities across multiple industries including Commercial/Mechanical, Facilities Supply, Fire & Fabrication, HVAC, Industrial, Residential Trade, Residential Building & Remodel, Waterworks, & Residential Digital Commerce. With approximately 36,000 associates across 1,700 locations, Ferguson is a community of people working toward a shared purpose of building something meaningful.

Within Ferguson Enterprise, the Reporting & Analytics organization supports the business by developing scalable data & reporting solutions that help teams better understand performance & make informed decisions. Our teams focus on building practical, high-quality analytics tools in a collaborative environment where technical excellence, ownership, & continuous improvement are valued. At Ferguson, you will have the opportunity to build a career you are proud of at a company you can believe in.


LEVEL 2 Senior (Senior Data Engineer)

Job Summary:

The Senior Data Engineer is an advanced individual contributor responsible for designing & developing complex semantic models & scalable reporting solutions. This role owns technical solution design within assigned workstreams & ensures delivered solutions meet performance, quality, & maintainability standards. Senior Engineers provide technical guidance to other developers, contribute to advanced analytics & predictive modeling initiatives, & play a key role in maintaining consistency & best practices across the reporting platform.


Essential Duties & Responsibilities:

Skills & Qualifications:

  • Design and develop complex Power BI semantic models and scalable reporting solutions leveraging curated Databricks Lakehouse layers (Silver/Gold) and enterprise data sources.
  • Write advanced SQL (including Databricks SQL) and DAX to implement complex business logic, standardized calculations, and reusable metrics.
  • Architect and maintain shared semantic models and datasets that enable consistent, scalable, and reusable analytics across reporting solutions.
  • Apply advanced modeling techniques including calculation groups and complex dimensional structures aligned to Lakehouse-based data design.
  • Diagnose and resolve performance issues across Databricks and Power BI, including query optimization, model efficiency, refresh performance, and data volume management.
  • Collaborate with data engineering teams to define and consume curated Gold-layer datasets, ensuring alignment with reporting and analytics requirements.
  • Refactor existing reports and datasets to transition from isolated imports to governed semantic models built on Databricks-backed data products.
  • Implement and enforce dataset governance practices including certification, documentation, lineage awareness, and metric standardization.
  • Develop and validate data quality checks across silver and gold layers, identifying and addressing upstream data issues.
  • Design and implement automated analytical workflows integrating Power BI, Python, Databricks, and the Power Platform.
  • Build forecasting, trend analysis, and statistical models supporting advanced and predictive analytics use cases.
  • Perform code reviews and provide technical guidance to Associate developers, ensuring adherence to modeling, DAX, and reporting standards.
  • Design semantic models optimized for AI-drive querying, ensuring datasets include standardized metrics, well defined relationships & rich metadata.
  • Bachelors degree in Computer Science, Information Systems, Data Analytics, or equivalent experience.
  • Advanced expertise in SQL & DAX, Power BI
  • 36 years of Power BI development experience.
  • Experience working with modern data platforms such as Databricks and querying data using Databricks SQL.
  • Understanding of Lakehouse architecture concepts, including bronze, silver, and gold data layers.
  • Experience integrating Databricks data with Power BI semantic models (Import and Direct Query).
  • Familiarity with distributed data processing concepts and performance considerations for large-scale datasets.
  • Experience using Python or R for predictive analytics & statistical modeling.
  • Proven ability to design performant & scalable datasets.
  • Experience tuning DAX, model relationships, & refresh performance.
  • Strong ownership mindset for solution quality & stability.
  • Design semantic models optimized for AI-drive querying, ensuring datasets include standardized metrics, well defined relationships & rich metadata.

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.