🔔 FCM Loaded

Staff, Data Engineer - Data Architecture [T500-20225]

Costco IT

2 - 5 years

Hyderabad

Posted: 10/12/2025

Getting a referral is 5x more effective than applying directly

Job Description

About Costco Wholesale

Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.

About Costco Wholesale India

At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members experiences and make employees jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale.


Position Title: Staff, Data Engineer

Job Description:

Roles & Responsibilities:

  • Shape and drive enterprise-wide data architecture strategy: Define and evolve the long-term technical vision for scalable, resilient data infrastructure across multiple business units and domains.
  • Lead large-scale, cross-functional initiatives: Architect and guide the implementation of data platforms and pipelines that enable analytics, AI/ML, and BI at an organizational scale.
  • Pioneer advanced and forward-looking solutions: Introduce novel approaches in real-time processing, hybrid/multi-cloud, and AI/ML integration to transform how data is processed and leveraged across the enterprise.
  • Mentor and develop senior technical leaders: Influence Principal Engineers, Engineering Managers, and other Staff Engineers; create a culture of deep technical excellence and innovation.
  • Establish cross-org technical standards: Define and enforce best practices for data modeling, pipeline architecture, governance, and compliance at scale.
  • Solve the most complex, ambiguous challenges: Tackle systemic issues in data scalability, interoperability, and performance that impact multiple teams or the enterprise as a whole.
  • Serve as a strategic advisor to executive leadership: Provide technical insights to senior executives on data strategy, emerging technologies, and long-term investments.
  • Represent the organization as a thought leader: Speak at industry events/conferences, publish thought leadership, contribute to open source and standards bodies, and lead partnerships with external research or academic institutions.


Technical Skills:

  • 15+ years of experience
  • Mastery of data architecture and distributed systems at enterprise scale: Deep experience in GCP .
  • Advanced programming and infrastructure capabilities: Expertise in writing database queries, Python, or Java, along with infrastructure-as-code tools like Terraform or Cloud Deployment Manager.
  • Leadership in streaming and big data systems: Authority in tools such as BigQuery, Dataflow, Dataproc, Pub/sub for both batch and streaming workloads.
  • Enterprise-grade governance and compliance expertise: Design and implement standards for data quality, lineage, security, privacy (e.g., GDPR, HIPAA), and auditability across the organization.
  • Strategic integration with AI/ML ecosystems: Architect platforms that serve advanced analytics and AI workloads (Vertex AI, TFX, MLflow).
  • Exceptional ability to influence across all levels: Communicate technical vision to engineers, influence strategic direction with executives, and drive alignment across diverse stakeholders.
  • Recognized industry leader: Demonstrated track record through conference presentations, publications, open-source contributions, or standards development.


Must Have Skills:

  • Deep expertise in data architecture, distributed systems, and GCP.
  • Python or Java, infrastructure-as-code (e.g. Terraform)
  • Big data tools: BigQuery(Expert level. Having experience on performance tuning and UDFs), Dataflow, Dataproc, Pub/Sub (batch + streaming)
  • Data governance, privacy, and compliance (e.g. GDPR, HIPAA)
  • Data modeling and architecture - level expert, have experience on hybrid architectures
  • SQL Skills level - Expert
  • Deep understanding of BigQuery, have experience on partitioning, clustering and performance optimizations
  • Experience on Cloud function, Composer and Cloud run, dataflow flex templates - should be able to write
  • Understanding of full concepts of cloud architecture.

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.