Login Sign Up

Senior Data Architect (Snowflake / Databricks / GCP)

ThreatXIntel

5 - 10 years

Chennai

Posted: 27/04/2026

Getting a referral is 5x more effective than applying directly

Job Description

Company Description

ThreatXIntel is a growing Cybersecurity, IT Staffing, and Consulting company delivering end-to-end technology and security solutions.

We are hiring for our corporate client. ThreatXIntel is the official hiring partner for this requirement.


Job Overview

We are looking for a highly experienced Data Architect with strong expertise across Snowflake, Databricks, and GCP to design and lead enterprise-scale data platforms.

The ideal candidate should have deep experience in data architecture, lakehouse/data warehouse design, and cloud-based data solutions, along with the ability to drive end-to-end data strategy and implementation.

Immediate joiners are strongly preferred


Key Responsibilities

Data Architecture & Strategy

  • Design and implement enterprise data architecture across Snowflake, Databricks, and GCP
  • Define data platform strategy, governance, and best practices
  • Build scalable data lakehouse and data warehouse architectures

Data Platform Engineering

  • Architect and optimize Snowflake data warehouse solutions
  • Design Databricks-based Lakehouse architectures (Delta Lake, Spark)
  • Build and manage GCP-based data platforms (BigQuery, Dataproc, etc.)

Data Pipelines & Integration

  • Design ETL/ELT pipelines for batch and real-time processing
  • Integrate data from multiple enterprise systems and external sources
  • Ensure high performance, scalability, and reliability

Governance & Security

  • Implement data governance, security, and access controls
  • Define data quality frameworks and compliance standards

Leadership & Stakeholder Management

  • Provide technical leadership and architectural guidance
  • Collaborate with business, analytics, and engineering teams
  • Translate business requirements into scalable technical solutions

Required Skills (Mandatory)

  • Strong experience in Snowflake Data Warehouse
  • Hands-on expertise in Databricks (Spark, Delta Lake)
  • Strong experience with GCP data services (BigQuery, Dataproc, etc.)
  • Expertise in data architecture, data modeling, and data warehousing
  • Strong knowledge of ETL/ELT and large-scale data pipelines
  • Experience with distributed data processing systems
  • Strong programming skills in Python / PySpark / SQL
  • Experience in cloud data platform design and optimization

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.