🔔 FCM Loaded

GCP Data Architect

NR Consulting

2 - 5 years

Bengaluru

Posted: 12/02/2026

Getting a referral is 5x more effective than applying directly

Job Description

Primary Roles and Responsibilities:

Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities

Develop and maintain data pipelines implementing ETL process, monitoring performance and advising any necessary infrastructure changes

Translate complex technical and functional requirements into detailed designs

Investigate and analyze alternative solutions to data storing, processing etc. to ensure most streamlined approaches are implemented

Serve as a mentor to junior staff by conducting technical training sessions and reviewing project outputs

Skills and Qualifications:

Strong understanding of data warehousing and data modeling techniques.

Proficient understanding of distributed computing principles - Hadoop v2, MapReduce, HDFS

Strong data engineering skills on GCP cloud platforms Airflow, Cloud Composer, Data Fusion, Data Flow, Data Proc, Big Query

Experience with building stream-processing systems, using solutions such as Storm or Spark- Streaming

Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala

Experience with Spark, SQL, and Linux.

Knowledge of various ETL techniques and frameworks, such as Flume, Apache NiFi, or DBT.

Experience with various messaging systems, such as Kafka or RabbitMQ.

Good understanding of Lambda Architecture, along with its advantages and drawbacks

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.