🔔 FCM Loaded

GCP Data Architect (Kafka, MongoDB)

Awign Expert

2 - 5 years

Pune

Posted: 17/12/2025

Getting a referral is 5x more effective than applying directly

Job Description

Job Title: GCP Data Architect

Location: Pune, Bangalore, Hyderabad, Chennai, Vadodara

Role Type: Permanent role

Compensation: Upto 35 LPA


Required Skills & Qualifications

  • 8+ years of experience in software engineering with at least 4 years in solution architecture.
  • Strong hands-on expertise in GCP , MongoDB , and Kafka .
  • Proficient in one or more programming languages (Java, Python, Go, Node.js).
  • Understanding of microservices, distributed systems, caching, messaging, and API best practices.
  • Experience in designing high-throughput, low-latency systems.
  • Familiarity with containerization (Docker) and orchestration (Kubernetes).
  • Excellent analytical, technical documentation, and communication skills


Key Responsibilities

Architecture & Design

  • Architect highly scalable, reliable, and secure cloud-native applications on GCP .
  • Define end-to-end solution architectures involving microservices , event-driven patterns , API ecosystems , and data pipelines .
  • Design MongoDB data models, indexing strategies, partitioning/sharding approaches, and performance optimization.
  • Architect Kafka -based streaming solutions including topic design, consumer group strategy, message serialization, and delivery semantics.
  • Ensure solutions follow best practices in scalability, resiliency, observability, security, and cost optimization.
  • Create architecture documents, sequence diagrams, logical/physical data models, and integration patterns.

Hands-on Development

  • Develop and review code in languages such as Python , Java , Node.js , or Go .
  • Build microservices, APIs, data ingestion jobs, Kafka producers/consumers, and GCP-native applications.
  • Perform POCs and build reusable frameworks, utilities, and accelerators.
  • Support CI/CD pipeline development and automated test practices.
  • Conduct code reviews, troubleshoot performance issues, and mentor developers.

Cloud Platform Expertise (GCP)

  • Strong knowledge of compute services (GKE, Cloud Run, GCE), storage (GCS), networking (VPC, Cloud Load Balancing), and IAM.
  • Experience with GCP data services such as BigQuery , Pub/Sub , Dataflow , Cloud Composer , Cloud Functions .
  • Optimize cloud workloads for performance and cost.
  • Design and implement observability Cloud Logging, Monitoring, APM.

MongoDB Expertise

  • Schema design for document databases aligned with access patterns and performance goals.
  • Experience with MongoDB Atlas, replica sets, sharding, backup/restore, and cluster tuning.
  • Strong understanding of query optimization, aggregations, and schema evolution.

Kafka Expertise

  • Experience with Kafka clusters (self-managed or Confluent Cloud).
  • Design and implement Kafka producers, consumers, stream processors.
  • Experience with Schema Registry, Connectors (Kafka Connect), KStreams/KSQL is a plus.
  • Tuning throughput, consumer lag handling, partition strategies, and ensuring high availability.

Cross-functional Collaboration

  • Work with product managers, business analysts, developers, and QA engineers to translate requirements into technical solutions.
  • Provide technical leadership, mentoring, and architectural governance.
  • Drive architecture reviews, performance reviews, and security reviews.

DevOps & Security

  • Experience in CI/CD tools (GitHub Actions, Jenkins, GitLab).
  • Knowledge of Terraform/Infrastructure-as-Code on GCP.
  • Implement secure coding practices, encryption, IAM, VPC design, secret management.
  • Perform threat modeling and ensure compliance with organizational standards.

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.