Big Data Architect
Minfy
8 - 10 years
Hyderabad
Posted: 04/04/2026
Getting a referral is 5x more effective than applying directly
Job Description
Architect, design, and optimize largescale data platforms using Databricks, Snowflake, and modern lakehouse patterns. Provide deep handson technical leadership while partnering with presales teams to shape scalable, costefficient solutions. Build proofsofconcept as an individual contributor to validate architectures and demonstrate value.
Responsibilities
- Define endtoend architecture for enterprise data platforms across Databricks, Snowflake, and cloud ecosystems, ensuring scalability, performance, governance, and security.
- Design and implement ingestion, transformation, and compute optimization frameworks using Spark, SQL, Delta Lake, and platformnative capabilities, balancing cost and performance.
- Develop standards for data modeling, data quality, medallion architecture, and workload orchestration, collaborating closely with data engineering, analytics, and ML teams.
- Partner with presales and solutioning teams to qualify use cases, shape solution architecture, estimate workload sizing, and build compelling POCs showcasing technical feasibility.
- Evaluate new capabilities in Databricks, Snowflake, and cloud services; drive platform modernization, automation, observability, and bestpractice adoption across teams.
Knowledge & Skills
- Expert-level knowledge of Databricks (Spark, Delta Live Tables, Unity Catalog) and/or Snowflake (Warehouses, Query Optimization, Snowpark), with strong architectural depth.
- Strong grounding in data modeling, data warehousing, data governance, lakehouse patterns, and distributed systems concepts.
- Understanding of cloud ecosystems (AWS/Azure/GCP), including storage systems, IAM, networking, orchestration, and CI/CD for data pipelines.
- Ability to articulate complex architectural concepts to technical and business stakeholders, connecting platform capabilities to business outcomes.
- Advanced SQL, Python, and Spark programming skills with ability to optimize distributed workloads, troubleshoot cluster performance, and tune compute resources.
Mandatory Experience
- 58 years of experience designing and delivering largescale data platform architectures, including at least 3 years handson in Databricks and/or Snowflake.
- Demonstrated experience building POCs, reference implementations, and endtoend solutions as an individual contributor with deep technical ownership.
- Strong background in performance tuning, cost optimization, cluster sizing, query optimization, and workload management on modern data platforms.
- Experience working with crossfunctional teams including data engineering, analytics, ML, and presales/sales solutioning functions.
- Handson experience with cloud data services (e.g., S3/ADLS/GCS, Lambda/Functions, Glue/Synapse/Data Factory, Kubernetes, CI/CD).
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
