🔔 FCM Loaded

Data Architect

ValueMomentum

2 - 5 years

Hyderabad

Posted: 23/12/2025

Getting a referral is 5x more effective than applying directly

Job Description

Data / ETL Architect

Experience: 14+ years

Skills : Databricks, ETL, Pyspark, Python, Presales


Requirements

  • Minimum eight years of relevant experience as a data architect or data engineer building large-scale data solutions.
  • P&C domain experience a must.
  • Bachelors degree in engineering, Information Technology, Computer Science, or a related field.
  • Experience in architecting and large data modernization, data migration, data warehousing experience with cloud-based data platforms (like Snowflake).
  • Experience with defining and operationalizing data strategy, data governance, data lineage and quality standards.
  • Extensive knowledge of data engineering, data integration and data management concepts (i.e. APIs, ETL, MDM, CRUD, Pub/Sub, etc.)
  • Experience with data modelling.
  • Experience with structured and hierarchical datasets (i.e. JSON, XML, etc.)
  • Engineering experience with large scale system integration and analytics projects
  • Consulting mindset highly collaborative, highly communicative approach with an eye on influence, rather than control.
  • Ability to work on high-level strategy and low-level tactical integration along with stakeholders at all levels of the organization.
  • Ability to communicate complex systems and concepts through pictures.
  • Clear and concise communication skills both written and oral.
  • Remains unbiased to specific technology or vendor more interested in results.



  • Should have 15+ years of experience with last 4 years in implementing Cloud native Data Solutions for variety of data consumption needs such as Modern Data warehouse, BI, Insights and Analytics
  • Should have experience in architecture and implementing End to End Modern Data Solutions using AWS and advance data processing frameworks like Databricks etc.
  • Strong knowledge of cloud native data platform architectures, data engineering and data management
  • Good knowledge of popular database and data warehouse technologies from Snowflake and AWS
  • Demonstrated knowledge of data warehouse concepts. Strong understanding of Cloud native databases, columnar database architectures
  • Ability to work with Data Engineering teams, Data Management Team, BI and Analytics in a complex development IT environment.
  • Good appreciation and at least one implementation experience on processing substrates in Data Engineering - such as ETL Tools, Confluent Kafka, ELT techniques
  • Exposure to varying databases NoSQL (at very minimum Key value stores and/or Document stores), Appliances. Be able to cite implementation experiences constraints and performance challenges in practice.
  • Preferable (Nice to have): Implementing analytic models using AWS SageMaker for production workloads.
  • Data Mesh and Data Products designing, and implementation knowledge will be an added advantage.

Services you might be interested in

Improve Your Resume Today

Boost your chances with professional resume services!

Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.