Associate Vice President - Senior Lead Data Engineer [T500-23254]
Deutsche Börse
5 - 10 years
Hyderabad
Posted: 17/02/2026
Job Description
About Deutsche Brse Group:
Headquartered in Frankfurt, Germany, Deutsche Brse Group is a leading international exchange organization and market infrastructure provider. They empower investors, financial institutions, and companies by facilitating access to global capital markets.
Their India centre is located in Hyderabad, serves as a key strategic hub and comprises Indias top-tier tech talent. They focus on crafting advanced IT solutions that elevate market infrastructure and services. Deutsche Brse Group in India is composed of a team of capital market engineers forming the backbone of financial markets worldwide.
Corporate IT of Deutsche Brse Group is in charge of end user workplace experience, voice & communication, application development and operations for all group processes such as Financial Core, Customer Care, Control & Corporate Processes, as well as Deutsche Brse Groups Reference Data Platform. We also develop and operate the groups Enterprise Analytics solutions which form the core of sharing and measuring our groups success. Our mission is simple Make IT Run! As member of the Enterprise Analytics Team, you must be an experienced and inspiring specialist in the dimensions of technology and analytics with first proven record in strategic or
operational execution. You are a role model in a team of junior and senior talents with diversity in experience, background, and locations. Translating business (processes) into numbers and KPIs, fully digital and automated, must be your DNA. You must embrace transparency and simple access to data from diverse ecosystems (SAP & non-SAP) via the Data Mesh to take decisions faster based on reliable data. The perfect candidate for this role will have a can do positive attitude. If you strive to take ownership and develop creative solutions, are fascinated by technology, and like to work in a challenging and fast paced environment then you are exactly the person we are looking for.
Tasks / Responsibilities:
- Conception and implementation of innovative data analytics solutions and projects for our business partners on the Enterprise Analytics & Reporting Platform.
- Design and Development: Create and manage data models & data pipelines using GCP services such as Big Query, Dataflow, Pub/Sub, Cloud Composer, Cloud Data Fusion and Cloud Storage.
- Data Integration: Integrate data from various sources, ensuring data quality and consistency.
- Optimization: Optimize data processing workflows for performance and cost-efficiency.
- Security and Compliance: Implement data security measures and ensure compliance with relevant regulations and best practices.
- Monitoring and Maintenance: Monitor data pipelines and troubleshoot issues to ensure smooth operation.
- Collaborate with various stakeholders to analyse, define, and prioritize business requirements and translate them into technical specifications
- Build, strengthen and maintain a close relationship to our main stakeholders
- Foster and drive the analytics and reporting culture in the whole organization Internal
Qualifications / Required skills:
- Technical or University Degree in Business Informatics, Business Management, Information Technology or a similar field paired with a passion for data
- Professional experience working in a Data Engineering (+5 Years)
- Distinguished capabilities in ETL / ELT development and deployment of data processing pipelines using CI/CD.
- Background in cloud technologies preferably Google Cloud and associated Cloud Services (Big Query, BigQueryML, Dataflows, Data Fusion, Cloud Composer, Cloud Run). Azure, AWS and others would be an advantage
- Experience with Infrastructure as Code (IaC) tools like Terraform for deploying and managing GCP resources with basic understanding of Linux operating system.
- Very Good knowledge in programming languages like SQL, Python and Java, with experience in Java Spring boot.
- Experience with setting up VMs, VPCs
- Experience with Cloud Workstation and Cloud Shell
- Experience with Cortex Framework
- Experience with GitHub Repo Setup
- Experience in containerized applications such as Docker and Kubernetes
- Ability to effectively explain the relevant analytics concepts and technologies with a strong passion in analyzing business needs together with the various stakeholders
- Understanding of modern analytics tools (e.g. SAP Analytics Cloud, Power BI, Looker Studio)
- Proven expertise in using an agile project methodology (SCRUM)
- Good communication with peers, technical teams, and business representatives
- Fluent English is a must
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
