Data Engineer – Informatica BDM | Data Pipelines | Orchestration | DWH | AWS & Cloud
Derisk360
2 - 5 years
Bengaluru
Posted: 10/01/2026
Job Description
Location: Bangalore, Chennai, Gurgaon
Experience: 8+ Years (hands-on Informatica BDM Mandatory, Banking Domain experience )
Organization: Derisk360
About the Role
We are looking for a hands-on Data Engineer with strong expertise in Informatica BDM, data orchestration, data pipeline design, and cloud/on-prem Data Warehousing platforms . The candidate should be able to work proactively in Agile Sprint delivery , collaborate with product, architecture, and testing teams, and take ownership of data ingestion, transformation, performance, and reliability across large-scale data ecosystems.
This is a high-impact engineering role suited for professionals who thrive in a fast-growing startup environment with rich Banking experience .
Key Responsibilities
- Design, develop, and maintain scalable data pipelines & orchestration workflows across on-prem and cloud platforms
- Build & enhance ETL / ELT solutions using Informatica BDM for ingestion, transformation, and data integration
- Implement data pipeline automation, scheduling, monitoring, and recovery handling
- Develop data ingestion frameworks for structured, semi-structured, and file-based sources
- Optimize pipeline performance, parallel processing, partitioning, and workload tuning
- Work closely with Snowflake / Oracle / Teradata / AWS environments for
- data movement
- staging & transformations
- reconciliation & performance validation
- Support migration initiatives Oracle to Snowflake / Cloud including
- data movement strategy
- transformation logic redesign
- security, access & audit controls
- Collaborate with Data Architects, Data Analysts, and Product Owners in Agile Sprint teams
- Contribute to design reviews, solution documentation, reusable frameworks, and best practices
- Provide technical production support for data pipelines and batch processing
- Actively contribute to startup culture, ownership mindset, and continuous improvement
Required Skills & Experience
- Core Data Engineering (Mandatory)
- Strong hands-on experience with Informatica BDM, AWS Services
- Experience in Data Warehousing / DWH environments
- Strong SQL skills across Snowflake, Oracle, and Teradata
- Experience in database design, query optimization & performance tuning
- Data Pipelines & Orchestration
- Strong experience in data pipeline architecture & orchestration techniques
- Hands-on with workflow scheduling & automation tools
- Experience working with UNIX, Autosys, and shell scripting
- Cloud & Platform Exposure
- Experience working on AWS & cloud data ecosystems
- Exposure to services such as S3, Glue, Redshift, IAM, Lambda, Athena
- Experience in Oracle Snowflake / Cloud migration programs
- Awareness of data security, masking, encryption, and role-based access
- Experience with CI/CD for data pipelines
- Experience in data quality frameworks & metadata-driven design
- Ways of Working
- Proven experience working in Agile Sprint delivery models
- Strong collaboration & communication skills
- Ability to operate in a well-established startup environment with proactive contribution
Good to Have (Added Advantage)
- Knowledge of Python for data validation / automation
- Exposure to PySpark / Spark-based processing
What We Value
- High ownership, learning mindset, and accountability
- Ability to work across multiple data platforms & evolving architectures
- Passion for scalable engineering, reliability, and performance
Why Join Us
- Opportunity to build enterprise-grade data engineering solutions
- Direct contribution to Derisk360s growth & client delivery
- Freedom to innovate and influence data engineering strategy
Services you might be interested in
Improve Your Resume Today
Boost your chances with professional resume services!
Get expert-reviewed, ATS-optimized resumes tailored for your experience level. Start your journey now.
