Data Engineer
Commonwealth Bank of Australia
2 - 5 years
Bengaluru
Posted: 24/05/2025
Job Description
Organization: At CommBank, we never lose sight of the role we play in other people’s financial wellbeing. Our focus is to help people and businesses move forward to progress. To make the right financial decisions and achieve their dreams, targets, and aspirations. Regardless of where you work within our organisation, your initiative, talent, ideas, and energy all contribute to the impact that we can make with our work. Together we can achieve great things.
Job Title: Data Engineer-Big Data
Location: Bengaluru
Business & Team: RM & FS Data Engineering
Impact & contribution:
As a Senior Data engineer with expertise in software development / programming and a passion for building data-driven solutions, you’re ahead of trends and work at the forefront of Big Data and Data warehouse technologies.
Which is why we’re the perfect fit for you. Here, you’ll be part of a team of engineers going above and beyond to improve the standard of digital banking. Using the latest tech to solve our customers’ most complex data-centric problems.
To us, data is everything. It is what powers our cutting-edge features and it’s the reason we can provide seamless experiences for millions of customers from app to branch.
We’re responsible for CommBank’s key analytics capabilities and work to create world-leading capabilities for analytics, information management and decisioning. We work across the Cloudera Hadoop Big Data, Teradata Group Data Warehouse and Ab Initio platforms.
Roles & Responsibilities:
- Passionate about building next generation data platforms and data pipeline solution across the bank.
- Enthusiastic, be able to contribute and learn from wider engineering talent in the team.
- Ready to execute state-of-the-art coding practices, driving high quality outcomes to solve core business objectives and minimise risks.
- Capable to create both technology blueprints and engineering roadmaps, for a multi- year data transformational journey.
- Can lead and drive a culture where quality, excellence and openness are championed.
- Constantly thinking outside the box and breaking boundaries to solve complex data problems.
- Have experience in providing data driven solutions that source data from various enterprise data platform into Cloudera Hadoop Big Data environment, using technologies like Spark, MapReduce, Hive, Sqoop, Kafka; transform and process the source data to produce data assets; and transform and egression to other data platforms like Teradata or RDBMS system.
- Are experienced in building effective and efficient Big Data and Data Warehouse frameworks, capabilities, and features, using common programming language (Scala, Java, or Python), with proper data quality assurance and security controls.
- Are experienced in providing data driven solutions in the Cloud to build various enterprise data platform into AWS platform using technologies like S3, EMR, Glue, Iceberg, Kinesis or MSK/Kafka; transform and process the data to produce data assets for Redshift and DocumentDB.
- Are confident in building group data products or data assets from scratch, by integrating large sets of data derived from hundreds of internal and external sources.
- Can collaborate, co-create and contribute to existing Data Engineering practices in the team.
- Have experience and responsible for data security and data management.
- Have a natural drive to educate, communicate and coordinate with different internal stakeholders.
Essential Skills:
- Preferably with at least 5+ years of hands-on experience in a Data Engineering role.
- Experience in designing, building, and delivering enterprise-wide data ingestion, data integration and data pipeline solutions using common programming language (Scala, Java, or Python) in a Big Data and Data Warehouse platform.
- Experience in building data solution in Hadoop platform, using Spark, Hive,MapReduce, Sqoop, Kafka and various ETL frameworks for distributed data storage and processing. Preferably with at least 5+ years of hands-on experience.
- Experience in building data solution using AWS Cloud technology (EMR, Glue, Iceberg, Kinesis, MSK/Kafka, Redshift, DocumentDB, S3, etc.). Preferably with 2+ years of hands-on experience and certified AWS Data Engineer.
- Strong Unix/Linux Shell scripting and programming skills in Scala, Java, or Python.
- Proficient in SQL scripting, writing complex SQLs for building data pipelines.
- Experience in working in Agile teams, including working closely with internal business stakeholders.
- Familiarity with data warehousing and/or data mart build experience in Teradata, Oracle or RDBMS system is a plus.
- Certification on Cloudera CDP, Hadoop, Spark, Teradata, AWS, Ab Initio is a plus.
- Experience in Ab Initio software products (GDE, Co>Operating System, Express>It, etc.) is a plus.
Educational Qualifications: B.Tech and above
If you're already part of the Commonwealth Bank Group (including Bankwest, x15ventures), you'll need to apply through Sidekick to submit a valid application. We’re keen to support you with the next step in your career.
We're aware of some accessibility issues on this site, particularly for screen reader users. We want to make finding your dream job as easy as possible, so if you require additional support please contact HR Direct on 1800 989 696.
About Company
The Commonwealth Bank of Australia (CBA) is one of Australia's leading financial institutions, offering a range of banking, investment, insurance, and financial services. Founded in 1911, it operates in Australia and internationally, focusing on retail banking, business banking, wealth management, and financial markets. Renowned for its digital innovation, CBA is a major player in advancing technology-driven banking solutions.
Services you might be interested in
One-Shot Campaign
Reach out to ideal employees in one shot!
The intelligent campaign for reaching out to the ideal audience to whom you can ask for help (guidance or referral).