Skill: AWS Services
Location: Kochi
Drive Date: 14th Dec 24 (Inperson in Kochi)
Job Summary
We are seeking a highly skilled Sr. Developer with 6 to 10 years of experience to join our team. The ideal candidate will have extensive experience in MWAA (Airflow) AWS Glue Databrew Spark Optimization AWS Cloud Formation Amazon RDS AWS CloudWatch AWS DevOps AWS IAM Amazon Dynamo DB Python AWS Glue Studio AWS Glue ETL AWS Glue Catalog Amazon S3 and Apache Spark. The candidate should also have domain expertise in Property & Casualty Insurance. This is a hybrid work model with day
Responsibilities
- Develop and maintain data pipelines using MWAA (Airflow) to ensure efficient data flow.
- Utilize AWS Glue Databrew for data preparation and transformation tasks.
- Optimize Spark jobs to improve performance and resource utilization.
- Implement infrastructure as code using AWS Cloud Formation.
- Manage and maintain Amazon RDS databases for application data storage.
- Monitor application performance and system health using AWS CloudWatch.
- Implement and manage CI/CD pipelines using AWS DevOps tools.
- Configure and manage AWS IAM roles and policies for secure access control.
- Develop and maintain NoSQL databases using Amazon Dynamo DB.
- Write and maintain Python scripts for data processing and automation tasks.
- Use AWS Glue Studio to create and manage ETL jobs.
- Develop ETL processes using AWS Glue ETL and AWS Glue Catalog.
- Store and manage data in Amazon S3 for scalable storage solutions.
- Leverage Apache Spark for large-scale data processing and analytics.
- Collaborate with cross-functional teams to understand business requirements and deliver solutions.
- Ensure data quality and integrity throughout the data lifecycle.
- Provide technical guidance and mentorship to junior developers.
- Stay updated with the latest industry trends and best practices in AWS and data engineering.
Qualifications
- Possess strong experience in MWAA (Airflow) for orchestrating data workflows.
- Demonstrate expertise in AWS Glue Databrew for data preparation.
- Have a deep understanding of Spark Optimization techniques.
- Show proficiency in AWS Cloud Formation for infrastructure management.
- Exhibit experience with Amazon RDS for relational database management.
- Be skilled in using AWS CloudWatch for monitoring and logging.
- Have hands-on experience with AWS DevOps tools for CI/CD.
- Be knowledgeable in AWS IAM for access control and security.
- Have experience with Amazon Dynamo DB for NoSQL database management.
- Be proficient in Python for scripting and automation.
- Demonstrate experience with AWS Glue Studio for ETL job creation.
- Have expertise in AWS Glue ETL and AWS Glue Catalog for data integration.
- Be skilled in using Amazon S3 for scalable storage solutions.
- Have experience with Apache Spark for data processing and analytics.
Certifications Required
AWS Certified Solutions Architect AWS Certified DevOps Engineer AWS Certified Big Data Specialty