Apply now »

EY - GDS Consulting - AI and DATA - DBT - Snowflake - Senior

Location:  Kochi
Other locations:  Primary Location Only
Salary: Competitive
Date:  Oct 21, 2025

Job description

Requisition ID:  1653338

At EY, we’re all in to shape your future with confidence. 

We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. 

Join EY and help to build a better working world. 

 

Job Description – Data Engineer (Snowflake, dbt, Airflow, Insurance Domain)

Position: Data Engineer

Rank- Senior

Experience: 5–8 years

Employment Type: Full-time

 

Overview

We are seeking an experienced Data Engineer with strong expertise in Snowflake, dbt, and Astronomer- Airflow, along with a solid understanding of insurance business processes. This role will focus on building and optimizing scalable data pipelines, ensuring data quality, and enabling analytics capabilities for insurance data sets.

 

Key Responsibilities

  • Design, develop, and maintain ELT data pipelines using Snowflake, dbt, and Airflow.
  • Build and optimize data models in Snowflake to support analytics and reporting requirements.
  • Implement data transformation logic in dbt for business rules and semantic layer creation.
  • Automate workflows and data ingestion processes with Airflow DAGs.
  • Collaborate with domain experts to translate insurance data concepts (policies, claims, underwriting) into technical pipelines.
  • Monitor system performance, troubleshoot data processing issues, and optimize queries.
  • Prepare documentation and conduct knowledge-sharing sessions for team members.

 

Required Skills

  • Hands-on expertise in Snowflake for data warehousing.
  • Strong proficiency with dbt for modular, testable SQL transformations.
  • Solid experience with Astronomer/Apache Airflow for orchestration and scheduling.
  • Strong SQL and data modelling skills.
  • Knowledge of ELT best practices and performance optimization.
  • Understanding of insurance data domains including policies, claims, actuarial data, and compliance.
  • Familiarity with version control (Git) and CI/CD practices for data projects.
  • Knowledge of Python and Git-based CI/CD pipelines is a plus.
  • Knowledge of Data Vault 2.0 and building model using Dbt is a plus.

 

Preferred Qualifications

  • Experience with cloud environments such as AWS, Azure, or GCP.
  • Knowledge of Python for scripting in Airflow workflows.

 

Education

  • Bachelor’s or master’s degree in computer science, Information Systems, Data Engineering, or related field

EY | Building a better working world

EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.

Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.

EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Apply now »