Apply now »

EY - GDS Consulting - AI and DATA - Data Engineer Lead - Manager

Location:  Bengaluru
Other locations:  Primary Location Only
Salary: Competitive
Date:  May 8, 2025

Job description

Requisition ID:  1595324

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. 

 

 

 

 

Job Description for Lead Data Engineer
 

Objectives and Purpose

  • The Lead Data Engineer leads large scale solution architecture design and optimisation to provide streamlined insights to partners throughout the business. This individual leads the team of Mid- and Senior data engineers to partner with visualization on data quality and troubleshooting needs.
  • The Lead Data Engineer will:
    • Implement data processes for the data warehouse and internal systems
    • Lead a team of Junior and Senior Data Engineers in executing data processes and providing quality, timely data management
    • Managing data architecture, designing ETL process
    • Clean, aggregate and organize data from disparate sources and transfer it to data warehouses.
    • Lead development testing and maintenance of data pipelines and platforms, to enable data quality to be utilized within business dashboards and tools.
    • Support team members and direct reports in refining and validating data sets.
    • Create, maintain, and support the data platform and infrastructure that enables the analytics front-end; this includes the testing, maintenance, construction, and development of architectures such as high-volume, large-scale data processing and databases with proper verification and validation processes.

 

Your key responsibilities

  • Lead the design, development, optimization, and maintenance of data architecture and pipelines adhering to ETL principles and business goals.
  • Develop and maintain scalable data pipelines, build out new integrations using AWS native technologies and data bricks to support increases in data source, volume, and complexity.
  • Define data requirements, gather and mine large scale of structured and unstructured data, and validate data by running various data tools in the Big Data Environment.
  • Lead the ad hoc data analysis, support standardization, customization and develop the mechanisms to ingest, analyze, validate, normalize, and clean data.
  • Write unit/integration/performance test scripts and perform data analysis required to troubleshoot data related issues.
  • Implement processes and systems to drive data reconciliation and monitor data quality, ensuring production data is always accurate and available for key stakeholders, downstream systems, and business processes.
  • Lead the evaluation, implementation and deployment of emerging tools and processes for analytic data engineering to improve productivity.
  • Develop and deliver communication and education plans on analytic data engineering capabilities, standards, and processes.
  • Solve complex data problems to deliver insights that help achieve business objectives.
  • Partner with Business Analysts and Enterprise Architects to develop technical architectures for strategic enterprise projects and initiatives.
  • Coordinate with Data Scientists, visualization developers and other data consumers to understand data requirements, and design solutions that enable advanced analytics, machine learning, and predictive modelling.
  • Collaborate with AI/ML engineers to create data products for analytics and data scientist team members to improve productivity.
  • Advise, consult, mentor and coach other data and analytic professionals on data standards and practices, promoting the values of learning and growth.
  • Foster a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions.

 

To qualify for the role, you must have the following:

 

Essential skillsets

  • Bachelor's degree in Engineering, Computer Science, Data Science, or related field
  • 10+ years of experience in software development, data engineering, ETL, and analytics reporting development.
  • Expert in building and maintaining data and system integrations using dimensional data modelling and optimized ETL pipelines.
  • Experience in design and developing ETL pipelines using ETL tools like IICS, Datastage, Abinitio, Talend etc.
  • Advanced experience utilizing modern data architecture and frameworks like data mesh, data fabric, data product design
  • Experience with designing data integration frameworks capable of supporting multiple data sources, consisting of both structured and unstructured data
  • Proven track record of designing and implementing complex data solutions
  • Demonstrated understanding and experience using:
    • Data Engineering Programming Languages (i.e., Python, SQL)
    • Distributed Data Framework (e.g., Spark) 
    • Cloud platform services (AWS preferred)
    • Relational Databases
    • DevOps knowledge with continuous integration
    • AWS knowledge on services like Lambda, DMS, Step Functions, S3, Event Bridge, Cloud Watch, Aurora RDS or related AWS ETL services
    • Knowledge of Data lakes, Data warehouses
    • Databricks/Delta Lakehouse architecture
    • Code management platforms like Github/ Gitlab/ etc.,
  • Deep understanding of database architecture, Data modelling concepts and administration.
  • Handson experience of Spark Structured Streaming for building real-time ETL pipelines.
  • Proficient in programming languages (e.g., SQL, Python, Pyspark) to design, develop, maintain, and optimize data architecture/pipelines that fit business goals.
  • Extracts, transforms, and loads data from multiple external/internal sources using Databricks Lakehouse/Data Lake concepts into a single, consistent source to serve business users and data visualization needs.
  • Leverages continuous integration and delivery principles to automate code deployment to elevated environments using GitHub Actions.
  • Excellent written and verbal communication skills, including storytelling and interacting effectively with multifunctional teams and other strategic partners.
  • Strong organizational skills with the ability to manage multiple projects simultaneously, operating as leading member across globally distributed teams.
  • Strong problem solving and troubleshooting skills.
  • Lead and oversee the code review process within the data engineering team to ensure high-quality, efficient, and maintainable code, while optimizing for performance and scalability.
  • Ability to work in a fast-paced environment and adapt to changing business priorities.
  • Identifying and implementing strategies to optimize AWS / Databricks cloud costs, ensuring efficient and cost-effective use of cloud resources.
  • Understanding of Databricks Unity Catalog for effective data governance and implementing robust access control mechanisms is highly advantageous.

 

Desired skillsets

  • Master’s degree in Engineering, Computer Science, Data Science, or related field
  • Experience in a global working environment
  • Master's degree in engineering specialized in Computer Science, Data Science, or related field
  • Demonstrated understanding and experience using:
    • Knowledge in CDK
    • Experience in IICS Data Integration tool
    • Job orchestration tools like Tidal/Airflow/ or similar
    • Knowledge on No SQL
  • Experience in a global working environment
  • Databricks Certified Data Engineer Professional
  • AWS Certified Data Engineer - Associate
     

 

EY | Building a better working world 


 
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.  


 
Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate.  


 
Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.  

Apply now »