DET-RCE-Risk Data Engineer-GDSF02
Job description
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all.
RCE_Risk Data Engineer/Leads
Description – External
Job Description: -
Our Technology team builds innovative digital solutions rapidly and at scale to deliver the next generation of Financial and Non- Financial services across the globe.
The Position is a senior technical, hands-on delivery role, requiring the knowledge of data engineering, cloud infrastructure and platform engineering, platform operations and production support using ground-breaking cloud and big data technologies.
The ideal candidate with 6-8 years of experience, will possess strong technical skills, an eagerness to learn, a keen interest on 3 keys pillars that our team support i.e. Financial Crime, Financial Risk and Compliance technology transformation, the ability to work collaboratively in fast-paced environment, and an aptitude for picking up new tools and techniques on the job, building on existing skillsets as a foundation.
In this role you will:
- Ingestion and provisioning of raw datasets, enriched tables, and/or curated, re-usable data assets to enable variety of use cases.
- Driving improvements in the reliability and frequency of data ingestion including increasing real-time coverage
- Support and enhancement of data ingestion infrastructure and pipelines.
- Designing and implementing data pipelines that will collect data from disparate sources across enterprise, and from external sources and deliver it to our data platform.
- Extract Transform and Load (ETL) workflows, using both advanced data manipulation tools and programmatically manipulation data throughout our data flows, ensuring data is available at each stage in the data flow, and in the form needed for each system, service and customer along said data flow.
- Identifying and onboarding data sources using existing schemas and where required, conduction exploratory data analysis to investigate and provide solutions.
- Evaluate modern technologies, frameworks, and tools in the data engineering space to drive innovation and improve data processing capabilities.
Core/Must Have skills.
- 3-8 years of expertise in designing and implementing data warehouses, data lakes using Oracle Tech Stack (ETL: ODI, SSIS, DB: PLSQL and AWS Redshift)
- At least 4+ years of experience in managing data extraction, transformation and loading various sources using Oracle Data Integrator with exposure on other tools like SSIS.
- At least 4+ years of experience in Database Design and Dimension modelling using Oracle PLSQL, Microsoft SQL Server.
- Experience in developing ETL processes – ETL control tables, error logging, auditing, data quality etc. Should be able to implement reusability, parameterization workflow design etc.
- Advanced working SQL Knowledge and experience working with relational and NoSQL databases as well as working familiarity with a variety of databases (Oracle, SQL Server, Neo4J)
- Strong analytical and critical thinking skills, with ability to identify and resolve issues in data pipelines and systems.
- Expertise in data Modelling and DB Design with skills in performance tuning.
- Experience with OLAP, OLTP databases, and data structuring/modelling with understanding of key data points.
- Experience building and optimizing data pipelines on Azure Databricks or AWS glue or Oracle cloud.
- Create and Support ETL Pipelines and tables schemas to facilitate the accommodation of new and existing data sources for the Lakehouse.
- Experience with data visualization (Power BI/Tableau) and SSRS.
Good to have:
- Experience of working in Financial Crime, Financial Risk and Compliance technology transformation domains.
- Certification on any cloud tech stack preferred Microsoft Azure.
- In depth knowledge and hands-on experience with data engineering, Data Warehousing and Delta Lake on-prem (Oracle RDBMS, Microsoft SQL Server) and cloud( Azure or AWS or Oracle cloud. )
- Ability to script (Bash, Azure CLI), Code (Python, C#), query (SQL, PLSQL, T-SQL) coupled with software versioning control systems (e, g GitHub) AND ci/cd systems.
- Design and development of systems for the maintenance of the Azure/AWS Lakehouse, ETL process, business Intelligence and data ingestion pipelines for AI/ML use cases.
EY | Building a better working world
EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets.
Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate.
Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.