Senior Data Engineer - EY wavespace AI & Data hub
Job description
About Us
At EY wavespace Madrid - Data & AI Hub, we are a diverse, multicultural team at the forefront of technological innovation, working with cutting-edge technologies like Gen AI, data analytics, robotics, etc. Our center is dedicated to exploring the future of AI and Data.
What We Offer
Join our Data & AI Hub, where you will have the chance to work in a vibrant and collaborative environment. You will engage directly with advanced data engineering, where you'll leverage cutting-edge technologies to drive innovative data solutions and transform business insights. Our team supports your growth and development, providing access to the latest tools and resources.
As a Data Engineer at EY wavespace, you will embark on a journey to take a data engineer role to the next level. You will meet with opportunities to engage with cutting-edge technologies and innovative architectures that drive AI and data-driven solutions. Collaborating closely with the AI domain at our AI & Data hub, you will confront complex challenges that stretch the limits of what is possible in data engineering, ensuring that your work is not only technically stimulating but also impactful in shaping the future of our clients' businesses.
Key Responsibilities:
- Design and implement scalable data pipelines using Databricks (PySpark, Delta Lake), Snowflake and/or Microsoft Fabric.
- Build robust ELT/ETL workflows that support batch and streaming data ingestion.
- Implement best practices for data quality, observability, lineage and metadata management.
- Implement and manage data integration processes, ensuring seamless data flow between various systems and platforms.
- Contribute to data modernization initiatives by ensuring that data is AI-ready.
- Conduct data profiling and cleansing activities to ensure high data quality and integrity across all data systems.
- Implement and manage stream systems to enable real-time data processing and analytics.
Qualifications:
- Bachelor's or master's degree in computer science, Information Technology, Data Science, or any relevant field.
- Proven hands-on experience working with Databricks, Snowflake, Azure in professional setting.
- Experience delivering projects in MS/Databricks environment.
- Familiar with Data Governance, Unity catalog and/or MS Purview.
- Solid programming skills in Python and SQL
- Industry experience in Consulting or financial services is valued.
- Proficiency in English
- Deep understanding of distributed data processing, cloud data architectures and data warehouse principles.
- 3-5 years of relevant experience.
- Strong communication and teamwork, and excellent problem-solving skills.
#LI-HYBRID