Data Architect Manager

at  EY

Budapest, Közép-Magyarország, Hungary -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate25 Jul, 2024Not Specified30 Apr, 20248 year(s) or aboveData Processing,Data Models,Information Technology,Transformation,Oracle Erp,Data Profiling,Computer Science,Data Extraction,Sap,CollaborationNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all.
Let us introduce you the job offer by EY GDS Hungary – a member of the global integrated Service Delivery Center network by EY established in Hungary in 2021.

SUMMARY:

We are looking for highly skilled and experienced Senior Data Architect with extensive knowledge of Oracle ERP, SAP, and Databricks to join our team and play a crucial role in designing and implementing data quality solutions. You will be responsible for leading the development and implementation of strategies and processes to ensure the accuracy, completeness, and consistency of our data across the organization, specifically focusing on data within Oracle ERP and SAP systems, while leveraging the capabilities of Databricks for data processing and transformation, including building and maintaining ETL pipelines using PySpark.

QUALIFICATIONS:

  • Bachelor’s degree in computer science, Information Technology, or a related field (Master’s degree preferred).
  • 8+ years of experience as a Data Architect or similar role.
  • Extensive knowledge of data quality principles and methodologies as applied to Oracle ERP, SAP systems, and big data processing in Databricks.
  • Experience in designing and implementing data quality solutions using various tools and technologies compatible with Oracle ERP, SAP, and Databricks (e.g., data profiling, data cleansing tools, data quality monitoring tools within Databricks).
  • Strong understanding of Oracle ERP and SAP data models, functionalities, and integrations.
  • Experience working with Oracle and SAP data extraction, transformation, and loading (ETL) processes.
  • Experience and expertise in utilizing Databricks for data processing, transformation, and analytics workflows, including building and maintaining ETL pipelines using PySpark.
  • Familiarity with data quality best practices within the context of enterprise resource planning (ERP) systems and big data processing.
  • Strong analytical and problem-solving skills.
  • Excellent communication and collaboration

Responsibilities:

  • Partner with business stakeholders and data engineers to understand data quality requirements and challenges specific to Oracle ERP, SAP systems, and within the context of big data processing using Databricks, including ETL development needs.
  • Design and implement data quality solutions, including data cleansing, transformation, and validation processes for data originating from Oracle ERP, SAP, and processed within Databricks, utilizing PySpark for building and maintaining ETL pipelines.
  • Develop and maintain data quality standards, policies, and procedures specifically tailored to Oracle ERP, SAP, and Databricks data integration, processing, and ETL development.
  • Select and implement data quality tools and technologies compatible with Oracle ERP, SAP, and Databricks, including PySpark libraries and functionalities for ETL development within Databricks.
  • Monitor and assess data quality metrics within Oracle ERP, SAP, and Databricks pipelines, including those related to ETL performance and data quality within ETL processes.
  • Develop and implement data governance frameworks to ensure data quality and compliance, focusing on Oracle ERP, SAP, and Databricks data integration, processing, and ETL development.
  • Collaborate with data analysts and scientists to ensure data used in analysis from Oracle ERP, SAP, and processed through Databricks, including data transformed through ETL pipelines, is high-quality and reliable.
  • Document data quality processes and solutions for future reference, specifically regarding Oracle ERP, SAP, and Databricks data integration, processing, and ETL development.
  • Stay up to date on the latest trends and innovations in data quality technologies and best practices, particularly those relevant to Oracle ERP, SAP, Databricks, and PySpark for ETL development.


REQUIREMENT SUMMARY

Min:8.0Max:13.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Information Technology

Graduate

Computer science information technology or a related field (master’s degree preferred

Proficient

1

Budapest, Hungary