Patient Collect & Global Insights, Data Engineer

at  Sanofi US

Budapest, Közép-Magyarország, Hungary -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate16 Sep, 2024Not Specified18 Jun, 20247 year(s) or aboveData Engineering,Functional Requirements,Nosql,Data Warehousing,Aws,Technical Requirements,Data Processing,Snowflake,Relational Databases,Orchestration,Business Analytics,Technical Analysis,Algorithms,Integration,Google Cloud,Cloud ServicesNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

KEY FUNCTIONAL REQUIREMENTS & QUALIFICATIONS:

  • Experience working cross-functional teams to solve complex data architecture and engineering problems
  • Demonstrated ability to learn new data and software engineering technologies in short amount of time
  • Good understanding of agile/scrum development processes and concepts
  • Strong technical analysis and problem-solving skills related to data and technology solutions
  • Excellent written, verbal, and interpersonal skills with ability to communicate ideas, concepts, and solutions to peers and leaders
  • Pragmatic and capable of solving complex issues, with technical intuition and attention to detail

KEY TECHNICAL REQUIREMENTS & QUALIFICATIONS:

  • Bachelor’s Degree or equivalent in Computer Science, Engineering, or relevant field
  • 7+ years of experience in data engineering, integration, data warehousing, business intelligence, business analytics, or comparable role with relevant technologies (ETL/ELT) and tools (Informatica/IICS, AWS, Snowflake a plus)
  • Strong proficiency in Snowflake for data warehousing and analytics
  • Hands-on experience with Informatica Intelligent Cloud Services (IICS) for data integration and ETL
  • Understanding of data structures and algorithms
  • Working knowledge of scripting languages (SQL, Python, Shell scripting)
  • Solid understanding of cloud computing concepts and experience with cloud platforms such as AWS, Azure, or Google Cloud Platform
  • Experience with job scheduling and orchestration (Control-M, Airflow is a plus)
  • Familiarity with Splunk/Kafka for real-time data processing and streaming
  • Good knowledge of SQL, NoSQL, and relational databases technologies/concepts
  • Familiarity with data governance and security best practices

Responsibilities:

  • Design, implement, and maintain data pipelines and infrastructure to support our Digital Insights & Analytics team data needs
  • Design, develop, and maintain scalable and efficient data pipelines to process, transform, and store large volumes of structured and unstructured data
  • Develop and maintain data models, schemas, and metadata to support analytics and reporting needs
  • Propose and establish technical designs to meet business and technical requirements
  • Work with DevOps teams to deploy and manage data infrastructure in cloud environments such as AWS, Azure, Snowflake and IICS Cloud Platform
  • Configure and optimize data ingestion workflows, ensuring data quality, reliability, and performance
  • Test and validate developed solution to ensure it meets requirements
  • Create design and development documentation based on standards for knowledge transfer, training, and maintenance
  • Collaborate with stakeholders to understand business requirements and translate them into technical specifications for data solutions


REQUIREMENT SUMMARY

Min:7.0Max:12.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Computer science engineering or relevant field

Proficient

1

Budapest, Hungary