Data Engineering Analyst

at  Sanofi US

Budapest, Közép-Magyarország, Hungary -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate29 Dec, 2024Not Specified04 Oct, 20243 year(s) or aboveMaintenance,Languages,Algorithms,Computer Science,Project Management Skills,Automation,Logging,Containerization,Python,Github,Data Structures,Argo,Integration,Performance Improvement,Infrastructure,Snowflake,Finance,Scripting,Functional Requirements,CodeNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

ABOUT THE JOB

Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions, to accelerate R&D, manufacturing and commercial performance and bring better drugs and vaccines to patients faster, to improve health and save lives.
We are seeking a Data Engineering Analyst interested in challenging the status quo to ensure the seamless creation and operation of the data pipelines that are needed by Sanofi’s advanced analytic, AI and ML initiatives for the betterment of our global patients and customers.

KEY FUNCTIONAL REQUIREMENTS & QUALIFICATIONS:

  • 3+ years of relevant experience developing backend, integration, data pipelining, and infrastructure
  • Bachelor’s degree in computer science, engineering, or similar quantitative field of study
  • Expertise in database optimization and performance improvement
  • Expertise in Python, PySpark, and Snowpark
  • Experience data warehouse and object-relational database (Snowflake and PostgreSQL) and writing efficient SQL queries
  • Experience in cloud-based data platforms (Snowflake, AWS)
  • Proficiency in developing robust, reliable APIs using Python and FastAPI Framework
  • Understanding of data structures and algorithms
  • Experience in modern testing framework (SonarQube, K6 is a plus)
  • Strong collaboration skills, willingness to work with others to ensure seamless integration of the server-side and client-side
  • Knowledge of DevOps best practices and associated tools, a plus, especially in the setup, configuration, maintenance, and troubleshooting of associated tools
  • Containers and containerization technologies (Kubernetes, Argo, Red Hat OpenShift)
  • Infrastructure as code (Terraform)
  • Monitoring and Logging (CloudWatch, Grafana)
  • CI/CD Pipelines
  • Scripting and automation (Python, GitHub, Github actions)

Responsibilities:

MAIN RESPONSIBILITIES:

  • Ownership of the entire back end of the application including the design, implementation, test, and troubleshooting of the core application logic, databases, data ingestion and transformation, data processing and orchestration of pipelines, APIs, CI/CD integration and other processes
  • Fine-tune and optimize queries using Snowflake platform and databases techniques
  • Optimize ETL/data pipelines to balance performance, functionality, and other operational requirements
  • Assess and resolve data pipeline issues to ensure performance and timeliness of execution
  • Assist with technical solution discovery to ensure technical feasibility
  • Assist in setting up and managing CI/CD pipelines and development of automated tests
  • Developing and managing microservices using python
  • Conduct peer reviews for quality, consistency, and rigor for production level solution
  • Design application architecture for efficient concurrent user handling, ensuring optimal performance during high usage periods
  • Promote best practices and standards for code management, automated testing, and deployments
  • Own all areas of the product lifecycle: design, development, test, deployment, operation, and support
  • Create detail documentation on Confluence to be able to support and maintain codebase and its functionality

MAIN RESPONSIBILITIES

  • This position is part of the Global Account to Report (A2R) team, which handles financial activities from recording data to producing reports. A2R includes tasks like managing fixed assets, intercompany transactions, closing accounts, reporting, inventory management, cash accounting, tax activities, audit support, and master data management.
  • The role aims to design and support the implementation of the global A2R Core Model and Target Operating Model for Intercompany to improve process efficiency, promote best practices, enhance user experience, and ensure compliance. The Core Model includes processes, technology, responsibilities, business rules, and key performance indicators (KPIs).
  • The A2R Global Process Lead – Intercompany helps project and operational teams implement the core model and manage changes. This includes consolidating A2R activities into regional hubs, increasing automation, and using new technologies like RPA and digital tools. The Lead collaborates closely with A2R Regional Heads and teams.
  • The role plays a crucial role in improving A2R processes and tools, including those managed by other corporate functions such as Finance. To achieve this, the Process Lead builds strong relationships and communicates effectively with these corporate functions.
  • The Global Process Lead – Intercompany reports to the A2R Global Process Owner and works closely with various departments, including Finance (Controlling, Treasury, Tax, Internal Control), Customer Service, and A2R teams in different regions and countries.


REQUIREMENT SUMMARY

Min:3.0Max:8.0 year(s)

Information Technology/IT

IT Software - Other

Software Engineering

Graduate

Computer science engineering or similar quantitative field of study

Proficient

1

Budapest, Hungary