Data Engineer

at  Amgen

Washington, DC 20004, USA -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate08 Jul, 2024USD 97300 Annual09 Apr, 20242 year(s) or abovePython,Postgresql,Aws,Software Engineers,Sql,Data Architects,Databases,Design,Software Development,Testing,Apache Spark,JenkinsNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

HOW MIGHT YOU DEFY IMAGINATION?

You’ve worked hard to become the professional you are today and are now ready to take the next step in your career. How will you put your skills, experience and passion to work toward your goals? At Amgen, our shared mission—to serve patients—drives all that we do. It is key to our becoming one of the world’s leading biotechnology companies, reaching over 10 million patients worldwide. Come do your best work alongside other innovative, driven professionals in this meaningful role.

BASIC QUALIFICATIONS:

Master’s degree
OR
Bachelor’s degree and 2 years of relevant experience
OR
Associate’s degree and 6 years of relevant experience
OR
High school diploma / GED and 8 years of relevant experience

MINIMUM QUALIFICATIONS:

  • Minimum 4 years of experience with design and development of data pipelines.
  • Experience with software development (Java, Python preferred), end-to-end system design.
  • Experience with data modeling for both OLAP and OLTP databases, hands-on experience with SQL, preferred Oracle, PostgreSQL, and Hive SQL.
  • Experience with ETL tool, for example Databricks.

PREFERRED QUALIFICATIONS:

  • Ability to learn quickly, be organized and detail oriented.
  • Hands-on development experience with Databricks.
  • Experience with software DevOps CI/CD tools, such Git, Jenkins.
  • Experience on AWS, familiar with EC2, S3, Redshift/Spectrum, Glue, Athena, RDS, Lambda, and API gateway.
  • Experience with Apache Airflow and Apache Spark.
  • Experience working in a team of data scientist, business experts and software engineers to deliver insights and solutions.
  • Experience working in biopharma/life sciences industry.

With general direction, apply specialized knowledge and understanding of principles, concepts and standards to moderately complex assignments as follows:

  • Be a key team member assisting in design and development of data pipelines.
  • Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipeline to meet fast paced business needs.
  • Serve as system admin to manage AWS and Databricks platform.
  • Adhere to best practices for coding, testing and designing reusable code/component.
  • Able to explore new tools and technologies that will help to improve ETL platform performance.
  • Participate in sprint planning meetings and provide estimations on technical implementation; collaborate and communicate effectively with the product teams

Responsibilities:

WHAT YOU WILL DO

Let’s do this. Let’s change the world. In this vital role you will join our team to develop a data & analytics capability focused on integrated product surveillance. As part of this role you will be designing and developing robust data models, data pipelines and data products as part of a product team of data scientists, business analysts, and software engineers. The team will rely on your expertise in automating the transformation and manipulation of data to generate insights on product complaints & adverse events for Amgen.

RESPONSIBILITIES:

With general direction, apply specialized knowledge and understanding of principles, concepts and standards to moderately complex assignments as follows:

  • Be a key team member assisting in design and development of data pipelines.
  • Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipeline to meet fast paced business needs.
  • Serve as system admin to manage AWS and Databricks platform.
  • Adhere to best practices for coding, testing and designing reusable code/component.
  • Able to explore new tools and technologies that will help to improve ETL platform performance.
  • Participate in sprint planning meetings and provide estimations on technical implementation; collaborate and communicate effectively with the product teams.


REQUIREMENT SUMMARY

Min:2.0Max:8.0 year(s)

Information Technology/IT

IT Software - System Programming

Software Engineering

Diploma

Proficient

1

Washington, DC 20004, USA