Data Engineer - GCP at HCA Healthcare
Nashville, TN 37203, USA -
Full Time


Start Date

Immediate

Expiry Date

06 Sep, 25

Salary

0.0

Posted On

07 Jun, 25

Experience

1 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Thinking Skills, It Governance, Color, Json, Oracle, Avro, Operations, Cloud Storage, Gcs, Servicenow, Communication Skills, Sql Server, Data Engineering, Airflow, Jira

Industry

Information Technology/IT

Description

INTRODUCTION

Experience the HCA Healthcare difference where colleagues are trusted, valued members of our healthcare team. Grow your career with an organization committed to delivering respectful, compassionate care, and where the unique and intrinsic worth of each individual is recognized. Submit your application for the opportunity below:Data EngineerHCA Healthcare

NOTE: ELIGIBILITY FOR BENEFITS MAY VARY BY LOCATION.

We are seeking a(an) Data Engineer for our team to ensure that we continue to provide all patients with high quality, efficient care. Did you get into our industry for these reasons? We are an amazing team that works hard to support each other and are seeking a phenomenal addition like you who feels patient care is as meaningful as we do. We want you to apply!

JOB SUMMARY

The Data Engineer is responsible for analysis, design, development, and support of Google Cloud Platform (GCP) data pipelines. The candidate will spend time writing, testing, and reviewing GCP data ingestion, encryption, and transformation pipelines. Other tasks include researching, architecting, requirements gathering, designing, documenting, and modifying data pipelines throughout the production life cycle.
In addition, this position requires a candidate who can analyze business requirements, perform design tasks, construct, test, and implement solutions with minimal supervision. This candidate will have a record of accomplishment of participation in successful projects in a fast-paced, mixed team environment

WHAT QUALIFICATIONS YOU WILL NEED:

  • Bachelor’s degree preferred
  • 1+ year(s) of experience in Data Warehouse ETL/ELT Data Engineering required
  • 1+ year(s) of experience in Google Cloud Platform (GCP) Data Engineering required
  • 1+ year(s) of experience in Healthcare IT preferred
  • Or equivalent combination of education and/or experience
  • Experience with Google Cloud Storage (GCS), BigQuery, and Cloud Composer (Airflow), Cloud Functions, Dataflow, and Dataproc GCP services.
  • Experience writing optimized BigQuery SQL transformation queries and scripts.
  • Experience with raw data formats such as JSON, Avro, and Parquet
  • Experience with Oracle, SQL Server, and other database platforms.
  • Experience writing and maintaining Unix/Linux and Python scripts.
  • Experience with GitHub source control and CI/CD workflows.
  • Knowledge of issue tracking tools such as Jira and ServiceNow.
  • Ability to troubleshoot, maintain, reverse engineer, and optimize existing data pipelines.
  • Ability to analyze and interpret complex data and offer solutions to complex problems.
  • Ability to work independently on assigned tasks.
  • Strong written and verbal communication skills, including the ability to explain complex technical issues in a way that non-technical people may understand.
  • Excellent problem-solving and critical thinking skills.
  • Knowledge of IT governance and operations.

Work Location/Schedule:

  • Nashville, TN area (near Centennial Park)
  • Hybrid - 2-3 days onsite

“There is so much good to do in the world and so many different ways to do it."- Dr. Thomas Frist, Sr.
HCA Healthcare Co-Founder
If you find this opportunity compelling, we encourage you to apply for our Data Engineer opening. We promptly review all applications. Highly qualified candidates will be directly contacted by a member of our team. We are interviewing apply today!
We are an equal opportunity employer. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status

Responsibilities
  • Develop, maintain, and optimize streaming, near real-time, and batch GCP data pipelines for enterprise-wise analysis of structured, semi-structured, and unstructured data.
  • Collaborate closely with the Senior Engineers, Lead Architect, and Product Owner to define, design, and build new features and improve existing data products.
  • Contribute to and participate in peer code reviews.
  • Participate in on-call support rotation.
  • Adhere to established development guidelines.
  • Translate business requirements into technical design specifications.
  • Work with Project Managers to estimate, establish, and meet target dates.
  • Work independently, and complete tasks on-schedule by exercising strong judgment and critical thinking skills.
  • Create and maintain technical documentation, including source-to-target mappings, job scheduling and dependency details, and business-driven transformation rules.
  • Participate in the deployment, change, configuration, management, administration, and maintenance of deployment processes and systems.
  • Participate in technical group discussions and adopt innovative technologies to improve development and operations.
Loading...