GCP Data Engineer

at  Cloud MR

Dallas, Scotland, United Kingdom -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate29 Jul, 2024Not Specified01 May, 2024N/ALanguages,Sql,Looker,Data Governance,Computer Science,Data Modeling,Cloud Storage,Scala,Java,Apache Spark,PythonNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

As a GCP Data Engineer, you will play a crucial role in designing, building, and optimizing data pipelines and systems on the Google Cloud Platform. You will collaborate with cross-functional teams to ensure efficient data extraction, transformation, and loading processes, enabling data-driven insights and decision-making.

Responsibilities:

  • Design and develop scalable data pipelines on GCP, utilizing services such as Big Query, Dataflow, Pub/Sub, and Cloud Storage.
  • Implement data integration and ETL (Extract, Transform, Load) processes to collect and process data from diverse sources.
  • Collaborate with data scientists and analysts to understand their requirements and ensure data availability for analysis and reporting.
  • Optimize data processing workflows and storage strategies to maximize performance and cost-efficiency.
  • Troubleshoot and resolve data quality issues, ensuring data integrity throughout the pipeline.
  • Stay up to date with the latest advancements in GCP data engineering technologies and best practices.
  • Collaborate with stakeholders to understand business needs and translate them into scalable data solutions.

Requirements:

  • Bachelor’s degree in Computer Science, Engineering, or a related field.
  • Strong programming skills in languages such as Python, Java, or Scala.
  • Extensive experience working with GCP services, including Big Query, Dataflow, Pub/Sub, Cloud Storage, and related technologies.
  • Proficiency in SQL and data querying languages.
  • Familiarity with data warehousing concepts and technologies, such as Big Query ML or Looker.
  • Strong understanding of data modeling, data governance, and data security principles.
  • Experience with large-scale data processing frameworks such as Apache Beam or Apache Spark.
  • Excellent problem-solving skills and ability to work with complex datasets.
  • Strong communication and collaboration skills to work effectively with cross-functional teams.

If you are passionate about GCP and data engineering, and if you thrive in a collaborative and fast-paced environment, apply now! Join us in shaping the future of data-driven solutions on the Google Cloud Platform

Responsibilities:

  • Design and develop scalable data pipelines on GCP, utilizing services such as Big Query, Dataflow, Pub/Sub, and Cloud Storage.
  • Implement data integration and ETL (Extract, Transform, Load) processes to collect and process data from diverse sources.
  • Collaborate with data scientists and analysts to understand their requirements and ensure data availability for analysis and reporting.
  • Optimize data processing workflows and storage strategies to maximize performance and cost-efficiency.
  • Troubleshoot and resolve data quality issues, ensuring data integrity throughout the pipeline.
  • Stay up to date with the latest advancements in GCP data engineering technologies and best practices.
  • Collaborate with stakeholders to understand business needs and translate them into scalable data solutions


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Computer science engineering or a related field

Proficient

1

Dallas, United Kingdom