Data / Cloud Engineer

at  Juvo bvba

Belgium, Wallonie, Belgium -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate24 Oct, 2024Not Specified27 Jul, 2024N/ASql,Python,Spark,Code,Information Technology,Cloud Storage,Data Quality,Computer Science,Data Engineering,Infrastructure,Data ModelingNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

We are seeking a highly skilled and motivated GCP Data/Cloud Engineer to join our team. The ideal candidate will have a strong understanding of cloud computing concepts, extensive experience with Google Cloud Platform (GCP), and proficiency in Python and SQL. This role involves designing and implementing data pipelines, maintaining data quality and governance, and leveraging various GCP services to deliver robust data solutions. The candidate should also possess experience in Infrastructure as Code (IaC) using Terraform and CI/CD for data pipelines.

FUNDAMENTAL SKILLS:

  • Strong understanding of cloud computing concepts.
  • Extensive experience with GCP.
  • Proficiency in Python and SQL.
  • Strong analytical mindset and problem-solving skills.
  • Knowledge of data modeling and experience with BigQuery.

GCP SPECIFIC SKILLS:

  • BigQuery
  • Cloud Pub/Sub
  • Cloud Storage
  • Cloud Dataflow
  • GCP Databricks

DATA ENGINEERING & DEVOPS SKILLS:

  • Experience in designing and implementing data pipelines on Spark.
  • Proficiency in maintaining data quality and governance.
  • Experience with Infrastructure as Code (IaC) using Terraform.
  • Proficiency in CI/CD for data pipelines.

QUALIFICATIONS:

  • Bachelor’s degree in Computer Science, Information Technology, or a related field.
  • Proven experience as a Data Engineer or Cloud Engineer, specifically with GCP.
  • Demonstrated ability to manage and analyze large datasets.
  • Excellent communication and collaboration skills.
  • Certification in GCP is a plus.
    Location: Eindhoven – 2 days onsite / 3 days remot

Responsibilities:

  • Design, develop, and maintain data pipelines on GCP using Spark.
  • Implement and manage data models in BigQuery.
  • Develop and manage streaming data workflows using Cloud Pub/Sub and Cloud Dataflow.
  • Store and manage data efficiently in Cloud Storage.
  • Utilize GCP Databricks for data processing and analytics.
  • Ensure data quality and governance across all data solutions.
  • Apply Infrastructure as Code (IaC) principles using Terraform to manage GCP resources.
  • Implement CI/CD practices for seamless deployment of data pipelines.
  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver effective solutions.
  • Monitor and troubleshoot performance issues in data pipelines and GCP services.


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Computer science information technology or a related field

Proficient

1

Belgium, Belgium