Data Engineer (Google Cloud Platform) - Remote model (based in Portugal)
at BoostIT
Remoto, Sicilia, Portugal -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 30 Nov, 2024 | Not Specified | 01 Sep, 2024 | 7 year(s) or above | Kubernetes,Docker,Dbt,Cloud Storage,Collaborative Environment,Apache Kafka,Data Studio,Azure,Scala,Aws,Git,Programming Languages,Computer Science,Google Cloud Platform,Java,Database Design,Query Optimization,Python,Data Modeling | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
Boost IT is a Portuguese technology consultancy company, we are integrated into one of the most entrepreneurial groups in Portugal, with investment in more than 30 companies.
We want to be known for being the most dynamic, energetic and reliable company to operate in the market and, for that, we want to count on you.
If you’re passionate about technology and want to work on the most relevant technology projects, then this ad could be for you!
Boost IT. Doing IT. Better
Tasks
As a Data Engineer specializing in Google Cloud Platform (GCP), you will play a key role in designing, building, and maintaining our cloud-based data solutions. You will collaborate closely with our cross-functional teams to develop data pipelines, implement data ingestion processes, and ensure the reliability and scalability of our data infrastructure on GCP. The ideal candidate will have a strong background in data engineering, experience with GCP services, and a passion for leveraging cloud technologies to solve complex data challenges.
PREFERRED SKILLS:
- Experience with other cloud platforms such as AWS or Azure.
- Knowledge of containerization technologies such as Docker and Kubernetes.
- Familiarity with streaming data processing frameworks such as Apache Kafka or Apache Flink.
- Understanding of machine learning concepts and frameworks.
Requirements
- Bachelor’s degree in Computer Science, Engineering, or related field.
- At least 7 years working as a Data Engineer
- Proven experience with a focus on Google Cloud Platform (GCP).
- In-depth knowledge of GCP services and products, including BigQuery, DBT (Data Build Tool), Cloud Storage, Dataflow, Pub/Sub, and Data Studio.
- Proficiency in programming languages such as Python, Java, or Scala for developing data pipelines and ETL processes.
- Strong understanding of data modeling, database design, and SQL query optimization.
- Experience with version control systems such as Git and CI/CD pipelines.
- Excellent problem-solving skills and attention to detail.
- Ability to work effectively in a fast-paced, collaborative environment with cross-functional teams.
- Google Cloud certification (e.g., Professional Data Engineer) is a plus
Responsibilities:
- Design, develop, and maintain data pipelines and ETL processes on Google Cloud Platform (GCP) using services such as Cloud Dataflow, Apache Beam, or Google Cloud Composer.
- Implement scalable and reliable data ingestion mechanisms to collect, process, and store large volumes of structured and unstructured data.
- Optimize data storage and retrieval processes using GCP storage solutions such as BigQuery, Cloud Storage, and Cloud Bigtable.
- Collaborate with data scientists and analysts to support their data needs and ensure data accessibility, accuracy, and integrity.
- Monitor and troubleshoot data pipelines, identifying and resolving performance bottlenecks, data quality issues, and system failures.
- Implement data security and compliance measures to protect sensitive information and ensure regulatory compliance.
- Stay up-to-date with the latest developments in GCP services and cloud computing technologies, evaluating new tools and techniques to improve our data infrastructure.
- Provide technical guidance and mentorship to junior data engineers, fostering a culture of continuous learning and growth.
Requirements
- Bachelor’s degree in Computer Science, Engineering, or related field.
- At least 7 years working as a Data Engineer
- Proven experience with a focus on Google Cloud Platform (GCP).
- In-depth knowledge of GCP services and products, including BigQuery, DBT (Data Build Tool), Cloud Storage, Dataflow, Pub/Sub, and Data Studio.
- Proficiency in programming languages such as Python, Java, or Scala for developing data pipelines and ETL processes.
- Strong understanding of data modeling, database design, and SQL query optimization.
- Experience with version control systems such as Git and CI/CD pipelines.
- Excellent problem-solving skills and attention to detail.
- Ability to work effectively in a fast-paced, collaborative environment with cross-functional teams.
- Google Cloud certification (e.g., Professional Data Engineer) is a plus.
REQUIREMENT SUMMARY
Min:7.0Max:12.0 year(s)
Information Technology/IT
IT Software - DBA / Datawarehousing
Software Engineering
Graduate
Computer science engineering or related field
Proficient
1
Remoto, Portugal