Data Engineer

at  Devoteam

Jakarta, JKT, Indonesia -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate11 Aug, 2024Not Specified11 May, 20242 year(s) or aboveDiscrimination,DisabilitiesNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

  • Devoteam is a leading consulting firm focused on digital strategy, tech platforms, and cybersecurity.By combining creativity, tech, and data insights, we empower our customers to transform their business and unlock the future.With 25 years’ of experience and 8,000 employees across Europe and the Middle East, Devoteam promotes responsible tech for people and works to create better change.#Creative Tech for Better ChangeDevoteam has launched in January 2021 its new strategic plan, Infinite 2024, with the ambition to become the #1 EMEA partner of the leading Cloud-based platform companies (AWS, Google Cloud, Microsoft, Salesforce, ServiceNow), further reinforced by deep expertise in digital strategy, cybersecurity, and data.

The Data Engineer will be responsible for the following activities:

  • Work closely with data architects and other stakeholders to design scalable and robust data architectures that meet the organization’s requirements
  • Develop and maintain data pipelines, which involve the extraction of data from various sources, data transformation to ensure quality and consistency, and loading the processed data into data warehouses or other storage systems
  • Responsible for managing data warehouses and data lakes, ensuring their performance, scalability, and security
  • Integrate data from different sources, such as databases, APIs, and external systems, to create unified and comprehensive datasets.
  • Perform data transformations and implement Extract, Transform, Load (ETL) processes to convert raw data into formats suitable for analysis and reporting
  • Collaborate with data scientists, analysts, and other stakeholders to establish data quality standards and implement data governance practices
  • Optimise data processing and storage systems for performance and scalability
  • Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions
  • Programming Skills: Proficiency in programming languages such as Python, Java, Scala, or SQL is essential for data engineering roles. Data engineers should have experience in writing efficient and optimized code for data processing, transformation, and integration.
  • Database Knowledge: Strong knowledge of relational databases (e.g., SQL) and experience with database management systems (DBMS) is crucial. Familiarity with data modeling, schema design, and query optimization is important for building efficient data storage and retrieval systems.
  • Big Data Technologies: Understanding and experience with big data technologies such as Apache Hadoop, Apache Spark, or Apache Kafka is highly beneficial. Knowledge of distributed computing and parallel processing frameworks is valuable for handling large-scale data processing.
  • ETL and Data Integration: Proficiency in Extract, Transform, Load (ETL) processes and experience with data integration tools like Apache NiFi, Talend, or Informatica is desirable. Knowledge of data transformation techniques and data quality principles is important for ensuring accurate and reliable data.
  • Data Warehousing: Familiarity with data warehousing concepts and experience with popular data warehousing platforms like Amazon Redshift, Google BigQuery, or Snowflake is advantageous. Understanding dimensional modeling and experience in designing and optimizing data warehouses is beneficial.
  • Cloud Platforms: Knowledge of cloud computing platforms such as Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP) is increasingly important. Experience in deploying data engineering solutions in the cloud and utilizing cloud-based data services is valuable.
  • Data Pipelines and Workflow Tools: Experience with data pipeline and workflow management tools such as Apache Airflow, Luigi, or Apache Oozie is beneficial. Understanding how to design, schedule, and monitor data workflows is essential for efficient data processing.
  • Problem-Solving and Analytical Skills: Data engineers should have strong problem-solving abilities and analytical thinking to identify data-related issues, troubleshoot problems, and optimize data processing workflows.
  • Communication and Collaboration: Effective communication and collaboration skills are crucial for working with cross-functional teams, including data scientists, analysts, and business stakeholders. Data engineers should be able to translate technical concepts into clear and understandable terms.

EDUCATION AND EXPERIENCE

  • Bachelor’s degree in Engineering required.
  • Minimum Two years of related experience is highly preferred.
  • Two certifications in GCP (within 3 months after joining).Status: Full-TimeDuration: -Beginning date: June 2024The Devoteam Group is committed to equal opportunities, promoting its employees on the basis of merit and actively fighting against all forms of discrimination. We believe that diversity contributes to the creativity, dynamism, and excellence of our organization. All our positions are open to people with disabilities

Responsibilities:

  • Work closely with data architects and other stakeholders to design scalable and robust data architectures that meet the organization’s requirements
  • Develop and maintain data pipelines, which involve the extraction of data from various sources, data transformation to ensure quality and consistency, and loading the processed data into data warehouses or other storage systems
  • Responsible for managing data warehouses and data lakes, ensuring their performance, scalability, and security
  • Integrate data from different sources, such as databases, APIs, and external systems, to create unified and comprehensive datasets.
  • Perform data transformations and implement Extract, Transform, Load (ETL) processes to convert raw data into formats suitable for analysis and reporting
  • Collaborate with data scientists, analysts, and other stakeholders to establish data quality standards and implement data governance practices
  • Optimise data processing and storage systems for performance and scalability
  • Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand data requirements and deliver solutions
  • Programming Skills: Proficiency in programming languages such as Python, Java, Scala, or SQL is essential for data engineering roles. Data engineers should have experience in writing efficient and optimized code for data processing, transformation, and integration.
  • Database Knowledge: Strong knowledge of relational databases (e.g., SQL) and experience with database management systems (DBMS) is crucial. Familiarity with data modeling, schema design, and query optimization is important for building efficient data storage and retrieval systems.
  • Big Data Technologies: Understanding and experience with big data technologies such as Apache Hadoop, Apache Spark, or Apache Kafka is highly beneficial. Knowledge of distributed computing and parallel processing frameworks is valuable for handling large-scale data processing.
  • ETL and Data Integration: Proficiency in Extract, Transform, Load (ETL) processes and experience with data integration tools like Apache NiFi, Talend, or Informatica is desirable. Knowledge of data transformation techniques and data quality principles is important for ensuring accurate and reliable data.
  • Data Warehousing: Familiarity with data warehousing concepts and experience with popular data warehousing platforms like Amazon Redshift, Google BigQuery, or Snowflake is advantageous. Understanding dimensional modeling and experience in designing and optimizing data warehouses is beneficial.
  • Cloud Platforms: Knowledge of cloud computing platforms such as Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP) is increasingly important. Experience in deploying data engineering solutions in the cloud and utilizing cloud-based data services is valuable.
  • Data Pipelines and Workflow Tools: Experience with data pipeline and workflow management tools such as Apache Airflow, Luigi, or Apache Oozie is beneficial. Understanding how to design, schedule, and monitor data workflows is essential for efficient data processing.
  • Problem-Solving and Analytical Skills: Data engineers should have strong problem-solving abilities and analytical thinking to identify data-related issues, troubleshoot problems, and optimize data processing workflows.
  • Communication and Collaboration: Effective communication and collaboration skills are crucial for working with cross-functional teams, including data scientists, analysts, and business stakeholders. Data engineers should be able to translate technical concepts into clear and understandable terms


REQUIREMENT SUMMARY

Min:2.0Max:7.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Engineering required

Proficient

1

Jakarta, Indonesia