Data Engineer at Qode
Special capital Region of Jakarta, Java, Indonesia -
Full Time


Start Date

Immediate

Expiry Date

07 Jun, 26

Salary

0.0

Posted On

09 Mar, 26

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Sql, Python, Bigquery, Etl/Elt, Airflow, Argo, Dbt, Git, Ci/Cd, Terraform

Industry

Software Development

Description
We are looking for a Data Engineer at Flip. You will be responsible for building and maintaining scalable data pipelines and core datasets to support data discovery and insights generation. In this role, you will partner closely with data stakeholders to enhance reliability, performance, and extensibility of the data platform, ensuring it meets both current requirements and anticipated future use cases. About FlipRafi, Luqman, and Anjar, who were college friends in Universitas Indonesia, started Flip as a project in 2015 to transfer payments to each other at a fraction of what banks would charge them. They are pioneers in the Indonesian market, with their technology now helping millions of Indonesians, both individuals and businesses, carry out bank-to-bank money transfers through a reliable and seamless app. After five years of operations, Flip has helped Indonesians transfer money worth several trillions of rupiah and has received double-digit funding from respectable investors such as Sequoia India, Insight Partner, and Insignia. Flip’s ultimate mission is to give Indonesians access to one of the most progressive and fairest financial services in the world. What You'll Do Build and maintain batch data pipelines for ingestion into BigQuery from multiple sources Support the creation of MVP data transformations and analytics datasets Schedule and automate reporting pipelines using orchestration tools Collaborate with data analysts to enable reporting and dashboards Ensure data quality, documentation, and reproducibility of workflows What We're Looking For 2+ years of experience in data engineering or analytics engineering Solid SQL and Python skills for ETL/ELT workflows and be able to implement it on production-ready Experience and knowledge of good practices with cloud data warehouses (BigQuery, Athena, etc) Familiarity with orchestration tools (Airflow, Argo, dbt) to deliver SLA driven pipeline design Comfortable working in Git-based workflows and supporting CI/CD processes Familiarity with Terraform is a plus Experience with building a data warehouse is a plus Experience working with variety of stakeholders outside Data team is a plus Experience with data quality/governance best practices is a plus
Responsibilities
The role involves building and maintaining scalable batch data pipelines for ingestion into BigQuery from various sources, alongside supporting the creation of MVP data transformations and analytics datasets. Responsibilities also include scheduling and automating reporting pipelines, collaborating with analysts, and ensuring data quality and workflow reproducibility.
Loading...