Data Engineer (Data Platform) at Cartrack
, , Vietnam -
Full Time


Start Date

Immediate

Expiry Date

26 Jan, 26

Salary

0.0

Posted On

28 Oct, 25

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Engineering, ETL Workflows, Python, Airflow, Database Optimization, Docker, Linux, Version Control, C, C++, C#, Go, Rust, Spark, Trino, Data Quality

Industry

Software Development

Description
About Us We're a world-leading smart mobility SaaS tech company with almost 2,300,000 active users. Our teams are collaborative, vibrant, and fast-growing, and all team members are empowered with the freedom to influence our products and technology. Are you curious, innovative, and passionate? Do you take ownership, embrace challenges, and love problem-solving? We are looking for a Senior Data Engineer who will help us build robust pipelines and infrastructure to process and analyze audio and video data, revolutionizing the way our customers use connected technology. Your Role Design, build, and maintain data pipelines and ETL workflows using tools like Airflow. Develop monitoring systems to detect upstream data changes and maintain data freshness. Optimize relational and analytical databases for performance, including partitioning, indexing, and table design. Implement data quality, data lineage, and source documentation standards. Manage and tune analytics databases such as ClickHouse, Druid, StarRocks, or Doris. Work with relational databases such as PostgreSQL, MySQL, SQL Server, or Oracle. Leverage distributed query tools (e.g., Spark, Trino) to support scalable data analysis. Build, deploy, and monitor services in Linux environments, including shell scripting and debugging. Create and maintain Docker images, ensuring consistent environments for development and deployment. Integrate data workflows with CI/CD pipelines using tools such as GitLab, GitHub, or Bitbucket. Your Qualifications Core Technical Skills Strong proficiency in one or more of the following: C, C++, C#, Go, or Rust. Proficiency in Python for scripting and automation. Experience with Airflow and complex ETL processes involving large datasets. Solid understanding of data architecture, database optimization, and monitoring best practices. Hands-on experience with Docker, Linux environments, and version control systems. Preferred / Nice-to-Have Skills Experience or interest in Kubernetes and Helm charts. Familiarity with AI/LLM data pipelines — generating tables and views optimized for low-latency queries. Experience with LLM Agent development, RAG workflows, or related frameworks such as LlamaIndex, LangGraph, or FastAPI. Knowledge of stream processing systems to accelerate data ingestion into analytics databases. Exposure to feature engineering and table design in collaboration with data scientists. Qualifications & Experience Bachelor’s Degree or Advanced Diploma in Computer Science, Information Technology, or Engineering, with 3–5 years of experience in a software or technology environment focused on data systems architecture, deployment, and monitoring. Candidates without a degree must have 6–10 years of equivalent experience in data engineering, infrastructure management, or backend systems development.
Responsibilities
The Senior Data Engineer will design, build, and maintain data pipelines and ETL workflows, as well as develop monitoring systems to ensure data freshness. They will also optimize databases for performance and manage analytics databases.
Loading...