Data Engineer - A26098 at Activate Interactive Pte Ltd
Singapore, , Singapore -
Full Time


Start Date

Immediate

Expiry Date

07 Jun, 26

Salary

0.0

Posted On

09 Mar, 26

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

SQL, Python, SSIS, Snowflake, AWS S3, RDS, ETL, ELT, Data Pipelines, CI/CD, DevOps, Data Modelling, Data Lake, Data Warehouse, REST API, Linux

Industry

IT Services and IT Consulting

Description
Activate Interactive Pte Ltd (“Activate”) is a leading technology consultancy headquartered in Singapore with a presence in Malaysia and Indonesia. Our clients are empowered with quality, cost-effective, and impactful end-to-end application development, like mobile and web applications, and cloud technology that remove technology roadblocks and increase their business efficiency. We believe in positively impacting the lives of people around us and the environment we live in through the use of technology. Hence, we are committed to providing a conducive environment for all employees to realize their full potential, who in turn have the opportunity to continuously drive innovation. We are searching for our next team members to join our growing team. If you love the idea of being part of a growing company with exciting prospects in mobile and web technologies that create positive impact on people’s lives, then we would love to hear from you. What will you do? Design, develop, and maintain data pipelines that extract data from various sources and formats, transform it according to business requirements, and load it into target systems. Perform data extraction, cleaning, transformation, and flow. Design, build, launch and maintain efficient and reliable large-scale batch and real-time data pipelines with data processing frameworks. Integrate and collate data silos in a manner which is both scalable and compliant. Collaborate with the Project Manager, Data Architect, Business Analysts, Frontend Developers, Designers and Data Analysts to build scalable data driven products. Work in an Agile Environment that practices Continuous Integration and Delivery. Work closely with fellow developers through pair programming and code review process. What are we looking for? Bachelor's degree in Computer Science, Software Engineering, or related field. At least 3–5 years’ experience in ETL/data integration projects. Proficient in general data cleaning and transformation using scripting languages (mandatory: SQL, Python; added advantages: R, etc) to ensure data accuracy and consistency. Knowledge in R will be an advantage. Proficient in building ETL pipeline (mandatory: SQL Server Integration Services SSIS, Python, Snowflake; added advantages: AWS Lambda, ECS Container task, Eventbridge, AWS Glue, Spring, etc). Proven hands-on experience with Microsoft SSIS and Snowflake. Proficient in database design and various databases (mandatory: SQL, AWS S3, RDS; added advantages: PostgreSQL, Athena, MongoDB, Postgres/GIS, MySQL, SQLite, VoltDB, Apache Cassandra, etc). Experience in cloud technologies such as GCC and GCC+ (i.e. AWS, Azure). Experience in and passion for data engineering in a big data environment using Cloud platforms such as GCC and GCC+ (i.e. AWS, Azure). Experience in building production-grade data pipelines, ETL/ELT data integration. Experience in CI/CD pipelines and DevOps tools (e.g. GitLab). Experience in automated provisioning tools (Ansible, Terraform, Puppet, Vagrant) will be an advantage. Familiar with data modelling, data access, and data storage infrastructure like Data Mart, Data Lake, Data Virtualisation and Data Warehouse for efficient storage and retrieval. Familiar with REST API and web requests/protocols in general. Familiar with data governance policies, access control and security best practices. Knowledge of system design, data structure and algorithms. Knowledge of AI/ML RAG (Retrieval-Augmented Generation), MCP (Model Context Protocol) concepts. Comfortable in both Windows and Linux development environments. Interest in being the bridge between engineering and analytics. What do we offer in return? Fun working environment Employee Wellness Program To work in Singapore Government Agencies projects We provide structured development framework and growth opportunities. (We are a “SHRI 2025 Gold winner” in “Learning & Development; Coaching & Mentoring”) Why you'll love working with us? If you are looking for opportunities to collaborate with leading industry experts and be surrounded by highly motivated and talented peers, we welcome you to join us. We provide all employees with equal opportunities to grow and develop with us. We believe your success is our success. Does it sound like something you are interested in exploring further? Please be in touch with our team for an initial chat at careers@activate.sg Activate Interactive Singapore is an equal opportunity employer. Employment decisions will be based on merit, qualifications and abilities. Activate Interactive Pte Ltd does not discriminate in employment opportunities or practices on the basis of race, color, religion, gender, sexuality, national origin, age, disability, marital status or any other characteristics protected by law. Protecting your privacy and the security of your data are longstanding top priorities for Activate Interactive Pte Ltd. Your personal data will be processed for the purposes of managing Activate Interactive Pte Ltd’s recruitment related activities, which include setting up and conducting interviews and tests for applicants, evaluating and assessing the results, and as is otherwise needed in the recruitment and hiring processes. Please consult our Privacy Notice (https://www.activate.sg/privacy-policy) to know more about how we collect, use, and transfer the personal data of our candidates. Here you can find how you can request for access, correction and/or withdrawal of your Personal Data.
Responsibilities
The role involves designing, developing, and maintaining data pipelines to extract, transform, and load data from various sources into target systems according to business requirements. This includes building scalable batch and real-time data pipelines and integrating data silos while collaborating with cross-functional teams.
Loading...