Data Engineer at ASSURITY GLOBAL PTE LTD
Singapore, , Singapore -
Full Time


Start Date

Immediate

Expiry Date

20 Jan, 26

Salary

0.0

Posted On

22 Oct, 25

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Pipelines, ETL Processes, APIs, Data Modelling, Data Governance, AWS, Python, SQL, Big Data, Data Lakes, Data Warehouses, Data Catalog, Analytical Skills, Problem-Solving, Collaboration, Mentoring

Industry

IT Services and IT Consulting

Description
Key Responsibilities: Design, implement, and maintain scalable data pipelines and ETL processes to ingest, catalog, normalise data and to ensure high data quality and availability. Create and manage APIs for data access, integration, and delivery to various stakeholders. Collaborate with cross-functional teams to gather requirements and translate them into technical solutions. Optimally structure and document data transformation processes to enhance efficiency and maintainability. Lead data modelling efforts to ensure that data architecture supports business goals and analytical initiatives. Configure, curate and maintain the central data catalog or metadata repository e.g. DataZone or similar data catalogue and governance platforms Implement best practices for data governance and maintain compliance with relevant regulations. Collaborate with Project Manager, Frontend Developers, UX Designers and Data Analyst to build scalable data-driven products Support data consumers in subscribing to and using approved data products across domains Be responsible for developing backend APIs & working on databases to support the applications Work in an Agile Environment that practises Continuous Integration and Delivery At least 3 years of experience as a Data Engineer, with a proven track record of designing and deploying large-scale data solutions. Strong proficiency in programming languages, particularly Python, PySpark and SQL derivatives (MySQL, NoSQl, etc.). Experience working with structured, semi-structured, and unstructured data Experience with AWS ETL and orchestration tools such as AWS MSK (Kafka), Firehose, SNS, Airflow or equivalent. Extensive knowledge of data modelling, data access and data storage infrastructure like Data Lake (both SQL and NoSQL) and Data Warehouses (e.g., AWS S3 , AWS Redshift, AWS Athena, BigQuery, RDBMs, NoSQL DBs). Knowledge of open table formats such as Apache Iceberg, Parquet etc. Familiarity with data mesh principles, domain ownership, data product thinking and federated governance Familiarity with Big Data technologies (e.g., Hadoop, Spark) and cloud services (e.g., AWS, GCP). Solid understanding of data architecture concepts, data lakes, and data marts, data catalog, data governance, metadata management and RBAC/ABAC models Exceptional analytical and problem-solving skills, with the ability to work in a fast-paced, collaborative environment. Excellent communication skills and experience working with stakeholders at various levels. Ability to mentor and guide team members in best practices and new technologies. Join us and discover a meaningful and exciting career with Assurity Trusted Solutions! The remuneration package will commensurate with your qualifications and experience. Interested applicants, please click "Apply Now". We thank you for your interest and please note that only shortlisted candidates will be notified. By submitting your application, you agree that your personal data may be collected, used and disclosed by Assurity Trusted Solutions Pte. Ltd. (ATS), GovTech and their service providers and agents in accordance with ATS’s privacy statement which can be found at: https://www.assurity.sg/privacy.html or such other successor site. We promote a learning culture and encourage you to grow and learn. Annual Leave Benefits with additional perks such as Family Care and Birthday Leave. Working in a collaborative environment with helpful team members

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
The Data Engineer will design, implement, and maintain scalable data pipelines and ETL processes while collaborating with cross-functional teams to gather requirements. They will also lead data modelling efforts and ensure compliance with data governance regulations.
Loading...