Senior Data Engineer at Great Eastern Life Assurance Co Ltd
Cuenca, Azuay, Ecuador -
Full Time


Start Date

Immediate

Expiry Date

07 Jun, 26

Salary

0.0

Posted On

09 Mar, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Pipelines, ETL, Data Quality, Scalability, Data Modeling, Code Reviews, Testing, Hadoop, Spark, Hive, Cloud Data Services, SQL, Big Data, Stakeholder Management, Data Governance, Architecture

Industry

Insurance

Description
We are seeking a skilled and detail-oriented Data Engineer to design, develop, and maintain robust data pipelines and ETL solutions. This role involves working closely with cross-functional teams to ensure data quality, scalability, and alignment with business and technical requirements Design, develop, test, and maintain scalable ETL pipelines to meet business, technical, and user requirements. Collect, refine, and integrate new datasets. Maintain comprehensive documentation and data mappings across multiple systems. Create optimized and scalable data models that align with organizational data architecture standards and best practices. Conduct code reviews and perform rigorous testing to ensure high-quality deliverables. Drive continuous improvement in data quality through optimization, testing, and solution design reviews. Ensure all solutions conform to big data architecture guidelines and long-term roadmap. Implement robust monitoring, logging, and alerting systems to ensure pipeline reliability and data accuracy. Apply best practices in data engineering to design and build reliable data marts within the Hadoop ecosystem for planning, reporting, and analytics. Maintain and optimize data pipelines to ensure data accuracy, integrity, and timeliness. Manage code in a centralized repository with clear branching strategies and well-documented commit messages. Coordinate with stakeholders to ensure smooth production deployment and adherence to data governance policies. Proactively identify and implement improvements to data engineering processes and workflows. Architect end-to-end solutions for business analytics product (dashboards or statistical model) from the acquiring of data, contextualizing data for business analytics and integrating of product with business process. Act as a business process owner for onboarding users and data products onto the data platform and pipelines supporting dashboards and statistical models. Ensure adherence to development standards and perform periodic reviews to maintain pipeline performance and sustainability. Coordinate and conduct testing with stakeholders to ensure effective deployment of data pipelines and dashboards. Monitor data pipelines continuously and collaborate with stakeholders to troubleshoot and optimize performance. Bachelor Degree in Computer Engineering, Computer Science, Mathematics, Software Engineering, equivalent fields or proven experience in data engineering A minimum 5 years of experience in Business Intelligence / Data Analytics field. An analytics practitioner with proven experience in delivering data-driven business solution and data-driven process augmentation Proficiency in tools and platforms such as Hadoop, Spark, Hive, and cloud data services (e.g., AWS, Azure, GCP) Hands-on experience with large volumes of data using SQL, Spark, Hadoop, or other big data ecosystems is preferred Proven experience in data engineering, ETL development, and big data technologies Stakeholder Management – Conversant in Business terms and ability to resolve and explain data analytics issues with Business users and other stakeholders A strong team player who is meticulous, detail-oriented, and capable of performing under pressure Possesses strong problem-solving and interpersonal skills Possesses strong communication skills with the ability to bridge technical and business domains, and proactive problem-solving Committed, dependable, and adaptable with the flexibility to support during peak periods and tight deadlines Demonstrate high integrity, accountability, and a collaborative mindset Takes initiative to improve current state of things and embrace change Understanding of banking, insurance and financial services is preferred

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
The role involves designing, developing, and maintaining robust data pipelines and ETL solutions, ensuring data quality, scalability, and alignment with business requirements. Responsibilities include creating optimized data models, conducting code reviews, implementing monitoring systems, and architecting end-to-end solutions for business analytics products.
Loading...