Start Date
Immediate
Expiry Date
27 Nov, 25
Salary
50.0
Posted On
28 Aug, 25
Experience
3 year(s) or above
Remote Job
Yes
Telecommute
Yes
Sponsor Visa
No
Skills
Server Management, Linux System Administration, Kubernetes, Database Security, Data Classification, Teams, Kafka, Computer Science, Normalization, Collaboration, Data Engineering, Airflow, Database Administration
Industry
Information Technology/IT
JOB DESCRIPTION:
As a data engineer, you’ll be part of a diverse team of technologists. Our goal is to deliver flexible, reliable platform solutions that enable our feature development teams to build exceptional services for our customers. We continually enhance our capabilities for developing, testing, and deploying features that improve reliability and resilience. Achieving this requires robust database solutions and thoughtful design to optimize data ingestion and extraction, while upholding data protection, integration, and availability.
QUALIFICATIONS AND SKILLS
Degree or diploma in Computer Science, Software Engineering, or a related field.
At least 3 years of relevant professional experience.
Background in one or more of the following: Database administration, Data Engineering, ETL loading and ingestion, data classification and normalization, SQL and schema optimization.
Skills in Linux system administration, web server management, and networking are beneficial.
Experience with Kubernetes is an advantage.
Knowledge of database security best practices, including collaboration across teams to ensure safe and zero-trust access.
Proficiency with AWS services such as Lambda, Redshift, DMS.
Experience with Apache tools like Airflow, Superset, Kafka.
Experience with similar technologies from other cloud vendors is a plus.
Strong curiosity and motivation to stay abreast of technical trends and recommend new tools and approaches.
Ability to empathize and communicate effectively with all areas of the business.
Job Type: Fixed term contract
Pay: $50.00-$55.00 per hour
Experience:
Extensive experience with MySQL to support platform growth and scalability.
Expertise in both tactical and strategic schema design enhancements, ensuring optimal data ingestion and extraction for our warehouse database to support BI, external reporting, and operations teams.
Collaborate with development teams to fine-tune schemas and queries for new features prior to production release.
Skilled in ETL processes and Data Lake management.
Familiarity with MySQL to PostgreSQL migrations is a strong asset.
Experience with snowflake.
Experience with Python or Go is desirable, to assist with current ETL ingestion scripts.
Proactively seek opportunities to enhance platform and data resilience.
Diagnose and resolve system issues, determine root causes, and manage recovery processes to ensure rapid restoration of services with no data loss.
Maintain precise attention to detail in all tasks.