DATA ENGINEER at SVITLA SYSTEMS
Costa Rica, , Costa Rica -
Full Time


Start Date

Immediate

Expiry Date

11 May, 25

Salary

0.0

Posted On

12 Feb, 25

Experience

3 year(s) or above

Remote Job

No

Telecommute

No

Sponsor Visa

No

Skills

Scala, Data Engineering, Git, Kubernetes, Solver, Ecr, Docker, Hive, Python, Etl, Jenkins, Devops, Unit Testing, Relational Databases, Glue, Spark

Industry

Information Technology/IT

Description

ANY CITY

February 11, 2025
Svitla Systems Inc. is looking for a Data Engineer for a full-time position (40 hours per week) in Costa Rica/Mexico.
Our client is thee company that specializes in custom software development, technology consulting, and digital transformation initiatives. It’s headquartered in Chicago with a satellite office in Lisbon and specializes in driving digital transformation through innovative software solutions. Founded by three-hand product and software development leaders with deep roots in the fintech space, the team blends strategic vision with technical expertise.
They thrive on problem-solving and innovation, leveraging the extensive knowledge to design, build, and test the technologies.
The project is a chance to get in early with a rapidly growing Silicon Valley company and to participate in developing the next generation of genuinely game-changing products in the insurance industry. The analytics solution is a leader in providing the most technologically advanced software in the insurance industry, dedicated to expanding the horizons of what is possible.
The mission is to drive the best claims outcomes for insurers and the insured using innovative machine learning, including deep learning and natural language processing. The models provide key insights and predictions that guide the claims adjuster in making optimal decisions at every step of the claim process.

REQUIREMENTS

  • 5+ years of experience with Python and Scala for data engineering and ETL.
  • 5+ years of experience with data pipeline tools (Informatica, Spark, Spark SQL, etc. are preferred), DAG orchestration, and workflow management tools (Airflow, AWS step functions, etc).
  • 5+ years of experience working in the AWS ecosystem or GCP.
  • 3+ years of experience using cloud-provider AI services.
  • 3+ years of experience with Kubernetes and developing applications at scale.
  • 3+ years of hands-on experience developing ETL solutions using RDS and warehouse solutions using AWS services such as S3, IAM, Lambda, RDS, Redshift, Glue, SQS, EKS, and ECR.
  • Strong understanding of SQL programming with relational databases, with experience writing complex SQL queries.
  • Understanding working with distributed computing tools (Spark, Hive, etc.).
  • Knowledge of software engineering best practices, including but not limited to version control (Git), CI/CD (Jenkins, GitLab CI/CD, GitHub Actions), automated unit testing, and DevOps.
  • Knowledge of containers/orchestration (Docker, Kubernetes, Helm).
  • Ability to contribute in an agile, collaborative, and fast-paced environment.
  • An excellent troubleshooter and problem solver who likes to think outside the box to solve everyday problems.
Responsibilities
  • Design, develop, maintain, and enhance highly scalable data engineering solutions leveraging AWS services.
  • Design, build, document, and implement scalable pipelines with a clear focus on data quality and reliability.
  • Ingest and transform structured, semi-structured, and unstructured data from multiple sources. Build an enterprise-level ETL/ELT solution.
  • Innovate and build proprietary algorithms to tackle complex problems involving interesting data challenges.
  • Execute and continually optimize new customer data ingestion and model implementation processes.
  • Integrate business knowledge with technical functionalities.
  • Often work collaboratively with application engineers, data scientists, product managers, and product delivery teams.
  • Develop solutions at the intersection of data and ML.
  • Monitor workflow performance and reliability and ensure SLA targets are met.
  • Automate existing code and processes using scripting, CI/CD, infrastructure-as-code, and configuration management tools.
  • AI is at the heart of everything we do, and if you desire to be closer to the AI scene, they have opportunities to work on problems like NLP, image analysis and featurization, and OCR labeling.
Loading...