SENIOR DATA ENGINEER at SVITLA SYSTEMS
Romania, , Romania -
Full Time


Start Date

Immediate

Expiry Date

07 May, 25

Salary

0.0

Posted On

08 Feb, 25

Experience

3 year(s) or above

Remote Job

No

Telecommute

No

Sponsor Visa

No

Skills

Storage, Snowflake, Scalability, Etl, Aws, Data Models, Spark, Dashboards

Industry

Information Technology/IT

Description

ANY CITY

February 4, 2025
Svitla Systems Inc. is looking for a Senior Data Engineer for a full-time position (40 hours per week) in Romania. Our client is a leading provider of video analysis solutions for loss prevention and security. It offers a cloud-based platform that acts as a hub to analyze the video stream to find the key points of data and create alerts and reports. It provides motion alerts by detecting unusual motion by setting regions of the camera views and provides alerts when activity happens. It offers a wide range of tools to keep track of important events and history and identify outlying patterns and incidents. The subscription-based software connects the camera footage with the POS data to review all POS transactions. It pairs them with the corresponding real-time video, creating a dashboard of searchable moments. These moments allow to filter by specific incidents like movement in a room, particular purchases, and unusual staff behavior. The company is headquartered in Ottawa, Ontario, with regional representation worldwide, and serves the retail, banking, and restaurant industries.
The client is transforming conventional data into ‘smart’ data. The solution connects and synchronizes brick-and-mortar business systems like video + Point of Sale data to create insight into loss prevention, security, and operations issues. They find new and powerful ways for businesses to get real value from the data and footage created daily by their standard systems.

REQUIREMENTS

  • 10+ years of experience designing and implementing large-scale data architectures, with at least 3 years in a senior or principal role.
  • Solid understanding of data engineering through a Computer Science, Engineering, or Technology-related degree.
  • Experience in designing efficient data models and schemas that optimize performance and scalability.
  • A strong background in developing and managing ETL and ELT pipelines.
  • Experience with data visualization tools, including building dashboards using BI tools such as ThoughtSpot (or similar).
  • Hands-on experience with AWS, Snowflake, and S3 for storage and data pipeline tools (AWS Glue, Apache Airflow) and other big data technologies (Spark, Kafka).
  • Experience setting up observability solutions to ensure data pipeline health (e.g., Prometheus, Grafana).
Responsibilities
  • Design and development of a high-volume, scalable data architecture that can securely handle data ingestion of varying types from thousands of data sources.
  • Continuously optimize data pipelines and ETL processes to support the team in accelerating the data integration.
  • Set up dashboards, reporting tools, and processes to monitor data pipelines, provide insight into data operations, and ensure data quality.
  • Contribute suggestions and new ideas to enhance performance and usability, having the opportunity to make a significant positive impact.
  • Research and develop new solutions using new technologies to enhance the existing software applications.
Loading...