Data Infrastructure Engineer ID35383 at AgileEngine
zdalnie, województwo śląskie, Poland -
Full Time


Start Date

Immediate

Expiry Date

25 Jul, 25

Salary

0.0

Posted On

26 Apr, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Good communication skills

Industry

Information Technology/IT

Description

MUST HAVES

  • 5+ years of engineering and data analytics experience;
  • Strong SQL and Python/Scala skills for complex data analysis;
  • Hands-on experience building automation tooling and pipelines using Python, Scala, Go, or TypeScript;
  • Experience with modern data pipeline and warehouse tools (e.g., Snowflake, Databricks, Spark, AWS Glue);
  • Proficiency with declarative data modeling and transformation tools (e.g., DBT, SqlMesh);
  • Familiarity with real-time data streaming (e.g., Kafka, Spark);
  • Experience configuring and maintaining data orchestration platforms (e.g., Airflow);
  • Background working with cloud-based data lakes and secure data practices;
  • Ability to work autonomously and drive projects end-to-end;
  • Strong bias for simplicity, speed, and avoiding overengineering;
  • Upper-intermediate English level.
Responsibilities
  • Architect, build, and maintain modern and robust real-time and batch data analytics pipelines;
  • Develop and maintain declarative data models and transformations;
  • Implement data ingestion integrations for streaming and traditional sources such as Postgres, Kafka, and DynamoDB;
  • Deploy and configure BI tooling for data analysis;
  • Work closely with product, finance, legal, and compliance teams to build dashboards and reports to support business operations, regulatory obligations, and customer needs;
  • Establish, communicate, and enforce data governance policies;
  • Document and share best practices with regards to schema management, data integrity, availability, and security;
  • Protect and limit access to sensitive data by implementing a secure permissioning model and establishing data masking and tokenization processes;
  • Identify and communicate data platform needs, including additional tooling and staffing;
  • Work with cross-functional teams to define requirements, plan projects, and execute on the plan.
Loading...