Senior Data Engineer at Stravito
Amsterdam, North Holland, Netherlands -
Full Time


Start Date

Immediate

Expiry Date

21 Jul, 26

Salary

0.0

Posted On

22 Apr, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, Kotlin, Data Engineering, Cloud Infrastructure, AWS, Azure, ClickHouse, Snowflake, API Design, RBAC, OAuth, SAML, CI/CD, Terraform, Data Privacy, Vector Stores

Industry

Software Development

Description
We are Stravito and this is the problem we solve. Stravito transforms how Consumer Insights professionals and Brand Managers work by building AI that automates their core workflows. We help world-leading organizations across industries accelerate strategic decision-making by turning millions of market research documents into intelligent systems that generate reports, discover insights proactively, and synthesize knowledge across vast content libraries. The data platform underneath makes all of this possible: ingesting millions of documents, enforcing tenant-level security, and feeding the AI systems our customers rely on daily. What Makes This Role Unique Build the foundation: Own the data platform that powers AI products used daily by thousands of professionals at Fortune 500 companies Work with data that matters: Transform event streams, usage telemetry, and millions of proprietary market research documents into reliable, queryable intelligence Solve hard infrastructure problems: Design multi-tenant, secure-by-default pipelines and APIs that meet the compliance bar of the world's most demanding enterprise customers Own it end-to-end: This isn't a role where someone else defines the work. You'll scope work independently, dig into data quality issues, field requests from stakeholders, and drive things to completion. What You’ll Do As part of the platform team, you'll own a broad area: from building infrastructure to supporting the teams that depend on it. In any given week you might: Build and operate data pipelines that move event streams, document metadata, and usage data into our cloud data warehouse (ClickHouse Cloud, Snowflake, Azure) Design and maintain APIs for analytics and event extraction, with multi-tenant security baked in (RBAC, OAuth, SSO/SAML) Make data usable: whether that's modeling schemas for BI consumers, investigating a data quality issue, or helping a stakeholder understand what's possible with the data we have Keep things reliable and secure through automated tests, monitoring, and handling of sensitive data within SOC 2 and ISO 27001 environments Power our AI experiences by working with vector stores, indexing, and retrieval systems Drive engineering best practices across the platform: CI/CD, peer reviews, infrastructure as code, API versioning, and clear documentation What We Need Must-haves A track record of building data platforms in SaaS or cloud-native analytics environments Strong programming skills, with depth in at least one of Python or Kotlin and willingness to work across both. Experience with Rust or TypeScript is a plus Hands-on experience with MPP/cloud data warehouses (e.g., ClickHouse, Redshift, BigQuery, Snowflake, Azure Synapse) and cloud infrastructure on AWS or Azure Practical experience designing, consuming, and maintaining APIs Familiarity with multi-tenant security patterns: RBAC, row-level security, and identity standards such as OAuth and SAML/SSO Solid engineering fundamentals: CI/CD, automated testing, observability, and infrastructure as code (Terraform a plus) Working knowledge of data privacy requirements (PII handling, GDPR) and experience operating within compliance frameworks like SOC 2 or ISO 27001 Nice-to-haves Experience integrating with BI tools (Power BI, Tableau, Looker) Familiarity with semantic search, embeddings, or vector stores (e.g., Pinecone, pgvector) Exposure to event-driven or streaming architectures (Kafka, Kinesis, SQS/SNS) Experience with containerisation (Docker, ECS/Fargate) Interest in leveraging LLMs and AI tooling to accelerate data engineering work dbt, SQLMesh, or similar transformation framework experience This role is fully remote, but you will need to be a current resident for tax purposes in one of the chosen locations. What’s in it for you? Join a remote-first, globally distributed team of 100+ professionals from 30+ nationalities, united by our core values: simplicity first, an 'own it, do it' mentality, embracing different perspectives, and enjoying the journey together. We bring everyone together for company events throughout the year to strengthen our global connections. You'll grow alongside experienced colleagues with deep expertise across AI, market research, and enterprise systems. We offer exceptional career development opportunities in our fast-evolving market, competitive compensation, and a collaborative culture where everyone actively supports each other's success. Most importantly, you'll have the satisfaction of simplifying the professional lives of thousands of Brand Managers and Consumer Insights professionals worldwide, making their strategic decisions more data-driven and impactful. Ready to build the data platform behind Fortune 500 strategy? Join us.
Responsibilities
You will own the data platform infrastructure, building and operating pipelines that ingest millions of documents and event streams. Additionally, you will design secure, multi-tenant APIs and ensure data quality and reliability for AI-driven products.
Loading...