Sr. Data Engineer at Oscilar
United States, , USA -
Full Time


Start Date

Immediate

Expiry Date

04 Dec, 25

Salary

0.0

Posted On

04 Sep, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, Java, Data Engineering, Data Systems, Kafka, Business Alignment, Athena, Pipeline Design, Sql, Security

Industry

Information Technology/IT

Description

SHAPE THE FUTURE OF TRUST IN THE AGE OF AI

At Oscilar, we’re building the most advanced AI Risk Decisioning Platform. Banks, fintechs, and digitally native organizations rely on us to manage their fraud, credit, and compliance risk with the power of AI. If you’re passionate about solving complex problems and making the internet safer for everyone, this is your place.

JOB DESCRIPTION

As a Senior Data Engineer at Oscilar, you will be responsible for designing, building, and maintaining the data infrastructure that powers our AI-driven decisioning and risk management platform. You will collaborate closely with cross-functional teams, ensuring the delivery of highly reliable, low-latency, and scalable data pipelines and storage solutions that support real-time analytics and mission-critical ML/AI models.

QUALIFICATIONS

  • 5+ years in data engineering (or equivalent), including architecting and operating production ETL/ELT pipelines for real-time, high-volume analytic platforms.
  • Deep proficiency with ClickHouse, Postgres, Athena, and distributed data systems (Kafka, cloud-native stores); proven experience with both batch and streaming pipeline design.
  • Advanced programming in Python and SQL, with bonus points for Java; expertise in workflow orchestration (Airflow, Step Functions), CI/CD, and automated testing for data.
  • Experience in high-scale, low-latency environments; understanding of security, privacy, and compliance requirements for financial-grade platforms.
  • Strong communication, business alignment, and documentation abilities—capable of translating complex tech into actionable value for customers and stakeholders.
  • Alignment with Oscilar’s values: customer obsession, radical ownership, bold vision, efficient growth, and unified teamwork with a culture of trust and excellence.

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
  • Architect and implement scalable ETL and data pipelines spanning ClickHouse, Postgres, Athena, and diverse cloud-native sources to support real-time risk management and advanced analytics for AI-driven decisioning.
  • Design, develop, and optimize distributed data storage solutions to ensure both high performance (low latency, high throughput) and reliability at scale—serving mission-critical models for fraud detection and compliance.
  • Drive schema evolution, data modeling, and advanced optimizations for analytical and operational databases, including sharding, partitioning, and pipeline orchestration (batch, streaming, CDC frameworks).
  • Own the end-to-end data flow: integrate multiple internal and external data sources, enforce data validation and lineage, automate and monitor workflow reliability (CI/CD for data, anomaly detection, etc.).
  • Collaborate cross-functionally with engineers, product managers, and data scientists to deliver secure, scalable solutions that enable fast experimentation and robust operationalization of new ML/AI models.
  • Champion radical ownership—identify opportunities, propose improvements, and implement innovative technical and process solutions within a fast-moving, remote-first culture.
  • Mentor and upskill team members, cultivate a learning environment, and contribute to a collaborative, mission-oriented culture.
Loading...