Senior Data / Platform Engineer (Embedded - Data & Analytics Engineering) at Decision Foundry
Bengaluru, karnataka, India -
Full Time


Start Date

Immediate

Expiry Date

24 Apr, 26

Salary

0.0

Posted On

24 Jan, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, Data Engineering, Platform Engineering, AWS, Terraform, Snowflake, dbt, Data Pipeline Design, Observability, Monitoring, Logging, Alerting, Testing, Workflow Orchestration, Web Scraping, Email Scraping

Industry

Business Consulting and Services

Description
Welcome to Decision Foundry - Data Analytics Division! We are proud to introduce ourselves as a certified "Great Place to Work," where we prioritize creating an exceptional work environment. As a global company, we embrace a diverse culture, fostering inclusivity across all levels. Originating from a well-established 19-year web analytics company, we remain dedicated to our employee-centric approach. By valuing our team members, we aim to enhance engagement and drive collective success. We are passionate about harnessing the power of data analytics to transform decision-making processes. Our mission is to empower data-driven decisions that contribute to a better world. In our workplace, you will enjoy the freedom to experiment and explore innovative ideas, leading to outstanding client service and value creation. We win as an organization through our core tenets. They include: · One Team. One Theme. · We sign it. We deliver it. · Be Accountable and Expect Accountability. · Raise Your Hand or Be Willing to Extend it About the Role We’re looking for a Senior Data / Platform Engineer to embed directly into our Data & Analytics Engineering team and help accelerate delivery across a highly customized, API-driven data platform. This role is focused on augmenting and hardening the existing platform, building and expanding pipelines, and developing reusable infrastructure and library components to support scalable ingestion and transformation workflows. This is a hands-on engineering role best suited for someone who thrives in software-engineering style data work—building modular Python libraries, deploying pipeline infrastructure, and improving reliability, observability, and test coverage across a production data ecosystem. Location: Remote – EST Hours Type: Contract Team: Data Platform / Analytics Engineering Key Responsibilities: What You’ll Work On You will integrate into our team to accelerate well-scoped execution work, including: Data pipeline and ingestion expansion across multiple sources and delivery patterns Platform hardening and refactoring initiatives to improve scalability and maintainability Observability, testing, and reliability improvements across orchestration and batch workloads Deployment and modularization of pipeline components to support repeatable onboarding of net-new data capabilities Supporting dbt model and mart development (big plus) and maintaining analytics transformations in Snowflake Core Responsibilities Build and maintain serverless, containerized batch pipelines orchestrated via Prefect (similar to Airflow) Expand ingestion and connectivity patterns across: APIs S3-based sources SFTP infrastructure Email scraping Web scraping Develop and enhance internal Python libraries used to standardize ingestion, transformation, and pipeline deployment patterns Implement and improve data observability practices including monitoring, alerting, and failure diagnostics Contribute to infrastructure-as-code using Terraform to support repeatable deployments and environment consistency Support and improve the data warehouse ecosystem: Snowflake as the primary data warehouse dbt on Snowflake for modeling and analytics transformations Collaborate closely with internal engineers through PR reviews, sprint workflows, and team standards. Operate within existing repos, processes, and CI/CD workflows to increase throughput while maintaining quality Technical Environment Python (expert level required) Prefect (workflow orchestration) AWS (cloud-native compute, containerized/serverless batch workloads) Terraform (IaC) Snowflake (data warehouse) dbt (transformations and marts) Highly integrated and customized platform with heavy API-based data flows What Success Looks Like Net-new ingestion capabilities are delivered faster without sacrificing reliability Pipelines are more modular, reusable, and deployable through standardized patterns Failures are easier to detect and debug through improved observability and testing The platform becomes easier to maintain as codebases are refactored and hardened Internal senior engineers retain architectural ownership while execution throughput increases Required Qualifications 6+ years of experience in Data Engineering, Platform Engineering, or Software Engineering with strong data systems exposure Expert-level Python skills with a track record of building production-grade libraries and services Strong experience building and operating batch pipeline infrastructure in cloud environments (AWS preferred) Experience with workflow orchestration tools such as Prefect, Airflow, Dagster, etc. Strong understanding of data pipeline design: modularity, idempotency, retries, deployment patterns, and maintainability Experience implementing data observability, monitoring, logging, alerting, and testing frameworks Hands-on experience with Terraform or similar infrastructure-as-code tooling Comfortable working in an embedded model: collaborating inside existing repos, PR workflows, and delivery processes Preferred / Nice-to-Have Strong experience with dbt (models, marts, testing, documentation) Experience with Snowflake performance optimization and warehouse best practices Experience with web scraping and/or email scraping pipelines Familiarity with containerized workloads and serverless compute patterns Strong instincts for platform refactoring, system hardening, and reliability engineering Working Model / Team Approach Our internal team retains ownership of architecture, modeling standards, and technical direction. This role operates as an embedded senior engineer within our workflows to accelerate delivery, increase throughput, and protect senior internal capacity—without compromising quality. Equal Opportunity Statement We are committed to building a diverse and inclusive team. We welcome applications from candidates of all backgrounds and are an equal opportunity employer. We provide equal employment opportunities to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability, gender identity, sexual orientation, marital status, or veteran status.
Responsibilities
The Senior Data / Platform Engineer will focus on expanding data pipelines and hardening the existing platform while developing reusable infrastructure components. They will also improve observability and reliability across the data ecosystem.
Loading...