Data Engineer at Better Collision Centers Inc
Charleston, South Carolina, United States -
Full Time


Start Date

Immediate

Expiry Date

24 Apr, 26

Salary

0.0

Posted On

24 Jan, 26

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, SQL, Data Pipelines, XML, JSON, AWS, GCP, Azure, Snowflake, Data Modeling, Data Quality, Error Handling, API Integration, Data Validation, Analytics, Data Architecture

Industry

Vehicle Repair and Maintenance

Description
Description About BetterX BetterX is the technology subsidiary of Better Collision Centers, created to bring modern, practical, AI-powered solutions to industries that have historically been underserved by technology. We build systems that simplify complex workflows, remove manual bottlenecks, and deliver clean, reliable data that teams can trust. Our tools support real-world operations in collision repair and other service-based businesses, with a strong focus on usability, performance, and reliability. We are a small, fast-moving team that values ownership, strong engineering fundamentals, and shipping production-ready systems that make a real impact. About the Role We are seeking a Data Engineer to design, build, and deliver a secure, scalable data pipeline connecting external webhooks, cloud data warehouses, and CRM systems. This is a temporary, project-based role with an expected duration of three to six months, with a strong possibility of extension or conversion to a full-time position based on performance and business needs. You will work closely with API, platform, AI, and analytics teams to ensure downstream systems receive clean, well-structured, and trustworthy data. What You’ll Do Design and implement event-driven, incremental data ingestion pipelines using webhooks and cloud data warehouse tools Ingest and process high-volume XML and JSON data using idempotent, retry-safe logic Build and maintain raw, parsed, and curated data layers in Snowflake or similar cloud warehouses Implement data validation, reconciliation checks, and error-handling for critical pipelines Monitor pipeline health, including latency, throughput, and error rates Design and enforce secure, multi-tenant data isolation and role-based access control Partner with API, AI, analytics, and business stakeholders to support CRM integrations and data delivery Document pipeline architecture, schemas, data flows, and operational runbooks What You Bring 2–3+ years of professional experience with Python and SQL Hands-on experience building modern data pipelines (batch and/or streaming) Experience working with semi-structured data, including XML and JSON Familiarity with cloud platforms such as AWS, GCP, or Azure Experience with data warehouse platforms such as Snowflake Strong understanding of data modeling and layered data architectures Experience implementing data quality checks, retry logic, and reconciliation processes Ability to work independently and deliver production-ready systems within a defined timeline Strong communication skills and comfort working cross-functionally Preferred Experience Experience with collision repair, insurance, or automotive data CRM data integration or API-based data delivery Multi-tenant cloud architecture design Exposure to data observability tools (Great Expectations, Datadog, Monte Carlo) Familiarity with analytics tools such as Tableau or Power BI What Success Looks Like A production-ready pipeline ingesting external webhook data into the warehouse Accurate mapping of repair orders, documents, and related entities Strong automated data quality and validation processes Clear visibility into pipeline health and performance Secure, tenant-isolated data access for internal and downstream consumers Well-documented architecture and runbooks that internal teams can support Important Note This is a temporary position and is not benefits-eligible. There is a strong possibility of extension or conversion to a full-time role based on performance and business needs. Why BetterX You’ll have real ownership, real impact, and the opportunity to build systems that matter. This role is ideal for someone who enjoys shipping production-quality work, solving real problems, and contributing to a fast-growing technology organization.
Responsibilities
The Data Engineer will design, build, and deliver a secure, scalable data pipeline connecting external webhooks, cloud data warehouses, and CRM systems. They will work closely with various teams to ensure clean and trustworthy data is delivered to downstream systems.
Loading...