Data Engineer at Attention Arc
Durham, North Carolina, United States -
Full Time


Start Date

Immediate

Expiry Date

21 May, 26

Salary

120000.0

Posted On

20 Feb, 26

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Sql, Etl/Elt, Snowflake, Dbt, Data Modeling, Api, Data Quality, Pipeline Orchestration, Data Warehousing, Data Integration, Database Design, Testing, Version Control, Airflow, Data Transformation, Analytics Engineering

Industry

Advertising Services

Description
At Attention Arc, we exist to make media matter. As part of our Technology & Analytics team — The Builders and Translators — you build the connective tissue that powers everything we do. Your work ensures data flows cleanly, reliably, and intelligently across systems — turning complexity into clarity and infrastructure into impact. This Data Engineer role centers on architecting and scaling robust data pipelines for complex, well-documented datasets. You will design integrations that unify multiple data sources, create analysis-ready environments, and ensure our data warehouse is optimized for performance and growth. This is a hands-on role for someone who thrives in building elegant solutions to messy problems — and who understands that clean data is the foundation of credible insight. WHAT YOU’LL DO Build & Scale Data Integrations Develop and maintain scalable ETL/ELT pipelines integrating complex datasets from multiple internal and external sources Build and maintain API endpoints to support secure, efficient data access and delivery Design efficient SQL queries and optimized database structures that support analytics workflows Ensure data transformations are documented, modular, and built for reuse Architect Modern Data Infrastructure Implement and optimize Snowflake data warehouse solutions for performance, scalability, and reliability Design and maintain DBT (Data Build Tool) models to manage transformation logic and business rules Create structured, well-documented data models that support clarity and cross-team adoption Continuously improve pipeline performance, orchestration, and monitoring Deliver Trusted, Analysis-Ready Data Partner closely with Analytics, Planning, and Activation teams to deliver clean, validated datasets Implement data quality checks, validation frameworks, and monitoring systems Ensure consistency and reliability across all data integrations Troubleshoot integration issues with urgency and precision Improve, Automate & Evolve Identify inefficiencies in workflows and proactively recommend scalable improvements Standardize integration patterns to reduce complexity and onboarding friction Evaluate new tools, frameworks, and approaches to enhance infrastructure capabilities Contribute to documentation and best practices that elevate agency-wide data maturity WHAT YOU’LL BRING You bring both technical depth and disciplined craft. You believe strong infrastructure builds trust — and that trust enables smarter decisions. Experience & Technical Expertise 3–6+ years of experience in data engineering, data integration, or analytics engineering Background in media, advertising, or marketing analytics is preferred Advanced proficiency in SQL and database design Proven experience building and maintaining ETL/ELT pipelines across multiple data sources Hands-on experience with Snowflake data warehouse architecture and optimization Experience developing and maintaining DBT models for transformation workflows Experience working with Nielsen or advanced measurement datasets is a plus Experience building and maintaining APIs for data access and distribution Strong understanding of data modeling principles (relational and dimensional) Experience implementing data quality validation and monitoring practices Strong documentation habits and structured version control practices Ability to translate business requirements into scalable technical solutions Clear communicator who can explain complex systems in actionable terms A disciplined approach to testing, validation, and long-term maintainability Familiarity with orchestration tools (e.g., Airflow) Experience working in cross-functional, agency-style environments WHO YOU ARE You are energized by building systems that make other people faster and smarter. You see data integration not as plumbing, but as possibility. All In - You take ownership of the pipelines and models you build. You design for scale, anticipate edge cases, and follow through until systems are stable and trusted. No Sidelines - You collaborate across analytics, planning, and activation teams without ego. You share context openly and treat feedback as fuel for better architecture. Move With Curiosity - You ask why before how. You challenge assumptions in data definitions, explore smarter integration patterns, and experiment with intention to improve performance. Truth With Heart - You communicate clearly about trade-offs, risks, and limitations. You value transparency in documentation, honesty in debugging, and empathy in cross-functional collaboration. You believe data engineering is more than infrastructure. It’s about credibility, clarity, connection, and care. It’s about building systems that empower people to make confident decisions — and turning attention into measurable impact. SALARY RANGE Our estimated range for this role is $90k - $120k Compensation packages are based on the skill level and experience each candidate brings to their role. There may also be a more senior or junior position available that could be a better fit with your expertise. Each level has its own compensation range. RIGHT TO WORK IN THE US You must be authorized to work in the US for any employer. At this time, we are not sponsoring or providing assistance with obtaining work authorization.
Responsibilities
The role involves building and maintaining scalable ETL/ELT pipelines to integrate complex datasets from multiple sources, designing efficient SQL queries, and architecting modern data infrastructure including Snowflake and DBT models. Key duties include ensuring data transformations are documented, modular, and optimized for performance and growth.
Loading...