Start Date
Immediate
Expiry Date
10 Oct, 25
Salary
235000.0
Posted On
10 Jul, 25
Experience
7 year(s) or above
Remote Job
Yes
Telecommute
Yes
Sponsor Visa
No
Skills
Tableau, Azure, Aws, Kubernetes, Looker, Dv, It, Research, Communication Skills, Java, Devops, Computer Science, Object Oriented Languages, Python, Optimization, Power Bi
Industry
Information Technology/IT
WHO WE ARE
DoubleVerify is the leading independent provider of marketing measurement software, data and analytics that authenticates the quality and effectiveness of digital media for the world’s largest brands and media platforms. DV provides media transparency and accountability to deliver the highest level of impression quality for maximum advertising performance. Since 2008, DV has helped hundreds of Fortune 500 companies gain the most from their media spend by delivering best in class solutions across the digital ecosystem, helping to build a better industry. Learn more at www.doubleverify.com.
REQUIREMENTS
ROLE DESCRIPTION
Help us build our analytics system used by both internal teams and external clients, from the ground up. You will be a leading part of a high-performing platform data engineering team, which builds an online analytics platform providing insights and data for the world’s largest brands and media platforms. This role involves working on high-scale distributed architecture, handling large-scale data processing at a petabyte scale, working with numerous advertising data sources from major platforms such as Google, Meta, and TikTok, developing systems that export massive amounts of reports with thousands of data points daily, and building robust APIs using Python and Java. You will integrate complex tools like data catalogs (e.g., Atlan, Open Metadata) and semantic layers (e.g., Looker, Cube.dev) and work with multiple data lakes such as Snowflake, BigQuery, and DataBricks. Additionally, you will leverage native tables and newly supported open formats like Iceberg and Delta to ensure maximum flexibility and minimize go-to-market time for new data products.
WHAT YOU WILL DO