Data Engineer II - (Remote - US)
at Mediavine
Atlanta, Georgia, USA -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 21 Dec, 2024 | USD 130000 Annual | 25 Sep, 2024 | 3 year(s) or above | Optimization,User Behavior,Snowflake,Data Infrastructure,Path Analysis,Devops,Tracking Systems,Dbt,Looker,Ad Tech,It,Aws,Azure | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
Mediavine is seeking an experienced Data Engineer to join our engineering team. We are looking for someone who enjoys solving interesting problems and wants to work with a small team of talented engineers on a product used by thousands of publishers. Applicants must be based in the United States.
REQUIREMENTS
Location:
- Applicants must be based in the United States
You Have:
- 3+ years of experience in a data engineering role
- Strong Python skills (Understands tradeoffs, optimization, etc)
- Strong SQL skills (CTEs, window functions, optimization)
- Experience working in cloud environments (AWS preferred, GCS, Azure)
- An understanding of how to best structure data to enable internal and external facing analytics
- Familiarity with calling APIs to retrieve data (Authentication flows, filters, limits, pagination)
- Experience working with DevOps to deploy, scale and monitor data infrastructure
- Scheduler experience either traditional or DAG based
- Comfortable working with multi-TB cloud data warehouses (Snowflake preferred, Redshift, Big Query)
- Experience with other DBMS systems (Postgres in particular)
Nice to haves:
- Experience with web analysis such as creating data structure that support product funnels, user behavior, and decision path analysis
- Understanding of Snowflake external stages, file formats and snowpipe
- Experience with orchestration tools particularly across different technologies and stacks
- Experience with dbt
- Knowledge of Ad Tech, Google Ad Manager and all of it’s fun quirks (so fun)
- The ability to make your teammates laugh (it wouldn’t hurt if you were fun to work with is what I’m saying)
- Familiarity with event tracking systems (NewRelic, Snowplow, etc)
- Experience with one or more major BI tools (Domo, Looker, PowerBI, etc.)
Responsibilities:
- Create data pipelines that make data available for analytic and application use cases
- Develop self-healing, resilient processes that do not require constant care and feeding to run smoothly
- Create meaningful data quality notifications with clear actions for interested parties including other internal teams and other members of the data and analytics team
- Leading projects from a technical standpoint, creating project Technical Design Documents
- Support data analysts and analytics engineers ability to meet the needs of the organization
- Participate in code reviews, understanding coding standards, ensuring test coverage and being aware of best practices
- Build or implement tooling around data quality, governance and lineage, in the dbt framework and Snowflake but external to that as needed
- Provide next level support when data issues are discovered and communicated by the data analysts
- Work with data analysts and analytics engineers to standardize transformation logic in the dbt layer for consistency and ease of exploration by end users
- Enable analytics engineers and data analysts by providing data modeling guidance, query optimization and aggregation advic
REQUIREMENT SUMMARY
Min:3.0Max:8.0 year(s)
Information Technology/IT
Analytics & Business Intelligence
Software Engineering
Graduate
Proficient
1
Atlanta, GA, USA