Data Engineer at Tribute Technology
United States, North Carolina, USA -
Full Time


Start Date

Immediate

Expiry Date

25 Jul, 25

Salary

0.0

Posted On

26 Apr, 25

Experience

3 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Sql, Python, Google Cloud Platform, Computer Science, Jira, Analytical Skills, Communication Skills, Vendors, Information Systems

Industry

Information Technology/IT

Description

Tribute Technology is an established best-in-class Software as a Service technology company and solutions provider. Our customers include some of the largest and most prominent media brands in the world, spanning 4 continents and reaching millions of users every day. Our mission is to make meaningful connections between our customers and their users through innovation and a commitment to excellent user experience.

ABOUT YOU:

We are seeking a talented Data Engineer who can help implement robust data architecture solutions and provide critical input on architecture design. You will primarily work with AWS platforms and play a pivotal role in integrating and managing large-scale data systems. Your initial focus will involve seamlessly connecting our Google Analytics traffic data from Google Cloud / BigQuery into AWS, enabling comprehensive data analytics and business insights.

EDUCATION AND/OR EXPERIENCE:

  • Bachelor’s degree in Computer Science, Information Systems, Engineering, or related field.
  • 3+ years experience as a Data Engineer or similar role.
  • Strong experience working with AWS technologies (e.g., Redshift, Glue, Lambda, S3, EMR).
  • Proficiency in Google Cloud Platform (GCP), specifically BigQuery.
  • Solid programming skills in SQL and Python or similar languages.
  • Demonstrable experience designing and implementing ETL/ELT processes.
  • Excellent problem-solving skills and the ability to work independently as well as part of a collaborative team.

SKILLS, KNOWLEDGE, AND ABILITIES:

  • Employ excellent interpersonal and verbal communication skills to communicate and collaborate with team members, organizational leaders, clients, and vendors effectively and tactfully
  • Work autonomously and proactively with minimal supervision, balancing multiple assignments and priorities
  • Robust analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
  • Experience with JIRA, Azure DevOps, or another agile project tracking system
  • Passion to learn and work with multiple data sources to perform complex analyses
  • Natural curiosity and motivation to dig into the details

LANGUAGE SKILLS:

  • English proficiency (reading, writing, verbal)
Responsibilities
  • Implement and maintain scalable, reliable data pipelines to move and transform data between Google Cloud Platform (BigQuery) and AWS.
  • Manage and improve existing ETL processes feeding an AWS data lake , including establishing novel data pipelines from a variety of data sources.
  • Collaborate with architects and stakeholders to design and refine data architecture.
  • Proactively identify opportunities for improving data reliability, efficiency, and quality.
  • Ensure data compliance, security, and governance best practices are met.
  • Provide technical expertise and recommendations based on your previous experiences to continuously enhance our data infrastructure.
Loading...