Data Engineer II at PagerDuty
Santiago de Chile, Región Metropolitana, Chile -
Full Time


Start Date

Immediate

Expiry Date

03 Jun, 25

Salary

0.0

Posted On

04 Mar, 25

Experience

3 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Modeling, Data Integration, Pipelines, Computer Science, Agile Environment, Metrics, Availability, Data Models

Industry

Information Technology/IT

Description

PagerDuty empowers teams of all kinds to do the critical work that moves business forward through the PagerDuty Operations Cloud.
Visit our careers site to explore life at PagerDuty, discover opportunities, and sign-up for job alerts!
PagerDuty is growing and we are looking for an experienced Data Engineer for our Data team in IT to manage and contribute to the software and services that we provide to our users. As a Data Engineer at PagerDuty, you will help lead the team responsible for designing, building, deploying, and supporting solutions for teams across PagerDuty’s growing global user base. You are scrappy, independent, and excited about having a big impact on a small but growing team.
Together with the other members of the Data Platform team, you will have the opportunity to re-define how PagerDuty, designs, builds, integrates, and maintains a growing set of software and SaaS solutions. In this role, you will be working cross-functionally with business domain experts, analytics, and engineering teams to re-design and re-implement our Data Warehouse model(s).

ABOUT YOU: SKILLS AND ATTRIBUTES

  • You will design, implement and scale data pipelines that transform billions of records into actionable data models that enable data insights.
  • You will help lead initiatives to formalize data governance and management practices, rationalize our information lifecycle and key company metrics.
  • You will provide mentorship and hands-on technical support to build trusted and reliable domain-specific datasets and metrics.
  • You will have deep technical skills, be comfortable contributing to a nascent data ecosystem, and building a strong data foundation for the company.
  • You will be a self-starter, detail and quality oriented, and passionate about having a huge impact at PagerDuty.

MINIMUM REQUIREMENTS

  • Bachelor’s degree in Computer Science, Engineering or related field, or equivalent training, fellowship, or work experience
  • 3+ years of experience working in data integration, pipelines, data modeling,
  • Experience designing, deploying code in Data platforms in a cloud-based and Agile environment
  • Fluent English is required
  • Availability to work 2 day per month into our Santiago office
    PagerDuty is a flexible, hybrid workplace. We embrace and encourage in-person working as an integral part of our culture. Both our employees and external research tell us that co-located collaboration strengthens connections, drives innovation, and accelerates learning.
    For external applicants, including employee referrals, this role is expected to come into our Santiago office 2 times per month, so they can thrive in their new role and fully embrace being a Dutonian!

NOT SURE IF YOU QUALIFY?

Apply anyway! We extend opportunities to a broad array of candidates, including those with diverse workplace experiences and backgrounds. Whether you’re new to the corporate world, returning to work after a gap in employment, or simply looking to take the next step in your career path, we are excited to connect with you.

Responsibilities
  • Translate business requirements into data models that are easy to understand and used by different disciplines across the company. Design, implement and build pipelines that deliver data with measurable quality under the SLA
  • Partner with business domain experts, data analysts and engineering teams to build foundational data sets that are trusted, well understood, aligned with business strategy and enable self-service
  • Be a champion of the overall strategy for data governance, security, privacy, quality and retention that will satisfy business policies and requirements
  • Own and document foundational company metrics with a clear definition and data lineage
  • Identify, document and promote best practices
Loading...