Senior Analytics Engineer at Intercom
Dublin, County Dublin, Ireland -
Full Time


Start Date

Immediate

Expiry Date

05 Jul, 25

Salary

0.0

Posted On

05 Apr, 25

Experience

0 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Data Modeling, Sql, Airflow, Dbt, Snowflake, Python

Industry

Information Technology/IT

Description

Intercom is the AI Customer Service company on a mission to help businesses provide incredible customer experiences.
Our AI agent Fin, the most advanced customer service AI agent on the market, lets businesses deliver always-on, impeccable customer service and ultimately transform their customer experiences for the better. Fin can also be combined with our Helpdesk to become a complete solution called the Intercom Customer Service Suite, which provides AI enhanced support for the more complex or high touch queries that require a human agent.
Founded in 2011 and trusted by nearly 30,000 global businesses, Intercom is setting the new standard for customer service. Driven by our core values, we push boundaries, build with speed and intensity, and consistently deliver incredible value to our customers.

WHAT’S THE OPPORTUNITY?

The Product Analytics Engineering team at Intercom has two main responsibilities. We provide and maintain a very important set of high quality core data models, predominantly in the Product domain. We also own various frameworks and dbt related tooling.
The combined impact enables a large group of AI Engineers, Data Scientists, BI Creators and Analysts across multiple teams to self-serve providing valuable insights for a plethora of purposes. Our work creates the solid engineering foundations for measuring how healthy our Customers are, accelerates GTM initiatives, helps the AI/ML team to continuously improve Fin, Intercom’s AI agent, or our own Customer Services team to provide an even more delightful experience for our own customers.
We’re looking for a Senior Analytics Engineer to join us and collaborate on data-related initiatives, who is passionate about making quality data available for our stakeholders.

WHAT SKILLS DO I NEED?

  • You have a very strong understanding of SQL, including a knack for query optimisation, and significant experience in data modeling, warehouse design.
  • You have strong professional experience or understanding of tools and technologies that are in our stack, such as Snowflake, DBT, or equivalent technologies.
  • You have years of full-time, professional work experience using a modern programming language on a daily basis. Strong preference for Python
  • You worked with Apache Airflow - we use Airflow extensively to orchestrate and schedule all of our data workflows.
  • You have great relationships with your stakeholders, such as Analysts and Data Scientists
  • You can demonstrate the impact of your work.
  • You care about your craft.
Responsibilities

Please refer the Job description for details

Loading...