Data Engineer - Tech Operations at Sigma Computing
San Francisco, California, USA -
Full Time


Start Date

Immediate

Expiry Date

21 Nov, 25

Salary

180000.0

Posted On

21 Aug, 25

Experience

3 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Sql, Pipelines, Search, Machine Translation, Airflow, Dbt

Industry

Information Technology/IT

Description

QUALIFICATIONS WE NEED

  • Strong experience working with APIs and building pipelines in cloud platforms (e.g., Snowflake, Databricks)
  • Expertise in SQL and dbt; fluency in at least one programming language (e.g., Python, R, Scala)
  • Experience implementing data governance frameworks that scale
  • 3+ years of experience in a Data Engineering role
  • Startup experience

QUALIFICATIONS WE WANT (ALSO, SKILLS YOU’LL LEARN!)

  • A drive to continuously learn (and share those learnings) about the evolving data ecosystem
  • Experience with modern orchestrators (e.g., Astronomer, Airflow, Dagster)
  • Experience building scalable ML systems such as recommendation engines, search or machine translation

ABOUT US:

Sigma is the only cloud analytics and business intelligence tool empowering business teams to break free from the confines of the dashboard, explore data for themselves, and make better, faster decisions. The award-winning software was built to capitalize on the performance power of cloud data warehouses to combine data sources and analyze billions of rows of data instantly via an intuitive, spreadsheet-like interface – no coding required.
Since launching with its unique interface, Sigma Computing has added features such as collaboration tools and embedded analytics capabilities. The most recent product launch included a set of AI tools such as forecasting capabilities, an AI copilot and a notebook interface for users who prefer a code-first environment.
Sigma announced its $200M in Series D financing in May 2024, to continue transforming BI through its innovations in AI infrastructure, data application development, enterprise-wide collaboration, and business user adoption. Spark Capital and Avenir Growth Capital co-led the Series D funding round, with additional participation from a group of past investors including Snowflake Ventures and Sutter Hill Ventures.The Series D funding, raised at a valuation 60% higher than the company’s Series C round three years ago, promises to further accelerate Sigma’s growth.
Come join us!

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities

ABOUT THE ROLE

We’re hiring our first Data Engineer within Tech Operations at Sigma. In this role, you’ll build the data foundation that powers critical insights across Engineering and Tech Operations. You will architect, scale, and optimize data models and pipelines across Snowflake and Databricks, fueling everything from internal decision-making to external-facing environments.
Reporting into the engineering organization, this is a high visibility and high-impact role with greenfield ownership and the opportunity to define how Engineering and Tech Operations leverages data at scale.

WHAT YOU WILL BE DOING

  • Design, build, and maintain core data models and visualizations in Sigma to support Engineering & Tech Operations initiatives, ensuring high data accuracy and usability.
  • Architect and manage our production data pipelines in Snowflake and how they are consumed in Sigma (Tech we use: Fivetran, dbt, Snowflake, Sigma, Hightouch, Metaplane).
  • Build foundational data assets for Tech Operations, including Support insights and internal telemetry.
  • Create observability datasets from Sigma’s cloud infrastructure platforms (AWS, GCP, Azure).
  • Partner with our infrastructure engineering team to ensure high availability of all key data assets.
  • Build internal data products and enable self-service usage across Tech Operations.
  • Identify and execute high-impact data projects in ambiguous environments, working independently to define scope, set priorities, and deliver quickly.
  • Collaborate across Engineering, Product, and GTM teams to deliver on all of the above.
Loading...