Lead Data Engineer at Dynatron Software
Remote, Oregon, USA -
Full Time


Start Date

Immediate

Expiry Date

20 Sep, 25

Salary

170000.0

Posted On

21 Jun, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Implementation Experience, Success Driven, Continuous Improvement, Relational Databases, Teamwork, Flexible Scheduling, Disability Insurance, Health, Mysql, Grit, Accountability

Industry

Information Technology/IT

Description

MUST HAVE (NON-NEGOTIABLE) SKILLS:

  • 5+ years of Expert Level Snowflake design and implementation experience.
  • 5+ years of experience designing and implementing other data warehouse technologies such as Databricks and Redshift
  • 5+ years of AWS experience, including S3 and managed database services (Aurora, DynamoDB, RDS)
  • 5+ years of experience with hands-on Python development
  • 10 years experience working with relational databases such as MySQL or other SQL based relational databases
  • 10 years experience with CI/CD pipelines and version control systems (e.g., Bitbucket) for managing codebase and deployments.
Responsibilities

ABOUT THE ROLE

We are seeking a true Snowflake data engineering “Expert” that will hit the ground running. This role will be critical to the success of the Dynatron SaaS and DaaS product portfolio. Your focus will be ensuring our data pipelines, data warehouse, and overall data quality are rock-solid and will support the company’s strategic vision around data.

TO BE SUCCESSFUL IN THIS ROLE YOU MUST HAVE:

  • BOTH leadership and hands-on experience.
  • You will be a thought leader in evaluating and recommending best practices.
  • You will also be hands-on in implementing these best practices.
  • Provide technical leadership and mentorship to a team of data engineers, fostering a culture of collaboration, innovation, and continuous learning.
  • An ability to challenge the current architecture based on your prior experience and justify your recommendations. Don’t just accept the status quo.
  • Desire to become an “expert” on all aspects of Dynatron data
  • Extensive experience designing, implementing and automating best-of-breed data pipelines using platforms such as Snowflake, AWS, Python, Airbyte/Fivetran, Airflow, and Stored Procedures
  • Extensive experience designing, modeling and implementing data warehouses on the Snowflake platform. You MUST have expert experience in Snowflake.
  • Extensive experience with approaches to ensure data quality throughout the data pipelines and in the data warehouse and addressing data quality issues with the current architecture
  • Collaborate closely with cross-functional teams, including product owners, software engineers, data scientists and business stakeholders, to understand data requirements and deliver solutions that meet evolving business needs.
  • Lead by example, demonstrating a commitment to excellence, integrity, and professionalism in all aspects of your work.
Loading...