Senior Data Engineer at Sequoyah Technologies
Oklahoma City, OK 73012, USA -
Full Time


Start Date

Immediate

Expiry Date

26 Jun, 25

Salary

0.0

Posted On

26 Mar, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Integration, Apache Spark, Snowflake, Tableau, Python, Data Engineering, Visualization, Dbt, Power Bi, Code, Automation, Azure, Design Principles, Decision Making, Reporting, Aws, Apache Kafka, Computer Science, Pipeline Development

Industry

Information Technology/IT

Description

JOB OVERVIEW

We are seeking an experienced Senior Data Engineer to design, develop, and maintain advanced data pipelines and warehousing solutions. This role demands expert-level SQL skills, a strong grasp of data warehousing models, API and streaming data integration, and the ability to optimize complex systems for performance and scalability.

REQUIRED SKILLS AND QUALIFICATIONS:

  • Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent experience).
  • 5+ years of experience in data engineering or a similar role.
  • Advanced proficiency in Python for data pipeline development and automation.
  • Extensive experience with Apache Airflow for workflow orchestration.
  • Hands-on expertise with cloud platforms: GCP (especially BigQuery), AWS, and Azure.
  • Proficiency with data ingestion tools such as Apache Kafka, Fivetran, Stitch, or similar technologies.
  • Experience supporting reporting and visualization tools like Tableau, Power BI, or equivalent platforms.
  • Strong knowledge of data warehousing concepts, design principles, and modeling techniques.
  • Proven ability to integrate APIs and manage streaming data workflows.
  • Expert-level SQL skills, with a track record of writing complex queries and tuning for performance.
  • Exceptional problem-solving abilities and a detail-oriented mindset.
  • Ability to thrive in both independent and collaborative settings in a fast-paced environment.

PREFERRED QUALIFICATIONS:

Familiarity with additional tools like Apache Spark, Snowflake, or dbt.
Experience with CI/CD pipelines or Infrastructure-as-Code (e.g., Terraform).
Knowledge of machine learning pipelines or integration with data science workflows
+Join us in leveraging the power of data to drive innovation and improve decision-making across our organization. If you are ready to take on this exciting challenge as a Data Engineer, we encourage you to apply!
Job Types: Full-time, Part-time
Pay: $95,231.00 - $140,768.00 per year

Application Question(s):

  • Are you currently located in the Oklahoma City Metro Area?
  • Do you currently hold an Oklahoma Driver’s License?

Location:

  • Oklahoma City, OK 73102 (Required)

Work Location: Hybrid remote in Oklahoma City, OK 7310

Responsibilities
  • Design, build, and maintain scalable data pipelines using Python, Apache Airflow, and ingestion tools like Apache Kafka, Fivetran, or Stitch.
  • Implement and optimize data warehousing solutions, applying knowledge of data modeling and design principles (e.g., star schema, snowflake schema).
  • Manage data workflows across cloud platforms including Google Cloud Platform (GCP), AWS, and Azure, with a focus on BigQuery for large-scale analytics.
  • Integrate APIs and streaming data sources into pipelines to support real-time data processing.
  • Write and optimize complex SQL queries for performance, ensuring efficient data retrieval and transformation.
  • Collaborate with stakeholders to deliver data solutions compatible with reporting tools like Tableau, Power BI, or similar platforms.
  • Troubleshoot and enhance data pipeline reliability, ensuring seamless ingestion, processing, and visualization of data.
  • Work closely with data scientists, analysts, and business teams to provide high-quality data for analytics and decision-making.
  • Stay ahead of industry trends, exploring new tools and techniques to elevate SEQTEK’s data capabilities.
Loading...