Senior Data Engineer at Interos Inc
Remote, Oregon, USA -
Full Time


Start Date

Immediate

Expiry Date

10 Jul, 25

Salary

160000.0

Posted On

11 Apr, 25

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Airflow, Aws, Data Streaming, Software Development, It, Data Processing, Agile Environment, Argo, Sql, Kafka, Communication Skills, Snowflake, Computer Science, Code, Production Experience

Industry

Information Technology/IT

Description

SENIOR DATA ENGINEER

About Interos
Interos is the supply chain risk intelligence company – building the most trusted and transparent supply chains in the world. Our pioneering discovery and monitoring intelligence spans the lifecycle of supply chain risk, enabling faster and more informed threat mitigation. As the world’s first, and only, automated supplier intelligence platform, we continuously map and monitor extended supply chains at speed and scale to protect organizations from regulatory fines, unethical labor, cyber-attacks, and other systemic vulnerabilities. Interos serves a variety of commercial, government, and public sector customers around the world including a host of Global Fortune 500 companies and from within the members of the Five Eyes nations. www.interos.ai.

MINIMUM QUALIFICATIONS:

  • 5+ years of experience in Software Development.
  • 3+ years of full-time professional Python experience including production experience with data pipelines.
  • 3+ years of experience with SQL via relational or columnar databases, especially Postgres and/or Snowflake.
  • 2+ years of experience working with Snowflake.
  • 2+ years of experience developing in AWS.
  • 2+ years of experience in data streaming or event-driven systems with Kafka or another stream processing system.
  • Bachelor’s degree

PREFERRED QUALIFICATIONS:

  • Experience with distributed data processing, using tools like PySpark, Dask, etc.
  • Experience with job or pipeline orchestration, using tools like Prefect, Airflow, Argo, etc.
  • Passionate about writing code that is scalable, maintainable, reusable, and well-tested.
  • Interested in building reliable, fault-tolerant, production-grade systems.
  • Feel comfortable debugging large or complex applications and, if necessary, guiding their refactor.
  • Have experience designing, implementing, and maintaining APIs as a service for your team and customers.
  • Excellent communication skills with the ability to convey technical concepts to both technical and non-technical stakeholders.
  • Have experience developing data software in an agile environment.
  • Enjoy optimizing complex processes and leaving something better off than where you found it.
  • A seasoned engineer who enjoys sharing your experience with the team.
  • Bachelor’s degree in Computer Science or closely related field or a foreign equivalent.
Responsibilities
  • Build and share knowledge of the data flow throughout the Resilience platform.
  • Optimize the storage, processing, and movement of data, such as tuning data models or refactoring data software.
  • Develop, document, and maintain new data platform functionality.
  • Contribute to software and data architecture reviews.
  • Provide support for other engineers or internal customers in the matters of data or software.
  • Implement and enforce best practices for code quality, testing, and documentation.
  • Improve or develop frameworks, packages, and/or documentation to support engineering standards.
  • Conduct code reviews to ensure adherence to coding standards and promote knowledge sharing within the team.
Loading...