Senior Software Engineer, Data

at  Interos Inc

Remote, Oregon, USA -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate06 Feb, 2025USD 160000 Annual06 Nov, 20242 year(s) or aboveArgo,Software Development,Airflow,Code,It,Sql,Snowflake,Data Streaming,Computer Science,Agile Environment,Production Experience,Communication Skills,Data Processing,Kafka,AwsNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

SENIOR SOFTWARE ENGINEER, DATA

About Interos
Interos is the supply chain risk intelligence company – building the most trusted and transparent supply chains in the world. Our pioneering discovery and monitoring intelligence spans the lifecycle of supply chain risk, enabling faster and more informed threat mitigation. As the world’s first, and only, automated supplier intelligence platform, we continuously map and monitor extended supply chains at speed and scale to protect organizations from regulatory fines, unethical labor, cyber-attacks, and other systemic vulnerabilities. Interos serves a variety of commercial, government, and public sector customers around the world including a host of Global Fortune 500 companies and from within the members of the Five Eyes nations. www.interos.ai.

MINIMUM QUALIFICATIONS:

  • 5+ years of experience in Software Development.
  • 3+ years of full-time professional Python experience including production experience with data pipelines.
  • 3+ years of experience with SQL via relational or columnar databases, especially Postgres and/or Snowflake.
  • 2+ years of experience working with Snowflake.
  • 2+ years of experience developing in AWS.
  • 2+ years of experience in data streaming or event-driven systems with Kafka or another stream processing system.
  • Bachelor’s degree

PREFERRED QUALIFICATIONS:

  • Experience with distributed data processing, using tools like PySpark, Dask, etc.
  • Experience with job or pipeline orchestration, using tools like Prefect, Airflow, Argo, etc.
  • Passionate about writing code that is scalable, maintainable, reusable, and well-tested.
  • Interested in building reliable, fault-tolerant, production-grade systems.
  • Feel comfortable debugging large or complex applications and, if necessary, guiding their refactor.
  • Have experience designing, implementing, and maintaining APIs as a service for your team and customers.
  • Excellent communication skills with the ability to convey technical concepts to both technical and non-technical stakeholders.
  • Have experience developing data software in an agile environment.
  • Enjoy optimizing complex processes and leaving something better off than where you found it.
  • A seasoned engineer who enjoys sharing your experience with the team.
  • Bachelor’s degree in Computer Science or closely related field or a foreign equivalent.

Responsibilities:

  • Build and share knowledge of the data flow throughout the Resilience platform.
  • Optimize the storage, processing, and movement of data, such as tuning data models or refactoring data software.
  • Develop, document, and maintain new data platform functionality.
  • Contribute to software and data architecture reviews.
  • Provide support for other engineers or internal customers in the matters of data or software.
  • Implement and enforce best practices for code quality, testing, and documentation.
  • Improve or develop frameworks, packages, and/or documentation to support engineering standards.
  • Conduct code reviews to ensure adherence to coding standards and promote knowledge sharing within the team.


REQUIREMENT SUMMARY

Min:2.0Max:3.0 year(s)

Information Technology/IT

IT Software - System Programming

Software Engineering

Graduate

Proficient

1

Remote, USA