Data Engineer at Tavily
Tel Aviv, Tel-Aviv District, Israel -
Full Time


Start Date

Immediate

Expiry Date

04 Apr, 26

Salary

0.0

Posted On

04 Jan, 26

Experience

2 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Python, SQL, NoSQL, AWS, Data Pipelines, Data Architecture, ETL, ELT, Data Quality, Data Integrity, Data Security, Mongo, Snowflake, Redis, Airflow, Docker, Kubernetes

Industry

technology;Information and Internet

Description
About Tavily - Search API Tavily is a cutting-edge company focused on providing a powerful and efficient Search API. We empower developers and businesses by delivering high-quality, relevant search results quickly and reliably. Join our team to help build the data infrastructure that supports and scales our core product. The Role We are seeking a highly motivated and experienced Data Engineer to join our growing team. You will be responsible for designing, constructing, installing, testing, and maintaining highly scalable data management systems. You will work closely with our engineering and devops teams to build and optimize the data pipelines that are crucial to the performance and accuracy of our Search API. Responsibilities Design, build, and maintain efficient and reliable ETL/ELT pipelines for data warehousing. Develop and improve our data infrastructure. Ensure data quality, integrity, and security across all data platforms. Optimize data systems for performance and scalability. Troubleshoot and resolve issues related to data pipelines and data infrastructure. Collaborate with cross-functional teams to understand data needs and deliver solutions. Minimum Qualifications A degree in Computer Science, Statistics, Engineering, or a related quantitative field. 3+ years of professional experience as a Data Engineer or in a similar role focused on data infrastructure. Proficiency in Python. Solid experience with relational databases (SQL) and NoSQL databases. Experience with AWS and their data services. Proven experience in building and optimizing data pipelines and data architectures. Preferred Qualifications Experience with Mongo/Snowflake/Redis/S3 General/S3 Express. Experience with big data technologies Experience with Airflow. Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Experience working in a company focused on API services or search technology. Knowledge of data governance and data security best practices.
Responsibilities
The Data Engineer will design, build, and maintain efficient ETL/ELT pipelines for data warehousing and improve the data infrastructure. They will also ensure data quality and troubleshoot issues related to data pipelines.
Loading...