Senior Data Engineer at Activate Talent
, , Argentina -
Full Time


Start Date

Immediate

Expiry Date

29 Jun, 26

Salary

0.0

Posted On

31 Mar, 26

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Snowflake, DBT, FiveTran, AWS, Python, SQL, ELT/ETL, Data Modeling, S3, Lambda, Glue, Athena, CloudWatch, Looker, Airflow, Dagster

Industry

Staffing and Recruiting

Description
Job Title: Senior Data Engineer Type: Full-time; Remote Schedule: US Time zone Industry: Fashion/Lifestyle Job Overview We are looking for a Senior Data Engineer to help build and scale our modern data platform. This hands-on role involves developing robust data pipelines, modeling data for analytics, and working closely with business and engineering teams. You will work with industry-leading tools such as Snowflake, DBT, FiveTran, and AWS services to create efficient and scalable data solutions. Responsibilities Design, build, and maintain scalable ELT/ETL pipelines (FiveTran, DBT, Python,Snowpipe) in Snowflake. Integrate workflows using AWS services (S3, Lambda, Glue, Athena, CloudWatch). Administer and optimize Snowflake (user roles, warehouse sizing, access, performance). Develop and maintain dimensional/normalized data models in Snowflake. Implement data transformation logic with DBT and SQL for accuracy and maintainability. Transform raw data into reliable datasets for analytics and operations. Collaborate with teams to understand data requirements and deliver solutions. Ensure high data quality, consistency, and security. Monitor and troubleshoot pipelines, implementing logging/alerting. Participate in code reviews, documentation, and promote data engineering best practices. Preferred Qualifications Exposure to Shopify, NetSuite, or Segment. Experience with Python and Looker or other BI Visualization tools is preferred Experience in e-commerce, SaaS, or financial analytics environments. Familiarity with orchestration tools such as Airflow, Dagster, or similar frameworks. B.S./M.S. in Computer Science or related field. 5-7 years in software development, 5+ in data/analytics engineering. Expertise in Snowflake, DBT, FiveTran, and SQL in production. Experience designing/managing AWS data pipelines. Strong problem-solving, communication, and cross-functional skills. Knowledge of data governance, performance tuning, and security.

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities
The role involves designing, building, and maintaining scalable ELT/ETL pipelines using tools like FiveTran, DBT, and Python within the Snowflake environment. Responsibilities also include integrating workflows with AWS services and developing dimensional/normalized data models for analytics.
Loading...