Data Engineer

at  TradeStation

Remote, , Costa Rica -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate12 Oct, 2024Not Specified13 Jul, 20242 year(s) or aboveB2,Git,Data Engineering,Pandas,Data Processing,Amazon Web Services,Transformation,Data Loading,Sql,Communication Skills,Python,Big Data,Pipelines,English,Azure,Data Warehousing,Kubernetes,Aws,Computer ScienceNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

#WEARETRADESTATION

Who We Are:
TradeStation is an online brokerage firm seeking to level the playing field for self-directed investors and traders, empowering them to claim their individual financial edge. At TradeStation, we’re continuously pushing the boundaries of what’s possible, encouraging out-of-the-box thinking and a relentless search for innovation.

WHAT WE ARE LOOKING FOR:

We are seeking a Data Engineer to join the Enterprise Analytics team. The team helps TradeStation extract valuable business insights from raw data scattered across dozens of silos and teams. Our Data Engineers ensure that our data is fresh, accurate, and meaningful to our business stakeholders.
The ideal candidate is a lifelong learner and self-starter who is constantly looking for new ways to solve old problems. Working with big data, this role will seek to find creative solutions to improve our efficiency, speed, and cost and help us develop them one Agile sprint at a time. This position will be instrumental in migrating from legacy data pipelines to our new lake-house architecture.

What You’ll Be Doing:

  • Assist in building a modern lake data warehouse
  • Create data pipelines using DevOps practices to ensure code is constantly tested, performant, and delivered
  • Assist in migrating old SSIS/SSRS-based ETL jobs to our new Databricks/ADF-based ELT architecture
  • Write Python and SQL solutions for data transformation, applying test-driven development principles and automated data quality controls
  • Process unstructured data from SQL DBs or JSON, Parquet, CSV files, or Rest APIs
  • Assist building data pipelines while considering scalability and efficiency

The Skills You Bring:

  • Excellent communication skills with English at B2+ level
  • Experience with Python, Pandas, Py-Spark, ML Flow, Delta Lake
  • Experience with Amazon Web Services and/or Azure clouds.
  • Experience with SSIS, SSRS, Data-Factory
  • Experience Databricks and Synapse
  • Experience with Git, Azure DevOps/Jenkins/Gitlab, Kubernetes
  • Experience data processing systems and pipelines on Databricks, APIs, and AWS Services
  • Ability to build the infrastructure required for optimal extraction, transformation, and data loading from various sources

Minimum Qualifications:

  • Bachelor’s or master’s degree in computer science, Engineering, or a related field
  • 2+ years of experience in data engineering, big data, and data warehousing

Desired Qualifications:

  • Experience in Python and SQL
  • Experience with cloud computing platforms such as AWS and AZURE
  • Knowledge of data storage and data processing
  • Experience with data pipeline and workflow management tools.
  • Knowledge of Databricks

What We Offer:

  • Collaborative work environment
  • Competitive Salaries
  • Yearly bonus
  • Comprehensive benefits for you and your family starting Day 1
  • Unlimited Paid Time Off
  • Flexible working environment
  • TradeStation Account employee benefits, as well as full access to trading education materials

LI-Remot

Responsibilities:

  • Assist in building a modern lake data warehouse
  • Create data pipelines using DevOps practices to ensure code is constantly tested, performant, and delivered
  • Assist in migrating old SSIS/SSRS-based ETL jobs to our new Databricks/ADF-based ELT architecture
  • Write Python and SQL solutions for data transformation, applying test-driven development principles and automated data quality controls
  • Process unstructured data from SQL DBs or JSON, Parquet, CSV files, or Rest APIs
  • Assist building data pipelines while considering scalability and efficienc


REQUIREMENT SUMMARY

Min:2.0Max:7.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Computer Science, Engineering

Proficient

1

Remote, Costa Rica