Sr. Data Engineer

at  TradeStation

Remote, , Costa Rica -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate27 Sep, 2024Not Specified28 Jun, 20242 year(s) or abovePipelines,Communication Skills,Amazon Web Services,Kubernetes,Data Engineering,Big Data,Architects,Data Loading,Azure,Sql,English,Data Processing,B2,Pandas,Developers,Git,Data Warehousing,Python,Transformation,Computer ScienceNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

#WEARETRADESTATION

Who We Are:
TradeStation is an online brokerage firm seeking to level the playing field for self-directed investors and traders, empowering them to claim their individual financial edge. At TradeStation, we’re continuously pushing the boundaries of what’s possible, encouraging out-of-the-box thinking and a relentless search for innovation.

WHAT WE ARE LOOKING FOR:

We are seeking a Senior Data Engineer to join the Enterprise Analytics team. The team helps TradeStation extract valuable business insights from raw data scattered across dozens of silos and teams. Our Data Engineers ensure that our data is fresh, accurate, and meaningful to our business stakeholders.
The ideal candidate is a lifelong learner and self-starter who is constantly looking for new ways to solve old problems. Working with big data, this role will seek to find creative solutions to improve our efficiency, speed, and cost, and help us develop them one Agile sprint at a time. This position will be instrumental in migrating from legacy data pipelines to our new lake-house architecture.

What You’ll Be Doing:

  • Help us build a modern lake data warehouse
  • Create data pipelines using DevOps practices to ensure code is constantly tested, performant, and delivered
  • Migrate old SSIS/SSRS-based ETL jobs to our new Databricks/ADF-based ELT architecture
  • Work independently and with the business stakeholders to understand their requirements and drive business outcomes
  • Write Python and SQL solutions for data transformation, applying test-driven development principles and automated data quality controls
  • Process unstructured data from SQL DBs or JSON, Parquet, CSV files, or Rest APIs
  • Provide engineering solutions, design, and build data pipelines while considering scalability and efficiency

The Skills You Bring:

  • Excellent communication skills with English at B2+ level
  • Experience with Python, Pandas, Py-Spark, ML Flow, Delta Lake
  • Experience with Amazon Web Services and/or Azure clouds
  • Experience with SSIS, SSRS, Data-Factory
  • Experience Databricks and Synapse
  • Experience with Git, Azure DevOps/Jenkins/Gitlab, Kubernetes
  • Ability to design efficient and scalable data processing systems and pipelines on Databricks, APIs, and AWS Services
  • Ability to build the infrastructure required for optimal extraction, transformation, and data loading from various sources

Minimum Qualifications:

  • Bachelor’s or master’s degree in computer science, Engineering, or a related field
  • 2+ years of experience in data engineering, big data, and data warehousing
  • Experience coordinating between multiple teams, such as Architects, Business Analysts, Scrum Masters, and Developers, to get technical clarity leading to the design, development, and implementation of business solutions

Desired Qualifications:

  • Proficient in Python and SQL
  • Strong experience with cloud computing platforms such AWS and AZURE
  • Knowledge of data storage and data processing
  • Experience with data pipeline and workflow management tools
  • AWS Certified Cloud Practitioner
  • Databricks Data Engineering Professional

What We Offer:

  • Collaborative work environment
  • Competitive Salaries
  • Yearly bonus
  • Comprehensive benefits for you and your family starting Day 1
  • Unlimited Paid Time Off
  • Flexible working environment
  • TradeStation Account employee benefits, as well as full access to trading education materials

Responsibilities:

  • Help us build a modern lake data warehouse
  • Create data pipelines using DevOps practices to ensure code is constantly tested, performant, and delivered
  • Migrate old SSIS/SSRS-based ETL jobs to our new Databricks/ADF-based ELT architecture
  • Work independently and with the business stakeholders to understand their requirements and drive business outcomes
  • Write Python and SQL solutions for data transformation, applying test-driven development principles and automated data quality controls
  • Process unstructured data from SQL DBs or JSON, Parquet, CSV files, or Rest APIs
  • Provide engineering solutions, design, and build data pipelines while considering scalability and efficienc


REQUIREMENT SUMMARY

Min:2.0Max:7.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Computer Science, Engineering

Proficient

1

Remote, Costa Rica