Middle/Senior Data Engineer (SP)

at  Sigma Software

Lisboa, Área Metropolitana de Lisboa, Portugal -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate31 Oct, 2024Not Specified07 Aug, 2024N/ACommunication Skills,Data Processing,Software,Code Design,Testing,Looker,Git,Snowflake,Data Modeling,Prs,Build Tools,Python,SparkNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

Company Description
Spendesk is a 7-in-1 spending solution built for finance teams to make faster, smarter spending decisions. Founded in 2016, Spendesk is now one of the fastestgrowing fintechs in Europe, with over 4,000 customers and an international team of 500+ employees based in Paris, Berlin, London, Hamburg, and remote. We’ve raised over €260M from leading investors and been named a French tech unicorn. And we’re not stopping there!
We are glad to welcome a Data Engineering expert with an entrepreneurial, performance-driven mindset and strong experience in implementing and delivering Data Analytics solutions to join us as a Data Engineer.
We offer you a flexible and dynamic environment with opportunities to go beyond your comfort zone to grow personally and professionally.

PROJECT

Spendesk Financial Services (SFS) is a payments institution that offers a platform for embedding financial services into the Spendesk product. SFS offers capabilities such as accounts management, KYC, card payments, wire transfers, etc. SFS projects can be divided into three major categories:

  • To develop our core banking system to replace the legacy solution
  • To migrate clients from legacy to the new core banking system
  • To extend the core banking system with better payment capabilities

Job Description

REQUIREMENTS

  • 3+ years of strong experience with Python as a programming language for data pipelines and related tools
  • Familiarity and understanding of distributed data processing with Spark, for data pipeline optimization and monitoring workloads
  • Experience with Snowflake
  • Proven strong track record of building data transformations in data build tools
  • Excellent data modeling and data warehousing best practices implementation
  • Experience with Looker with a developer proficiency
  • Strong Data Domain background - knowledge of how data engineers, data scientists, analytics engineers, and analysts work to be able to work closely with them and understand their needs
  • Good written and spoken English communication skills
  • Software engineering best practices: Testing, PRs, Git, code reviews, code design, releasing

Responsibilities:

  • Contributing to new technologies investigations and complex solutions design, supporting a culture of innovation considering matters of security, scalability, and reliability with the focus on building out our ETL processes
  • Working with modern data stack, coming up with well-designed technical solutions and robust code, implementing data governance processes
  • Working and professionally communicating with the customer’s team
  • Taking up responsibility for delivering major solution features
  • Participating in requirements gathering & clarification process, proposing optimal architecture strategies, leading the data architecture implementation
  • Developing core modules and functions, designing scalable and cost-effective solutions
  • Performing code reviews, writing unit, and integration tests
  • Scaling the distributed system and infrastructure to the next level
  • Building data platform using the power of AWS cloud provider
    Qualifications


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Proficient

1

Lisboa, Portugal