Data Platform Engineer

at  Amach

Dublin, County Dublin, Ireland -

Start DateExpiry DateSalaryPosted OnExperienceSkillsTelecommuteSponsor Visa
Immediate23 Apr, 2025Not Specified23 Jan, 2025N/AAirflow,Scala,Version Control,Spark,Data Warehousing,Tableau,Sql,Snowflake,Data Solutions,Jenkins,Programming Languages,Python,Github,Netezza,AwsNoNo
Add to Wishlist Apply All Jobs
Required Visa Status:
CitizenGC
US CitizenStudent Visa
H1BCPT
OPTH4 Spouse of H1B
GC Green Card
Employment Type:
Full TimePart Time
PermanentIndependent - 1099
Contract – W2C2H Independent
C2H W2Contract – Corp 2 Corp
Contract to Hire – Corp 2 Corp

Description:

ABOUT US:

Amach is an industry-leading technology driven company with headquarters located in Dublin and remote teams in UK and Europe.
Our blended teams of local and nearshore talent are optimised to deliver high quality and collaborative solutions.
Established in 2013, we specialise in cloud migration and development, digital transformation including agile software development, DevOps, automation, data and machine learning…
We are seeking an experienced Data Platform Engineer to join our team in Dublin. This role offers the chance to shape and support our data architecture, working on cutting-edge cloud technologies and driving the success of our data-driven projects. The ideal candidate will have a strong background in Databricks, Snowflake, and AWS and be proficient in MLOps to support seamless deployment and scaling of machine learning models. You’ll play a critical role in our mission to enhance data accessibility, streamline data sourcing pipelines and optimize performance for large-scale data solutions.

REQUIRED EXPERIENCE:

  • Experience in Data Platform Engineering: Proven track record in architecting and delivering large-scale cloud-native data solutions
  • Proficiency in Databricks and Snowflake: Strong skills in data warehousing and lakehouse technologies with hands-on experience in Databricks, Spark, Pyspark and Delta Lake
  • MLOps Expertise: Experience with MLOps practices, ideally with MLflow for model management and deployment
  • Cloud Platforms: Knowledge of AWS, with additional experience in Azure beneficial for multi-cloud environments
  • Programming Languages: Strong coding skills in Python, SQL and Scala
  • Tooling Knowledge: Experience with version control (GitHub), CI/CD pipelines (Azure DevOps, GitHub Actions), data orchestration tools (Airflow, Jenkins) and dashboarding tools (Tableau, Alteryx)

    Desirable Skills:

  • Experience with Synapse Analytics, Netezza and legacy data systems

  • Familiarity with data governance tools and best practices
  • Strong problem-solving skills, with an ability to work both independently and as part of a cross-functional team

NOT FOR YOU?

Check out all of our open positions in our careers page and follow us on LinkedIn for future opportunities.
P.S. Share this with friends and co-workers! Don’t be afraid they’ll steal it from you, if you’re amazing and smart we’ll find a role for you. We are growing fast and we are always looking for talented people.
At Amach, we strive to be an inclusive community of open-minded individuals with different backgrounds and we are committed to fostering, cultivating and preserving a culture of diversity, equity and inclusion. We strongly believe that a diversity of experience and background is essential to create a fulfilling environment and better solutions for our people and our customers. All Amach employees and contractors are expected to honour this policy and act to ensure that every individual is respected in the workplace.

Responsibilities:

  • Architect and Implement Cloud-Native Data Solutions: Design and develop scalable data platforms, focusing on a cloud-native approach, data mesh architectures, and seamless integration across multiple data sources
  • MLOps Pipeline Development: Build and maintain MLOps pipelines using tools like MLflow, ensuring efficient and reliable deployment of machine learning models to production environments
  • Data Governance and Quality Management: Create and enforce data governance standards, ensuring robust data quality and compliance through tools such as Databricks Unity Catalog
  • Data Integration & Migration: Lead migration projects from legacy data platforms to modern cloud solutions, optimizing cost and operational efficiency
  • Performance Tuning and Optimization: Leverage tools such as Snowflake and Delta Lake to improve data accessibility, reliability, and performance, delivering high-quality data products that adhere to best practices


REQUIREMENT SUMMARY

Min:N/AMax:5.0 year(s)

Information Technology/IT

IT Software - DBA / Datawarehousing

Software Engineering

Graduate

Proficient

1

Dublin, County Dublin, Ireland