Full Stack Developer at Erode AI
Charlottetown, PE C1A 4L1, Canada -
Full Time


Start Date

Immediate

Expiry Date

14 Nov, 25

Salary

70000.0

Posted On

14 Aug, 25

Experience

3 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Github, Ecs, Gitlab, Kubernetes, Perspectives, Pair Programming, Git, Ecr, User Experience

Industry

Information Technology/IT

Description

ABOUT ERODE AI

Erode AI is a climate-tech startup on a mission to revolutionize renewable energy forecasting and environmental monitoring with AI-driven models for forecasting weather and weather-derived outcomes. We build scalable, user-friendly SaaS tools for energy traders, hydropower operators, and agricultural partners, leveraging cutting-edge AI technologies.
Live and work remotely, or at one of our offices in Calgary, Alberta or Charlottetown, PEI. Preference will be given to non-remote candidates.
Calgary, Alberta - Consistently ranked among the world’s most livable cities, Calgary combines big-city amenities with instant Rocky Mountain escapes. Enjoy Canada’s sunniest skies, an acclaimed culinary scene, and endless outdoor adventures—from mountain biking and hiking to world-class skiing—all just a short drive away.
Charlottetown, Prince Edward Island - Experience the charm of Canada’s smallest provincial capital, where historic brick streets meet red-sand beaches and lively culinary festivals. Savour farm-to-table dining, stroll scenic waterfront trails, and immerse yourself in a close-knit community that offers a vibrant arts scene and coastal adventures.

QUALIFICATIONS & EXPERIENCE

Required:

  • 3+ years building full-stack applications with React and modern JavaScript/TypeScript.
  • Demonstrated ability to design, build, and deploy fully responsive platforms that deliver a seamless user experience on both mobile and desktop devices.
  • Strong Python skills, with experience in data and model pipelines (numpy, pandas, pytorch).
  • Hands-on experience with AWS Amplify (or equivalent serverless frameworks) and core AWS/GCP services.
  • Proven track record working with large datasets and optimizing for performance.
  • Experience deploying containers (e.g. Docker, ECS, ECR).
  • Experience using AI/LLM integration to boost developer productivity and build end-user features.
  • Proficient in Agile workflows, including iterative sprints, code-reviews, and pair-programming, and skilled at collaborating on large, version-controlled codebases with Git (GitHub, GitLab, etc.).

Preferred:

  • Familiar with weather-specific data formats (grib, xarray, and zarr).
  • Experience orchestrating data scraping or ETL workflows (e.g., Airflow, Prefect).
  • MLOps background: Kubernetes/Cloud Run, CI/CD, Terraform/CloudFormation.
  • Real-world experience using the Model Context Protocol (MCP) specification.

AT ERODE AI, WE BELIEVE THAT GREAT IDEAS COME FROM EVERYWHERE—AND WE’RE BUILDING A TEAM THAT REFLECTS A BROAD RANGE OF EXPERIENCES AND PERSPECTIVES.

We recognize that some individuals with non-traditional backgrounds may hesitate to apply unless they meet every qualification. But if you’re passionate about our mission and excited to grow with us, we strongly encourage you to apply — even if your experience doesn’t align perfectly with the job description.

How To Apply:

Incase you would like to apply to this job directly from the source, please click here

Responsibilities

ROLE SUMMARY

In this role, you’ll play a key role in the full-stack development of our core platform—building everything from scalable, serverless APIs and resilient data pipelines to polished, responsive front-ends. You’ll partner closely with the entire team to ingest and process massive weather and climate datasets, craft intuitive React interfaces, and fine-tune performance across AWS and GCP at scale. We’re seeking driven, collaborative team players who are passionate about turning complex data into actionable insights that help tackle the global climate crisis and make a positive impact on the planet.

KEY RESPONSIBILITIES

  • Develop and maintain interactive web applications using React, JavaScript/TypeScript, and AWS Amplify.
  • Design and implement Python back-end services for data processing with Dask/Zarr and Lambda functions.
  • Integrate AI-powered tooling (e.g., LLMs, MCP) to augment developer and user workflows.
  • Architect scalable cloud infrastructure in AWS/GCP (CloudFormation/Terraform).
  • Collaborate with ML engineers to productionize ETL models via Docker, Cloud Run, and CI/CD pipelines.
  • Ensure data ingestion and orchestration workflows are robust—bonus if you have experience building operational scraping or ETL tools.
  • Monitor performance, reliability, and cost across large-scale datasets and distributed systems.
Loading...