Data Engineer, W2 only, 9+ years at TechnoKraft
Princeton, NJ 08540, USA -
Full Time


Start Date

Immediate

Expiry Date

27 Jun, 25

Salary

123105.0

Posted On

27 Mar, 25

Experience

5 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Machine Learning, Docker, Computer Science, Architecture, Decision Making, Unity, Airflow, Information Systems, Kubernetes, Python, Aws, Business Intelligence, Big Data, Scala, Rdbms

Industry

Information Technology/IT

Description

EDUCATION AND EXPERIENCE QUALIFICATION

  • Bachelor’s degree in Computer Science, Information Systems, or equivalent education or work experience
  • Around 5+ years of experience as a developer on cloud technologies
  • Any AWS and/or Databricks certification will be a plus

SKILL SETS REQUIRED

  • Good decision-making and problem solving skills
  • Solid understanding of Databricks fundamentals/architecture and have hands on experience in setting up Databricks cluster, working in Databricks modules (Data Engineering, ML and SQL warehouse).
  • Knowledge on medallion architecture, DLT and unity catalog within Databricks.
  • Experience in migrating data from on-prem Hadoop to Databricks/AWS
  • Understanding of core AWS services, uses, and AWS architecture best practices
  • Hands-on experience in different domains, like database architecture, business intelligence, machine learning, advanced analytics, big data, etc.
  • Solid knowledge on Airflow
  • Solid knowledge on CI/CD pipeline in AWS technologies
  • Application migration of RDBMS, java/python applications, model code, elastic etc.
  • Solid programming background on scala, python
  • Experience with Docker and Kubernetes is a plus
    Job Type: Full-time
    Pay: $112,198.00 - $123,105.00 per year

Schedule:

  • 8 hour shift
  • Day shift

Ability to Commute:

  • Princeton, NJ 08540 (Required)

Ability to Relocate:

  • Princeton, NJ 08540: Relocate before starting work (Required)

Work Location: In perso

Responsibilities

ROLES & RESPONSIBILITIES

  • Recognize the current application infrastructure and suggest new concepts to improve performance
  • Document the best practices and strategies associated with application deployment and infrastructure support
  • Produce reusable, efficient, and scalable programs, and also cost-effective migration strategies
  • Develop Data Engineering and ML pipelines in Databricks and different AWS services, including S3, EC2, API, RDS, Kinesis/Kafka and Lambda to build serverless applications
  • Work jointly with the IT team and other departments to migrate data engineering and ML applications to Databricks/AWS
  • Comfortable to work on tight timelines, when required.
Loading...