Snowflake AI Data Engineer at JP Techno Park
Alpharetta, Georgia, USA -
Full Time


Start Date

Immediate

Expiry Date

14 Oct, 25

Salary

75.0

Posted On

14 Jul, 25

Experience

10 year(s) or above

Remote Job

Yes

Telecommute

Yes

Sponsor Visa

No

Skills

Kafka, Integration, Python, Sql, Salesforce, Snowflake, Sdk

Industry

Information Technology/IT

Description
  • In short, we are looking for an AI Data Engineer with strong Python skills and experience in LLM frameworks ( RAG and Vector Search).
  • Proficiency with OpenAI integration is critical. Exposure to Kafka, Snowflake, and Elastic Search is preferred.
  • 10 years hands-on experience with Snowflake, SQL server, including data modeling, performance tuning, Data Visualization, and working with structured/semi-structured data. It also includes data extraction, transformation, and analysis.
  • Proficient in designing and managing data end to end ETL pipelines using IDMC / Spring batch/Autosys.
  • Experience integrating data from diverse sources like Genesys Cloud, Salesforce and APIs provided any internal / external systems. This includes monitoring, debugging, and optimizing data flows.
  • Knowledge on Genesys PureCloud APIs and SDKs, Architect Flow
    Job Type: Contract
    Pay: $70.00 - $75.00 per hour
    Expected hours: 40 per week

Experience:

  • AI Data Engineer: 10 years (Required)
  • Python: 10 years (Required)
  • LLM Framework: 10 years (Required)
  • RAG: 10 years (Required)
  • Vector Search: 10 years (Required)
  • OpenAI Integration: 10 years (Required)
  • Elastic Search : 10 years (Required)
  • Kafka: 10 years (Required)
  • Snowflake: 10 years (Required)
  • Genesys Cloud: 10 years (Required)
  • Salesforce: 10 years (Required)
  • API: 10 years (Required)
  • SQL: 10 years (Required)
  • IDMC: 10 years (Required)
  • SDK: 10 years (Required)

Work Location: In perso

Responsibilities

Please refer the Job description for details

Loading...