Internship/ Master Thesis Evaluation of an on-premise Code Intelligence Pla
at ELCA Informatique SA
Zürich, ZH, Switzerland -
Start Date | Expiry Date | Salary | Posted On | Experience | Skills | Telecommute | Sponsor Visa |
---|---|---|---|---|---|---|---|
Immediate | 22 Apr, 2025 | Not Specified | 23 Jan, 2025 | N/A | Good communication skills | No | No |
Required Visa Status:
Citizen | GC |
US Citizen | Student Visa |
H1B | CPT |
OPT | H4 Spouse of H1B |
GC Green Card |
Employment Type:
Full Time | Part Time |
Permanent | Independent - 1099 |
Contract – W2 | C2H Independent |
C2H W2 | Contract – Corp 2 Corp |
Contract to Hire – Corp 2 Corp |
Description:
Today’s frontier LLM models excel at using popular open source libraries to build simple apps from scratch. However, generating code for private codebases hits challenges around hallucinated APIs, subtly incorrect code, and wrong or misleading answers to technical questions. A new generation of on-premise Code Intelligence Platforms is emerging to support enterprise developers maintaining large private, on-premise codebases. The main challenge for the developers is to maintain and extend the system over a long period of time, often with the source code as the only source of truth. The focus of a Code Intelligence Platform is on providing code search, analysis, and understanding large codebases by modeling source code as an Abstract Syntax Tree (AST). Furthermore, requests to a Code Intelligence Platform always depends on a specific context, eg the opened source code tabs in the Integrated Development Environment (IDE).
Sourcegraph/Cody is a promising emerging Code Intelligence Platform. Sourcegraph allows you to search code and to provide insights across repositories. Cody is a Coding Assistant based on a Retrieval Augmented Generation (RAG) architecture. It runs an IDE plugin and connects to popular LLMs.
Responsibilities:
- Get familiar with the features of Sourcegraph/Cody and RAG architectures
- Setup a on-premise GPU based infrastructure using a model serving framework to run LLMs for Cody to connect to
- Focus on providing optimal context by leveraging additional sources, eg
- Git history and commit information
- Content in linked Issues on JIRA, YouTrack
- Up-to-date specifications on Filesystem and on Confluence
- Find ways to adapt existing LLMs to project specific needs
- Leverage the possibilities of the emerging MCP ( Model Context Protocol )
- Setup Model Context Protocol Servers and connect Cody via OpenCtx
- Try to anticipate future trends such as long context windows
- Present the results in an ELCA Brownbag session
REQUIREMENT SUMMARY
Min:N/AMax:5.0 year(s)
Information Technology/IT
IT Software - Application Programming / Maintenance
Software Engineering
Graduate
Proficient
1
Zürich, ZH, Switzerland