AI use at the University of Cologne
The topic of (generative) artificial intelligence (AI) affects all performance dimensions and areas of the University of Cologne. This includes, for example, AI research, the use of AI tools in research, the use of AI in the context of teaching and learning, including the transfer of skills, as well as a wide range of applications in administration and university management.
These different areas result in a large number of use cases and corresponding (specialized) needs. Current and planned overarching offers are described below.
Background - in-house and commercial offerings and delivery modes
For various reasons, e.g. financing, compliance, security and data sovereignty, we are aiming for a mixed model of "own" and commercial offerings. Specifically, this is reflected, for example, in the targeted provision of both open models (see pillars 1 and 2) and commercial models (see pillars 3 and 4).
In addition, different modes of provision are sought: Access on certain AI applications in the cloud, the provision of interfaces (APIs), e.g. for the use of own software or in research, as well as the use of models on own, e.g. also local, hardware.
Overview of available and planned AI offerings
In order to map the diverse and different needs of the University of Cologne in the area of AI tools, we rely on a variety of tools with the aim of building a comprehensive AI offering that addresses as many use cases as possible. To this end, a pillar model has been developed to cover as many needs as possible in a target group-oriented manner and with efficient use of human, technical and monetary resources. These pillars are described below.
Pillar 1: GWDG AI chatbot / Academic Cloud
The GWDG (Gesellschaft für wissenschaftliche Datenverarbeitung Göttingen) provides an AI chatbot via the "Academic Cloud" service, which can access different models. All holders of a university account - staff and students - have access to the chatbot provided. If you already have an "Academic ID", you can use this to log in, otherwise please select "Federated login" and choose the University of Cologne as your home institution:
https://chat-ai.academiccloud.de/
Instructions and further information, including information on data protection and processing, can be found at
https://gwdg.de/services/application-services/ai-services/
You can choose from various open source models such as LLaMA, Mistral or Qwen. In addition, individual parameterization of the request is possible, in particular to balance out how much freedom the chatbot should be granted when answering.
Target group(s): Employees and students
Current status: Available.
Pillar 2: LLMs on the High Performance Computer / Open Source-KI.nrw
In cooperation with the Ruhr University Bochum, various high-performance graphics cards from NVidia were procured for training and interference in the context of the Open Source-KI.nrw project (OSKI.nrw), a project funded by the MKW (Ministry of Culture and Science NRW), and integrated into our new high-performance computer "RAMSES".
The aim is to provide generally available Large Language Models (LLM) enhanced with our own content (via Retrieval Augmented Generation) using our own front end, and thus the professional use of high-performance AI models on local hardware in research and teaching. For this purpose, not only the monetary resources of the MKW and the technology available through RAMSES are used, the RRZK/ITCC has also explicitly commissioned personnel to create the offer. The first beta tests are expected in January 2025. The access modalities and the distribution of resources are still being planned and will be evaluated and adjusted on an ongoing basis after deployment.
Target group(s): Employees and students in research and teaching
Current status: Under construction, planned availability Q1/2025.
Pillar 3: Access for commercial models
In parallel to the activities mentioned so far, we expect initiatives such as OCRE (Open Clouds for Research Environments) and/or the KI:Connect.nrw project to provide discounted API access for commercial AI models such as OpenAI/ChatGPT in the medium term. The RRZK/ITCC is closely monitoring these developments and registering corresponding requirements. In addition, framework conditions for use, especially with regard to data protection and classification, must be defined in the meantime.
The framework agreements for the use of OCRE cloud services will be available from February 2025. Individual agreements must then be made on the basis of these. The distribution of costs and the corresponding billing modalities must also be clarified.
Target group(s): Employees
Current status: Framework agreements have been concluded. The aim is to conclude contracts for Azure OpenAI and possibly other contracts.
Planned availability: Q1 or Q2 2025
Pillar 4: AI in the central university administration
The RRZK/ITCC has acquired a short-term contingent of tokens for an AI system that is fully encrypted and implemented on public cloud providers. This encryption is a basic requirement for the use of public clouds for sensitive data. This system is used specifically by the central university administration because we want to increasingly rely on cloud applications in the administrative area and the integration of AI functions while observing data protection issues for personal data poses a particular challenge.
As a prototype, this service will initially be made available in the form of a simple front end. The aim here is to achieve a level of security and data protection so that it can also work with texts such as job descriptions and other internal documents.
Target group(s): Employees and applications in the central university administration
Current status: Under construction, proto-typical availability in January, integration into applications Q4/2025
Pillar 5: Entry-level model for initial experiments
This pillar is intended to enable individual AI-savvy interested parties as well as research groups and institutes to get a low-threshold entry into the development of AI (supported) applications and to "just try it out". The offer will use existing resources and deliberately not offer particularly high performance, but rather focus on general availability and beginner-friendliness.
Target group(s): AI developers, research groups
Current status: In preparation.
Planned availability Q2/2025
Pillar 6: AI on your own end devices
The increasing availability of smaller yet powerful AI models has made it possible to run these models on your own (powerful) end devices, e.g. laptops. This approach ensures that no data leaves the end device and an extremely high level of data protection, security and control can be achieved.
For example, with programs such as LM Studio, Jan or GPT4All, it is possible to use models on your own end device in a familiar interface. However, the performance of the models depends heavily on the size of the model and the available hardware, e.g. the availability of a graphics card in the laptop. The "provision" or offer here is primarily limited to the provision of instructions or a brief guide to using AI models on your own end devices.
Target group(s): Employees and students in research and teaching
Current status: A guide or handout is being planned
Contact
If you have any questions or problems, please contact the RRZK-Helpdesk