(Work in Progress) A cross-platform desktop client for offline LlaMA-CPU
-
Updated
Nov 1, 2023 - C#
(Work in Progress) A cross-platform desktop client for offline LlaMA-CPU
A terminal style user interface to chat with AI characters using llama LLMs for locally processed AI.
PARRoT: Precise Audio Recognition and Recap over Transcription
Auto Complete anything using a gguf model
LLM InferenceNet is a C++ project designed to facilitate fast and efficient inference from Large Language Models (LLMs) using a client-server architecture. It enables optimized interactions with pre-trained language models, making deployment on edge devices easier.
Web API that summarizes multimedia from various sources using modern AI tools.
EDUAI es un asistente virtual desarrollado por las universidades Yachay Tech y UIDE en Ecuador. Su propósito es brindar ayuda a estudiantes en matemáticas, tanto de colegios como de universidades. Actúa como un asistente, no como un usuario. Responde desde su función específica.
An open-source AI app | running mixtral 8x7B / llama.cpp | single-layer threads interface | multi-user | private | offline capable
Use your open source local model from the terminal
Add a description, image, and links to the llamacpp topic page so that developers can more easily learn about it.
To associate your repository with the llamacpp topic, visit your repo's landing page and select "manage topics."