ai-team

AI Team with Memory

AI Team is a group of expert personas you can interact with:

Note: You can also trigger communication between the AI characters for collaborative responses.


Example Outputs

example_chat.png


How It Works

The AI characters are built using llama3.2:3b and qwen2.5-coder models as the foundation, but you can customize them as needed. The system is powered by Ollama and supports:

Language Support: Chats in German are supported, but the responses may not always be perfect. For optimal results, using English is recommended.

Model Recommendations: While qwen2.5-coder provides good suggestions, larger models are recommended for handling more complex tasks or providing more detailed insights.


Preparation

Install Ollama

Download and install Ollama from their official website.

Create a Virtual Environment

In the root directory of the project:

python -m venv .venv
source .venv/bin/activate  # For Linux/macOS
.venv\Scripts\activate   # For Windows

Install Dependencies

Run the following command to install the required Python packages:

pip install -r requirements.txt

Install Models and Embeddings

Install via Script

Run the following command to download the models and create the specific configurations:

./create_models.sh

Usage

Start the Ollama Server

Ensure the Ollama server is running. Start it with:

ollama serve

Run the Script

Start the app using:

python app.py

Open main.html in your browser.

Interaction:


Additional Information

Install Models Manually

Run the following commands to download the models:

ollama pull llama3.2:3b
ollama pull qwen2.5-coder:7b
ollama pull nomic-embed-text

Run these commands to create the AI characters using the provided model files:

ollama create james_ryan_wick -f modelfiles/Modelfile-james_ryan_wick
ollama create jimmy_roar -f modelfiles/Modelfile-jimmy_roar
ollama create neil_knows -f modelfiles/Modelfile-neil_knows
ollama create sophia_akasha_lightray -f modelfiles/Modelfile-sophia_akasha_lightray
ollama create dr_sigmund_leud -f modelfiles/Modelfile-dr_sigmund_leud
ollama create zoe_voss -f modelfiles/Modelfile-zoe_voss

Offline Libraries

Note: To ensure offline functionality, download all necessary libraries and resources (e.g., jQuery, Bootstrap, Highlight.js) and replace the URLs in main.html with local paths.