Introducing Reins - Empowering LLM researchers and hobbyists with seamless control over self-hosted models.
With Reins in your pocket, you’ll find:
* Remote Server Access—Connect to your self-hosted Ollama Server from anywhere, ensuring seamless interactions on the go.
* Per-Chat System Prompts—Tailor system prompts for each conversation to enhance context and relevance.
* Prompt Editing & Regeneration—Modify and regenerate prompts effortlessly to refine your interactions.
* Image Integration—Enrich your chats by sending and receiving images, making interactions more dynamic.
* Advanced Configuration—Adjust parameters like temperature, seed, context size, and max tokens and more advanced options to make experiments.
* Model Selection—Choose from various models to suit your specific research needs and preferences.
* Model Creation from Prompts—Save system and chat prompts as new models.
* Multiple Chat Management—Handle several conversations simultaneously with ease.
* Dynamic Model Switching—Change the current model within existing chats without interruption.
* Real-Time Message Streaming—Experience messages as they arrive, ensuring timely and efficient communication.
Note: Reins requires an active connection to a self-hosted Ollama Server to function.