goviewmodels

A terminal-based tool for discovering, browsing, and managing local AI model files across multiple AI applications.

Features
- Multi-App Support: Discovers models from Ollama, LM Studio, GPT4All, Jan, LocalAI, koboldcpp, text-generation-webui, and llama.cpp
- Multiple Formats: Supports GGUF (full metadata) and SafeTensors (basic) model formats
- Interactive TUI: Beautiful terminal user interface built with Bubble Tea for browsing models
- Non-Interactive CLI: Command-line tools for automation and scripting
- Centralized Repository: Hardlink-based model management to organize models in one location
- Rich Metadata: Extracts detailed model information including quantization, VRAM estimates, and parameters
- Cross-Platform: Works on Linux, macOS, and Windows
Installation
From Source
git clone https://github.com/JasonDoug/goviewmodels.git
cd goviewmodels
go build -o goviewmodels .
Using Go Install
go install github.com/JasonDoug/goviewmodels@latest
Quick Start
Launch the interactive TUI:
goviewmodels
List models in terminal:
goviewmodels list
Scan for models with optional linking:
goviewmodels scan [--link]
Usage
Interactive Mode
Run goviewmodels to launch the Terminal User Interface (TUI):
goviewmodels
Key bindings in TUI:
j/k or ↑/↓: Navigate models
Enter: View model details
s: Scan for models
l: Link selected model to repository
/: Filter models by name/format/quant/source
Tab: Cycle sort column
v: Toggle VRAM estimation table (in detail view)
?: Show help
q or Ctrl+C: Quit
Command Line Interface
List Models
# List all discovered models
goviewmodels list
# List models with additional directories
goviewmodels list --paths /path/to/custom/models
# Skip specific applications
goviewmodels list --skip ollama,lm-studio
Scan Models
# Scan for models
goviewmodels scan
# Scan and link all discovered models
goviewmodels scan --link
# Scan with additional directories
goviewmodels scan --paths /path/to/custom/models
# Skip specific applications
goviewmodels scan --skip ollama,lm-studio
Supported Applications
goviewmodels automatically discovers models from these AI applications:
| Application |
Default Path |
| Ollama |
~/.ollama/models |
| LM Studio |
~/LM Studio/models |
| GPT4All |
~/Library/Application Support/nomic/gpt4all (macOS) or ~/.local/share/nomic/gpt4all (Linux) |
| Jan |
~/jan/models |
| LocalAI |
~/localai/models |
| koboldcpp |
~/koboldcpp/models |
| text-generation-webui |
~/text-generation-webui/models |
| llama.cpp |
~/llama.cpp/models |
Configuration
goviewmodels uses a JSON configuration file located at ~/.goviewmodels/config.json:
{
"repo_path": "/home/user/.goviewmodels",
"extra_paths": ["/path/to/additional/models"],
"skip_apps": ["ollama"],
"concurrency": 4,
"prefer_hardlinks": true
}
Configuration options:
repo_path: Directory for the centralized model repository
extra_paths: Additional directories to scan for models
skip_apps: List of applications to skip during scanning
concurrency: Number of concurrent scanning workers
prefer_hardlinks: Whether to prefer hardlinks over symlinks
Repository Management
goviewmodels creates a centralized model repository at ~/.goviewmodels/library by default. Models are hardlinked (or symlinked if cross-device) to this location for organized management.
The repository structure:
~/.goviewmodels/library/
├── ollama/
│ ├── llama3/
│ │ └── llama3.gguf
│ └── mistral/
│ └── mistral.gguf
├── lm-studio/
│ ├── TheBloke/
│ │ └── Llama-2-7B/
│ │ └── model.gguf
└── manifest.json
VRAM Estimation
goviewmodels provides VRAM estimates for GGUF models based on quantization type and context length. The detail view includes a comprehensive VRAM table showing memory requirements at different context lengths.
Building from Source
# Clone the repository
git clone https://github.com/JasonDoug/goviewmodels.git
cd goviewmodels
# Build
go build -o goviewmodels .
# Install
go install .
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature)
- Commit your changes (
git commit -m 'Add some AmazingFeature')
- Push to the branch (
git push origin feature/AmazingFeature)
- Open a Pull Request
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments