折腾 Zhēteng

Tag: llm

3 items with this tag.

  • Jan 28, 2026

    GPU order helper script for Ollama

    • self-hosted
    • ollama
    • gpu
    • llm
    • cuda
  • Jan 28, 2026

    Ollama Models and GPU VRAM Usage

    • ollama
    • llm
    • gpu
    • self-hosted
  • Feb 05, 2024

    Ollama: A Simple Solution for Self-Hosted LLMs

    • ollama
    • self-hosted
    • llm
    • big-agi
    • openwebui
    • systemd
    • nodejs
    • ubuntu
    • apt
    • npm
    • docker-compose

Graph View

Content is licensed under Creative Commons Attribution 4.0 International License

Created with Quartz v4.5.2 © 2026

  • GitHub
  • Discord Community