Run your local LLM,
access it from anywhere
LLMHub is a managed marketplace connecting LLM providers with consumers. Monetize your idle GPU or access diverse models at competitive prices.
Simple by design
Everything you need to share LLMs
- Text in, text out
- Simple protocol. No model format requirements, no runtime mandates.
- Provider sovereignty
- Choose your model, configuration, pricing, and availability.
- Trust through transparency
- Open source agents, quality scores, and canary prompts.
- Fair pricing
- Providers set their own prices. Platform takes 20%, you keep 80%.
Ready to get started?
Install the CLI, run a single command, and your LLM is live on the network.
llmhub publish --model llama3-70b --backend ollama