mirror of
https://github.com/ollama/ollama.git
synced 2025-12-05 19:16:53 -06:00
docs: update readme and links (#12809)
This commit is contained in:
@@ -1,22 +1,22 @@
|
||||
# Documentation
|
||||
|
||||
### Getting Started
|
||||
* [Quickstart](../README.md#quickstart)
|
||||
* [Quickstart](https://docs.ollama.com/quickstart)
|
||||
* [Examples](./examples.md)
|
||||
* [Importing models](./import.md)
|
||||
* [MacOS Documentation](./macos.md)
|
||||
* [Linux Documentation](./linux.md)
|
||||
* [Windows Documentation](./windows.md)
|
||||
* [Docker Documentation](./docker.md)
|
||||
* [Importing models](https://docs.ollama.com/import)
|
||||
* [MacOS Documentation](https://docs.ollama.com/macos)
|
||||
* [Linux Documentation](https://docs.ollama.com/linux)
|
||||
* [Windows Documentation](https://docs.ollama.com/windows)
|
||||
* [Docker Documentation](https://docs.ollama.com/docker)
|
||||
|
||||
### Reference
|
||||
|
||||
* [API Reference](./api.md)
|
||||
* [API Reference](https://docs.ollama.com/api)
|
||||
* [Modelfile Reference](./modelfile.md)
|
||||
* [OpenAI Compatibility](./openai.md)
|
||||
* [OpenAI Compatibility](https://docs.ollama.com/api/openai-compatibility)
|
||||
|
||||
### Resources
|
||||
|
||||
* [Troubleshooting Guide](./troubleshooting.md)
|
||||
* [FAQ](./faq.md)
|
||||
* [Troubleshooting Guide](https://docs.ollama.com/troubleshooting)
|
||||
* [FAQ](https://docs.ollama.com/faq#faq)
|
||||
* [Development guide](./development.md)
|
||||
|
||||
@@ -108,7 +108,10 @@
|
||||
"/modelfile",
|
||||
"/context-length",
|
||||
"/linux",
|
||||
"/macos",
|
||||
"/windows",
|
||||
"/docker",
|
||||
"/import",
|
||||
"/faq",
|
||||
"/gpu",
|
||||
"/troubleshooting"
|
||||
|
||||
@@ -12,9 +12,3 @@ Ollama JavaScript examples at [ollama-js/examples](https://github.com/ollama/oll
|
||||
|
||||
## OpenAI compatibility examples
|
||||
Ollama OpenAI compatibility examples at [ollama/examples/openai](../docs/openai.md)
|
||||
|
||||
|
||||
## Community examples
|
||||
|
||||
- [LangChain Ollama Python](https://python.langchain.com/docs/integrations/chat/ollama/)
|
||||
- [LangChain Ollama JS](https://js.langchain.com/docs/integrations/chat/ollama/)
|
||||
@@ -1,4 +1,6 @@
|
||||
# Ollama for macOS
|
||||
---
|
||||
title: macOS
|
||||
---
|
||||
|
||||
## System Requirements
|
||||
|
||||
@@ -2,11 +2,7 @@
|
||||
title: Windows
|
||||
---
|
||||
|
||||
Welcome to Ollama for Windows.
|
||||
|
||||
No more WSL required!
|
||||
|
||||
Ollama now runs as a native Windows application, including NVIDIA and AMD Radeon GPU support.
|
||||
Ollama runs as a native Windows application, including NVIDIA and AMD Radeon GPU support.
|
||||
After installing Ollama for Windows, Ollama will run in the background and
|
||||
the `ollama` command line is available in `cmd`, `powershell` or your favorite
|
||||
terminal application. As usual the Ollama [API](/api) will be served on
|
||||
|
||||
Reference in New Issue
Block a user