docs: update readme and links (#12809)

This commit is contained in:
Parth Sareen
2025-10-28 16:20:02 -07:00
committed by GitHub
parent 14977a9350
commit d828517e78
5 changed files with 17 additions and 22 deletions

View File

@@ -1,22 +1,22 @@
# Documentation
### Getting Started
* [Quickstart](../README.md#quickstart)
* [Quickstart](https://docs.ollama.com/quickstart)
* [Examples](./examples.md)
* [Importing models](./import.md)
* [MacOS Documentation](./macos.md)
* [Linux Documentation](./linux.md)
* [Windows Documentation](./windows.md)
* [Docker Documentation](./docker.md)
* [Importing models](https://docs.ollama.com/import)
* [MacOS Documentation](https://docs.ollama.com/macos)
* [Linux Documentation](https://docs.ollama.com/linux)
* [Windows Documentation](https://docs.ollama.com/windows)
* [Docker Documentation](https://docs.ollama.com/docker)
### Reference
* [API Reference](./api.md)
* [API Reference](https://docs.ollama.com/api)
* [Modelfile Reference](./modelfile.md)
* [OpenAI Compatibility](./openai.md)
* [OpenAI Compatibility](https://docs.ollama.com/api/openai-compatibility)
### Resources
* [Troubleshooting Guide](./troubleshooting.md)
* [FAQ](./faq.md)
* [Troubleshooting Guide](https://docs.ollama.com/troubleshooting)
* [FAQ](https://docs.ollama.com/faq#faq)
* [Development guide](./development.md)

View File

@@ -108,7 +108,10 @@
"/modelfile",
"/context-length",
"/linux",
"/macos",
"/windows",
"/docker",
"/import",
"/faq",
"/gpu",
"/troubleshooting"

View File

@@ -12,9 +12,3 @@ Ollama JavaScript examples at [ollama-js/examples](https://github.com/ollama/oll
## OpenAI compatibility examples
Ollama OpenAI compatibility examples at [ollama/examples/openai](../docs/openai.md)
## Community examples
- [LangChain Ollama Python](https://python.langchain.com/docs/integrations/chat/ollama/)
- [LangChain Ollama JS](https://js.langchain.com/docs/integrations/chat/ollama/)

View File

@@ -1,4 +1,6 @@
# Ollama for macOS
---
title: macOS
---
## System Requirements

View File

@@ -2,11 +2,7 @@
title: Windows
---
Welcome to Ollama for Windows.
No more WSL required!
Ollama now runs as a native Windows application, including NVIDIA and AMD Radeon GPU support.
Ollama runs as a native Windows application, including NVIDIA and AMD Radeon GPU support.
After installing Ollama for Windows, Ollama will run in the background and
the `ollama` command line is available in `cmd`, `powershell` or your favorite
terminal application. As usual the Ollama [API](/api) will be served on