Andy API Local Client

Contribute your GPU power to the distributed AI compute pool.

Could Not Connect to Local Client

We tried to connect to your local client at http://localhost:5000, but it doesn't seem to be running. The local client allows you to share your computer's processing power with the Andy API network, helping to run AI models for everyone.

1

Install Ollama

Ollama is required to run the language models on your machine. Download and install it from the official website.

Download Ollama
2

Get the Local Client

Download either the Go or Python Andy API Local Client from GitHub. These contain the scripts to run.

Get the Go Client on GitHub Get the Python Client on GitHub
3

Run the Client

Follow the instructions in the `README.md` file on GitHub to install dependencies and start the client. Once it's running, refresh this page.