Connect to your local Ollama instance and start chatting. Configure the endpoint and model in the setup panel.