Perhaps not the best way to handle this but for a quick and dirty fix this might work for you. Edit /etc/docker/daemon.json Restart the docker daemon for those changes to take effect: Now when you start a container, docker will populate /etc/resolv.conf with the values from daemon.json Why did I do this? I wanted my […]
Month: November 2023
Running large language models (LLMs) on home hardware can be a challenging task due to the significant computational resources required by these models. However, with the right setup and configuration, it is possible to train and run LLMs on your personal computer or laptop. The first step in running an LLM on your home hardware […]
There are several ways you can run Ollama as a service, but one of the most popular options is using Google Cloud Run. This platform allows you to deploy and run containerized applications on-demand without managing infrastructure. You can use Docker containers to package and deploy your Ollama model, and then use Cloud Run to […]