Categories
How-To Software Technical

Large Language Models (LLM) Running @ Home

Running large language models (LLMs) on home hardware can be a challenging task due to the significant computational resources required by these models. However, with the right setup and configuration, it is possible to train and run LLMs on your personal computer or laptop. The first step in running an LLM on your home hardware […]

Categories
How-To Software Technical

Ways to run ollama as a service

There are several ways you can run Ollama as a service, but one of the most popular options is using Google Cloud Run. This platform allows you to deploy and run containerized applications on-demand without managing infrastructure. You can use Docker containers to package and deploy your Ollama model, and then use Cloud Run to […]