jecabeda
@jecabeda
TIL: You can very easily run genAI models locally using https://ollama.ai
With it you can just run `ollama run codellama:13b` and you are good to go!
TIL: You can very easily run genAI models locally using https://ollama.ai
With it you can just run `ollama run codellama:13b` and you are good to go!
No comments yet. Join the conversation on Mastodon