5 Comments

I installed it. It is very easy. But it is slow, probably because my laptop isn't very big. Your article helped me very much. Thanks

Expand full comment

Happy to help, glad the article helped you! Do subscribe for more articles in Emerging Technologies and the latest research.

Expand full comment

Thanks. I am looking into running a LMM localy. This will certainly help.

I am no developper, just an end-user wanting to have control over my own privacy.

Expand full comment

How does this compare to using Ollama to run AIs locally?

Expand full comment

Less setup, a user-friendly GUI, and equally powerful capabilities. Can be served online as well. Multiple LLMs and multiple agents can run at the same time, with minimal effort and a GUI to control everything instead of a CLI. And, if you prefer a CLI, LM Studio has that as well.

Refer to this link: https://lmstudio.ai/blog/lms

The real beauty about LM Studio is that it is developer-friendly while being beginner-friendly, simultaneously. It has a Python SDK as well as a JS SDK.

Expand full comment