🎓LM Studio Demystified: Your Ultimate Guide to Running Local LLMs | Installation & Setup
Looking to run Large Language Models (LLMs) locally but frustrated with the limitations of Jupyter notebooks? LM Studio is here to change that.
Discover, download, and run local LLMs
That’s the tagline for LMStudio. It aims to run the LLMs locally, on any possible hardware, whether its Intel, Nvidia and AMD (yes, AMD users like me are happy). Aside from just running LLMs locally, it has tons of features, that you might want to explore to reach its full potential. And guess what? It’s absolutely free! 🤑 Download it today and start exploring the world of local LLMs.
WHAT is LM Studio?
LM Studio is like a magic wand 🪄 for deploying and running Large Language Models on your local machine. Developed with love ❤️ (and a lot of coding) and with support from the llama.cpp project — big shoutout to them! 🎉 The real reason one might use this is the simplicity it offers to use local LLMs and researching upon them. It supports multiple LLMs straight from HuggingFace. Also, the installation is pretty forward out of the box and requires no effort to setup.
WHY use LM Studio ?
Using local LLMs is a piece of cake with LM Studio 🥧. Yes, you read that right — it supports multiple LLMs straight from HuggingFace. It’s also so easy to install that you’ll wonder why you haven’t been using it sooner.
LM Studio’s interface is a dream come true for anyone interested in LLMs, regardless of their experience level in AI research. It’s simple, easy to use, and best of all, doesn’t require any coding!
Here are some features you might want to look ahead to:
- User Friendly Interface — Great for researchers at all levels + no-code solution for running llms.
- Extensive Hardware support — Everyone from Apple, Intel, Nvidia and AMD are welcome here. AMD is supported through ROCm.
- OpenAI compatible Local Server Support — Feel free to connect using APIs, log the results, and also expose the server to devices on same network. This is a good solution for testing LLMs.
- Automatic Load Sharing between GPU and CPU
- Document chat supported through RAGs.
- Options to configure models before loading.
… and many more.
Installation and Usage
Since the pros for LM Studio are clear now, let's head towards its installation.
- Head over to lmstudio.ai and download the installer.
- Open up the installer, and its done. It will boot up the application straight.
- By default, the CHAT menu will be opened. Click on Load Model in the center and you will be asked to select a LLM model. Select your model and start chatting.
- If you have any model previously loaded or installed, it will be shown in the popup list.
- If no LLM is available, head over to Discover menu by selecting the 2nd option from the left bar (the one with 5 icons).
- Using the top bar, search any model from Hugging face and install it. Then click on USE IN CHAT on the model details in right. Now, it will start loading the model for you.
Since LM Studio automatically allocates the resources, you do not need to explicitly define the resource you want to use. It will select the best available device for inference.
Final Thought
After all this small tutorial, LM Studio is ready to use and have chatting with the LLM in private 😏.
LM Studio is a game-changer for anyone interested in local LLMs. Its user-friendly interface, extensive hardware support, and easy installation process make it the go-to tool for AI research and development. Don’t miss out on revolutionizing your AI projects — download LM Studio today and explore the future of local LLMs!
Share Your Experience
If you found this guide helpful, consider sharing it with your network or leaving a comment below. Let’s spread the word about LM Studio and help others unlock the power of local LLMs! 🌟
Looking for DirectML guide
Want to Build MLOps pipeline
Follow this playlist: List: MLOps Project Tutorial | Curated by Gurneet Singh | Medium
Alternative to LM Studio?
OLLAMA is here: OLLAMA — Your Local LLM Friend: Installation Tutorial 🖥️🚀 | by Gurneet Singh | Aug, 2024 | Medium
Hashtags:
#AIResearch, #MachineLearning, #LLMs, #TechTools, #ArtificialIntelligence