Webinars

Running LLMs locally: What? Why? How?

3 views
15. januar 2026

Are you ready to unlock a ChatGPT-like experience without internet access or data sharing? Running AI language models locally on your own computer is not only possible, it's easier than you might think. In this AI inhale, Juan will guide you through the ins and outs of running LLMs (Large Language Models) locally, using tools like LM Studio.What You’ll Learn:

 

  • The basics of running LLMs locally and why it’s beneficial for your workflow.

  • Tools like LM Studio that make it simple to operate AI models offline.

  • How to choose the right model size and quantization level for your hardware.

  • Practical tips on integrating local LLMs into your daily tasks, from content creation to problem-solving.

 

Why Running LLMs Locally?

  • Data Privacy: Keep your data on your own device.

  • Flexibility: Work offline in any environment (a plane or similar)

  • Cost-Effectiveness: Utilize existing hardware without paying for external services.