Skip to main content

SLUUG Talk: Demystifying Large Language Models on Linux

I gave a talk for the St. Louis Unix Users Group (SLUUG) titled “Demystifying Large Language Models (LLMs) on Linux: From Theory to Application.” The goal was to walk through how LLMs actually work and how to run them locally on Linux.

I demoed two projects:

  1. A simple Colab notebook using basic Python to generate text with an n-gram model. The point was to illustrate the core idea behind language models and show concretely why n-grams fall short, which motivates the transformer architecture.

  2. A project that uses ElasticSearch and LLMs to allow natural language search queries over databases.

The talk went well. Content and code are on GitHub.

Discussion