README updates, textbook polynomial cell, self-contained notebook
Same set of changes as che-computing-dev/LLMs: - 03/04/05 READMEs: uv add workflow, required model caching - 05-tool-use: add Setup section, requirements.txt - 06-neural-networks: textbook cubic polynomial comparison cell - 06-neural-networks: add nn_workshop_colab.ipynb (self-contained, inline data) - vocab.md: catch up with terms from 02-05 Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
parent
a1f9d4d5ed
commit
f7d2b48f5a
7 changed files with 534 additions and 23 deletions
|
|
@ -18,6 +18,42 @@ The LLM tools you use every day are not bare language models. They are agentic s
|
|||
|
||||
---
|
||||
|
||||
## Setup
|
||||
|
||||
This section uses the [Ollama Python library](https://github.com/ollama/ollama-python) to call local models programmatically.
|
||||
|
||||
### Install the Python library
|
||||
|
||||
A `requirements.txt` is provided. **If you are using `uv` for the workshop** (recommended):
|
||||
|
||||
```bash
|
||||
cd /path/to/llm-workshop
|
||||
uv add $(cat 05-tool-use/requirements.txt)
|
||||
```
|
||||
|
||||
**With a plain venv:**
|
||||
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
The only direct dependency is `ollama`. If you completed sections 03 or 04, this is likely already installed as a transitive dependency of `llama-index-llms-ollama`.
|
||||
|
||||
### Pull the model
|
||||
|
||||
Both scripts use `llama3.1:8b`:
|
||||
|
||||
```bash
|
||||
ollama pull llama3.1:8b
|
||||
```
|
||||
|
||||
You can substitute another tool-calling model by editing the scripts. See https://ollama.com/search?c=tool for the current list of models that support function calling. Smaller options like `llama3.2:3b` work for the simple examples; larger models tend to handle multi-step tool calls more reliably.
|
||||
|
||||
### A note on Exercise 8
|
||||
|
||||
The optional advanced exercise wires the RAG pipeline from sections 03-04 into a LlamaIndex `ReActAgent`. If you plan to attempt it, you'll need the LlamaIndex packages from those sections already installed and a working RAG store from `03-rag/`.
|
||||
|
||||
|
||||
## 1. From LLM to agent: what changed?
|
||||
|
||||
### The early days
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue