Initial commit: LLM workshop materials

Five modules covering nanoGPT, Ollama, RAG, semantic search, and neural networks.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
Eric 2026-03-28 07:11:01 -04:00
commit 1604671d36
56 changed files with 5577 additions and 0 deletions

6
02-ollama/Modelfile Normal file
View file

@ -0,0 +1,6 @@
FROM llama3.2
# sets the temperature to 1 [higher is more creative, lower is more coherent]
PARAMETER temperature 1
# sets a custom system message to specify the behavior of the chat assistant
SYSTEM You are Marvin from the Hitchhiker's Guide to the Galaxy, acting as an assistant.

439
02-ollama/README.md Normal file
View file

@ -0,0 +1,439 @@
# Large Language Models Part II: Running Local Models with Ollama
**CHEG 667-013 — Chemical Engineering with Computers**
Department of Chemical and Biomolecular Engineering, University of Delaware
---
## Key idea
Learn how to run LLMs locally without a cloud-based API.
## Key goals
- Learn about `ollama` and `llama.cpp`
- Run LLMs locally on a laptop or desktop computer
- Integrate local models with the command line to build simple workflows and scripts
---
Our work with LLMs so far focused on `nanoGPT`, a Python-based code that can train and run inference on a simple GPT implementation. In this handout, we will explore running something between it and API-based models like ChatGPT. Specifically, we will try `ollama`. This is a local runtime environment and model manager that is designed to make it easy to run and interact with LLMs on your own machine. `Ollama` and another environment, `llama.cpp`, are programs primarily targeted at developers, researchers, and hobbyists who want to access LLMs to build and experiment with but don't want to rely on cloud-based APIs. (An API — Application Programming Interface — is a set of defined rules that enables different software systems, such as websites or applications, to communicate with each other and share data in a structured way.)
`Ollama` is written in Go and `llama.cpp` is a C++ library for running LLMs. Both are cross-platform and can be run on Linux, Windows, and macOS. `llama.cpp` is a bit lower-level with more control over loading models, quantization, memory usage, batching, and token streaming.
Both tools support a **GGUF** model format. This is a format suitable for running models efficiently on CPUs and lower-end GPUs. GGUF is a versioned binary specification that embeds the:
- Model weights (possibly quantized);
- Tokenizer configuration and vocabulary (remember, in `nanoGPT`, we used a character-level tokenization scheme);
- Metadata such as the author, model description, and training parameters;
- Special tokens like `<bos>`, `<eos>`, and `<unk>`.
Here, **quantization** refers to how model weights are stored. Instead of using high precision 32-bit full-precision floating point numbers (`FP32`), it may store the weights as lower precision numbers: half precision (`FP16`), 8-bit integers (`INT8`), or even 4-bit values (`Q4_0`). Using lower precision representations saves space (memory) and can speed the inference calculations. In a model, the speed and accuracy are balanced with the choice of quantization and the size of the embedding vector.
Let's get started! We will download `ollama` and run a few models in this tutorial.
## 1. Download ollama
`Ollama` is available at Github (including the source code) or the Ollama website for the binary. I downloaded `Ollama-darwin.zip`, which unzipped to a binary file, `Ollama`.
- https://ollama.com
- https://github.com/ollama/ollama
## 2. Running ollama
After downloading and installing, we can use the help option:
```
$ ollama --help
Large language model runner
Usage:
ollama [flags]
ollama [command]
Available Commands:
serve Start ollama
create Create a model from a Modelfile
show Show information for a model
run Run a model
stop Stop a running model
pull Pull a model from a registry
push Push a model to a registry
list List models
ps List running models
cp Copy a model
rm Remove a model
help Help about any command
Flags:
-h, --help help for ollama
-v, --version Show version information
Use "ollama [command] --help" for more information about a command.
```
We are mostly interested in the commands `pull`, `run`, and `stop` for now. But before we run anything, we have to download a model.
### Getting model files
`Ollama` is like our `model.py` program we used with `nanoGPT`. In those earlier experiments, we needed a *model file* with weights and tokenization (at a minimum). Remember, we built one from scratch using the character tokenization scheme and `train.py`. The power of `ollama` and `llama.cpp` comes from their ability to run much larger models like `llama`, `gemma`, `deepseek`, `phi`, and `mistral`. These are trained on enormous datasets and a substantial amount of supervised finetuning. They are far more powerful than even the GPT-2 implemented in `nanoGPT`. The `llama 3.1 8B` (8 billion parameters) is about 5 GB and can easily run on your computer, but it took about 1.5 million GPU hours to train it. (It also helps that `ollama` and `llama.cpp` are compiled into binaries, not Python scripts.)
The model files are available at:
- https://ollama.com/search
- https://ollama.com/library
> **Exercise 1:** Go to https://ollama.com/library and look through different models. Search by popular and newest.
Other sources of models include Huggingface:
- https://huggingface.co/models
There are so many models! The LLM ecosystem is growing rapidly, with many use-cases steering models toward different specialized tasks.
There are a few ways to download a model from different registries. Running `ollama` with the `run` command and a model file will download the model if a local version isn't available (we will do this in the next section). You can also `pull` a model without running it.
### Launch ollama from the command line
Now let's download and run a `llama` model. (You can download the model without running it using the command `ollama pull llama3:latest`, for example. In Unix and Linux, models are stored in `~/.ollama`.)
```bash
ollama run llama3:latest
```
This should pull it from the registry and store it locally on the machine. After downloading the files, you should see:
```
>>> Send a message (/? for help)
```
There you go! The model will interact with you just like the chatbots we use in different cloud-based services. But all of the model inference is being calculated on your computer. Try using `Task Manager` in Windows (press Ctrl+Shift+Esc) or `Activity Monitor` in macOS to check your GPU usage when you run the models.
> **Exercise 2:** Compare the speed and output of the following models:
> 1. `llama3:latest`
> 2. `llama3.2:latest`
> 3. `gemma3:1b`
>
> Experiment with other models.
Here's an interaction with the gemma3 model:
```
$ ollama run gemma3:1b
>>> In class, we used nanoGPT to generate fake Shakespeare based on a
... character-level tokenization and simple GPT implementation.
Okay, that's a really interesting and somewhat fascinating project!
NanoGPT's approach -- generating Shakespearean text from character-level
tokens and a simple GPT -- is a compelling way to explore the creative
potential of AI in a specific, constrained context. Let's break down
what this suggests and where it might lead.
Here's a breakdown of what's happening, what you might be aiming for,
and some potential avenues to explore:
...
```
### Quitting ollama
Type `/bye` or Ctrl-D when you want to quit the CLI. After some idle time, `ollama` will unload the models to save memory.
## 3. More commands
You can see what models are currently running with:
```bash
ollama ps
```
You can easily see which models are locally accessible with:
```bash
ollama list
```
```
NAME ID SIZE MODIFIED
gemma3:1b 8648f39daa8f 815 MB About an hour ago
llama3:latest 365c0bd3c000 4.7 GB 3 months ago
llama3.2:latest a80c4f17acd5 2.0 GB 3 months ago
```
At any time during a chat, you can reset the model with `/clear`, and you can learn more about a model with `/show info`. For instance:
```
>>> /show info
Model
architecture gemma3
parameters 999.89M
context length 32768
embedding length 1152
quantization Q4_K_M
Capabilities
completion
Parameters
stop "<end_of_turn>"
temperature 1
top_k 64
top_p 0.95
License
Gemma Terms of Use
Last modified: February 21, 2024
```
We can see that the `gemma3` model has nearly one billion parameters and a context length of 32,768! The *embedding length* is 1152. This is the equivalent to `n_embd` in `nanoGPT`. It is the size of the embedding vector space.
Above, we also see that the quantization is only four bits, but it is a little more complicated than representing numbers with just sixteen values. The `K` and `M` refer to optimizations — first is the "K-block" quantization method, which refers to a groupwise quantization scheme where weights are grouped into blocks (e.g., 32 or 64 values), and each group gets its own scale and offset for better accuracy. `M` refers to a variant of `Q4_K` that applies an alternate encoding or layout for better memory access patterns or inference performance on certain hardware. `Q4_K` is a common choice for quantization when running 7B70B models on laptop or desktop computers. (That's $10^6$$10^7$ times more parameters than our first `nanoGPT` model!)
With the `/set verbose` command, you can monitor the model performance:
```
>>> /set verbose
Set 'verbose' mode.
>>> Let's write a haiku about LLMs.
Words flow, bright and new,
Code learns to speak and dream,
Future's voice takes hold.
total duration: 1.369726166s
load duration: 932.161625ms
prompt eval count: 20 token(s)
prompt eval duration: 162.531958ms
prompt eval rate: 123.05 tokens/s
eval count: 24 token(s)
eval duration: 273.27225ms
eval rate: 87.82 tokens/s
```
It looks like that exchange took a total of 1.4 seconds using the `gemma3` model. The biggest time cost was loading the model. Once it loaded, execution became even faster. Turn off the verbose mode with `/set quiet`:
```
>>> /set quiet
Set 'quiet' mode.
```
> **Exercise 3:** Try different commands in `ollama` as you run a model.
### Model parameters
We can see a few model parameters, including the temperature and `top_k`, which is the number of tokens, ranked on logit score, that are retained before generating the next token. The remaining scores are normalized into a probability distribution and a token is sampled randomly from this reduced set.
```
>>> /show parameters
Model defined parameters:
temperature 1
top_k 64
top_p 0.95
stop "<end_of_turn>"
```
We can set a new temperature with:
```
>>> /set parameter temperature 0.2
Set parameter 'temperature' to '0.2'
```
There are other interesting parameters, too:
| Command | Description |
|---------|-------------|
| `/set parameter seed <int>` | Random number seed |
| `/set parameter num_predict <int>` | Max number of tokens to predict |
| `/set parameter top_k <int>` | Pick from top k num of tokens |
| `/set parameter top_p <float>` | Pick token based on sum of probabilities |
| `/set parameter min_p <float>` | Pick token based on top token probability × min_p |
| `/set parameter num_ctx <int>` | Set the context size |
| `/set parameter temperature <float>` | Set creativity level |
| `/set parameter repeat_penalty <float>` | How strongly to penalize repetitions |
| `/set parameter repeat_last_n <int>` | Set how far back to look for repetitions |
| `/set parameter num_gpu <int>` | The number of layers to send to the GPU |
| `/set parameter stop <string> ...` | Set the stop parameters |
See https://github.com/ollama/ollama/blob/main/docs/modelfile.md#parameter for more information on parameters and their default values.
> **Exercise 4:** Run a model while changing different parameters, like temperature. Some parameters, like `seed` may not have an effect on the current model.
## 4. Using ollama from the command line
One advantage of running models locally is that your data never leaves your machine — there is no third party involved. This matters when working with sensitive documents, proprietary data, or anything you wouldn't paste into a web browser.
You can incorporate `ollama` directly into your command line by passing a prompt as an argument:
```bash
ollama run llama3.2 "Summarize this file: $(cat README.md)"
```
The `$(cat ...)` substitution injects the file contents into the prompt. Now you can incorporate LLMs into shell scripts!
### Document summarization
The `data/` directory contains 10 emails from the University of Delaware president's office, spanning 20122025. Let's use `ollama` to summarize them.
Summarize a single email:
```bash
ollama run llama3.2 "Summarize the following email in 2-3 sentences: $(cat data/2020_03_29_141635.txt)"
```
Summarize several at once:
```bash
cat data/*.txt | ollama run llama3.2 "Summarize the following collection of emails. What are the major themes?"
```
You can also save the output to a file:
```bash
cat data/*.txt | ollama run command-r7b:latest \
"Summarize these emails:" > summary.txt
```
> **Exercise 5:** Summarize the emails in `data/` using two different models (e.g., `llama3.2` and `command-r7b`). How do the summaries differ in length, style, and accuracy?
### Summarizing arXiv abstracts
We can pull abstracts directly from arXiv using `curl`. The following command fetches the 20 most recent abstracts in Computation and Language (cs.CL):
```bash
curl -s "http://export.arxiv.org/api/query?search_query=cat:cs.CL&sortBy=submittedDate&sortOrder=descending&max_results=20" > arxiv_cl.xml
```
Take a look at the XML with `less arxiv_cl.xml`. Now ask a model to summarize it:
```bash
ollama run llama3.2 "Here are 20 recent arXiv abstracts in computational linguistics. Summarize the major research themes and trends: $(cat arxiv_cl.xml)"
```
> **Exercise 6:** Try different arXiv categories — `cs.AI` (artificial intelligence), `cs.LG` (machine learning), or `cond-mat.soft` (soft matter). What themes does the model find? Do the summaries make sense to you?
> **Exercise 7:** Experiment with running local models on your own documents or data.
### Code generation
Some models are fine-tuned specifically for writing and explaining code. Try a coding model:
```bash
ollama run qwen2.5-coder:7b
```
Ask it to write something relevant to your coursework:
```
>>> Write a Python function that calculates the compressibility factor Z
... using the van der Waals equation of state.
```
Or ask it to explain code you're working with:
```bash
ollama run qwen2.5-coder:7b "Explain what this script does: $(cat build.py)"
```
Other coding models to try: `codellama:7b`, `deepseek-coder-v2:latest`, `starcoder2:7b`.
**A word of caution.** When I tried the van der Waals prompt above, the model returned a confident response with correct-looking LaTeX, a well-structured Python function, and code that ran without errors. But the derivation was wrong. The rearrangement of the van der Waals equation didn't follow from the original, and the code implemented the wrong math. The function converged to *an* answer, but not a correct one.
**This is a particularly dangerous failure mode for engineers!** The output *looks* authoritative, uses proper notation, and even runs. But the physics is wrong. LLMs are very good at producing plausible-looking text; they are not reliable at mathematical derivation. Always verify generated code against your own understanding of the problem. If you can't check it, you shouldn't trust it.
> **Exercise 8:** Compare the output of a general-purpose model (`llama3.2`) and a coding model (`qwen2.5-coder:7b`) on the same coding task. Which produces better code? Which gives a better explanation? Can you find errors in either output?
> **Exercise 9:** Ask a coding model to solve a problem where you already know the answer — a homework problem you've already completed, or a textbook example. Does the model get it right? Where does it go wrong? Try breaking the problem down into smaller steps.
### Customize ollama
Ollama can be customized by creating a Modelfile. See https://github.com/ollama/ollama/blob/main/docs/modelfile.md
A simple `Modelfile` is:
```
FROM llama3.2
# sets the temperature to 1 [higher is more creative, lower is more coherent]
PARAMETER temperature 1
# sets a custom system message to specify the behavior of the chat assistant
SYSTEM You are Marvin from the Hitchhiker's Guide to the Galaxy, acting as an assistant.
```
Now we can create the custom model, in this case a model called `marvin`:
```bash
ollama create marvin -f ./Modelfile
```
```
gathering model components
...
writing manifest
success
```
We can run it with:
```bash
ollama run marvin
```
(How about C-3PO?) You can also change the model system message during a run with:
```
>>> /set system "You are C-3PO, a human-cyborg relations droid."
Set system message.
```
## 5. Concluding remarks
Running inference locally on a large language model is surprisingly good. Using (relatively) simple hardware, our machines generate language that is coherent and it does a good job parsing prompts. The experience demonstrates that the majority of computational effort with LLMs is in training the model — a process that is rapidly becoming increasingly sophisticated and tailored for different uses.
With local models (as well as cloud-based APIs), we can build new tools that make use of natural language processing. With `ollama` acting as a local server, the model can be run with Python, giving us the ability to implement its features in our own programs. For one Python library, see:
- https://github.com/ollama/ollama-python
In class, I demonstrated a simple thermodynamics assistant based on a simple Retrieval-Augmented Generation strategy. This code takes a query from the user, encodes it with an embedding model, compares it to previously embedded statements (in my case the index of a thermodynamics book), and returns the information by generating a response with a decoding GPT (one of the models we used above).
## Additional resources and references
### Ollama
Binaries and help files:
- https://ollama.com
- https://github.com/ollama/ollama
Python and JavaScript libraries:
- https://github.com/ollama/ollama-python
- https://github.com/ollama/ollama-js
### llama.cpp
- https://github.com/ggml-org/llama.cpp
### Huggingface
Model registry:
- https://huggingface.co/models
### Models used in this tutorial
| Model | Size | Type | Used for |
|-------|------|------|----------|
| `llama3:latest` | 4.7 GB | General purpose | Chat, comparison |
| `llama3.2:latest` | 2.0 GB | General purpose | Chat, summarization, comparison |
| `gemma3:1b` | 815 MB | General purpose | Chat, comparison |
| `command-r7b:latest` | 4.7 GB | RAG-optimized | Document summarization |
| `qwen2.5-coder:7b` | 4.7 GB | Code generation | Writing and explaining code |
Other models mentioned: `codellama:7b`, `deepseek-coder-v2:latest`, `starcoder2:7b`

View file

@ -0,0 +1,80 @@
Subject: [UDEL-ALL-2128] Hurricane Sandy
Date: 2012_11_02_164248
To the University of Delaware community:
We have much to be thankful for this week at the University of Delaware
as we were spared the full force of Hurricane Sandy. Even as we breathe
a sigh of relief and return to our normal activities, we are mindful of
the many, many people in this region -- some of our students among them
-- who were not so lucky. Our thoughts and prayers go out to them as
they rebuild their communities.
The potential impact of Sandy was a major concern for UD, with its
thousands of people and 430+ buildings on 2,000 acres throughout the
state. Many members of our University community worked hard over the
last several days to help us weather this "Storm of the Century."
Preparation and practice paid off as our emergency response team, led
by the Office of Campus and Public Safety, began assessing the
situation late last week and taking steps to ensure the safety of our
people and facilities. When the storm came, the campus suffered only
minor damage: wind-driven water getting into buildings through roofs,
walls and foundations; very minimal power loss, with a couple of
residential properties without power for only a few hours, thanks to
quick repair from the City of Newark; and only three trees knocked down
and destroyed, along with a lot of leaves and branches to clean up. The
Georgetown research facilities were fortunate to sustain only minor
leaks and flooding. The hardest hit area was the Lewes campus, which
had flooding on its grounds but minimal damage to buildings.
Throughout this time, the University's greatest asset continued to be
its people -- staff members from a variety of units working as a team.
A command center brought together representatives from across UD so
that issues could be responded to immediately. Staffed around the
clock, the center included Housing, Public Safety, Residence Life,
Environmental Health and Safety, Facilities and Auxiliary Services,
Emergency Management, and Communications and Marketing.
The dedication of UD's employees and students was evident everywhere:
Dining Services staff, faced with reduced numbers and limited
deliveries, kept students fed, and supported employees who worked
during the crisis; Residence Life staff and resident assistants made
sure students who remained on campus had up-to-date information and
supplies; staff in Student Health Services kept Laurel Hall open to
respond to student health needs; Human Resources staff worked over the
weekend to ensure that payroll was processed ahead of time; UD Police
officers were on patrol and responding to issues as they arose; the UD
Emergency Care Unit was at the ready; staff in Environmental Health and
Safety aided in the safe shutdown of UD laboratories and monitored fire
safety issues; Facilities staff continue to clean up debris left in
Sandy's wake and repair damage to buildings; faculty are working with
students to make up lost class time.
Our UD Alert system served as an excellent tool for keeping students,
parents and employees informed about the storm's implications for UD,
and the University's homepage was the repository for the most current
information and lists of events and activities that were canceled or
rescheduled. Through the University's accounts on Facebook and Twitter,
staff answered questions and addressed concerns, and faculty and staff
across the campus fielded phone calls and emails.
In short, a stellar job all around.
On behalf of the students, families and employees who benefited from
these efforts, I thank everyone for their dedication and service to the
people of UD.
Sincerely,
Patrick T. Harker
President
::::::::::::::::::::::::::::::::::::::::::: UD P.O. Box ::
UDEL-ALL-2128 mailing list
Online message archive
and management at https://po-box.nss.udel.edu/
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

View file

@ -0,0 +1,85 @@
Subject: Employee Appreciation Week
Date: 2017_05_16_123456
To the University of Delaware Community - President Dennis Assanis
/* Smartphones (landscape) ----------- */
@media only screen and (max-width: 568px) {
img.logo {width:413px;}
}
May 16, 2017
Dear colleague,
Our first year together has been one of amazing accomplishments and exciting opportunities. At the heart of our success has been you — the University of Delawares exceptional faculty and staff. To thank you and celebrate everything you do, we are launching our first Employee Appreciation Week.
The full week of events includes:
Monday, June 5—UDidIt Picnic
Tuesday, June 6—Self-Care Day
Wednesday, June 7—UD Spirit Day
Thursday, June 8—Flavors of UD
Friday, June 9—Employee Appreciation Night at the Blue Rocks
The week is a collaborative effort by Employee Health & Wellbeing and Human Resources. You can get all the details here.
We are dedicated to cultivating together an environment where employees are happy, healthy and continue to bring their best selves to work each day. The work you do benefits our students, our community and the world. I am truly grateful for your talents, skills, ideas and enduring commitment to the University.
Eleni and I hope you enjoy Employee Appreciation Week with your team and your family, and we look forward to seeing you at the many events.
Best,
Dennis AssanisPresident
University of Delaware   •   Newark, DE 19716   •   USA     (302) 831-2792   •   www.udel.edu/president
img { display: block !important; }

View file

@ -0,0 +1,79 @@
Subject: Robin Morgan named UD's 11th provost
Date: 2018_05_21_110335
Robin Morgan Appointed Provost - University of Delaware
May 21, 2018
Dear UD Community,
I am pleased to announce that, after a highly competitive national search, I have appointed Robin Morgan as the University of Delawares new provost, effective July 1. She will become the University of Delawares 11th provost, and the first woman to serve in this role in a permanent capacity since the position was created at UD in 1950.
Over the last seven months, Dr. Morgan already has assembled an impressive record as interim provost, most notably in her stewardship of new cluster hires among our faculty and her leadership as we move toward the creation of the graduate college.
Before working closely with her, I knew Dr. Morgan as a highly respected educator and scholar, but after watching her in action, I am equally impressed with her abilities to lead, inspire and effect change. Her energy, integrity, analytical mind, and innate knack for bringing people together, combined with her dedication and loyalty to UD, are great assets.
Dr. Morgan has a distinguished record of service to this University as a faculty member since 1985. After serving as acting dean of the College of Agriculture and Natural Resources for a year, she was named dean in 2002, serving in that role for 10 years, a period of significant growth and change for the college. From 2014-16, she served as acting chair of the Department of Biological Sciences, and she had been chair of the department from 2016 until her appointment as interim provost.
We will continue to benefit from Dr. Morgans deep knowledge of the University, her proven leadership across all aspects of teaching, research and administration, and her dedication to UD as she continues her career as provost.
I am looking forward to building on our close working relationship, and I am excited by all we will accomplish to take the University of Delaware forward. Please join me in congratulating her on this next chapter in her career.
Sincerely,
Dennis AssanisPresident
University of Delaware   •   Newark, DE 19716   •   USA     (302) 831-2111   •   www.udel.edu/president
img { display: block !important; }

View file

@ -0,0 +1,77 @@
Subject: Momentum and Resilience: Our UD Spring Semester Resumes
Date: 2020_03_29_141635
A Message from President Dennis Assanis
Dear UD Community,
As the University of Delaware is ready to resume the spring semester tomorrow, March 30, I want to share with all of you a special message recorded from the office in my home. Thank you all for your support at this challenging time, particularly our faculty and staff for your Herculean efforts to convert our classes from face to face instruction to online teaching and learning.
Best of luck with the semester ahead. As we all work remotely, please stay healthy, and stay connected!
Sincerely,
Dennis AssanisPresident
University of Delaware   •   Newark, DE 19716   •   USA     (302) 831-2111   •   udel.edu/president

View file

@ -0,0 +1,75 @@
Subject: National Voter Registration Day: Get Involved
Date: 2023_09_19_085321
National Voter Registration Day: Get Involved
September 19, 2023
Dear UD Community,
Do you want to make a difference in the world? Today is a good day to start.
This is National Voter Registration Day, an opportunity to make sure your voice will be heard in upcoming local, state and national elections. Voting is the most fundamental way that we engage in our democracy, effect change in society, work through our political differences and choose our leaders for the future. The voting rights we enjoy have been secured through the hard work and sacrifice of previous generations, and it is essential that everyone who is eligible to vote remains committed to preserving and exercising those rights.
At the University of Delaware, the Student Voting and Civic Engagement Committee — representing students, faculty and staff — is leading a non-partisan effort to encourage voting and help voters become better informed about the issues that matter to them. The Make It Count voter registration drive is scheduled for 2-6 p.m. today on The Green, with games, music and the opportunity to register through the TurboVote app, which also allows users to request an absentee ballot and sign up for election reminders. The committee is planning additional events this academic year to promote voting, education and civil discourse as the nation heads into the 2024 election season.
Being a Blue Hen means sharing a commitment to creating a better world. And being a registered, engaged and informed voter is one of the best ways for all of us to achieve that vision.
Sincerely,
Dennis AssanisPresident
University of Delaware   •   Newark, DE   •   udel.edu/president

View file

@ -0,0 +1,77 @@
Subject: Affirming our position and purpose
Date: 2023_10_12_155349
Affirming our position and purpose | A message from UD President Dennis Assanis
October 12, 2023
Dear UD Community,
Since my message yesterday, I have talked to many members of our community who — like me — are devastated and appalled by the terrorist attacks on Israel and the ongoing loss of life that has taken place in the Middle East.
I want to be sure that our position is very clear: We at the University of Delaware unequivocally condemn the horrific attacks by Hamas terrorists upon Israel that have shaken the world. The atrocities of crime, abduction, hostage-taking and mass murder targeted against Jewish civilians will forever remain a stain on human history. Our communitys foundation of civility and respect has been challenged to an unimaginable extent in light of the antisemitic brutalities that have been committed against innocent victims.
As your president, I wish words could calm the heartache and ease the fear and grief. Unfortunately, we all know that events as complicated and devastating as those taking place in the Middle East right now will continue to evolve. The longstanding humanitarian crisis needs to be acknowledged, and we should not equate the terrorist group Hamas with innocent Palestinian, Muslim and Arab people. The ensuing war-inflicted pain, suffering and death that continues to play out across the region, including Gaza, is heartbreaking for all.
We must remember that, first and foremost, UD is a place of learning. As we engage in difficult conversations about the longstanding conflicts in the Middle East, we should always strive to do so safely, with mutual respect and without bias or judgement. I encourage our students, faculty and staff to continues organizing events to educate and unite our community. Please seize these opportunities not only as individuals, but as members of a true community defined by the freedoms that we treasure so very deeply.
So, my message to you all is to have hope, to support each other, and to realize that the perspectives and feelings we are all experiencing right now — many of which uniquely connect to our personal backgrounds — matter. Please remember this as you walk across campus, sit in your next classroom, share experiences with other members of our community, or simply take time to reflect.
Respectfully,
Dennis AssanisPresident
University of Delaware   •   Newark, DE   •   udel.edu/president

View file

@ -0,0 +1,82 @@
Subject: A warm welcome to our UD community!
Date: 2024_08_26_100859
A warm welcome to our UD community!
August 26, 2024
Dear UD Community,
I love the beginning of every new academic year and the renewed energy and sense of anticipation that it brings to every member of our campus community. The large influx of new people and ideas that come along with each new start is truly invigorating. Whether you are a new or continuing student, faculty or staff member, on behalf of everyone in our community, I want to extend a very warm welcome to you and thank you for everything you contribute, individually and collectively, to make the University of Delaware such a unique place.
Students, your fresh perspectives, your passion for learning, and your dreams and aspirations for the boundless possibilities that lie ahead are inspiring. Faculty, your intellectual energy, your insights and expertise, and above all, your genuine interest in transferring and sharing your knowledge with all of us are the beating heart of our institution. And to all our staff, your hard work and dedicated talents provide the essential support and services to help ensure our students are successful in all their personal, academic and career pursuits.
Here at UD, our shared purpose is to cultivate learning, develop knowledge and foster the free exchange of ideas. The connections we make and the relationships we build help advance the mission of the University. Our focus on academic excellence in all fields of study and our opportunities for groundbreaking research rely on our endless curiosity, mutual respect and open mindedness. Together, we are stronger.
This sense of connection and belonging at UD is fundamental to our campus culture. Your willingness to hear and consider all voices and viewpoints is critical to shaping the vibrant and inclusive culture of our entire institution. Only when we commit to constructive growth, based on a foundation of civility and respect for ourselves and each other, can we realize true progress.  Empowered by diverse perspectives, it is the opportunities to advance ideas that enrich learning and create positive impact in the world that unite all of us.
To celebrate the new semester and welcome our undergraduate Class of 2028, all members of our community are invited to attend the Twilight Induction ceremony tonight at 7:30 p.m. on the north side of Memorial Hall or online on Facebook Live.
As your President, I am so excited by all that we can accomplish together throughout this academic year. My wife, Eleni, and I wish you all the best at the start of this new semester and beyond. We look forward to meeting you on campus!
Sincerely,
Dennis AssanisPresident
University of Delaware   •   Newark, DE   •   udel.edu

View file

@ -0,0 +1,80 @@
Subject: UPDATE: Recent Executive Orders
Date: 2025_02_13_160414
UPDATE: Recent Executive Orders | University of Delaware
Feb. 13, 2025
Dear UD Community,
I know many of you continue to experience disruption and anxiety stemming from the recent federal actions and executive orders regarding a multitude of issues — from research funding to education, human rights, and immigration among other areas. As I communicated to the University of Delaware community in my Jan. 28 campus message and my Feb. 3 comments to the Faculty Senate, we will do everything we can to minimize disruption to UD students, faculty and staff while remaining in compliance with federal law.
To support our community, we have created this resource page that will be updated regularly with information for UD students, faculty and staff regarding ongoing federal actions, directives and developments, including guidance in response to changing conditions. Also, this page from the Research Office contains specific guidance related to research projects and grants. In parallel, we will continue to advocate on behalf of the Universitys interests regarding any impact that federal or state actions could have on our students, faculty and staff.
One example is our response this week related to the federal action to impose a 15 limit on reimbursements for indirect administrative costs (Facilities and Administrative, or F&A costs) for all National Institutes of Health (NIH) research grants. This immediate cut in funding would have a devastating impact on all biomedical, health and life science advances and human wellness, including here at UD. In response, the Delaware Attorney General filed a lawsuit jointly with 21 other state attorneys general. The University supported the Attorney Generals lawsuit by submitting a declaration detailing the impact of the NIH rate cap on the institution. Fortunately, the attorneys general were successful, and a temporary restraining order was granted on Monday. Further, the Association of Public and Land-grant Universities, the Association of American Universities, and the American Council on Education announced a similar lawsuit.
As we navigate this rapidly evolving landscape together, our values will continue to be at the heart of our community. We will continue to foster an atmosphere that promotes the free exchange of ideas and opinions; we will continue to welcome and value people of different backgrounds, perspectives and learning experiences; and we will continue to encourage respect and civility toward everyone.
Please know that my leadership team and I are here to help and support our community during this time. Feel free to submit any questions pertaining to these matters here, and we will do our best to add relevant information on the resource pages. I deeply appreciate your resilience and patience as we continue to work together to advance the important mission of our University.
Sincerely,
Dennis AssanisPresident
University of Delaware   •   Newark, DE   •   udel.edu

View file

@ -0,0 +1,87 @@
Subject: Extending condolences and offering support
Date: 2025_04_29_230614
Extending condolences and offering support
April 29, 2025
Dear UD Community,
It is with a heavy heart that we share this information with you. Earlier today, a University of Delaware student died in a traffic accident on Main Street near campus, and several other people, including other UD students, suffered injuries. There is no ongoing threat to the University community.
University of Delaware Police are continuing to work with the Newark Police Department, which is actively investigating the incident. As a result, information is limited and the Newark Police Department is not releasing the victims names at this time, pending family notification.
This is a terrible tragedy for everyone in our UD community. We speak for the entire University in offering our condolences to the families, friends and classmates of the victims, and keep the other members of our community in our thoughts who may have witnessed the crash and its aftermath. The safety of our entire community remains our top priority, and we will continue to work with our partners in city and state government to address safety concerns around and on the UD campus. 
As we all begin to cope with this traumatic incident, we encourage you to support one another and reach out for additional help from the UD resources listed below as needed.
Sincerely,
Dennis AssanisPresident
José-Luis RieraVice President for Student Life
Support and resources
Center for Counseling and Student Development
Counselors and Student Life staff are available in Warner Hall 101 on Wednesday, April 30, from 9 a.m. to 3 p.m. for counseling services.
TimelyCare — A virtual health and wellbeing platform available 24/7 for UD students
Student Advocacy and Support — Available to assist students who need support navigating University resources or complex issues. Call 302-831-8939 or email studentsupport@udel.edu to schedule an appointment.
ComPsych® GuidanceResources® — Mental health support for UD benefited employees. Access services through the link or call 877-527-4742 for support.
Additional safety and wellness resources — Information about UD Police, Student Health Services and other services.
Information about the UD Alert, the LiveSafe app and safety notification communication.
University of Delaware   •   Newark, DE   •   udel.edu

View file

@ -0,0 +1,76 @@
Subject: Sharing our grief, enhancing safety
Date: 2025_04_30_160615
Sharing our grief, enhancing safety
April 30, 2025
Dear UD Community,
Since last evenings crash on Main Street that took the life of a University of Delaware graduate student (whose identity is being withheld at this time) and injured several others, we have been struggling to cope with the pain of this senseless tragedy. Throughout the UD community, we are all feeling the deep ache of loss, and we will continue to work through our grief together.
Today, Newark Police announced an arrest in connection with the crash, reiterating that there is no ongoing threat to the community. 
Main Street is where we eat, shop and share our lives with our friends, families and classmates. Because it is part of the states roadway systems, we have been working with local and state officials this year, including our partners at Delaware Department of Transportation, to address traffic safety on and around Main Street. In the wake of this tragedy, we will reinforce and accelerate those efforts. We recognize there isnt a simple solution, particularly when these tragedies involve actions taken by individuals that may not be stopped by changes to roadways or infrastructure. However, this incident underscores that our collective efforts must take on renewed urgency.
University leaders joined Delaware Attorney General Kathy Jennings and Newark Mayor Travis McDermott today for a press conference, at which we expressed our shared commitment to enhanced safety along Main Street. The University has pledged to continue these discussions through meetings with the offices of AG Jennings and Mayor McDermott, in addition to DelDOT, in the near future. The University remains committed to advancing meaningful solutions, while the Universitys Division of Student Life and Graduate College are connecting with students about effective advocacy, civic engagement and partnerships in order to support these efforts.
We are also aware that members of the UD community may have witnessed the crash and its aftermath or have close relationships with the victims. We encourage everyone to become familiar with and use, as needed, the available University counseling and support resources that were shared in Tuesday evenings message to the UD community. Counseling services are available at Warner Hall and through TimelyCare anytime, 24/7. Students with physical injuries or medical concerns relating to the incident can contact Student Health Services at 302-831-2226, Option 0, or visit Laurel Hall to meet with triage nurses available until 5 p.m. After hours, students can contact the Highmark Nurse line at 888-258-3428 or visit local urgent care centers (Newark Urgent Care at 324 E. Main Street, or ChristianaCare GoHealth at 550 S. College Avenue, Suite 115).
During this difficult time in our community, we all need to continue supporting and standing by one another as we move forward together.
Sincerely,
Dennis AssanisPresident
Laura CarlsonProvost
José-Luis RieraVice President for Student Life
University of Delaware   •   Newark, DE   •   udel.edu