Bible_Project
A small utility that fetches a Bible verse each hour, uses an Ollama LLM to generate a reflection question, saves the verse+question to ~/bible_verse.txt, and sends a desktop notification.
app.py
— main Python scriptrequirements.txt
— Python dependenciesbible_verse.txt
— output file written to the home directory (created by the script)
Python dependencies
- Install Python packages from
requirements.txt
(recommended inside a virtualenv):
python3 -m venv venv
source venv/bin/activate
pip install --upgrade pip
pip install -r requirements.txt
System dependencies
notify-send
(usually provided bylibnotify-bin
on Debian/Ubuntu) — used to send desktop notifications.- Ollama model server running and reachable at
http://localhost:11434/api/generate
(configurable inapp.py
).
On Debian/Ubuntu you can install notify-send with:
sudo apt update
sudo apt install libnotify-bin
- Edit
app.py
to changeOLLAMA_MODEL
orOLLAMA_URL
if your model or host/port differ. - The script writes to
~/bible_verse.txt
by default. UpdateOUTPUT_FILE
inapp.py
if you need a different path.
Run the script directly (recommended inside the virtualenv):
source venv/bin/activate
python app.py
The script warms up the model, fetches a verse immediately, writes ~/bible_verse.txt
, sends a desktop notification, and then continues to run the hourly task.
Place a unit file at ~/.config/systemd/user/bible.service
with the following content (adjust paths as needed):
[Unit]
Description=Bible Verse hourly updater
After=network.target
[Service]
Type=simple
WorkingDirectory=%h/Desktop/Bible_Project
ExecStart=%h/Desktop/Bible_Project/venv/bin/python %h/Desktop/Bible_Project/app.py
Restart=on-failure
[Install]
WantedBy=default.target
Enable and start the user service:
systemctl --user daemon-reload
systemctl --user enable --now bible.service
- If Conky is reading
~/bible_verse.txt
and you don't see the verse, prefer the full path in your Conky config:/home/<your-username>/bible_verse.txt
. - If long lines overflow in Conky, use
fold
to wrap lines in your Conky config, for example:
${execp fold -s -w 60 /home/stephen-awili/bible_verse.txt}
- Ollama may take time to load the model on first run. The script handles timeouts and returns a friendly message if the model is still warming up.
Skills Used:
Python Programming
Api Integration
AI Integration
Automation
System Integration
Subskills:
Automation