Bible_Project

Bible_Project
A small utility that fetches a Bible verse each hour, uses an Ollama LLM to generate a reflection question, saves the verse+question to ~/bible_verse.txt, and sends a desktop notification.

Contents

  • app.py — main Python script
  • requirements.txt — Python dependencies
  • bible_verse.txt — output file written to the home directory (created by the script)

Requirements

Python dependencies

  • Install Python packages from requirements.txt (recommended inside a virtualenv):
python3 -m venv venv
source venv/bin/activate
pip install --upgrade pip
pip install -r requirements.txt
 

System dependencies

  • notify-send (usually provided by libnotify-bin on Debian/Ubuntu) — used to send desktop notifications.
  • Ollama model server running and reachable at http://localhost:11434/api/generate (configurable in app.py).

On Debian/Ubuntu you can install notify-send with:

sudo apt update
sudo apt install libnotify-bin
 

Configuration

  • Edit app.py to change OLLAMA_MODEL or OLLAMA_URL if your model or host/port differ.
  • The script writes to ~/bible_verse.txt by default. Update OUTPUT_FILE in app.py if you need a different path.

Running

Run the script directly (recommended inside the virtualenv):

source venv/bin/activate
python app.py
 

The script warms up the model, fetches a verse immediately, writes ~/bible_verse.txt, sends a desktop notification, and then continues to run the hourly task.

Optional: create a systemd user service

Place a unit file at ~/.config/systemd/user/bible.service with the following content (adjust paths as needed):

[Unit]
Description=Bible Verse hourly updater
After=network.target

[Service]
Type=simple
WorkingDirectory=%h/Desktop/Bible_Project
ExecStart=%h/Desktop/Bible_Project/venv/bin/python %h/Desktop/Bible_Project/app.py
Restart=on-failure

[Install]
WantedBy=default.target
 

Enable and start the user service:

systemctl --user daemon-reload
systemctl --user enable --now bible.service
 

Notes & Troubleshooting

  • If Conky is reading ~/bible_verse.txt and you don't see the verse, prefer the full path in your Conky config: /home/<your-username>/bible_verse.txt.
  • If long lines overflow in Conky, use fold to wrap lines in your Conky config, for example:
${execp fold -s -w 60 /home/stephen-awili/bible_verse.txt}
 
  • Ollama may take time to load the model on first run. The script handles timeouts and returns a friendly message if the model is still warming up.
 

 

Skills Used:

Python Programming Api Integration AI Integration Automation System Integration

Subskills:

Automation

Leave Feedback

Comments

No comments yet. Be the first!