Skip to content

Troubleshooting

This guide helps you resolve common issues when using BMLibrarian.

Database Connection Issues

"Connection refused" or "Could not connect to server"

Symptoms:

psycopg.OperationalError: connection to server at "localhost" (127.0.0.1),
port 5432 failed: Connection refused

Solutions:

  1. Check if PostgreSQL is running:

    brew services list | grep postgresql
    brew services start postgresql
    
    sudo systemctl status postgresql
    sudo systemctl start postgresql
    
    net start postgresql-x64-14
    
  2. Verify connection parameters:

    psql -h localhost -p 5432 -U your_username -d postgres
    

  3. Check PostgreSQL configuration:

    • Ensure postgresql.conf has listen_addresses configured
    • Check pg_hba.conf for authentication settings

"Password authentication failed"

Solutions:

  1. Verify credentials:

    psql -h localhost -U your_username -d postgres
    

  2. Check environment variables:

    echo $POSTGRES_USER
    echo $POSTGRES_PASSWORD
    

  3. Reset password if needed:

    ALTER USER your_username PASSWORD 'new_password';
    

"Database does not exist"

Solutions:

  1. For first-time setup, use migrate init instead of migrate apply
  2. Create database manually:
    CREATE DATABASE bmlibrarian_dev;
    

Permission Issues

"Permission denied to create database"

Solutions:

  1. Grant CREATEDB privilege:

    ALTER USER your_username CREATEDB;
    

  2. Use superuser for initial setup:

    bmlibrarian migrate init --user postgres --password admin_password
    

"Permission denied for table"

Solutions:

GRANT ALL PRIVILEGES ON DATABASE bmlibrarian_dev TO your_username;
GRANT ALL PRIVILEGES ON ALL TABLES IN SCHEMA public TO your_username;
GRANT ALL PRIVILEGES ON ALL SEQUENCES IN SCHEMA public TO your_username;

Ollama Issues

"Failed to connect to Ollama"

Symptoms:

ConnectionError: Failed to connect to Ollama server

Solutions:

  1. Start Ollama:

    ollama serve
    

  2. Verify Ollama is running:

    curl http://localhost:11434/api/tags
    

  3. Check Ollama URL in configuration:

    {
      "general": {
        "ollama_base_url": "http://localhost:11434"
      }
    }
    

"Model not found"

Solutions:

  1. List available models:

    ollama list
    

  2. Pull required model:

    ollama pull gpt-oss:20b
    ollama pull medgemma4B_it_q8:latest
    

  3. Check model name in configuration matches exactly

Extension Issues

"Extension does not exist" (pgvector)

Solutions:

  1. Install pgvector:

    sudo apt-get install postgresql-14-pgvector
    
    brew install pgvector
    
  2. Enable extensions:

    CREATE EXTENSION IF NOT EXISTS pgvector;
    CREATE EXTENSION IF NOT EXISTS pg_trgm;
    CREATE EXTENSION IF NOT EXISTS "uuid-ossp";
    

  3. Check available extensions:

    SELECT * FROM pg_available_extensions WHERE name LIKE '%vector%';
    

GUI Issues

Application Won't Start

Solutions:

  1. Check Python version:

    python --version  # Must be 3.12+
    

  2. Verify dependencies:

    uv sync
    

  3. Check logs:

    cat ~/.bmlibrarian/gui_qt.log
    

  4. Reset configuration:

    rm ~/.bmlibrarian/gui_config.json
    

No Tabs Appear

Solutions:

  1. Check configuration file exists
  2. Verify enabled_plugins list is not empty
  3. Check logs for plugin loading errors
  4. Reset configuration to defaults

Theme Not Changing

Solutions:

  1. Restart the application (required for theme changes)
  2. Verify theme setting in configuration
  3. Check stylesheet files exist

UI Freezing

Solutions:

  1. Wait for operation to complete (background processing)
  2. Check Ollama is responding
  3. Check database query performance
  4. Restart application if frozen

Performance Issues

Slow Searches

Solutions:

  1. Add database indexes:

    CREATE INDEX IF NOT EXISTS idx_pubmed_title_trgm
    ON pubmed_articles USING gin (title gin_trgm_ops);
    

  2. Use faster models for initial testing

  3. Reduce search result limits
  4. Check database statistics:
    ANALYZE pubmed_articles;
    

Out of Memory

Solutions:

  1. Use smaller models (7B-8B parameters)
  2. Reduce max_tokens settings
  3. Lower batch_size for scoring
  4. Process fewer documents at once

Environment Issues

"Environment variable not set"

Solutions:

  1. Set required variables:

    export POSTGRES_USER=your_username
    export POSTGRES_PASSWORD=your_password
    export POSTGRES_HOST=localhost
    export POSTGRES_PORT=5432
    export POSTGRES_DB=knowledgebase
    

  2. Use .env file:

    cat > .env << EOF
    POSTGRES_USER=your_username
    POSTGRES_PASSWORD=your_password
    POSTGRES_HOST=localhost
    POSTGRES_PORT=5432
    POSTGRES_DB=knowledgebase
    EOF
    

"Module not found"

Solutions:

  1. Install missing dependencies:

    uv sync
    

  2. Activate virtual environment:

    source .venv/bin/activate
    

  3. Reinstall package:

    uv pip install -e .
    

Debugging Tools

Useful Commands

# Check PostgreSQL version
psql --version

# Test database connection
psql -h localhost -U username -d database_name -c "SELECT version();"

# Check Python environment
pip list | grep -E "(bmlibrarian|psycopg)"

# List installed Ollama models
ollama list

# Check BMLibrarian logs
tail -f ~/.bmlibrarian/gui_qt.log

Enable Debug Logging

# CLI with debug output
uv run python bmlibrarian_cli.py --debug

# GUI with debug output
uv run python bmlibrarian_qt.py --debug

Getting Help

If these solutions don't resolve your issue:

  1. Check logs for detailed error messages
  2. Verify prerequisites are installed correctly
  3. Test with minimal configuration
  4. Create a test case to reproduce the issue
  5. Report issues at GitHub