2. Configuration
After installation, point LevelApp at your LLM endpoints and set up credentials.
1. Create a .env
file in your repo root:
# .env
OPENAI_API_KEY=sk-...
IONOS_API_KEY=eyJ...
IONOS_ENDPOINT=https://inference.de-txl.ionos.com/models
GOOGLE_APPLICATION_CREDENTIALS=./googlecreds.json
2. Verify env vars are loaded
# Python check
python - << 'EOF'
import os, dotenv
dotenv.load_dotenv()
print("OpenAI:", bool(os.getenv("OPENAI_API_KEY")))
print("Ionos:", bool(os.getenv("IONOS_API_KEY")))
EOF
3. Model-specific config
By default, the FastAPI app will register both openai
and ionos
providers if keys are present.
You can override or add new evaluators in app.py
:
from level_core.evaluators.service import EvaluationService
from level_core.evaluators.schemas import EvaluationConfig
eval_svc.set_config("openai", EvaluationConfig(
api_key=os.getenv("OPENAI_API_KEY"),
model_id="gpt-4o-mini"
))
4. Firestore Connection Methods
LevelApp supports two ways to connect to a Firestore database. Choose one of the following methods:
Option 1: Service Account JSON Key
Add the path to your service account JSON key in the .env
file:
GOOGLE_APPLICATION_CREDENTIALS=./googlecreds.json
Option 2: Google Cloud SDK Default Credentials
Authenticate locally using the Google Cloud SDK and specify your project ID in the YAML config:
gcloud auth application-default login
database:
type: firestore
project_id: your-project-id
Important: You must configure either the Service Account JSON key or the Google Cloud SDK method. If both or neither are provided, LevelApp will raise an error.