teleo-codex/ops/pipeline-v2/lib/log.py
m3taversal 05d74d5e32 sync: import all VPS pipeline + diagnostics code as baseline
Imports 67 files from VPS (/opt/teleo-eval/) into repo as the single source
of truth. Previously only 8 of 67 files existed in repo — the rest were
deployed directly to VPS via SCP, causing massive drift.

Includes:
- pipeline/lib/: 33 Python modules (daemon core, extraction, evaluation, merge, cascade, cross-domain, costs, attribution, etc.)
- pipeline/: main daemon (teleo-pipeline.py), reweave.py, batch-extract-50.sh
- diagnostics/: 19 files (4-page dashboard, alerting, daily digest, review queue, tier1 metrics)
- agent-state/: bootstrap, lib-state, cascade inbox processor, schema
- systemd/: service unit files for reference
- deploy.sh: rsync-based deploy with --dry-run, syntax checks, dirty-tree gate
- research-session.sh: updated with Step 8.5 digest + cascade inbox processing

No new code written — all files are exact copies from VPS as of 2026-04-06.
From this point forward: edit in repo, commit, then deploy.sh.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-07 00:00:00 +01:00

48 lines
1.5 KiB
Python

"""Structured JSON logging with rotation."""
import json
import logging
import logging.handlers
from datetime import datetime, timezone
from . import config
class JSONFormatter(logging.Formatter):
"""Format log records as JSON lines."""
def format(self, record):
entry = {
"ts": datetime.now(timezone.utc).isoformat(),
"level": record.levelname,
"logger": record.name,
"msg": record.getMessage(),
}
if record.exc_info and record.exc_info[0]:
entry["exception"] = self.formatException(record.exc_info)
# Include extra fields if present
for key in ("stage", "source", "pr", "model", "cost", "event"):
if hasattr(record, key):
entry[key] = getattr(record, key)
return json.dumps(entry)
def setup_logging():
"""Configure structured JSON logging with rotation."""
config.LOG_DIR.mkdir(parents=True, exist_ok=True)
handler = logging.handlers.RotatingFileHandler(
str(config.LOG_FILE),
maxBytes=config.LOG_ROTATION_MAX_BYTES,
backupCount=config.LOG_ROTATION_BACKUP_COUNT,
)
handler.setFormatter(JSONFormatter())
# Also log to stderr for systemd journal
console = logging.StreamHandler()
console.setFormatter(logging.Formatter("%(name)s [%(levelname)s] %(message)s"))
root = logging.getLogger()
root.setLevel(logging.INFO)
root.addHandler(handler)
root.addHandler(console)