1.6 KiB
1.6 KiB
Performance Benchmark Metrics
Overview
This document describes the performance benchmark suite for core system components. The benchmarks measure:
- Task dispatcher throughput (tasks/second)
- RBAC authorization latency (milliseconds)
- SQLite CRUD operation performance (operations/second)
Running Benchmarks
pytest tests/performance/benchmarks.py -v --benchmark-enable --benchmark-json=benchmarks/results.json
Interpreting Results
Performance metrics are logged to metrics/api_performance.log with timestamps.
Key Metrics
| Component | Metric | Target | Unit |
|---|---|---|---|
| TaskDispatcher | Throughput | ≥1000 | tasks/sec |
| RBACEngine | Auth Latency | ≤5 | ms |
| SQLiteAdapter | INSERT | ≤2 | ms/op |
| SQLiteAdapter | SELECT | ≤1 | ms/op |
Baseline Targets
These targets are based on system requirements:
-
Task Dispatcher
- Must handle ≥1000 tasks/second under load
- 95th percentile latency ≤10ms
-
RBAC Authorization
- Average check time ≤5ms
- 99th percentile ≤10ms
-
SQLite Operations
- INSERT: ≤2ms average
- SELECT: ≤1ms average for simple queries
- Complex queries (joins): ≤10ms average
Performance Trends
Performance metrics are tracked over time in metrics/api_performance.log. Use this command to analyze trends:
grep "TaskDispatcher throughput" metrics/api_performance.log
Troubleshooting
If benchmarks fail to meet targets:
- Check system resource usage during tests
- Review recent code changes affecting components
- Compare with historical data in performance logs