Step 1 — Log JSON, not plain text
Before you pick an aggregator, make your logs structured. Every aggregator's value is indexing fields, not substring-matching free text.
config/logging.php
'channels' => [
'stack' => [
'driver' => 'stack',
'channels' => ['json'],
'ignore_exceptions' => false,
],
'json' => [
'driver' => 'daily',
'path' => storage_path('logs/laravel.log'),
'formatter' => Monolog\Formatter\JsonFormatter::class,
'level' => env('LOG_LEVEL', 'debug'),
'days' => 14,
],
],Attach per-request context globally so every log line has a correlation ID:
app/Http/Middleware/LogContext.php
use Illuminate\Support\Facades\Log;
use Illuminate\Support\Str;
public function handle($request, Closure $next)
{
$requestId = $request->header('X-Request-Id', Str::uuid()->toString());
Log::withContext([
'request_id' => $requestId,
'user_id' => auth()->id(),
'env' => config('app.env'),
]);
return $next($request);
}Option A — Grafana Loki + Promtail
Loki is Grafana's log aggregation database. Indexes metadata (labels), not log content — cheap to run, fast at filter queries, scales horizontally. Promtail tails your log files and ships them.
promtail-config.yaml
server:
http_listen_port: 9080
positions:
filename: /tmp/positions.yaml
clients:
- url: http://loki:3100/loki/api/v1/push
scrape_configs:
- job_name: laravel
static_configs:
- targets: [localhost]
labels:
job: laravel
env: production
__path__: /var/www/storage/logs/*.log
pipeline_stages:
- json:
expressions:
level: level
message: message
request_id: request_id
- labels:
level:
request_id:Run Loki + Promtail + Grafana via Docker Compose. Expect $10-40/mo of VPS for small-to-moderate apps. Loki scales to large volume but needs object storage (S3) for the big tiers.
Option B — OpenSearch + Filebeat
OpenSearch is the Apache-licensed Elasticsearch fork. Familiar if you've used ELK. Full-text indexing of log content — expensive on storage but powerful at arbitrary search. Filebeat is the log shipper.
filebeat.yml
filebeat.inputs:
- type: filestream
paths:
- /var/www/storage/logs/*.log
parsers:
- ndjson:
overwrite_keys: true
add_error_key: true
output.elasticsearch:
hosts: ["https://opensearch:9200"]
username: "filebeat"
password: "${FILEBEAT_PASSWORD}"
index: "laravel-%{+yyyy.MM.dd}"Heavier than Loki. Expect 8GB+ RAM for a production OpenSearch node. Worth it if you need full-text search across every log byte, not just labels. Not worth it for most Laravel apps.
Option C — BYOD Postgres (NightOwl)
If you already run PostgreSQL, logs as another table is the lowest-ops option. NightOwl ships a Laravel logs watcher that writes every log record with context into your own PostgreSQL — alongside request, query, and job telemetry.
composer require nightowl/agent
php artisan nightowl:install
# logs start streaming to your Postgres immediatelyYou get structured logs in a database you already know how to query, back up, and scale. Works for teams that don't want to run a separate log stack. Not right for teams doing ingest volumes Postgres can't handle — that's typically 100+ GB/day territory.
How to pick
| If you want | Pick |
|---|---|
| Fully open-source, log-only, scales horizontally | Grafana Loki + Promtail |
| Full-text search across every log byte | OpenSearch + Filebeat |
| Logs alongside request/query/job APM data, zero extra infrastructure | NightOwl (BYOD Postgres) |
| Cheapest SaaS (not self-hosted) | Papertrail from $5/mo |
THE EASY WAY
Skip the separate log stack
NightOwl ingests Laravel logs — structured, with full request context — into a PostgreSQL database you own. You get log aggregation, request traces, query groupings, and job monitoring in one place. No separate Loki or OpenSearch cluster to operate.
Data stays in your PostgreSQL. From $5/month flat.