Setting up the Vercel Integration
Monitoring your Vercel Deployment with Consolidated Structured Logging

Structured Logging with Vercel
Vercel's log drains provide a robust and scalable way to send application logs from your Vercel app to any consolidated logging system. We've developed the Logflare Vercel integration on top of log drains to make it easy for you to get logs from Vercel into your Logflare account. Once setup it's super easy to stream, search, alert on and dashboard your structured logs from Vercel.
Not only that, but you can log a structured event from your Vercel app as a stringified JSON object and we'll automatically parse that into a JSON object on our end which enables all the powerful streaming, searching, alerting and dashboaring features for each individual field of that object.
You can also setup pino-logflare
to automatically log to Logflare from the client or server
using the same logger interface. See the Isomorphic Logging
section.
Install the Integration
Visit the Logflare integration page and click `add`. This will walk you through the Logflare sign up process. You will create a Logflare source during that process. A `source` is where your Vercel logs will go.
Install the Logflare Vercel integrationSetup a Log Drain
After installing the integation you'll navigate to the installed integration configuration page in your Vercel account. Now you need to add a log drain. This is what tells Vercel to send all your logs to Logflare.

Selecting a project: you can choose to filter the logs sent to Logflare by project. You will eventually want to have each project be a `source` in your Logflare account. This lets you do that. You can, however, choose to send Logflare all logs for all projects, and use Logflare's routing to send logs do different sources. This would be useful if you don't want to setup a new log drain for each new project.
Once you've successfully added your log drain you should be able to immediately see logs start streaming into your Logflare account. Try visiting your Vercel deployment url then check your Logflare dashboard!
Visit your Logflare dashboardBuild vs Static vs Lambda Logs
Vercel gives us three main kinds of logs: build
, static
and lambda
.
You'll likely want to route all your build logs to a different source. This doesn't exclude them from the original source, it effectively copies them to another source.
To set this up add a build source and create a rule on your main Vercel source to send logs to this new build source.

So now when you deploy you should see something like this:

Vercel also gives us some static
and lambda
logs depending on if the event came
from their CDN or their serverless infrastructure. You can setup different rules for these as well but you
probably will want these in the
same source so you can have them on the same dashboard later.
Example Searches
Here are some searches you might find useful with your Vercel deployment. It doesn't take long for your events to be searchable by Logflare.
This is a search for m.source:"static"
which lets you see all the logs generated from the
Vercel CDN.

All 5xx status codes: m.proxy.statusCode:>499
All user agents with `bot` in the user agent string: m.proxy.userAgent:~"bot"
When Vercel sends lambda logs over we parse data out of that message over for you automatically into
the m.parsedLambdaMessage
object.
To see all lambda requests with a response time greater than 1,000 ms:
m.parsedLambdaMessage.report.duration_ms:>1000
Example Dashboard
Logging a Custom JSON Object
To enable even more insights from your Vercel deployment you can log any JSON object and we'll parse that accordingly.
If you do plan on logging a lot of structured data we suggest to use the Logflare Pino transport. Vercel's logging infrastructure will truncate long log lines.
When logging an object, your object keys should comply with BigQuery column requirements as we turn all keys into columns for you automatically. Specifically, a column name must contain only letters (a-z, A-Z), numbers (0-9), or underscores (_), and it must start with a letter or underscore.
When logging numbers from Javascript it's a good idea to force all numbers you plan on logging to integers. Logflare automatically manages your table schema for you, and it detects the schema based on the first time it sees a field value pair. If you log a number and it's an integer, and you log the same number later and it's a float, that event will be rejected as it didn't typematch.
For example, in a Next.js project you can use Pino to log an object.
const logger = require('pino')()
logger.info({user: {name: "Joe Schmo", email: "[email protected]", company: "Dunder Dev", id: 38}, event: {type: "request", tag: "api"}})
Which then would give you an object in Logflare like this:

So you can do a search for m.parsedLambdaMessage.lines.data.event.tag:"api"
like this:

Isomorphic Logging with Vercel, Pino and Logflare
Using pino-logflare
you can automatically log from the browser and the server using the same
logger interface.
When using pino-logflare
with Vercel be sure to intsantiate your logger with the config
described in the pino-logflare
repo.
This sets up your server side logs to be logged to stdout which Vercel picks up and forwards to Logflare via the Vercel Logflare integration. It also sets up the browser client to send logs to Logflare via an HTTP request. When configured as such, your logs will automatically be handled appropriately, depending on where the log statement appears in your code.
If you have browser logs going to the same source as server logs you can easily filter out your browser
logs
with the LQL query m.browser:true
because all browser logs from pino-logflare
have
the key-value pair browser: true
in the metadata.

Production vs Preview vs Development
When logging a custom JSON object with Pino you should separate your production logs from everything else. Environment variables make this easy.
Use Vercel's environment
variables and setup an ENV
variable for
production
, preview
and development
.
Then setup Pino to always log the environment. When instantiating, the fields in the base
object are always included with your log events.
const logger = require('pino')({
base: {
env: process.env.ENV || "ENV not set"
}
})