Setting up the Vercel Integration

Monitoring your Vercel Deployment with Consolidated Structured Logging

Structured Logging with Vercel

Vercel's log drains provide a robust and scalable way to send application logs from your Vercel app to any consolidated logging system. We've developed the Logflare Vercel integration on top of log drains to make it easy for you to get logs from Vercel into your Logflare account. Once setup it's super easy to stream, search, alert on and dashboard your structured logs from Vercel.

Not only that, but you can log a structured event from your Vercel app as a stringified JSON object and we'll automatically parse that into a JSON object on our end which enables all the powerful streaming, searching, alerting and dashboaring features for each individual field of that object.

Install the Integration

Visit the Logflare integration page and click `add`. This will walk you through the Logflare sign up process. You will create a Logflare source during that process. A `source` is where your Vercel logs will go.

Install the Logflare Vercel integration

Setup a Log Drain

After installing the integation you'll navigate to the installed integration configuration page in your Vercel account. Now you need to add a log drain. This is what tells Vercel to send all your logs to Logflare.

Selecting a project: you can choose to filter the logs sent to Logflare by project. You will eventually want to have each project be a `source` in your Logflare account. This lets you do that. You can, however, choose to send Logflare all logs for all projects, and use Logflare's routing to send logs do different sources. This would be useful if you don't want to setup a new log drain for each new project.

Once you've successfully added your log drain you should be able to immediately see logs start streaming into your Logflare account. Try visiting your Vercel deployment url then check your Logflare dashboard!

Visit your Logflare dashboard

Build vs Static vs Lambda Logs

Vercel gives us three main kinds of logs: build, static and labda.

You'll likely want to route all your build logs to a different source. This doesn't exclude them from the original source, it effectively copies them to another source.

To set this up add a build source and create a rule on your main Vercel source to send logs to this new build source.

So now when you deploy you should see something like this:

Vercel also gives us some static and labda logs depending on if the event came from their CDN or their serverless infrastructure. You can setup different rules for these as well but you probably will want these in the same source so you can have them on the same dashboard later.

Example Searches

Here are some searches you might find useful with your Vercel deployment. It doesn't take long for your events to be searchable by Logflare.

This is a search for m.source:"static" which lets you see all the logs generated from the Vercel CDN.

All 5xx status codes: m.proxy.statusCode:>499

All user agents with `bot` in the user agent string: m.proxy.userAgent:~"bot"

When Vercel sends lambda logs over we parse data out of that message over for you automatically into the m.parsedLambdaMessage object.

To see all labda requests with a response time greater than 1,000 ms: m.parsedLambdaMessage.report.duration_ms:>1000

Example Dashboard

Logging a Custom JSON Object

To enable even more insights from your Vercel deployment you can log any JSON object and we'll parse that accordingly.

If you do plan on logging a lot of structured data we suggest to use the Logflare Pino transport. Vercel's logging infrastructure will truncate long log lines.

When logging an object, your object keys should comply with BigQuery column requirements as we turn all keys into columns for you automatically. Specifically, a column name must contain only letters (a-z, A-Z), numbers (0-9), or underscores (_), and it must start with a letter or underscore.

For example, in a Next.js project you can use Pino to log an object.

					
  const logger = require('pino')()
  logger.info({user: {name: "Joe Schmo", email: "[email protected]", company: "Dunder Dev", id: 38}, event: {type: "request", tag: "api"}})
					

Which then would give you an object in Logflare like this:

So you can do a search for m.parsedLambdaMessage.lines.data.event.tag:"api" like this:

Production vs Preview vs Development

When logging a custom JSON object with Pino you should separate your production logs from everything else. Environment variables make this easy.

Use Vercel's environment variables and setup an ENV variable for production, preview and development.

Then setup Pino to always log the environment. When instantiating, the fields in the base object are always included with your log events.

            
  const logger = require('pino')({
    base: {
      env: process.env.ENV || "ENV not set"
    }
  })