Menu Close Get Metabase

Serialization: preloading dashboards in a new Metabase instance

Oct 9, 2020 by The Metabase Team

This article covers how to use Metabase Enterprise’s serialization feature to copy questions, dashboards, collections, settings, and more from one Metabase instance to a new Metabase instance.

Metabase serialization

Many Enterprise customers use Metabase in a multi-tenant environment that requires uploading a predefined set of questions or dashboards, either to set up a new Metabase instance, or a new database connection. This article will cover how to:

  1. Create a default set of questions and dashboards.
  2. Export those dashboards.
  3. Re-import those dashboards to a new instance.

Specifically, we’ll use the dump and load commands in Metabase’s serialization feature to perform parts 2 and 3, plus a little bit of manual curation of the exported files.

We’ll use Docker to run our Metabase environments, and use the open source PostgresSQL for our application databases. We don’t recommend using the default H2 database for production (H2 ships with Metabase because it’s a lightweight database that makes it easy to get users up and running with Metabase.)

If you’re using a trial version of Metabase Enterprise, be sure to enter your Metabase Enterprise Trial Token to activate Enterprise features.

The plan

We’ll create a Metabase instance (our origin environment), create a dashboard, and load that dashboard into a new Metabase instance (our target environment). Here’s the plan:

  1. Create a dedicated network called metanet.
  2. Spin up two instances of Metabase: origin and target.
  3. Create dashboards and collections in the origin environment.
  4. Dump the data from the origin environment.
  5. Load the origin dump into the target environment.
  6. Verify that our dashboard and collection is loaded in the target environment.

Prerequisites

You’ll need to have Docker installed on your machine.

Step 1 - Create a dedicated network

To create a dedicated network called “metanet”, run the following command from your terminal of choice:

docker network create metanet

You can confirm the network was created with:

docker network ls

The network will have a local scope and a bridge driver.

Step 2 - Spin up two instances of Metabase

Spin up two Metabase environments, origin and target (though you can name these environments whatever you like).

Note that we use --rm -d when creating these Docker containers so they both get removed when you stop them and run in the background. Feel free to change those flags to modify that behavior.

Origin environment

Create the Postgres database:

docker run --rm -d --name postgres \
    -p 5433:5432 \
    -e POSTGRES_USER=metabase \
    -e POSTGRES_PASSWORD=knockknock \
    --network metanet \
    postgres:12

Create the Metabase origin instance, and connect it to Postgres database we just created:

docker run --rm -d --name metabase-origin \
    -p 5001:3000 \
    -e MB_DB_TYPE=postgres \
    -e MB_DB_DBNAME=metabase \
    -e MB_DB_PORT=5432 \
    -e MB_DB_USER=metabase \
    -e MB_DB_PASS=knockknock \
    -e MB_DB_HOST=postgres \
    --network metanet \
    metabase/metabase-enterprise:v1.36.7

You can check the container’s logs to view the container’s progress:

docker logs metabase-origin

Once you see the line that contains “Metabase initialization COMPLETE”, you can open a browser to http://localhost:5001 to view your Metabase instance.

Target environment

Setting up a target environment is similar. On our metanet network, we’ll set up a Postgres database to serve as our application database, then spin up another instance of Metabase in another Docker container.

Note the changes to:

  • ports for both Postgres (5434) and the Metabase server (5002)
  • Instance names: postgres-target and metabase-target

Application database:

docker run --rm -d --name postgres-target \
    -p 5434:5432 \
    -e POSTGRES_USER=metabase \
    -e POSTGRES_PASSWORD=knockknock \
    --network metanet postgres:12

Metabase instance:

docker run --rm -d --name metabase-target \
    -p 5002:3000 \
    -e MB_DB_TYPE=postgres \
    -e MB_DB_DBNAME=metabase \
    -e MB_DB_PORT=5432 \
    -e MB_DB_USER=metabase \
    -e MB_DB_PASS=knockknock \
    -e MB_DB_HOST=postgres-target \
    --network metanet \
    metabase/metabase-enterprise:v1.36.7

After our Metabase instances complete their initialization, we should now have two Metabase environments up and running:

Add users to our metabase-origin environment

Let’s add some users to our metabase-origin instance: one Admin account, and two basic users.

You can add users to your Metabase environment manually (i.e., in the Metabase application), but here’s a quick bash script that creates an Admin user (the initial user) and two basic users:

#!/bin/sh

ADMIN_EMAIL=${MB_ADMIN_EMAIL:-admin@metabase.local}
ADMIN_PASSWORD=${MB_ADMIN_PASSWORD:-Metapass123}

METABASE_HOST=${MB_HOSTNAME}
METABASE_PORT=${MB_PORT:-3000}

echo "⌚︎ Waiting for Metabase to start"
while (! curl -s -m 5 http://${METABASE_HOST}:${METABASE_PORT}/api/session/properties -o /dev/null); do sleep 5; done

echo "😎 Creating admin user"

SETUP_TOKEN=$(curl -s -m 5 -X GET \
    -H "Content-Type: application/json" \
    http://${METABASE_HOST}:${METABASE_PORT}/api/session/properties \
    | jq -r '.["setup-token"]'
)

MB_TOKEN=$(curl -s -X POST \
    -H "Content-type: application/json" \
    http://${METABASE_HOST}:${METABASE_PORT}/api/setup \
    -d '{
    "token": "'${SETUP_TOKEN}'",
    "user": {
        "email": "'${ADMIN_EMAIL}'",
        "first_name": "Metabase",
        "last_name": "Admin",
        "password": "'${ADMIN_PASSWORD}'"
    },
    "prefs": {
        "allow_tracking": false,
        "site_name": "Metawhat"
    }
}' | jq -r '.id')


echo -e "\n👥 Creating some basic users: "
curl -s "http://${METABASE_HOST}:${METABASE_PORT}/api/user" \
    -H 'Content-Type: application/json' \
    -H "X-Metabase-Session: ${MB_TOKEN}" \
    -d '{"first_name":"Basic","last_name":"User","email":"basic@somewhere.com","login_attributes":{"region_filter":"WA"},"password":"'${ADMIN_PASSWORD}'"}'

curl -s "http://${METABASE_HOST}:${METABASE_PORT}/api/user" \
    -H 'Content-Type: application/json' \
    -H "X-Metabase-Session: ${MB_TOKEN}" \
    -d '{"first_name":"Basic 2","last_name":"User","email":"basic2@somewhere.com","login_attributes":{"region_filter":"CA"},"password":"'${ADMIN_PASSWORD}'"}'

echo -e "\n👥 Basic users created!"

You’ll need to have jq installed to handle the JSON in this script. Save the above code as create_users.sh, and make it executable (chmod +x create_users.sh), then run:

MB_HOSTNAME=localhost MB_PORT=5001 ./create_users.sh

With your metabase-origin instance up, and your users created, open up http://localhost:5001, and sign in as the admin user you created:

  • Email: admin@metabase.local
  • Password: Metapass123

You should see a fresh instance of Metabase (figure 1).

Figure 1. A fresh instance of Metabase.
Figure 1. A fresh instance of Metabase.

Step 3 - Create dashboards and collections in the origin environment

We’ll need some application data to export, so let’s create some dashboards using the Sample Dataset included with Metabase. Or rather, let’s let Metabase create some dashboards for us!

As shown in figure 2, in the TRY THESE X-RAYS BASED ON YOUR DATA section, click on the card with a lightning bolt that says A look at your Products table.

Metabase will generate a set of questions for you that you can save as a dashboard (figure 2).

Figure 2. An X-ray of the Products table in the Sample Dataset included with Metabase.
Figure 2. An X-ray of the Products table in the Sample Dataset included with Metabase.

Click on the green Save this button, and Metabase will save the dashboard and its questions in a collection titled A look at your Products table.

This collection will be saved to a parent collection titled Automatically Generated Dashboards. You can find this collection by clicking on the Metabase logo in the upper left of the navigation bar to return to Metabase home. From the home page, in the Our Analytics section, click on the Automatically Generated Dashboards section, from there you should see the collection A look at your Products table (figure 3).

Figure 3. A collection titled <strong>A look at your Products table</strong>.
Figure 3. A collection titled A look at your Products table.

Next, create a new collection. Call it whatever you like. We’ll give our collection the exciting name of Default Collection, and save it to the Our Analytics collection.

Figure 4. Creating a new collection, titled <strong>Default Collection</strong>.
Figure 4. Creating a new collection, titled Default Collection.

Then we’ll move our A look at your Products table collection to our newly created Default Collection (figure 5).

Figure 5. Adding the <strong>A look at your Products Table</strong> collection to the <strong>Default Collection</strong>.
Figure 5. Adding the A look at your Products Table collection to the Default Collection.

Step 4 - Dump from origin environment

Here’s where we actually start using Metabase’s serialization feature.

With our metabase-origin instance set up with some questions, now it’s time to dump this data and load it into our metabase-target environment. That way we don’t have to manually recreate our Default Collection in the target environment.

Let’s first create a directory in our /tmp directory called metabase_data to store our dump:

cd /tmp
mkdir metabase_data

Next, we’ll run the dump command.

docker run --rm --name metabase-dump --network metanet -e MB_DB_CONNECTION_URI="postgres://postgres:5432/metabase?user=metabase&password=knockknock" -v "/tmp/metabase_data:/target" metabase/metabase-enterprise:v1.36.4 "dump /target"

This command creates a temporary metabase instance called metabase-dump. This temporary Metabase instance will connect to the Postgres application database for the metabase-origin environment, and export the environment’s data.

If all goes well, after a few seconds you should see some output, followed by a message in your terminal that says “Finished running data migrations.”

To verify the dump, cd into your directory: /tmp/metabase_data. You should see two directories, and three YAML files:

Manifest

The manifest file contains some basic information about the environment:

serialization-version: 1
metabase-version:
  date: '2020-08-19'
  tag: v1.36.4
  branch: enterprise-release-1.36.x
  hash: 0324e9c

Settings

The settings file contains a number of options that you can configure when setting up a new instance. Here’s a peek at the settings file:

enable-whitelabeling?: null
jwt-enabled: 'false'
ldap-host: null
jwt-attribute-email: null
engines: null
application-colors: '{}'
enable-embedding: 'false'
jwt-shared-secret: null
enable-xrays: 'true'
...

Databases

This directory contains all of metadata settings for your connected databases. In this case, we only have the Sample Dataset included with Metabase.

Collections

In the collections directory, we’ll find the data we set up.

Here’s our Default_collection.yaml:

description: A default collection that features our default questions.
archived: false
slug: default_collection
color: '#509EE3'
name: Default collection
namespace: null

Here’s a peek at an example question, titled Days when Products were added:

enable_embedding: false
visualization_settings:
  graph.series_labels:
  - number
  graph.metrics:
  - count
  graph.dimensions:
  - CREATED_AT
  graph.colors:
  - '#509EE3'
  graph.x_axis.title_text: Created At by day of the month
dataset_query:
  type: query
  database: /databases/Sample Dataset
  query:
    source-table: /databases/Sample Dataset/schemas/PUBLIC/tables/PRODUCTS
    breakout:
    - - datetime-field
      - - field-id
        - /databases/Sample Dataset/schemas/PUBLIC/tables/PRODUCTS/fields/CREATED_AT
      - day-of-month
    aggregation:
    - - count
name: Days when Products were added
archived: false
collection_position: null
database_id: /databases/Sample Dataset
embedding_params: null
table_id: /databases/Sample Dataset/schemas/PUBLIC/tables/PRODUCTS
...

Step 5 - Load into target environment

You’ll need at least one admin account loaded into our metabase-target in order to upload a dump. You can login via the app to create that user, or use the script we used above. Just remember to change the MB_PORT to 5002, since that’s the port we assigned to our metabase-target environment. E.g., cd into the directory where you saved your create_users.sh script, and run:

MB_HOSTNAME=localhost MB_PORT=5002 ./create_users.sh

We can upload all of these settings into the target environment, but let’s assume we only want to load our default collection.

Let’s copy our /tmp/metabase_data directory so we can keep the original contents and make changes to the copy.

cp -r /tmp/metabase_data /tmp/serialize_load

Change into our new /tmp/serialize_load directory. We’re going to make two changes:

  1. Since every Metabase instance includes the Sample Dataset, and we didn’t make any changes to the metadata, let’s delete the databases directory. From within the /tmp/serialize_load directory, run rm -r databases.
  2. Additionally, let’s remove the Automatically Generated Dashboards collection, since we’re only interested in uploading our Default collection:
cd /tmp/serialize_load/collections/root/collections && rm -r Automatically\ Generated\ Dashboards/

To verify the changes, you can run diff to see the changes between the original serialized_001 directory, and the serialized_load directory you’ll use to load into the metabase-target environment:

diff -r metabase_data serialize_load

And you should see the following:

Only in metabase_data/collections/root/collections: Automatically Generated Dashboards
Only in metabase_data: databases

Now, with our /tmp/serialize_load directory set, we can run the load command to load the metadata into out target environment, metabase-target.

docker run --rm --name metabase-dump --network metanet -e MB_DB_CONNECTION_URI="postgres://postgres-target:5432/metabase?user=metabase&password=knockknock" -v "/tmp/serialize_load:/target" metabase/metabase-enterprise:v1.36.4 "load /target"

Step 6 - Verify dashboard and collection in target environment

Now, if you log in to the target environment at http://localhost:5002, you should see our Default collection ready to go, containing our A look at your Products table collection.

And that’s it: you’ve preloaded a fresh instance of Metabase with a collection containing a dashboard full of questions!

Serialization limitations

Just note that serialization dumps do not contain certain data:

  • Permission settings
  • User accounts or settings
  • Alerts on saved questions
  • Personal Collections or their contents

Other use cases for serialization

Using the serialization feature to export questions and dashboards opens up some cool possibilities, including:

  • Adding version control to questions and dashboards. You can check in the downloaded metadata to a repository, and manage changes to that data via version control software like git.
  • Setting up a staging environment for Metabase. You can play around with a staging environment until you’re happy with the changes, then export the metadata, and upload it to a production environment.

Play around with the serialization feature, and let us know how you’re using it on our on our forum.

Further reading