Backing up Domino

Overview

The information necessary to recover Domino projects and data is stored in four systems, each of which has its own backup process. Deployments in the cloud will have these systems automatically backed up to AWS Simple Storage Service (S3) at least once per day.

The four storage systems are:

  • Blob Store: stores the files users upload or add to their projects
  • Git Server: stores the revision history of all projects
  • MongoDB: stores user accounts, run history, comments, environments, and configuration
  • etcd: key-value database used by Kubernetes to maintain quorum and verify consistency

See below for more information on each service, plus instructions on how to perform manual backups for on-premises deployments.

Using the domino-deployment CLI

The easiest way to manually trigger a backup of an entire Domino deployment that has access to AWS, is with the domino-deployment command line interface. This command line tool is likely available on an instance somewhere in your deployment. Ask Domino Support or your Customer Success Engineer if you do not know where to find it. To trigger a backup of your deployment, run:

bin/domino-deployment backup

This will trigger backups of MongoDB and Git to S3, and it will create Amazon Machine Image (AMI) snapshots of the host servers in your deployment. For more information on all of the options and behaviors in this command, read the domino-deployer CLI reference document.

Blob Store

The blob store contains the content of the files that users have in their projects. This is distinct from the Git server, which organizes those projects into revisions.

Cloud backups

For AWS installations, the blob store itself resides directly on Amazon S3. S3 provides durable, high-availability storage such that backups are transparent to the user. Blob store files stored in S3 by Domino are never modified or deleted by Domino.

Manual on-premise backups

For on-premises installations, the blob store is a directory on disk, either on a network (NFS) drive, or on the local drive of a single server installation. Its size depends on the project data in Domino and can be up to many TBs. It can be backed up without downtime with the following procedure:

  1. Determine the location of the blob store on disk:
    1. Sign in to the Domino web interface as a user with administrative privileges
    2. Click your username on the top right, then click Admin
    3. Click Advanced, then click Central Config.
    4. Look for the line with key com.cerebro.domino.blobStorageMedium. Ensure that its value is FileSystem.
    5. Look for the line with key com.cerebro.domino.blobFileRoot. Its value will be the path to the blob store, e.g. /domino/blobs.
  2. SSH into the server hosting the Domino web interface. This will be the address you use to load the Domino application in your browser. If you do not have access to the private SSH key for this deployment, contact your administrator or Domino Support.
  3. On this host, the blob store directory (e.g. /domino/blobs per the example above) is the directory that you need to back up to preserve all project files. The blob store can get very large in an active Domino deployment, so setting up an incremental backup process where only new files are copied is recommended. Once created, files in the blob store are never modified or deleted by Domino.

Git server

The Git server contains the revision history of all projects. It tracks changes to files and is used to reconstruct your project at a particular state when you sync or browse files in the UI. It contains only metadata, in the form of links to files that are stored in the blob store.

Cloud backups

For AWS deployments, Git server data is automatically backed up daily and uploaded to S3. The most recent few days of backup are also available on the central server under /domino/backup/git. To confirm backups are being generated, SSH into the server hosting the Domino web interface, and run ls /domino/backup/git/. You should see output like this:

ls /domino/backup/git/
20171112-0651.tar.gz  20171115-0650.tar.gz  20171118-0637.tar.gz
20171113-0649.tar.gz  20171116-0647.tar.gz  20171119-0645.tar.gz
20171114-0630.tar.gz  20171117-0642.tar.gz  20171120-0628.tar.gz

Manual on-premise backups

On-premises deployments will need to set up their own backups for the Git server. Backing up the Git server involves backing up the directory which contains all of its Git repositories. Depending on the number of projects and revisions, this directory can be larger than 100MB. It can be backed up without downtime with the following procedure:

  1. Determine the Git server address:
    1. Sign in to Domino’s web interface as a user with administrative privileges
    2. Click your username on the top right, then click Admin
    3. Click Advanced, then click Central Config.
    4. Look for the line with key com.cerebro.domino.internalGitRepoHost. Its value should be something like http://10.0.13.163:9001. The server address in this example would be 10.0.13.163. Record this address for use in the next step.
  2. SSH into the Git server using the address from the previous step.
  3. Navigate to /domino/git. Within it there should be a directory called projectrepos. If /domino/git does not exist, search for projectrepos system-wide by running:

    find / -name projectrepos.

  4. This is the directory that you need to back up. You can compress it to a single file by running:

    tar czf domino_git_backup.tar.gz projectrepos

MongoDB

MongoDB contains user account information, run history and textual output (stdout), comments, environments, and system configuration. In addition to user data and other app-related functionality, it contains a “projects” collection that links each project in the UI to a git repository.

Cloud backups

For AWS deployments, MondoDB data is automatically backed up daily and uploaded to S3. The most recent few days of backup are also available on the central server under /domino/backup/mongodb. To confirm backups are being generated, SSH into the server hosting the Domino web interface, and run ls /domino/backup/mongodb/. You should see output like this:

ls /domino/backup/mongodb/
20171112-0651.tar.gz  20171115-0650.tar.gz  20171118-0637.tar.gz
20171113-0649.tar.gz  20171116-0647.tar.gz  20171119-0645.tar.gz
20171114-0630.tar.gz  20171117-0642.tar.gz  20171120-0629.tar.gz

Manual on-premise backups

For on-premises deployments, backing up the Mongo database involves exporting the contents of the database to disk and then backing up the resulting files. Depending on the number of runs and the size of run outputs in Domino, the Mongo database can be larger than 10GB. It can be backed up without downtime with the following procedure:

  1. Determine the Mongo server address and the password for the domino Mongo user:
    1. Sign in to Domino’s web interface as a user with administrative privileges
    2. Click your username on the top right, then click Admin
    3. Click Advanced, then click Central Config.
    4. Look for the line with key mongodb.default.uri. Its value should be something like:

      mongodb://domino:kyf9C9b6OJ9gwI1GfNIb9iWUOLSSZK31@13.0.128.30:27017/domino?connectTimeoutMS=60000&socketTimeoutMS=60000&maxPoolSize=1000
      

      The password is delimited by domino: on the left and @ on the right. In the example above the password is kyf9C9b6OJ9gwI1GfNIb9iWUOLSSZK31. Record your database’s password for use in the next step.

      The server address is delimited by @ on the left and :27017 on the right. In the example above the address is 13.0.128.30. Record this address for use in the next step.

  2. SSH into the Mongo server using the address from the previous step.

  3. Navigate to a directory with enough free space, e.g. /tmp.

  4. Execute the following command to export the Mongo data to disk, replacing $PASSWORD with the password you retrieved and recorded earlier:

    mongodump -u domino -p $PASSWORD -d domino -o domino_mongo_backup

    This command exports the database contents to a new directory named domino_mongo_backup. This is the directory you need to back up. You can compress it to a single file by running:

    tar czf domino_mongo_backup.tar.gz domino_mongo_backup

etcd

etcd is the key-value database used by Kubernetes to maintain quorum and verify consistency between hosts in the Domino deployment. etcd requires at least three nodes that must be in constant communication to ensure consistency. Backups allow etcd to restore quorum and recover data in a situation where members have irrevocably lost access to each other.

Cloud backups

For AWS deployments, etcd data is automatically backed up daily and uploaded to S3. The most recent few days of backup are also available on the central server under /domino/backup/etcd. To confirm backups are being generated, SSH into the server hosting the Domino web interface, and run ls /domino/backup/etcd/. You should see output like this:

ls /domino/backup/etcd/
20180227-0625  20180301-0625  20180303-0625
20180305-0625  20180307-0625  20180309-0625
20180311-0625  20180313-0625  20180315-0625

Manual on-premise backups

Instructions for performing manual etcd backups for on-premises deployments are under development and will be available in a future version of this document.

Was this article helpful?
0 out of 0 found this helpful