Setting Up Rendering Services and Shared Storage

Here are some additional tips on setting up an Apple Qmaster “render farm.”

Changing the Number of Rendering Service Instances

By default, Apple Qmaster enables one rendering service per core. On a dual-core system, you will have two rendering services enabled by default. So that means out of the box, two copies of Shake will be running at the same time on a dual-core system. It may be necessary to disable a rendering service. For more information on changing the number of rendering services, see the “Apple Qmaster and Distributed Processing” chapter in the Compressor User Manual, available in Compressor Help.

Note: If you are using Shake, you can always change the number of processors Shake uses at submission time by using -cpus x, where x is the number of threads to use.

Shared Volumes

All the nodes in the cluster must have a common volume to work from; otherwise they will not know where to find assets needed for any given batch. There are many ways to set up file sharing. For more information, see the Mac OS X documentation on file sharing and the Mac OS X Server documentation.

Dedicated NFS Servers

You can set up a dedicated NFS server. For more information, see the Mac OS X Server documentation.

Shared Volume Media Management with Shake

It may be necessary to relink your assets once you move your project to the shared volume. This varies depending on the types of files, plug-ins, fonts, and environmental requirements. Things that commonly need to be changed are the FileIn and FileOut paths, and env variables such as NR_INCLUDE_PATH and NR_FONT_PATH. UNC should be disabled, as these paths are typically not resolvable to nodes in the cluster when arbitrary host names are used.

To disable UNC and enable Apple Qmaster from within the Shake application
  1. Go to: ~/nreal/include/startup

  2. Create a file called: qmaster.h

  3. Add the following:

    script.uncFileNames = 0;
    sys.useRenderQueue = "Qmaster";
  4. Press Return several times after the last line.

  5. Save your work.