High performance computing (HPC) facilities are very useful for Monte Carlo simulations – the simulation of particle transport is an application ideal for parallel processing. This post details discusses how the HPC facilities at the Queensland University of Technology (QUT) can be used for EGSnrc simulations by students and staff. The High Performance Computing and Research Support Group have provided a number of user guides that detail how to access HPC resources such as the the network file system and the job submission queues. Depending on your UNIX experience, I’d also recommend familiarising yourself with a text editor such as VIM, for example, with this guide.

Setting up EGSnrc

Depending on your application for HPC resources, your account may already have EGSnrc installed. If not, you can install the EGSnrc and BEAMnrc user codes in your home directory using the following commands (where v4-2-3-2 is the latest version):

> cp /pkg/suse11/egs/v4-2-3-2/NewUser/profile_beam /home/$USER/.profile_beam
> echo “source .profile_beam” >> /home/$USER/.profile
> source ~/.profile_beam
> /pkg/suse11/egs/v4-2-3-2/scripts/finalize_egs_foruser
> /pkg/suse11/egs/v4-2-3-2/scripts/finalize_beam_foruser

These commands may prompt you to enter an installation directory, which should be


The first thing to do for a new EGSnrc installation should be copying over any existing files, with accelerator module specifications placed in:


and existing PEGS databases in:


Each of the accelerator module specifications need to ‘built’, using

> beam_build.exe module

where module is the name of the accelerator module specification file (without an extension). Each time beam_build.exe is executed, a new directory will be created in the EGSnrc user folder, with a name like


For each directory created, you should compile the accelerator both as an executable and a library. For the example mentioned, you would type:

> cd ~/egsnrc/BEAM_WRO_Varian_21iX_10X
> make && make library

The DOSXYZnrc user code will have already compiled during installation, but will need to be recompiled to support larger phantoms. The maximum allowable phantom size is specified in the file


This file should be edited for larger $IMAX, $JMAX and $KMAX values, such as:

REPLACE {$IMAX} WITH {256} "Maximum number of x cells
REPLACE {$JMAX} WITH {256} "Maximum number of y cells
REPLACE {$KMAX} WITH {256} "Maximum number of z cells

Once this is done DOSXYZnrc should be recompiled using the commands:

> cd ~/egsnrc/dosxyznrc
> make

Job Submission

Before submitting a job, you must decide what queue it is placed in. The following queues are available:

  • pbs_short – recommended if runtimes will be shorter than 10 hours.
  • pbs_medium – recommended if runtimes will be shorter than 100 hours.
  • pbs_long – to be otherwise used.

Jobs are submitted using the EXB script distributed with EGSNRC. Usage is:

> exb user_code input_file pegs_file batch=batch_system p=N

Where batch_system is the queue to be used, and N is the number of parts to split the job into (I suggest 10). To simulate the BEAM_WRO_Varian_21iX_10X accelerator, using the input file beam1.egsinp with the default materials database 700icru, in the short queue, in 10 parts, you would type:

> exb BEAM_WRO_Varian_21iX_10X beam1 700icru batch=pbs_short p=10

You can check whether the job was submitted successfully using qusers, or

> qstat –u $USER

The EXB command will result in the production of multiple egsrun_* links in the user code directory. These are links to directories on the work nodes. For example, for folder


links to a directory on the work node 34 on cluster 1. To access the contents of this directory (to check how many histories have been simulated, or find the problem with a simulation, you need to ssh into the node. For the above example you would type:

ssh cl1n034

Successful BEAMnrc runs result in the production of multiple phase space files. These need to be combined using the addphsp command. For the exb example above, you would type:

addphsp beam1 beam1 10

Successful DOSXYZnrc runs will automatically add the resultant pardose files together to produce a 3ddose file.