User Tools

Site Tools


This is an old revision of the document!

Using MFIX at the CHPC


MFIX is an open-source multiphase flow solver written in FORTRAN 90. It is used for simulating fluid-solid systems such as fluidized beds. As it appears to have numerous potential uses in chemical engineering and mineral processing applications, a limited level of support for the use of the package is now available at the CHPC, at /opt/gridware/non-supported/mfix.

Alternatively, users may download the code and install it in their home directory. A one-time free registration is required prior to downloading the source code. To register, go to the MFIX website at, click on “Register” at the bottom of the home page or go directly to

This wiki page details those issues specific to running MFIX on the Sun cluster at CHPC. For a more general explanation of the use of MFIX, consult the documentation and example cases included with the source tarball.

Building MFIX

MFIX generates a new executable for each case, which is copied in to the case directory. First cd to the case directory. Next, add modules for your choice of gcc version and the corresponding version of MPI, for instance:

module add gcc/4.7.2 openmpi/openmpi-1.6.5_gcc-4.7.2

Finally, the mfix executable is built and copied to the current case directory by the following command:

sh /opt/gridware/non-supported/mfix/model/make_mfix

if using the system mfix, or

sh <mfix_install_dir>/model/make_mfix

for other install locations.

This script prompts the user to specify a number of compile-time options, in particular the compiler used, the desired level of optimisation and the type of parallelisation used. The default compiler is gfortran. The Intel fortran compiler may also be used, but this has not yet been tested on the Sun cluster. The parallelisation options are serial, parallel shared-memory, parallel distributed memory, or parallel hybrid (shared and distributed).

At present, parallel distributed memory appears to work best; reasonable scaling seems to be obtained. Shared memory parallel works for many cases, but does not yield a significant improvement for the cases tested to date. Hybrid parallelisation is currently under development, and it is best avoided at present. It should be noted that use of the Johnson and Jackson partial slip boundary condition (BC_JJ in mfix.dat file) causes a crash for all methods of parallelisation (although it works for serial computations).

Setting up and submitting a job

The input for an MFIX case consists of an mfix.dat file, a text file which defines most or all of the properties of case (geometry, boundary and initial conditions, choice of turbulence and friction models, and so on), as well as any fortran source files containing user-extensions to the standard MFIX solver, and any additional optional files describing geometry. In most cases the mfix.dat file is sufficient, and as this file is relatively small, it may be uploaded using scp.

Below is an example of a simple PBS submit script for an mfix job.

#PBS -l select=1:mpiprocs=4:jobtype=westmere,place=free:group=nodetype
#PBS -l select=1:
#PBS -l walltime=100:00:00
#PBS -q workq
#PBS -m be
#PBS -o /export/home/agill/local/opt/mfix/mintek/grid_res/coarse/coarse.out
#PBS -e /export/home/agill/local/opt/mfix/mintek/grid_res/coarse/coarse.err
nproc=`cat $PBS_NODEFILE | wc -l`
mpirun -np $nproc -machinefile $PBS_NODEFILE $exe -parallel >"coarse.log" 2>&1

Remember to edit or add the following line to your mfix.dat file:


where NX, NY and NZ should be replaced by the number of partitions along each physical principle axis of the model, so that NX*NY*NZ is the total number of cores requested in the submit script. In general, it is best to choose NX, NY and NZ such that largest number of partitions occur along the axis/axes corresponding roughly with the average flow direction.


Mfix outputs a .RES file in the case directory, in addition to optional VTK files representing the solution. Both types of files may be opened using Paraview or a similar VTK viewer. Paraview is installed at the following location on the Sun cluster:


If postprocessing is to be done on CHPC hardware via the network, the following link may be helpful to get decent performance: Remote OpenGL visualization with TurboVNC and VirtualGL

If the postprocessing is to be performed on the user's local machine, the entire contents of the MFIX case directory should be copied to the user's machine using rsynch or scp.

/var/www/wiki/data/attic/howto/mfix.1415183848.txt.gz · Last modified: 2014/11/05 12:37 by agill