This article provide details on running the simulation using the macOS terminal or command prompt.
Using the design environment (CAD/GUI)
-
FDTD
fdtd-solutions [options]
-
MODE
mode-solutions [options]
-
CHARGE, HEAT, DGTD and FEEM
device [options]
-
INTERCONNECT
interconnect [options]
Options
filename
- optional, the filename Opens the specified simulation or project file.
-v
- optional, Outputs the product version number.
scriptFile.lsf
- optional, the script filename Opens the specified script file.
-safe-mode
- optional, Turn on the safe mode.
-trust-script
- optional, Turn off the safe mode.
-run <scriptfile>
- optional, Run the specified script file.
-nw
-hide
- optional, (-nw for FDTD only), -hide for other solvers.
- This hides the CAD window from appearing on the Desktop.
Notes: -nw and -hide command options
|
-logall
- optional, Generates a log file for each simulation or sweep.
-exit
- optional, Exit application after running the script file.
-o
- Change the location that log files are saved to.
- All log files will be saved to the relative or absolute directory passed to -o.
- If the directory ends with .log this will be treated as a file name.
Useful when running INTERCONNECT with the -logall option.
Examples
Opening FDTD design environment (CAD/GUI)
/Applications/Lumerical\ [[verpath]].app/Contents/Applications/FDTD\ Solutions.app/Contents/MacOS/fdtd-solutions &
Running a script and 'hiding' the CAD window.
/Applications/Lumerical\ [[verpath]].app/Contents/Applications/INTERCONNECT.app/Contents/MacOS/interconnect -hide -run scriptfile.lsf
Run simulations without using MPI
-
FDTD
fdtd-engine [options]
- FDE
fde-engine [options]
- EME
eme-engine [options]
-
varFDTD
varfdtd-engine [options]
-
CHARGE
device-engine [options]
-
HEAT
thermal-engine [options]
-
DGTD
dgtd-engine [options]
-
FEEM
feem-engine [options]
Options
filename
- required, the name of the simulation or project file to run.
-t
- optional, Controls the number of threads used. If not used or left blank, it will use all available threads.
-v
- optional, Outputs the product version number.
-fullinfo
- optional, It will print more detailed time benchmarking information to the log file based on walltime and cpu time measurements.
-log-stdout
- optional, Redirects the log file data to the standard output, rather than saving it to file.
- This option will be ignored when the simulation runs in graphical mode.
-mesh-only
- optional, Mesh the geometry without running the simulation.
-inmaterialfile <file>
- optional, Load simulation mesh data from a <file>.
-outmaterialfile <file>
- optional, Save simulation mesh data to <file> for use on another project.
-logall
- optional, Create a log file for each simulation or sweep.
- Logfiles are named filename_p0.log, filename_p1.log, filename_p2.log
- By default, only filename_p0.log is created.
-mr
- optional, Print a simple memory usage report for a given simulation file to the standard output. Output can be piped or saved as a text file.*
-o
- optional, Change the location that log files are saved to.
- All log files will be saved to the relative or absolute directory passed to -o.
- If the directory ends with .log the last section will be treated as a file name.
-resume
- optional, available for FDTD simulations only.
- Resumes the simulation from the last check point.
- If no check point is found it will start the simulation job from the beginning.
- Enable the simulation checkpoint feature in the "Advanced Options" of the FDTD Solver object.
Examples
Running on the local computer with the -resume flag when check point is enabled in FDTD.
/Applications/Lumerical\ [[verpath]].app/Contents/Applications/MODE\ Solutions.app/Contents/MacOS/varfdtd-engine -t 2 -resume /path/simulationfile.fsp
Running a varFDTD simulation with 2 threads and saving the log into a specific location.
/Applications/Lumerical\ [[verpath]].app/Contents/Applications/MODE\ Solutions.app/Contents/MacOS/varfdtd-engine -t 2 example.lms -o /<path>/logfiles/
Run a FEEM simulation job
/Applications/Lumerical\ [[verpath]].app/Contents/Applications/DEVICE.app/Contents/MacOS/feem-engine example.ldev
Running simulations via the MPI
Using MPI to run the simulation job with the solver is done for the following used cases:
- Run several simulations at the same time on different machine or nodes. (Concurrent computing)
- Using several machine to run 1 single simulation to take advantage of their memory (RAM) as required by the simulation. (Distributed computing)
- Launch and run a simulation from a local machine to a different remote machine or node.
MPI is a complex application with many configuration options and versions. On macOS, MPICH2 is supplied with Lumerical products installation.
General MPI Syntax
mpiexec [mpi_options] solver [solver_options]
MPI Options
-n <#>
- FDTD and varFDTD specify the <#> number of mpi processes
-hosts <hostlist>
- FDTD and varFDTD or send job across multiple computers.
-hosts <hostfile>
- Overrides the '-n' option.
Where - hostlist : comma separated list of hosts or IP with corresponding number of processes
- hostfile : text file with 1 hostname/IP per line, with corresponding number of processes separated by a comma ':'
-nice -n19
- all solvers, specifies the process priority for load balancing.
For additional information on MPI options, consult the MPI product documentation for further details:
"/opt/lumerical/mpich2/bin/mpiexec" -help
Supported MPI variants
It is necessary to use the version of the solver that is matched to the version of MPI being used to run the solver. See the list below for details.
MPICH2
The following solver or engine executable are included with the installation for MPICH2 (nemesis), that can be used to run on remote machines or on local host.
- fdtd-engine
- fde-engine
- eme-engine
- varfdtd-engine
- device-engine
- thermal-engine
- dgtd-engine
- feem-engine
Examples
Run on local computer with 2 processes
"/opt/lumerical/mpich2/bin/mpiexec" -n 2 "/Applications/Lumerical\ [[verpath]].app/Contents/Applications/FDTD\ Solutions.app/Contents/MacOS/fdtd-engine" -t 1 /tmp/example.fsp
Running on the local computer with the -resume flag when check point is enabled in FDTD.
"/opt/lumerical/mpich2/bin/mpiexec" -n 12 "/Applications/Lumerical\ [[verpath]].app/Contents/Applications/FDTD\ Solutions.app/Contents/MacOS/fdtd-engine" -t 1 -resume /tmp/example.fsp
Run on a remote computer with 2 processes
"/opt/lumerical/mpich2/bin/mpiexec" -n 2 -host node2 "/Applications/Lumerical\ [[verpath]].app/Contents/Applications/FDTD\ Solutions.app/Contents/MacOS/fdtd-engine" -t 1 /tmp/example.fsp
Distribute simulation between two computers, using 4 processes per node.
/opt/lumerical/mpich2/bin/mpiexec -hosts IP_node1:4,IP_node2:4 /Applications/Lumerical\ [[verpath]].app/Contents/Applications/FDTD\ Solutions.app/Contents/MacOS/fdtd-engine -logall -t 1 /tmp/example.fsp
Note: Use the IP address of the node instead of the HostName, in case the it is not able to resolve the host names.
Pipe standard output to a text file
- The standard output does not appear in the terminal window. In order to see the report you can simply pipe the output to a text file using the piping command ">".
- For example, to output the engine version number or memory usage report to a file, use the following syntax.
/Applications/Lumerical\ [[verpath]].app/Contents/Applications/FDTD\ Solutions.app/Contents/MacOS/fdtd-engine -v > $HOME/temp/version.txt
/Applications/Lumerical\ [[verpath]].app/Contents/Applications/DEVICE.app/Contents/MacOS/dgtd-engine -mr $HOME/temp/example.ldev > $HOME/temp/example_mem_usage.txt
CPi - MPI test program
This test application allows users to ensure that MPI is properly configured, without the additional complication of running any Lumerical solver.
For example, this avoids any potential problems with product licensing, since both MPI and CPI are not licensed features.
MPICH2 Nemesis
Run CPi using 4 processes on the local computer.
/opt/lumerical/mpich2/bin/mpiexec -n 4 /Applications/Lumerical\ [[verpath]].app/Contents/Applications/FDTD\ Solutions.app/Contents/MacOS/cpi
The output of the CPI test should look something like this:
Process 2 on localhost
Process 1 on localhost
Process 3 on localhost
Process 0 on localhost
pi is approximately 3.1416009869231249, Error is 0.0000083333333318
wall clock time = 0.000049