Next: convert-units, Previous: command-line, Up: Directory Index [Contents][Index]
Returns a struct containing the names (keys) and filenames (values) of functions required by the supplied functions. If exclude (a cell array of strings) is given, exclude all functions whose filepaths start with one of the filepath prefixes in exclude.
The cell array extras returns any additional data files required by the functions. It is determined by calling any dependent function named ’__depends_extra_files__()’, which should return file names as multiple string arguments.
octprefixes = cellfun(@octapps_config_info, {"fcnfiledir", "octfiledir"}, "UniformOutput", false); [deps,extras] = depends(octprefixes, "parseOptions");
Set up a Condor DAG for running Condor jobs.
name of Condor DAG submit file
dag_name
name of Condor DAG, used to name DAG submit file
job_nodes
struct array of job nodes, which has the following fields:
file
name of Condor submit file for this job
vars
struct of variable substitutions to make
child
array indexing child job nodes for this node
retries
how man times to retry Condor jobs (default: 0)
sub_dags
split DAG into this many subfiles (default: 1)
oldpwd = pwd; jobdir = mkpath(tempname(tempdir)); unwind_protect cd(jobdir); jobname = "test_makeCondorDAG"; job = makeCondorJob("job_name", jobname, ... "log_dir", pwd, "func_name", "__test_parseOptions__", ... "func_nargout", 1, ... "arguments", { ... "--real-strictpos-scalar", "$(x)", ... "--integer-vector", [3,9,5], ... "--string", "Hi there", ... "--cell", {1,{2,3}}, ... }, ... "data_files", { ... fullfile(fileparts(file_in_loadpath("readSFT.m")), "SFT-good") ... }, ... "extra_condor", { ... "requirements", "TARGET.has_avx == true", ... } ... ); assert(exist("./test_makeCondorDAG.job") == 2); assert(exist("./test_makeCondorDAG.sh") == 2); assert(exist("./test_makeCondorDAG.in") == 7); assert(exist("./test_makeCondorDAG.in/.exec") == 7); assert(exist("./test_makeCondorDAG.in/.func") == 7); assert(exist("./test_makeCondorDAG.in/SFT-good") == 2); nodes = struct; node = struct; node.file = job; node.vars.x = 1.23; nodes(1) = node; node.vars.x = 4.56; nodes(2) = node; makeCondorDAG("dag_name", jobname, "job_nodes", nodes); assert(exist("./test_makeCondorDAG.dag") == 2); assert(exist("./test_makeCondorDAG_nodes.bin.gz") == 2); assert(exist("./test_makeCondorDAG.out") == 7); assert(exist("./test_makeCondorDAG.out/00") == 7); assert(exist("./test_makeCondorDAG.out/01") == 7); unwind_protect_cleanup cd(oldpwd); end_unwind_protect
Set up a Condor job for running Octave scripts or executables.
name of Condor job submit file
job_name
name of Condor job, used to name submit file and input/output directories
log_dir
where to write Condor log files (default: $TMP)
data_files
cell array of required data files; elements of cell array may be either:
ENVPATH
, file_name_in_ENVPATH, …}
where ENVPATH
is the name of an environment path
extra_condor
extra commands to write to Condor submit file, in form: {‘command’, ‘value’, …}
func_name
name of Octave function to run
arguments
cell array of arguments to pass to function. use $(variable) to insert reference to a Condor variable.
func_nargout
how many outputs returned by the function to save
exec_files
cell array of executable files required by the function
output_format
output format of file containg saved outputs from function:
Oct(Text|Bin)(Z)
Octave (text|binary) (zipped) format; file extension will be .(txt|bin)(.gz)
HDF5
Hierarchical Data Format version 5 format; file extension will be .hdf5
Mat
Matlab (version 6) binary format; file extension will be .mat
Default is "OctBinZ"
executable
name of executable to run
arguments
cell array of arguments to pass to executable.
use $(variable)
to insert reference to a Condor variable.
Build a rescue Condor DAG to rerun jobs with missing or corrupt result files.
dag_name
Name of Condor DAG, used to name DAG submit file.
check_load
If true, check that result file can be loaded and its contents are not corrupted; otherwise, just check that result file exists [default: true].
Merge results from a Condor DAG.
dag_name
Name of Condor DAG, used to name DAG submit file.
merged_suffix
Suffix to append to merged results file name
'dag_name'_'merged_suffix'.bin.gz
.
Default is "merged".
args_filter
Job results which share the same job arguments are merged together. If specified, this is a function which may modify the job arguments before they are compared, i.e. to change which arguments are merged together.
merge_function
jobs with the same arguments, as determined by the DAG job name ’vars’ field. Syntax is:
merged_res = merge_function(merged_res, res, args)
where ’res’ are to be merged into ’merged_res’, and ’args’ are the arguments passed to the job. One function per element of job ’results’ must be given.
norm_function
If given, function(s) used to normalise merged results after all Condor jobs have been processed. Syntax is:
merged_res = norm_function(merged_res, n)
where ’n’ is the number of merged Condor jobs. One function per element of job ’results’ must be given.
save_period
How often merged results should be saved (default: 90 sec).
extra_data
Extra data to save to merged results file.
load_retries
How many times to try loading result files (default: 3).
retry_period
How long to wait between trying to load results (default 30 sec).
Returns in deps a list of all the shared libraries on which the executables/shared libraries file, … depend. If exclude (a cell array of strings) is given, exclude all libraries whose filepaths start with one of the filepath prefixes in exclude.
libdir = octapps_config_info("libdir"); version = octapps_config_info("version"); dirs0 = glob(fullfile(libdir, "liboct*.so")); dirs1 = glob(fullfile(libdir,"octave", version, "liboct*.so")); deps = sharedlibdeps(dirs0{:}, dirs1{:});
Next: convert-units, Previous: command-line, Up: Directory Index [Contents][Index]