lstmcpipe package#

Subpackages#

Submodules#

lstmcpipe.logging module#

Helpers to setup logging. Copied and adapted from fact-project/aict-tools

lstmcpipe.logging.setup_logging(logfile=None, verbose=False)#

Setup logging using the logging module from the python standard library. A handler for the terminal and optionally a file handler get setup, the logger is using INFO or DEBUG depending on the value of verbose. Numba logs are filtered if they are not WARNING or above, because numba creates thousands of lines in the output.

Parameters:#

logfile: str or path-like

File to save logs to.

verbose: boolean

Whether to enable debug logging

Returns:#

logging.Logger

lstmcpipe.lstmcpipe_start module#

lstmcpipe.lstmcpipe_start.build_argparser()#

Build argument parser and return it

lstmcpipe.lstmcpipe_start.main()#

Main lstmcpipe script. This will launch the selected stages and start the processing. The jobs are submitted, but not awaited meaning that the analysis will be going on after the script has exited. To look at the submitted jobs, you can use e.g. squeue -u $USER.

Arguments, that can be passed via the command line:#

–config_mc_prod / -c

path to a yaml configuration file containing lstmcpipe settings This defines the stages run, the files processed, …

–config_file_lst / -conf_lst

path to a yaml configuration file containing lstchain settings This defines the processing parameters like cleaning, models…

–config_file_ctapipe / -conf_cta

same for ctapipe

–config_file_rat / -conf_rta

same for HIPERTA

--log-file

Optional: path to a file where lstmcpipe logging will be written to.

--debug

Toggle to enable debug print messages.

lstmcpipe.utils module#

class lstmcpipe.utils.SbatchLstMCStage(stage, wrap_command, slurm_output=None, slurm_error=None, job_name=None, slurm_account=None, slurm_dependencies=None, extra_slurm_options=None, source_environment='', backend='')#

Bases: object

Base class to (slurm) sbatch a lstMCpipe stage

compose_wrap_command(wrap_command=None, source_env='', backend='')#
property dl1_dl2_default_options#
property dl1ab_default_options#
property dl2_irfs_default_options#
property dl2_sens_default_options#
property dl2_sens_plot_default_options#
property merge_dl1_default_options#
property r0_dl1_default_options#
property slurm_command#
property slurm_options#
stage_default_options(stage)#
submit()#
property train_plot_rf_feat_default_options#
property train_test_splitting_default_options#
property trainpipe_default_options#
lstmcpipe.utils.batch_mc_production_check(dict_jobids_all_stages, log_directory, prod_id, prod_config_file, batch_config, logs_files)#

Check that the dl1_to_dl2 stage, and therefore, the whole workflow has ended correctly. The machine information of each job will be dumped to the file. The file will take the form check_MC_prodID_{prod_id}_OK.txt

Parameters:
  • dict_jobids_all_stages (dict) – dict containing the {stage: all_job_ids related} information

  • log_directory (Path)

  • prod_id (str)

  • prod_config_file (str)

  • batch_config (dict)

  • logs_files (dict) – Dictionary with logs files

Returns:

jobid

Return type:

str

lstmcpipe.utils.dump_lstchain_std_config(filename='lstchain_config.json', allsky=True, overwrite=False)#
lstmcpipe.utils.rerun_cmd(cmd, outfile, max_ntry=2, failed_jobs_dir=PosixPath('/home/runner/LSTMCPIPE_PROD_LOGS/failed_outputs'), **run_kwargs)#

Rerun the command up to max_ntry times. If all attempts fail, raise an exception.

Parameters:
  • cmd (list) – Command to run as a list of strings

  • outfile (Path) – Path to the cmd output file

  • max_ntry (int) – Maximum number of attempts to run the command

  • failed_jobs_dir (Path or str) – Subdirectory to move failed output files to

  • run_kwargs (kwargs) – Additional keyword arguments for subprocess.run

Raises:

RuntimeError – If the command fails after all retry attempts

lstmcpipe.utils.run_command(*args)#

Runs the command passed through args, as a subprocess.Popen() call.

Based on: cta-observatory/cta-lstchain

args: str or iter

Shell is forced to True, thus a single string (shell cmd) is expected.

(subprocess.Popen.stdout).strip(’

‘)

lstmcpipe.utils.save_log_to_file(dictionary, output_file, workflow_step=None)#

Dumps a dictionary (log) into a dicts of dicts with keys each of the pipeline stages.

Parameters:
  • dictionary (dict) – The dictionary to be dumped to a file

  • output_file (str or Path) – Output file to store the log

  • workflow_step (str) – Step of the workflow, to be recorded in the log

Return type:

None

lstmcpipe.version module#

Module contents#