Accessing and merging of results

Dear FLUKA experts,
I am new to Flair and FLUKA and couldn’t find an answer to two particular problems, yet:

  1. Parallelizing a run with the clone function and different seeds, I would like to merge all (!) the results afterwards. This would allow me to calculate more cycles simultaneously (or so I hope) and save simulation time. But the flair layout only allows for using the “processing” button on the results of one clone, not all of them. Did I miss something? How do I do this?

Or is there a way to do this on the command line? (I haven’t worked with the command line for FLUKA, yet, but if this is the way to go… )

I would like to include a screenshot but… this brings me to my second question:

  1. How do I reopen a flair project correctly after a run so that I get to see the results? This is particularly important to me since I am running the simulation on a shared VM which ends my session after a certain idle time. So if my simulation finishes at night, I need to restart in the morning an can only check the output files locally, but I would like to use flair for this, too.
    Once the simulation finished and the program ended, it seems impossible to display or process the results. When I start flair again and load the project, a) I only get one clone (or the original) displayed or b) it looks like it did before the run.
    How do I fix this?

Many thanks in advance!

Dear Inken,

First of all, sorry for the very late answer.

For your first question, to parallelize the runs instead of cloning your input many times, run several “Spawns” for the same “Run” instead. The “Processing” bottom will merge all the spawns together.

Have a look at this presentation: FLAIR Introduction

For the second question, I encourage you that you save the flair project after launching the simulations, so after they are done, you can reopen the project and process the results with FALIR.

If the problem is more related with the use of the VM itself, I may leave the question for a colleague with more experience on that.

Kind regards,