Dear All, I’d like to run flair on my university’s super-computer server to parallel run FLUKA.
- I download the tar source code. Then upload it to server and extract it to get the flair-3.2 folder.
- cd flair-3.2
Has flair been completely installed now? But when I execute the flair, the error occurs, which says "No protocol specified couldn’t connect to display “:0.0”. ". How can I solve this problem?
I suggest you prepare your input file with help of flair on a personal computer. Then you run the *.inp file on the supercomputer with the rfluka script, which is found in the bin-directory under fluka.
More information on this under FLUKA on the command line | The official CERN FLUKA website
Best regards, Thomas
Thanks for your nice advice. I have tried your advice and the problem seems to be worked out successfully. What I have done is the following, is it correct?
- I prepare my input file on a personal computer. That is, I modified the queue to “null”, then execute the run would only generate the input file without running.
- I uploaded these input files to supercomputer with rfluka script, submit several tasks(equal to the number of input .inp files).
- After the tasks terminated, I downloaded the generated files to my personal computer, and used flair to process them.
Is it a correct procedure to run a FLUKA job parallelly?
If I understand your mail correctly, you did everything alright. Just one important point to generate meaningful results:
For parallel running you can launch several rfluka scripts at the same time, with different names of input files (e.g. example-01.inp, example-02.inp, …) and providing a different random number seed in every input file in the RANDOMIZ command. This process is automated by flair when you “spawn” input files.
Thanks you very much! I generated the input files using flair “spawn” function. And I checked out the RANDOMIZ command for every input file, is “RANDOMIZ 1.0 1” for example_01.inp, " RANDOMIZ 1.0 2" for exapmle_02.inp, et al. I think it works correctly. Thanks again for your help.