EVENTBIN eating out disk space like crazy?

Dear FLUKA experts,

I ran into a problem with my simulation after I added 4 EVENTBIN cards. Now the simulation takes way longer time to finish (the actual time it takes is totally different from the ETA generated by flair). This might be that the program is dumping huge amount of data into my HDD every second, and HDD only writes so fast. I checked the size of the FLUKA_Work folder and it already takes up 400 GB of the disk space even I opted in to print only Non-Zero Cells in the EVENTBIN options. Is there any solution to address this issue?

Thanks in advance.

Martin

Dear @KillMartin

EVENTBIN can actually dump a lot of information, filling a lot of disk space. Then, the I/O can add up to the time required for the simulation.
Without knowing the details of your physics case, it is difficult to suggest any solution. The questions are: do you really need all those information? Can you obtain them in another way? Maybe a mgdraw user routine could come handy.
If you can only get what you want using an EVENTBIN card, then I fear there aren’t that many workaorund.

Dear @amario,

Thank you for your help and valuable information. Unfortunately, I do not have access to my .inp file right now but I could upload them tomorrow morning for sure. Therefore, please allow me to describe my simulation just through words.

I am modeling a low energy neutron detector. The cylindrical shaped detector consists of a cylindrical chamber filled with Argon gas with Boron Nitride paint covered on the outside of the chamber. I would like to generate spectrum of particle counts vs. energy deposited by proton, alpha particles and Lithum-7 nuclei separately in the gas chamber.

Other experts in this forum suggested me to use EVENTBIN to achieve what I want properly. I thought about using USRBDX to generate the two spectrum (from paint into the gas, and from the gas into the paint respectively) and make a subtraction. However, I do not think this is the correct way to do this.

Hope this would help you better understand my problem, thanks in advance for any help.

Martin

The size of any event-by-event scoring, as the one you need, obviously scales with the number of primary events. Just reduce drastically the latter in the START card, in order to get first something of limited size you can process. Then, you can increase the statistics accordingly in a reasonable fashion, rather than starting with billions of events.

Dear @ceruttif,

Thank you! I thought about it but was worrying about bad statistics. I’ll do it in step by step now.

Martin