Improvement of statistics using photon spectrum

Dear FLUKA experts,

I’m trying to calculate equivalent dose using a photon spectrum with maximum energy of 500 KeV using a source.f rotine.
In the geometry there are three copper angled scatters. I want to see the dose inside the TUBE1 and in the region near of the third scatter.
I simplified the geometry with black holes around the area of interest but the problem is that the statistics seems that it’ll never be good enough even I spend a really long time with a lot of primaries. Is there some card that could help us to improve statistic in this case?
The files are attached. (122.7 KB)

Best regards,


Hello Isabela,

you should try region biasing. (BIASING card). The idea is to divide the path of the photons into a number of regions and to increase the “weight” of the regions the farther the reion is from the source, and the more scatterers were in the way. Based on the calculation that you joined, you can set the weights inversely proportional to the photon fluence that you have scored withtout biasing.

Best regrads, Thomas


Thanks for the answer, Thomas!

I add a BIASING Card increase the importance around the scatters with high values (1000) but It seems that it doesn’t work. I look the FLUKA manual and it says that the card BIASING is “only for hadron, heavy ion, or muon/photon nuclear interactions”. The beam that I’m using are photons with maximum energy of 500 keV, so I’m wondering If that card is applicable for my case.

Best regrads,


Dear @isabela.moraes,
the card is applicable in your case, as @totto pointed out.
It can be used to different purposes, the limitation you took from the manual applies to multiplicity biasing (i.e. artificial reduction of the number of secondaries generated in a nuclear reaction), which represents another use with respect to the region importance biasing he suggested. The latter works also for electrons, positrons and photons (WHAT(1)=2.0) of whatever energy above threshold. The point is that you should segment the path from the first scatter (where you are supposed to get more easily a good statistics) up to the third scatter into regions of increasing importance. The default importance is 1, what matters is the importance ratio between two contiguous regions, which is anyway limited to 5, so assigning an isolated importance of 1000 makes no sense: one could at most adopt a sequence such as 1-5-25-125-625-etc. This way, a particle crossing any of these region boundaries (as moving from the first to the third scatter) will be replaced on average by five copies of reduced statistical weight.
Your problem is objectively challenging, but if you manage to properly setup the aforementioned region importance scheme, you should still see an effect.

1 Like

Dear @totto e @ceruttif,
I made this changes and I got results with better statistics! Thanks a lot!
Best regards,