Scoring energy deposition of radioactive source

Hi,

I would like to score the energy deposition of a radioactive contaminant in a pixel detector, on a single decay basis. For example I would like to see the energy deposit of a U-238 decaing alpha to Th-234. From what I understood with the card RADDECAY in Semi-Analog allow to simulate the full decay chain of an element specified with HI-PROPE card. But if I try to score the energy deposition I get the sum of the energy deposition of the entire decay chain end not of the single reaction, for example U-238 \to Th-234 + alpha. Is there a way to sore single process? or there is a smart way to normalise the energy scored by USRBIN in a grid? In attach my .inp file.

Thanks a lot
Stefano

test_raddecay.inp (1.7 KB)

Hi Stefano,

Thank you for the question.
Apologies, but I will have time to look into your issue starting from next week.

Cheers,
Jerzy

Hi Stefano,

After looking into your problem I think that the most straight forward solution would be to just simulate alpha particles coming from the decay of U-238, namely alphas with the energy of 4.267 MeV emitted isotropically. Unfortunately, it seems that there is no general solution to the problem provided in FLUKA at the moment. Luckily, U-238 has only one decay channel. Otherwise you would have to carefully consider the proper normalisation to have your result per decay.

From your geometry, I understand that you are using a very small plate of silicon which would normally be embedded into a bigger structure. In this case, simulating the decay in the centre is perfectly fine, but if you would like to have a simulation of uranium contamination in a bigger structure, you could also use a user source routine to distribute the decay vertices uniformly in your material.
Let me know if you have any other questions.

Cheers,
Jerzy