Ba-133 close to detector, experience question

Hello,

I am new to Fluka… I am studying on designing NaI(Tl) detector,

I took FWHM values from experiments… I am getting close values to real spectrum for Na-22, Co-60 and Cs-137 point sources in case of true a,b,c values …

When i tried same configuration for the Ba-133 isotope, i didn’t get what i expected…

Before i play with detector and other simulation configurations, i want to be sure that i am not missing sthg about the program.

In one of previous Fluka version topics (not the version published here), i found out: RE: [fluka-discuss]: Possibile error in Barium133 isotope decay from Eleftherios Skordis on 2017-03-24 (fluka discuss archive)

which says “What FLUKA is doing (by design) in semi-Analog mode, is to reproduce these results on average after a statistically significant amount of decays. However since it is not fully-Analog the results cannot be used (out of the box) for event by event spectroscopy analysis since the coincidence summing would be unrealistic, and especially dominant if your detector is very close to the source.”

Is this still valid for the new version published here? Do you have an experience on this subject?

Thank you for your time.

It is. See Scoring energy spectrum Cs-137 in LaBr3 - #6 by cesc

Thank you,

I will check source.f routine

I tried source_newgen.f, i choose random 2 energies by using source_phase_space_example.dat for photons.

Energies are 1100 and 1600 keV.
I got the below graph from “detect” card, there is no 1100 an 1600 keV photons. The place of peaks changes when i changed the “weight of the particle” from .dat file.

I can detect 1100 and 1600 keV energy photons by using USRBDX card (fig 1) but DETECT card gives me different result (fig 2). I didn’t figure out why i am getting different energies rather than the input ones.


Do i need to assign Energy value at the Beam card, if so… what do i need to place as an input for 2 energies to beamcard? For example for 1100 and 1600 keV…?

Could you please guide know how i can get peak energies from detect card? What am i missing?

The files i used to try are attached.

Thank you for your time.

source_phase_space_example.dat (204 Bytes)
hpge2018a.flair (2.9 KB)
hpge2018a.inp (2.1 KB)
source_newgen2.f (18.8 KB)

Dear @hakan.cetinkay,

Thank you for providing your input files, this allowed to spot the problem. I believe the issue lies in assigning the weights in your “source_phase_space_example.dat” file. Here you assign a weight of 0.5 to each energy, for which the DETECT card is not compatible. The plots you provide also show that you have not collected a sufficient amount of statistics. Can you try to reset the weights to 1.0 for each of the energies in the “source_phase_space_example.dat” file and run more primaries to get more statistics?

Hope this helps,

Andreas

1 Like

I perform 2 runs, 1 st run, each weight 1 with 1 M primaries. I called it “runa”:



:
USRBDX and DETECT card gives same energies.

I named second run as “runb” and weight for 1100 keV is 1 and weight for 1600 keV is 2. Primaries are 1 M as for the previous run.


It is seen from the graph that weight card changes the energy in detect card by the factor of 2 (weight card value).
It works fine with USRBDX card.

I am not sure if it is compatible with detect card as you wrote in your message. It doesn’t have any problem with statistics. 100 k or 1 M particles gave approximately same results.

Dear @hakan.cetinkay , are you sure you are merging the results from different cycles/runs in the correct way? When I run your input file for 100K primaries I obtain for the DETECT output the plot below which shows the peaks much more clearly. In your plots you seem to have collected only a couple of events and you should be able to see the difference easily between running 100k primaries and 1M primaries.

You are right indeed that setting the weights in a different way is not really compatible with DETECT but for the USRBDX results this has no impact, it is a good approach that you use to compare the results of different scorings to check if everything is working as it should.

1 Like

Dear Andreas Waets,

Thank you very much.

Flair->Run->Data->Process

But run takes only a minute even with 1 M primaries. It is weird. In other examples, it takes at least 5 minute for 100 k primaries.

I didn’t have this problem in usreou.f, maybe i am doing something wrong in “run” tab, i don’t know.

If i don’t select “source_newgen2.exe”, it says some hours to some days for 100 k and 1 M primaries.
I didn’t try this. Do i need to? what do you think?

Dear Hakan,

in addition to Andreas’ excellent answer, I would like to comment on why you don’t have any statistics in your simulation.

The problem is in the source routine with the call to the read_phase_space_file() subroutine. You set the sequential option to .TRUE.. This means, the routine will read the data file line-by-line, ans stops when it reaches it’s end. So, you simulate only 2 * cycles primates.

To fix the problem you need to set this option to .FALSE.. This way, the routine will choose one line randomly, until you simulate the number of primaries specified in the input file.

With this modification (and setting the weights to 1.0 as Andreas pointed out) I get the following result with your DETECT scoring:

Cheers,
David

P.S.: If you don’t use your custom compiled executable, then FLUKA will use the beam definition found in the input file, which is in you case 200 GeV/c (the default value, since you left this field empty) photons.

1 Like

Dear David and Andreas,

Thank you both!

I have got enough statistics now…

Reading inputs from output file is practical way to weight particles in user routines but i think i need more practice and comparison to understand if i can use it or not if my particle weights are not equal…

Best regards,
Hakan