Why is there so big difference between glare indice value caluculated by evalglare under Windows system and Linux system?

I used evalglare to calculate DGP, DGI values baed on HDR image which is generated by Radiance.However, I accidentally found values caluculated by evalglare of Windows version was quite different with those calculated by evalglare of Linux version.
image

The radiance version are shwon as below:
image
image

Hi,
you must use different versions on the two platforms. There was a change some years ago regarding default glare source detection mode from 5-times average luminance to a fixed threshold of 2000, based on the findings from Clotilde Pierson. Things like this (default values) are always documented with the manpage shipping with the version.
So please check the versions that you are using by evalglare -v and post the versions you are using on both platforms.
The above mentioned change was to version 2.05 in August 2018. If you use an older version on any platform, then you should update.
In case you have exactly the same version for both platforms but different results, then please share that image to me so that I can investigate this further. But I never experienced different results - at least not for the versions that I compile myself for windows (which is a different process).

Jan

1 Like

I just realized you posted the radiance version you are using. it seems on windows you are using a version that is 12 years old! Evalglare version from 2010 was v0.9 - which is totally outdated and which has no safety features at all in it - in that case you definitely have to update versions. The minimum version you should use is v2.07 - there were plenty of bugfixes before.

1 Like