I am attempting to extract raw luminance values from .hdr images using radiance to execute various calculations.

I am experiencing an issue with the extraction of raw luminous values using the pvalue command with Radiance.

I have tried the following 2 commands:

1 - pvalue -b -h -H

2 - pvalue -b -h -H -o

Both commands give me the same raw values at the same coordinates. Despite this, when I extract the minimum and maximum values from the output, they do not match the values displayed in Photosphere. I am using the histrogram function in Photosphere to capture the max, min, median and mean values to compare against the values I am trying to calculate from the pvalue command.

Depending on the image, I have seen conversion values range from 174 to 177 instead of the documented conversion factor of 179. The conversion is also not linear in every .hdr whereas I may need to use different conversion factors to match the max and min values (for the minimum value I may use a conversion factor of 175 but 176 will get me the maximum value). The mean and medians I calculate from the outputs are also not the same as the ones reported in photosphere and I can’t seem to identify the discrepancy at this point.

I’m running a little dry on ideas of the cause of this and potential solutions

Hi Patrick,

Welcome to the forum!

The accuracy of HDR files is +/-1% at best, so some differences are to be expected. Also, the histogram function in Photosphere makes further approximations, so you can’t rely on the min and max values being closer than 2% or so.

Regarding the average values, Photosphere performs a log average in its histogram function, but reports the straight linear average if you select a region (or the whole image) in the upper-right section of the window. That is probably the difference you are seeing there.

If you want better accuracy from your values, you can always use rtrace and output floating-point images (matrices) with the -f?f option. The vwrays command can provide the rays corresponding to a particular view. This is a more advanced option, but you seem quite capable based on your first question. You can check out the man pages for vwrays and rtrace or search for other posts covering these commands.

Best,

-Greg

Thank you for the insight Greg.

Just to clear things up for me as I am still very much new to HDR and Radiance, is the discrepancy in values from the Radiance pvalue output and Photosphere expected? Should the 179 conversion factor be the value to use to convert back from the pvalue output to luminance?

Do you have any more information for me for how photosphere computes the log average in the histogram function? I initially assumed a calculation of (y - x)/(ln y - ln x) but that isn’t giving me a value comparable to what photosphere is outputting.

Thank you again for your help thus far

The official conversion factor is 179. Computing the log average means computing:

exp(average(log(Li)))

I did a quick check between pvalue results and Photosphere’s histogram report, and it seems to be off by around 2%:

pvalue -h -H -b -o -df test.hdr > /tmp/lums.flt

total -if -m /tmp/lums.flt

total -if -p -m /tmp/lums.flt

The first run of total gives the linear average radiance, and the second computes the log average. Other options can give the minimum (-l) and maximum (-u) values.

Cheers,

-Greg