Luminance wrong read out after calibration in matlab

Hi all,

(Appreciate all, especially the founders and maintainers of this fourm/mail list. I feel like being saved when I come here, because I was stucked for a long time when I doing my dissertation.)

When I use Matlab command ‘hdrread(*.hdr)’ to import a calibrated image (by photosphere) into Matlab, the luminance readout was wrong. I guess this is because the header will be absorbed while reading in. My question is: Is there any method to control the header in Matlab? (header problem may also exists in the ‘hdrwrite’ command).

By the way, the ‘evalglare’ command is compatible with the lens with angle of view less than 180° , right?

And it is possible to build an integrated Matlab script to control the whole process from calibrating to glare evaluation. right?

Apologize for my poor english!

I am no Matlab expert, and didn’t realize that the hdrread() function didn’t handle the EXPOSURE= line(s) properly in the header. This may be overcome by passing the HDR file through “ra_xyze -r -o” first.

Regarding the fisheye view for evalglare, if you have less than 180 degrees in your view, then there is no way to accurately compute the vertical illuminance, so the estimates may be slightly in error even if the image is accepted. You will probably get a warning message about that from evalglare.

If you can “shell out” commands from Matlab, then you can do anything you like with Radiance. I am not sure why you would do that, but I suppose you have your reasons.

Thank you Greg Ward, It helps me a lot. !!
In fact besides the glare, I want to calculate the other visual comfort parameters such as local contrast, so I suppose to integrated hdr process in Matlab.

For the accuracy, I know the vertical lumianance so an external illuminance meter will help me to complete the process.

it is embarrassing that I’m still study the Radience hence I’m not very clear about the whole framework of the post-process of hdr file. Could you please take a little time to check my understand? Thank you!
From my perspective, what I should do in each step should be:

1, Use hdrgen or pfstools (I’m limited) to merge the pictures and calibrate the absolute luminance.

2, Use pcomb to convert the projection of fish-eye lens, and the vigenetting.

3, Use evalglare to evaluate the glare.
The luminance calibration info and the projection type should be write into the header.

Yes, I believe you have the proper procedure, though I’m not the expert on evalglare. If someone else has an opinion, hopefully they will add it…

thanks a lot !:rofl::rofl:!

Hi Jing,

Regarding evalglare, you can use also an image with smaller angle than180°, but, as pointed out by Greg, the vertical illuminance cannot be calculated correctly. (Unfortunately there is NO warning in evalglare so far - good point to include that in the next version).

What lens are you using that you don’t have the 180°? In my point of view it is really crucial to have the 180° to check the accuracy of the image by comparing it to an illuminance sensor besides the lens. We did a lot of test measurements here with the hdrgen-method and high-precises equipment and we saw that in some cases luminances are not correct ending up also with deviations in the illuminance. Therefore a comparison with the illuminance is always recommended to ensure at least the overall intergral is correct.

Another thing to consider: If you calculate “averages” of certain zones of the image in matlab, don’t forget to consider the solid angles of the pixels and not just averaging over the pixels - this is mathematically not correct. All implemented functions in evalglare (also the zone evaluation) consider the solid angles of the pixels.

good luck

Thank you Wienold !
The suggestions about the unit is very usefull !
And…Please excuse me to ask another question: which field in a header of a *.hdr file clarifies the luminance factor (real luminance/ Y value readout) ?

Well, I just worry how can evalglare get the actual luminance, but not an inaccurate Y-value. After merging the pictures by photosphere, I found that the luminance factor seems not be attached with files.

An HDR file has one or more EXPOSURE= lines in the header. The luminance of a pixel is the appropriate combination of RGB according to the color space times 179 lumens/watt then divided by the EXPOSURE factor(s). (The RGB coefficients for CCIR709 typical for cameras is 0.212R + 0.715G + 0.073*B.)

Thank you Greg!

I’ve learned a lot, although a small question: how about the range of pixels? 0~1? or 0~255?
So, luminance=179* Y /Exposure, where the Y ranges from 0~1.0,
In addition, Exposure is calculated from the average aperture ,shutter ,ISO of image sequence.
Am I right?

Greatly appreciate!

Pixel range is 0-infinity, where 1.0 is equal to 179/EXPOSURE candelas/m^2 (luminance). The exposure value is based on average camera exposure as you say. An attempt is made to bring it into this absolute calibrated scale, but of course different cameras may not match exactly.

Great thanks !
You helped me to surmount almost the last hinder of file format !
Your help will be registered into my MSc dissertation !