Hi everyone -
I have just started using and testing HDR images for glare analysis as part of my research. The question I face is that when I manually generate an image using Photosphere I am able to calibrate the image using an illuminance meter which, when false colours are applied, generates one range of values. If I duplicate this scene using canoncap an HDR image is output but one that gives very different (like 3 times smaller) upper luminance values. Is there a part to the method I have not accomplished and/or is there a way to calibrate the process in order to get these illuminance values to match the calibrated process?
I am hoping this to be the case so I can set up a time-lapse situation like others have successfully done and process the images using evalglare.
Thanks for any assistance.