# HDRI capture of LED

Hello everybody!

I’ll provide the summary of my research and have questions within the
summary. I would appreciate any of your help!

My research investigates if HDRI technique can precisely capture luminances
of small bright light sources (e.g. LED garage fixtures) with narrow light
distributions.

I was able to figure out luminance values for a single LED, which can be
compared to the ones from HDR images. But I have a couple of
questions/concerns on HDRI technique and Photosphere.

At first, I’ve used “regular” scene to retrieve response curve of the
camera (large smooth gradients with very dark and bright areas, and had
reflectance standards for the absolute calibration).

Camera: EOS T1i Rebel with 28-105mm lens, at 28mm

Calibrated at the grey reflectance sample 186.45 cd/m2

CF=0.957

···

*
*

I’ve got the following RC for RGB:

red(r) = -6.434199e-03+ 4.518039e-01*r + 1.291426e+00*r^2 + 1.802896e+00
*r^3;

green(g) = -5.804720e-03+ 4.175837e-01*g + 1.176582e+00*g^2 + 1.721643e+00
*g^3;

blue(b) = -4.376831e-03+ 3.784418e-01*b + 1.075695e+00*b^2 + 1.658471e+00
*b^3

If I look at the histogram of the scene, maximum luminance within the scene
is 60,291 cd/m2.

Then I use this RC to analyze HDRI of a captured LED. The value is 230,000
cd/m2 for a single LED, which is low (it’s has to be around 7*106 cd/m2).
So, it underestimates the luminance.

It seems like calibration point is critical here. I’ve decided to try to
capture a different scene for deriving RC with a wider range. It would make
sense that camera has to see higher luminance values in order to accurately
measure them later. The dynamic range has to cover measured values.

1. 1. How does Photosphere deals/approximates/calculates the upper end of
the curve? I assume it gives more weight to mid tone values? But what
happens with high luminance values?

So, the new brighter scene was picked with the direct sun! But in order to
avoid the damage of the camera’s sensor, measurements were taken before the
sunset.

In the new brighter captured scene without the calibration all values for
reflectance standards were overestimated, while the value for the sun
underestimated. Then I decided to calibrate my scene at the sun!

But when I apply absolute calibration, it simply multiplies CF to all
values.

2. I assumed when CF is applied, it does not equally change all values,
but does it proportionally to RC (since it is not linear). Why does it do
it equally for the whole range?

Lsun=80*106 cd/m2. And of course CF is very big 391.

New RC:

red(r) = 3.219064e+00+ -2.655078e+01*r + 9.351069e+02*r^2 + -2.115052e+03
*r^3+1.594538e+03*r^4;

green(g) = 2.094164e+00+ -1.468109e+00*g + 7.306838e+02*g^2 + -1.720743e+03
*g^3+1.380693e+03*g^4;

blue(b) = 1.049078e+00+ 1.591820e+01*b + 5.848958e+02*b^2 + -1.461635e+03
*b^3+1.251033e+03*b^4

But then something interesting happened. When I analyze LED, it gives a
value of 79*106 cd/m2. So, it jumps to this upper limit calibrated with the
sun previously.

(I had similar results for EOS 7D with the lens 16-35mm, at 16mm)

3. Does photosphere compress the response curve, so at the upper end
all values above certain threshold will have the same number?

4. Any additional suggestions on properly obtaining and calibrating
HDRI for this purpose?

--
Thank you,
*Yulia Tyukhova*
*
*
Fulbright Scholar, "Intern LC"
Architectural Engineering Graduate Student, UNL-Omaha, NE, USA
B.E. and M.E. in Lighting Engineering (MPEI), Moscow, Russia
[email protected]
[email protected]
+1 (402) 996 0910
PKI 247

Hi Yulia,

I, too, am looking at HDR measurements of artificial light sources in
the hope of measuring glare. The results I've been getting are rather
inconsistent, so I've been doing some googling and reading.

I was able to figure out luminance values for a single LED, which can be
compared to the ones from HDR images. But I have a couple of
questions/concerns on HDRI technique and Photosphere.

At first, I’ve used “regular” scene to retrieve response curve of the camera
(large smooth gradients with very dark and bright areas, and had reflectance
standards for the absolute calibration).

Camera: EOS T1i Rebel with 28-105mm lens, at 28mm
Calibrated at the grey reflectance sample 186.45 cd/m2
CF=0.957

Then I use this RC to analyze HDRI of a captured LED. The value is 230,000
cd/m2 for a single LED, which is low (it’s has to be around 7*106 cd/m2).
So, it underestimates the luminance.

McCann and Rizzi have done some quite comprehensive research into the
dynamic range of cameras and how it is limited by veiling glare. They
have published an HDR book
"The Art and Science of HDR Imaging" (Wiley, 2011), and many of their
papers are available on
http://web.mac.com/mccanns/HDR/Glare_Limits_HDRI.html

To wet your appetite, I recommend
http://web.mac.com/mccanns/HDR/Glare_Limits_HDRI_files/07EI%206492-41_1.pdf
Their conclusion is that accurate HDR luminance measurements are not
actually possible because veiling glare that is generated within the
lens limits the dynamic range of the optical system. Apparently, there
is actually an ISO standard (9358:1994) that comes to the same
conclusion: The higher the dynamic range of the scene, the more
inaccurate the HDR measurement.

You will be aware of the 'flare removal' option in hdrgen (-f switch).
I'm not entirely sure where 'flare' sits between 'point spread
function' and 'veiling glare', but I believe it to be closer to the
former. The PSF is a funciton of any optical system that results in
the image to become 'smudged' out. Back a few years ago when the
megapixel race was in full swing, many observers correctly stated that
the lenses on cheap digital cameras can't actually provide a
resolution that a would justify, say, 12MP on a digital snap shot
camera. This is the PSF they were talking about--it's how a pixel
affects the neighbouring pixels.

While the PSF can be estimated (or even calculated, given enough
information about the lenses and their optical properties? Not
sure...), veiling glare, on the other hand, cannot because it depends
on the scene. Every 'pixel' of the scene affects every pixel of the
image. It's even worse than that: Even scene objects outside of the
field of view of the optical system have an impact on the sensor
image.

Hoefflinger (Ed.) "High-Dynamic-Range (HDR) Vision" (Springer, 2007)
has an entire section dedicated to HDR lenses. While true HDR low-res
(video-) cameras are actually becoming commercially available (they
have a logarithmic response, with a dynamic range far exceeding that
of the human vision), the problem is that they require special HDR
lenses that have to be carefully designed to minimise veiling glare.
Digial camera lenses (even pro-level DSLR ones) are not optimised for
this.

So there is nothing wrong with the camera calibration that you carried
out with a LDR scene. This is how it should be done. The problem
you're facing is not specific to Photosphere or the Mitsunaga RSP
recovery algorithm. The RSP is not compressed at the upper end--it's
just Physics that you're up against.

It seems like calibration point is critical here. I’ve decided to try to
capture a different scene for deriving RC with a wider range. It would make
sense that camera has to see higher luminance values in order to accurately
measure them later. The dynamic range has to cover measured values.

1. 1. How does Photosphere deals/approximates/calculates the upper end of
the curve? I assume it gives more weight to mid tone values? But what
happens with high luminance values?

2. I assumed when CF is applied, it does not equally change all values,
but does it proportionally to RC (since it is not linear). Why does it do
it equally for the whole range?

Lsun=80*106 cd/m2. And of course CF is very big 391.

3. Does photosphere compress the response curve, so at the upper end all
values above certain threshold will have the same number?

4. Any additional suggestions on properly obtaining and calibrating HDRI
for this purpose?

I'm afraid you have to lower your expectations with regards to the
achievable accuracy when it comes to HDR scenes that include bright
light sources.

Light modulation ('flicker') is another problem with HDR measurements
of electric light sources. Unless you are certain that your light
source is driven by a HF driver or ballast, I recommend you actually
measure the modulation of the light source. If the LEDs are mains
driven, they will flicker with 100 or 120 Hz, depending on your mains
frequency. If the modulation factor is high, e.g. if the LEDs
effectively switch on and off with this frequency, HDR measurements at
short exposure times will be unpredictable. You can test this by
taking a number of photographs of the same scene (with light source in
it) at short exposure times. There is no need to go HDR. If all images
have the same overall 'brightness', you're all right. If the images
are noticeably different, you've got yet another problem.

Cheers

Axel

Hi,

did you consider using a filter to work with higher luminances at still
reasonable shutter speeds?

Cheers, Lars.

Lars, if this is a question to me--then yes, I did consider using an
ND filter, but no, I can't use one because the lens I'm using is a
4.5mm fisheye lens that doesn't allow the use of filters, AFAIK.

Either way--you'd want to expose (much) longer than a full cycle which
is only 1/100 sec. This severely limits the range of exposure times
available for the HDR sequence. If I remember correctly, Santiago
Torres captured sunny skies in his PhD thesis, and he had to use two
identical cameras for this, one of which was fitted with an ND filter.
Aligning the two must have been a nightmare.

Axel

···

On 20 February 2012 11:13, Lars O. Grobe <[email protected]> wrote:

Hi,

did you consider using a filter to work with higher luminances at still
reasonable shutter speeds?

Cheers, Lars.

_______________________________________________
HDRI mailing list
[email protected]

Hi Axel!

Lars, if this is a question to me--then yes, I did consider using an
ND filter, but no, I can't use one because the lens I'm using is a
4.5mm fisheye lens that doesn't allow the use of filters, AFAIK.

Ok that sounds scary. Probably it would be possible to get a setup with
a filter wheel behind the lens - but that would not be a small handy
device to be carried around outdoors any more.

For capturing a LED, there is no need for a fisheye though, so here one
or more filters might be an option. Of course one would have a very
close look at how "neutral" that density filter actually is.

Either way--you'd want to expose (much) longer than a full cycle which
is only 1/100 sec. This severely limits the range of exposure times
available for the HDR sequence. If I remember correctly, Santiago
Torres captured sunny skies in his PhD thesis, and he had to use two
identical cameras for this, one of which was fitted with an ND filter.
Aligning the two must have been a nightmare.

Actually there are codes around doing the alignment by the use of shared
keypoints. I mentioned Hugin before - I have not used it to get
photometric values from images, but it claims to handle HDR formats,
includes a lot of alignment functionality, and is free. Maybe worth to
try it? Another question is how close two "identical" cameras can get...

Cheers, Lars.

Hi Lars,

For capturing a LED, there is no need for a fisheye though, so here one
or more filters might be an option. Of course one would have a very
close look at how "neutral" that density filter actually is.

True enough. I need a 180deg lens for the glare, though. The veiling
glare might be less of a problem with non-fisheye lenses. Note that
McCann and Rizzi use an 'ordinary' lens, not a fisheye.

Either way--you'd want to expose (much) longer than a full cycle which
is only 1/100 sec. This severely limits the range of exposure times
available for the HDR sequence. If I remember correctly, Santiago
Torres captured sunny skies in his PhD thesis, and he had to use two
identical cameras for this, one of which was fitted with an ND filter.
Aligning the two must have been a nightmare.

Actually there are codes around doing the alignment by the use of shared
keypoints. I mentioned Hugin before - I have not used it to get
photometric values from images, but it claims to handle HDR formats,
includes a lot of alignment functionality, and is free. Maybe worth to
try it? Another question is how close two "identical" cameras can get...

I'm trying to use Hugin for vignetting calibration. The results so far
are not very consistent between themselves (different apertures), and
with what little is published on vignetting of the Sigma 4.5mm. I have
to say that I have not explored all options yet. Using two or more
cameras is not an option in my case--I only got one.

Axel

To compensate for flicker you could take several shots at the same shutter speed and average between them.
You can go as high as 1/8000 if you do so.
Next is to measure the low and the high values within the picture set and determine how you are set within the flicker fluctuation of that source (just to check you are covering the fluctuation nicely).
The overall picture average should be reasonably close the the long integration time exposure.
This seems a lot to write (in matlab) but it is not rocket science if you have the lamp data sheet and a camera with a lot of fps….

Using a ND filter is an option but is not going to get you close enough to the sort of luminance you are looking to measure.
Adding multiple filters is going to make the lens performance worst and in the end will not work.

And about lenses… you may need to consider some good one, like symmetrical lens designs for large format or range finders (look at Zeiss/Leica to start with).
They usually correct a lot of aberrations very well and flare is just one of the set…

G

···

On 20 Feb 2012, at 13:08, Axel Jacobs wrote:

Hi Yulia,

I, too, am looking at HDR measurements of artificial light sources in
the hope of measuring glare. The results I've been getting are rather
inconsistent, so I've been doing some googling and reading.

I was able to figure out luminance values for a single LED, which can be
compared to the ones from HDR images. But I have a couple of
questions/concerns on HDRI technique and Photosphere.

At first, I’ve used “regular” scene to retrieve response curve of the camera
(large smooth gradients with very dark and bright areas, and had reflectance
standards for the absolute calibration).

Camera: EOS T1i Rebel with 28-105mm lens, at 28mm
Calibrated at the grey reflectance sample 186.45 cd/m2
CF=0.957

Then I use this RC to analyze HDRI of a captured LED. The value is 230,000
cd/m2 for a single LED, which is low (it’s has to be around 7*106 cd/m2).
So, it underestimates the luminance.

McCann and Rizzi have done some quite comprehensive research into the
dynamic range of cameras and how it is limited by veiling glare. They
have published an HDR book
"The Art and Science of HDR Imaging" (Wiley, 2011), and many of their
papers are available on
http://web.mac.com/mccanns/HDR/Glare_Limits_HDRI.html

To wet your appetite, I recommend
http://web.mac.com/mccanns/HDR/Glare_Limits_HDRI_files/07EI%206492-41_1.pdf
Their conclusion is that accurate HDR luminance measurements are not
actually possible because veiling glare that is generated within the
lens limits the dynamic range of the optical system. Apparently, there
is actually an ISO standard (9358:1994) that comes to the same
conclusion: The higher the dynamic range of the scene, the more
inaccurate the HDR measurement.

You will be aware of the 'flare removal' option in hdrgen (-f switch).
I'm not entirely sure where 'flare' sits between 'point spread
function' and 'veiling glare', but I believe it to be closer to the
former. The PSF is a funciton of any optical system that results in
the image to become 'smudged' out. Back a few years ago when the
megapixel race was in full swing, many observers correctly stated that
the lenses on cheap digital cameras can't actually provide a
resolution that a would justify, say, 12MP on a digital snap shot
camera. This is the PSF they were talking about--it's how a pixel
affects the neighbouring pixels.

While the PSF can be estimated (or even calculated, given enough
information about the lenses and their optical properties? Not
sure...), veiling glare, on the other hand, cannot because it depends
on the scene. Every 'pixel' of the scene affects every pixel of the
image. It's even worse than that: Even scene objects outside of the
field of view of the optical system have an impact on the sensor
image.

Hoefflinger (Ed.) "High-Dynamic-Range (HDR) Vision" (Springer, 2007)
has an entire section dedicated to HDR lenses. While true HDR low-res
(video-) cameras are actually becoming commercially available (they
have a logarithmic response, with a dynamic range far exceeding that
of the human vision), the problem is that they require special HDR
lenses that have to be carefully designed to minimise veiling glare.
Digial camera lenses (even pro-level DSLR ones) are not optimised for
this.

So there is nothing wrong with the camera calibration that you carried
out with a LDR scene. This is how it should be done. The problem
you're facing is not specific to Photosphere or the Mitsunaga RSP
recovery algorithm. The RSP is not compressed at the upper end--it's
just Physics that you're up against.

It seems like calibration point is critical here. I’ve decided to try to
capture a different scene for deriving RC with a wider range. It would make
sense that camera has to see higher luminance values in order to accurately
measure them later. The dynamic range has to cover measured values.

1. 1. How does Photosphere deals/approximates/calculates the upper end of
the curve? I assume it gives more weight to mid tone values? But what
happens with high luminance values?

2. I assumed when CF is applied, it does not equally change all values,
but does it proportionally to RC (since it is not linear). Why does it do
it equally for the whole range?

Lsun=80*106 cd/m2. And of course CF is very big 391.

3. Does photosphere compress the response curve, so at the upper end all
values above certain threshold will have the same number?

4. Any additional suggestions on properly obtaining and calibrating HDRI
for this purpose?

I'm afraid you have to lower your expectations with regards to the
achievable accuracy when it comes to HDR scenes that include bright
light sources.

Light modulation ('flicker') is another problem with HDR measurements
of electric light sources. Unless you are certain that your light
source is driven by a HF driver or ballast, I recommend you actually
measure the modulation of the light source. If the LEDs are mains
driven, they will flicker with 100 or 120 Hz, depending on your mains
frequency. If the modulation factor is high, e.g. if the LEDs
effectively switch on and off with this frequency, HDR measurements at
short exposure times will be unpredictable. You can test this by
taking a number of photographs of the same scene (with light source in
it) at short exposure times. There is no need to go HDR. If all images
have the same overall 'brightness', you're all right. If the images
are noticeably different, you've got yet another problem.

Cheers

Axel

_______________________________________________
HDRI mailing list
[email protected]

Hi all!

Indeed, the alignment of the cameras was far from perfect, and I had to do some post-processing to correct this, which was done simply by shifting pixels with some reference point. Still it was ok for what I needed (using the hdr as light source), but I wouldn't use the same approach again.

Flare from the sun was quite a problem, especially as I was using a filter behind the fish-eye. I'm still not sure if the filter in front would reduce it, as the less flare would be compensated by the need of a longer exposure... Still, the flare affects the lower end of the dynamic range, so it shouldn't be an issue for the measurement of the LED itself.

A note about ND filters, I've had very bad experiences with photography filters, especially at high ND values. It really needs to be an optical filter (I ended up using a combination of filters from Edmund Optics).

Also, a very good reference is a thesis by Jessi Stumpfel, from USC (working with Paul Debevec) available here:
http://gl.ict.usc.edu/skyprobes/

Hope this helps,
Santiago

···

________________________________________
From: Axel Jacobs [[email protected]]
Sent: Monday, February 20, 2012 11:24 AM
To: High Dynamic Range Imaging
Subject: Re: [HDRI] HDRI capture of LED

Lars, if this is a question to me--then yes, I did consider using an
ND filter, but no, I can't use one because the lens I'm using is a
4.5mm fisheye lens that doesn't allow the use of filters, AFAIK.

Either way--you'd want to expose (much) longer than a full cycle which
is only 1/100 sec. This severely limits the range of exposure times
available for the HDR sequence. If I remember correctly, Santiago
Torres captured sunny skies in his PhD thesis, and he had to use two
identical cameras for this, one of which was fitted with an ND filter.
Aligning the two must have been a nightmare.

Axel

On 20 February 2012 11:13, Lars O. Grobe <[email protected]> wrote:

Hi,

did you consider using a filter to work with higher luminances at still
reasonable shutter speeds?

Cheers, Lars.

_______________________________________________
HDRI mailing list
[email protected]

_______________________________________________
HDRI mailing list
[email protected]
____________________________________________________________
Electronic mail messages entering and leaving Arup business
systems are scanned for acceptability of content and viruses

Hello Yulia,

Seems your question has spawned quite a bit of interesting discussion...

My main recommendation is to use camera RAW images for critical photometry, especially when there are saturated colors involved. It is impossible to correct the color of JPEG images and undo what the camera maker has done, so you need to start from the sensor data.

Photosphere does not accept camera RAW as input, but I have written a Perl script that uses dcraw with the command-line HDR image builder hdrgen to overcome this limitation. It also requires the use of another third-party program, exiftool, which I have packaged together for you at:

Unfortunately, I don't have a good set of documentation to go with it. Typing "raw2hdr" by itself shows the basic syntax:

Usage: raw2hdr [hdrgen opts][-h][-w][-C calib][-c cspace] -o output.hdr input1.raw ..

If your images are taken on a tripod (aligned exposures), you can use the default settings:

raw2hdr -o output.hdr expos1.cr2 expos2.cr2 expos3.cr2 ...

The hdrgen settings can be found in the included HTML man page, and so can the -h and -w option meanings in the included dcraw man page. The -C option is to provide a linear factor to correct the overall exposure based on previous calibrations. The -c option is to specify an output color space. The default is "sRGB" which is actually linear CCIR-709 primaries. The only other output color space I would recommend is AdobeRGB. There is a CIE XYZ space supported by dcraw, but I have found it to be somewhat unreliable, and I don't know where the fault lies in this.

Regarding Axel's mention of camera flare, this is less of an issue for sources that are brighter than the rest of the scene. It mostly affects darker, surrounding regions. The -f option will attempt to estimate the camera/lens PSF and remove it, but it cannot be relied upon to remove this source of error completely. Your problem with the accuracy of the LED sources is due no doubt (as others have said) to limitations in your short exposures combined with the color issues inherent to JPEG processing.

Other responses inline....

From: "Tyukhova, Yulia" <[email protected]>
Date: February 19, 2012 9:34:25 PM PST

Hello everybody!
I’ll provide the summary of my research and have questions within the summary. I would appreciate any of your help!
My research investigates if HDRI technique can precisely capture luminances of small bright light sources (e.g. LED garage fixtures) with narrow light distributions.
I was able to figure out luminance values for a single LED, which can be compared to the ones from HDR images. But I have a couple of questions/concerns on HDRI technique and Photosphere.
At first, I’ve used “regular” scene to retrieve response curve of the camera (large smooth gradients with very dark and bright areas, and had reflectance standards for the absolute calibration).
Camera: EOS T1i Rebel with 28-105mm lens, at 28mm
Calibrated at the grey reflectance sample 186.45 cd/m2
CF=0.957

I’ve got the following RC for RGB:
red(r) = -6.434199e-03+ 4.518039e-01*r + 1.291426e+00*r^2 + 1.802896e+00*r^3;
green(g) = -5.804720e-03+ 4.175837e-01*g + 1.176582e+00*g^2 + 1.721643e+00*g^3;
blue(b) = -4.376831e-03+ 3.784418e-01*b + 1.075695e+00*b^2 + 1.658471e+00*b^3
If I look at the histogram of the scene, maximum luminance within the scene is 60,291 cd/m2.
Then I use this RC to analyze HDRI of a captured LED. The value is 230,000 cd/m2 for a single LED, which is low (it’s has to be around 7*106 cd/m2). So, it underestimates the luminance.
It seems like calibration point is critical here. I’ve decided to try to capture a different scene for deriving RC with a wider range. It would make sense that camera has to see higher luminance values in order to accurately measure them later. The dynamic range has to cover measured values.
1. How does Photosphere deals/approximates/calculates the upper end of the curve? I assume it gives more weight to mid tone values? But what happens with high luminance values?

Photosphere (and hdrgen) use all the brightest pixels from the shortest exposure and all the darkest pixels from the longest exposure. Middle exposures have their brightest and darkest pixels downgraded.

So, the new brighter scene was picked with the direct sun! But in order to avoid the damage of the camera’s sensor, measurements were taken before the sunset.
In the new brighter captured scene without the calibration all values for reflectance standards were overestimated, while the value for the sun underestimated. Then I decided to calibrate my scene at the sun!
But when I apply absolute calibration, it simply multiplies CF to all values.
2. I assumed when CF is applied, it does not equally change all values, but does it proportionally to RC (since it is not linear). Why does it do it equally for the whole range?
Lsun=80*106 cd/m2. And of course CF is very big 391.
New RC:
red(r) = 3.219064e+00+ -2.655078e+01*r + 9.351069e+02*r^2 + -2.115052e+03*r^3+1.594538e+03*r^4;
green(g) = 2.094164e+00+ -1.468109e+00*g + 7.306838e+02*g^2 + -1.720743e+03*g^3+1.380693e+03*g^4;
blue(b) = 1.049078e+00+ 1.591820e+01*b + 5.848958e+02*b^2 + -1.461635e+03*b^3+1.251033e+03*b^4
But then something interesting happened. When I analyze LED, it gives a value of 79*106 cd/m2. So, it jumps to this upper limit calibrated with the sun previously.
(I had similar results for EOS 7D with the lens 16-35mm, at 16mm)

I don't think your shortest exposure properly captured the LED, and maybe didn't capture the sun, either.

3. Does photosphere compress the response curve, so at the upper end all values above certain threshold will have the same number?

Photosphere does not compress the curve.

4. Any additional suggestions on properly obtaining and calibrating HDRI for this purpose?

I would only reiterate others' suggestion to use a neutral density filter, and using raw2hdr rather than Photosphere.

--
Thank you,
Yulia Tyukhova

Best,
-Greg

Everybody,

Thank you for fast responses/tools and suggestions!

Greg,

Thank you for your suggestions and files!
I am new to Radiance, and I assume that this is what I need to have
installed on my computer in order to use suggested Perl scripts.
If you can provide me with the link/info how to run it, that would be

Let me restate the question about the compression of the curve in
Photosphere.
1. Do manufactures compress the response curve or maybe it is limited by
camera/optics/sensor saturation itself on the upper end?

2. And I'm still curious, how CF is applied in Photosphere?

I've been using ND filter t=0.0094 on the luminance meter, because
otherwise it is impossible to measure such high luminances. I assume, you
suggest to use it on the camera as well.

I'm looking forward to analyze my images with the suggested
hdrgen. Luckily, I've been taken them in both formats jpeg and raw.
Greg, will you recommend to have regular calibration scene calibrated at
the grey card instead of using brighter scene?

Thank you,
Yulia

Hello Yulia,

Seems your question has spawned quite a bit of interesting discussion...

My main recommendation is to use camera RAW images for critical photometry,
especially when there are saturated colors involved. It is impossible to
correct the color of JPEG images and undo what the camera maker has done,
so you need to start from the sensor data.

Photosphere does not accept camera RAW as input, but I have written a Perl
script that uses dcraw with the command-line HDR image builder hdrgen to
overcome this limitation. It also requires the use of another third-party
program, exiftool, which I have packaged together for you at:

Unfortunately, I don't have a good set of documentation to go with it.
Typing "raw2hdr" by itself shows the basic syntax:

Usage: raw2hdr [hdrgen opts][-h][-w][-C calib][-c cspace] -o
output.hdr input1.raw ..

If your images are taken on a tripod (aligned exposures), you can use the
default settings:

raw2hdr -o output.hdr expos1.cr2 expos2.cr2 expos3.cr2 ...

The hdrgen settings can be found in the included HTML man page, and so can
the -h and -w option meanings in the included dcraw man page. The -C
option is to provide a linear factor to correct the overall exposure based
on previous calibrations. The -c option is to specify an output color
space. The default is "sRGB" which is actually linear CCIR-709 primaries.
The only other output color space I would recommend is AdobeRGB. There is
a CIE XYZ space supported by dcraw, but I have found it to be somewhat
unreliable, and I don't know where the fault lies in this.

Regarding Axel's mention of camera flare, this is less of an issue for
sources that are brighter than the rest of the scene. It mostly affects
darker, surrounding regions. The -f option will attempt to estimate the
camera/lens PSF and remove it, but it cannot be relied upon to remove this
source of error completely. Your problem with the accuracy of the LED
sources is due no doubt (as others have said) to limitations in your short
exposures combined with the color issues inherent to JPEG processing.

Other responses inline....

From: "Tyukhova, Yulia" <[email protected]>
Date: February 19, 2012 9:34:25 PM PST

Hello everybody!
I?ll provide the summary of my research and have questions within the

summary. I would appreciate any of your help!

My research investigates if HDRI technique can precisely capture

luminances of small bright light sources (e.g. LED garage fixtures) with
narrow light distributions.

I was able to figure out luminance values for a single LED, which can be

compared to the ones from HDR images. But I have a couple of
questions/concerns on HDRI technique and Photosphere.

At first, I?ve used ?regular? scene to retrieve response curve of the

camera (large smooth gradients with very dark and bright areas, and had
reflectance standards for the absolute calibration).

Camera: EOS T1i Rebel with 28-105mm lens, at 28mm
Calibrated at the grey reflectance sample 186.45 cd/m2
CF=0.957

I?ve got the following RC for RGB:
red(r) = -6.434199e-03+ 4.518039e-01*r + 1.291426e+00*r^2 +

1.802896e+00*r^3;

green(g) = -5.804720e-03+ 4.175837e-01*g + 1.176582e+00*g^2 +

1.721643e+00*g^3;

blue(b) = -4.376831e-03+ 3.784418e-01*b + 1.075695e+00*b^2 +

1.658471e+00*b^3

If I look at the histogram of the scene, maximum luminance within the

scene is 60,291 cd/m2.

Then I use this RC to analyze HDRI of a captured LED. The value is

230,000 cd/m2 for a single LED, which is low (it?s has to be around 7*106
cd/m2). So, it underestimates the luminance.

It seems like calibration point is critical here. I?ve decided to try to

capture a different scene for deriving RC with a wider range. It would make
sense that camera has to see higher luminance values in order to accurately
measure them later. The dynamic range has to cover measured values.

1. How does Photosphere deals/approximates/calculates the upper end of

the curve? I assume it gives more weight to mid tone values? But what
happens with high luminance values?
Photosphere (and hdrgen) use all the brightest pixels from the shortest
exposure and all the darkest pixels from the longest exposure. Middle
exposures have their brightest and darkest pixels downgraded.

So, the new brighter scene was picked with the direct sun! But in order

to avoid the damage of the camera?s sensor, measurements were taken before
the sunset.

In the new brighter captured scene without the calibration all values for

reflectance standards were overestimated, while the value for the sun
underestimated. Then I decided to calibrate my scene at the sun!

But when I apply absolute calibration, it simply multiplies CF to all

values.

2. I assumed when CF is applied, it does not equally change all

values, but does it proportionally to RC (since it is not linear). Why
does it do it equally for the whole range?

Lsun=80*106 cd/m2. And of course CF is very big 391.
New RC:
red(r) = 3.219064e+00+ -2.655078e+01*r + 9.351069e+02*r^2 +

-2.115052e+03*r^3+1.594538e+03*r^4;

green(g) = 2.094164e+00+ -1.468109e+00*g + 7.306838e+02*g^2 +

-1.720743e+03*g^3+1.380693e+03*g^4;

blue(b) = 1.049078e+00+ 1.591820e+01*b + 5.848958e+02*b^2 +

-1.461635e+03*b^3+1.251033e+03*b^4

But then something interesting happened. When I analyze LED, it gives a

value of 79*106 cd/m2. So, it jumps to this upper limit calibrated with the
sun previously.

(I had similar results for EOS 7D with the lens 16-35mm, at 16mm)

I don't think your shortest exposure properly captured the LED, and maybe
didn't capture the sun, either.

3. Does photosphere compress the response curve, so at the upper end

all values above certain threshold will have the same number?

Photosphere does not compress the curve.

4. Any additional suggestions on properly obtaining and calibrating

HDRI for this purpose?

I would only reiterate others' suggestion to use a neutral density filter,
and using raw2hdr rather than Photosphere.

--
Thank you,
Yulia Tyukhova

Best,
-Greg

···

--
Thank you,
*Yulia Tyukhova*
*
*
Fulbright Scholar, "Intern LC"
Architectural Engineering Graduate Student, UNL-Omaha, NE, USA
B.E. and M.E. in Lighting Engineering (MPEI), Moscow, Russia
[email protected]
[email protected]
+1 (402) 996 0910
PKI 247

Responses inline...

From: "Tyukhova, Yulia" <[email protected]>
Date: February 20, 2012 12:48:37 PM PST

Everybody,

Thank you for fast responses/tools and suggestions!

Greg,

Thank you for your suggestions and files!
I am new to Radiance, and I assume that this is what I need to have installed on my computer in order to use suggested Perl scripts.
If you can provide me with the link/info how to run it, that would be really helpful!

Actually, you don't need to have Radiance installed. You just need to move the executables (non-HTML files) from the unpacked directory to /usr/bin or /usr/local/bin or some other directory in your shell's PATH variable. These are command-line tools that must be run from the Terminal application under /Applications/Utilities. I.e., start Terminal and copy the files from your Downloads folder with:

tar xzf raw2hdr.tgz
cd raw2hdr
cp raw2hdr dcraw exiftool /usr/bin
cd
raw2hdr

This should give you the usage message I wrote you earlier if it all goes well. Some basic commands and pointers for Unix are available many places online. Googling "basic unix tutorial" gave this page at the head of the list:

Let me restate the question about the compression of the curve in Photosphere.
1. Do manufactures compress the response curve or maybe it is limited by camera/optics/sensor saturation itself on the upper end?

Some camera makers do compress the top end of the response curve, and do funny things at the bottom as well. Photosphere attempts to discover the tone curve and correct for these manipulations, but it isn't perfect and if the camera is changing the tone curve dynamically, it's pretty hopeless. There are settings you can use on a DSLR to disable such manipulations, but using RAW files bypasses the problems entirely because the data is linear.

2. And I'm still curious, how CF is applied in Photosphere?

A calibration factor is applied equally to all coefficients in the polynomial, which is exactly the same as applying a linear scale factor after the HDR merge operation.

I've been using ND filter t=0.0094 on the luminance meter, because otherwise it is impossible to measure such high luminances. I assume, you suggest to use it on the camera as well.

Whatever gives you a short exposure that is past the integration time of your source (1/60th second is acceptable) and not saturated is OK. Specifically, all values in the short exposure's histogram should be be below 245.

I'm looking forward to analyze my images with the suggested hdrgen. Luckily, I've been taken them in both formats jpeg and raw.
Greg, will you recommend to have regular calibration scene calibrated at the grey card instead of using brighter scene?

The best scene for calibration is a white card in a scene with no bright sources directed at the camera. The calibration should hold in other scenes where lens flare is not problematic.

Thank you,
Yulia

Certainly,
-Greg

Greg,
I apologize for so many questions!

Are those executables for Linux OS?
I use Windows 7 32-bit OS.

Message: 1

···

Date: Mon, 20 Feb 2012 14:48:37 -0600
From: "Tyukhova, Yulia" <[email protected]>
To: [email protected]
Subject: [HDRI] HDRI capture of LED
Message-ID:
<CAJYAbgpa_tQtuxLAvgFeOQXzC0X8F25D6haBdt1uOqT=4NHiJg@mail.gmail.com
>
Content-Type: text/plain; charset="iso-8859-1"

Everybody,

Thank you for fast responses/tools and suggestions!

Greg,

Thank you for your suggestions and files!
I am new to Radiance, and I assume that this is what I need to have
installed on my computer in order to use suggested Perl scripts.
If you can provide me with the link/info how to run it, that would be

Let me restate the question about the compression of the curve in
Photosphere.
1. Do manufactures compress the response curve or maybe it is limited by
camera/optics/sensor saturation itself on the upper end?

2. And I'm still curious, how CF is applied in Photosphere?

I've been using ND filter t=0.0094 on the luminance meter, because
otherwise it is impossible to measure such high luminances. I assume, you
suggest to use it on the camera as well.

I'm looking forward to analyze my images with the suggested
hdrgen. Luckily, I've been taken them in both formats jpeg and raw.
Greg, will you recommend to have regular calibration scene calibrated at
the grey card instead of using brighter scene?

Thank you,
Yulia

Hello Yulia,

Seems your question has spawned quite a bit of interesting discussion...

My main recommendation is to use camera RAW images for critical photometry,
especially when there are saturated colors involved. It is impossible to
correct the color of JPEG images and undo what the camera maker has done,
so you need to start from the sensor data.

Photosphere does not accept camera RAW as input, but I have written a Perl
script that uses dcraw with the command-line HDR image builder hdrgen to
overcome this limitation. It also requires the use of another third-party
program, exiftool, which I have packaged together for you at:

Unfortunately, I don't have a good set of documentation to go with it.
Typing "raw2hdr" by itself shows the basic syntax:

Usage: raw2hdr [hdrgen opts][-h][-w][-C calib][-c cspace] -o
output.hdr input1.raw ..

If your images are taken on a tripod (aligned exposures), you can use the
default settings:

raw2hdr -o output.hdr expos1.cr2 expos2.cr2 expos3.cr2 ...

The hdrgen settings can be found in the included HTML man page, and so can
the -h and -w option meanings in the included dcraw man page. The -C
option is to provide a linear factor to correct the overall exposure based
on previous calibrations. The -c option is to specify an output color
space. The default is "sRGB" which is actually linear CCIR-709 primaries.
The only other output color space I would recommend is AdobeRGB. There is
a CIE XYZ space supported by dcraw, but I have found it to be somewhat
unreliable, and I don't know where the fault lies in this.

Regarding Axel's mention of camera flare, this is less of an issue for
sources that are brighter than the rest of the scene. It mostly affects
darker, surrounding regions. The -f option will attempt to estimate the
camera/lens PSF and remove it, but it cannot be relied upon to remove this
source of error completely. Your problem with the accuracy of the LED
sources is due no doubt (as others have said) to limitations in your short
exposures combined with the color issues inherent to JPEG processing.

Other responses inline....

> From: "Tyukhova, Yulia" <[email protected]>
> Date: February 19, 2012 9:34:25 PM PST
>
> Hello everybody!
> I?ll provide the summary of my research and have questions within the
summary. I would appreciate any of your help!
> My research investigates if HDRI technique can precisely capture
luminances of small bright light sources (e.g. LED garage fixtures) with
narrow light distributions.
> I was able to figure out luminance values for a single LED, which can be
compared to the ones from HDR images. But I have a couple of
questions/concerns on HDRI technique and Photosphere.
> At first, I?ve used ?regular? scene to retrieve response curve of the
camera (large smooth gradients with very dark and bright areas, and had
reflectance standards for the absolute calibration).
> Camera: EOS T1i Rebel with 28-105mm lens, at 28mm
> Calibrated at the grey reflectance sample 186.45 cd/m2
> CF=0.957
>
> I?ve got the following RC for RGB:
> red(r) = -6.434199e-03+ 4.518039e-01*r + 1.291426e+00*r^2 +
1.802896e+00*r^3;
> green(g) = -5.804720e-03+ 4.175837e-01*g + 1.176582e+00*g^2 +
1.721643e+00*g^3;
> blue(b) = -4.376831e-03+ 3.784418e-01*b + 1.075695e+00*b^2 +
1.658471e+00*b^3
> If I look at the histogram of the scene, maximum luminance within the
scene is 60,291 cd/m2.
> Then I use this RC to analyze HDRI of a captured LED. The value is
230,000 cd/m2 for a single LED, which is low (it?s has to be around 7*106
cd/m2). So, it underestimates the luminance.
> It seems like calibration point is critical here. I?ve decided to try to
capture a different scene for deriving RC with a wider range. It would make
sense that camera has to see higher luminance values in order to accurately
measure them later. The dynamic range has to cover measured values.
> 1. How does Photosphere deals/approximates/calculates the upper end of
the curve? I assume it gives more weight to mid tone values? But what
happens with high luminance values?
Photosphere (and hdrgen) use all the brightest pixels from the shortest
exposure and all the darkest pixels from the longest exposure. Middle
exposures have their brightest and darkest pixels downgraded.
> So, the new brighter scene was picked with the direct sun! But in order
to avoid the damage of the camera?s sensor, measurements were taken before
the sunset.
> In the new brighter captured scene without the calibration all values for
reflectance standards were overestimated, while the value for the sun
underestimated. Then I decided to calibrate my scene at the sun!
> But when I apply absolute calibration, it simply multiplies CF to all
values.
> 2. I assumed when CF is applied, it does not equally change all
values, but does it proportionally to RC (since it is not linear). Why
does it do it equally for the whole range?
> Lsun=80*106 cd/m2. And of course CF is very big 391.
> New RC:
> red(r) = 3.219064e+00+ -2.655078e+01*r + 9.351069e+02*r^2 +
-2.115052e+03*r^3+1.594538e+03*r^4;
> green(g) = 2.094164e+00+ -1.468109e+00*g + 7.306838e+02*g^2 +
-1.720743e+03*g^3+1.380693e+03*g^4;
> blue(b) = 1.049078e+00+ 1.591820e+01*b + 5.848958e+02*b^2 +
-1.461635e+03*b^3+1.251033e+03*b^4
> But then something interesting happened. When I analyze LED, it gives a
value of 79*106 cd/m2. So, it jumps to this upper limit calibrated with the
sun previously.
> (I had similar results for EOS 7D with the lens 16-35mm, at 16mm)
>

I don't think your shortest exposure properly captured the LED, and maybe
didn't capture the sun, either.
> 3. Does photosphere compress the response curve, so at the upper end
all values above certain threshold will have the same number?

Photosphere does not compress the curve.
> 4. Any additional suggestions on properly obtaining and calibrating
HDRI for this purpose?

I would only reiterate others' suggestion to use a neutral density filter,
and using raw2hdr rather than Photosphere.

> --
> Thank you,
> Yulia Tyukhova

Best,
-Greg

--
Thank you,
*Yulia Tyukhova*
*
*
Fulbright Scholar, "Intern LC"
Architectural Engineering Graduate Student, UNL-Omaha, NE, USA
B.E. and M.E. in Lighting Engineering (MPEI), Moscow, Russia
[email protected]
[email protected]
+1 (402) 996 0910
PKI 247
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <
>

------------------------------

Message: 2
Date: Mon, 20 Feb 2012 13:14:15 -0800
From: "Gregory J. Ward" <[email protected]>
To: High Dynamic Range Imaging <[email protected]>
Subject: Re: [HDRI] HDRI capture of LED
Message-ID: <[email protected]>
Content-Type: text/plain; charset="us-ascii"

Responses inline...

> From: "Tyukhova, Yulia" <[email protected]>
> Date: February 20, 2012 12:48:37 PM PST
>
> Everybody,
>
> Thank you for fast responses/tools and suggestions!
>
> Greg,
>
> Thank you for your suggestions and files!
> I am new to Radiance, and I assume that this is what I need to have
installed on my computer in order to use suggested Perl scripts.
> If you can provide me with the link/info how to run it, that would be

Actually, you don't need to have Radiance installed. You just need to
move the executables (non-HTML files) from the unpacked directory to
/usr/bin or /usr/local/bin or some other directory in your shell's PATH
variable. These are command-line tools that must be run from the Terminal
application under /Applications/Utilities. I.e., start Terminal and copy

tar xzf raw2hdr.tgz
cd raw2hdr
cp raw2hdr dcraw exiftool /usr/bin
cd
raw2hdr

This should give you the usage message I wrote you earlier if it all goes
well. Some basic commands and pointers for Unix are available many places
list:

> Let me restate the question about the compression of the curve in
Photosphere.
> 1. Do manufactures compress the response curve or maybe it is limited by
camera/optics/sensor saturation itself on the upper end?

Some camera makers do compress the top end of the response curve, and do
funny things at the bottom as well. Photosphere attempts to discover the
tone curve and correct for these manipulations, but it isn't perfect and if
the camera is changing the tone curve dynamically, it's pretty hopeless.
There are settings you can use on a DSLR to disable such manipulations,
but using RAW files bypasses the problems entirely because the data is
linear.

> 2. And I'm still curious, how CF is applied in Photosphere?

A calibration factor is applied equally to all coefficients in the
polynomial, which is exactly the same as applying a linear scale factor
after the HDR merge operation.

> I've been using ND filter t=0.0094 on the luminance meter, because
otherwise it is impossible to measure such high luminances. I assume, you
suggest to use it on the camera as well.

Whatever gives you a short exposure that is past the integration time of
your source (1/60th second is acceptable) and not saturated is OK.
Specifically, all values in the short exposure's histogram should be be
below 245.

> I'm looking forward to analyze my images with the suggested hdrgen.
Luckily, I've been taken them in both formats jpeg and raw.
> Greg, will you recommend to have regular calibration scene calibrated at
the grey card instead of using brighter scene?

The best scene for calibration is a white card in a scene with no bright
sources directed at the camera. The calibration should hold in other
scenes where lens flare is not problematic.

> Thank you,
> Yulia

Certainly,
-Greg
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <
>

------------------------------

_______________________________________________
HDRI mailing list
[email protected]

End of HDRI Digest, Vol 46, Issue 7
***********************************

--
Thank you,
*Yulia Tyukhova*
*
*
Fulbright Scholar, "Intern LC"
Architectural Engineering Graduate Student, UNL-Omaha, NE, USA
B.E. and M.E. in Lighting Engineering (MPEI), Moscow, Russia
[email protected]
[email protected]
+1 (402) 996 0910
PKI 247

Oh, oops! I assumed since you said you were using Photosphere that you were on a Mac. These are for Apple's OS X (Snow Leopard). I don't have versions for Linux or Windows.

-Greg

···

From: "Tyukhova, Yulia" <[email protected]>
Date: February 20, 2012 2:41:11 PM PST

Greg,
I apologize for so many questions!

Are those executables for Linux OS?
I use Windows 7 32-bit OS.

Hi Santiago,

good to hear from you. May I ask you to elaborate this:

Flare from the sun was quite a problem, especially as I was using a filter behind the fish-eye. I'm still not sure if the filter in front would reduce it, as the less flare would be compensated by the need of a longer exposure... Still, the flare affects the lower end of the dynamic range, so it shouldn't be an issue for the measurement of the LED itself.

How did you attach the filter behind the fisheye lens? Did you simply
stick it there with some glue or silicone? Or onto the sensor?

A note about ND filters, I've had very bad experiences with photography filters, especially at high ND values. It really needs to be an optical filter (I ended up using a combination of filters from Edmund Optics).

Do you mean that the light attenuation was not what was written on the
box? Or were there other issues? I assume you calibrated even the EO
filter, or did you just take it at face value?

Cheers

Axel

Yulia,

First of all, good on ya for dipping your toes into the HDR list with such informed questions! I think you, me, Jennifer, and Clarence can work together to get you applying Greg's advice at Lincoln. I am pretty sure Clarence thought there was a Mac available in the school, and he will of course be back from NREL full time in May-ish and has his own Mac too.

This has been a great discussion, all.

- Rob

···

On Feb 20, 2012, at 4:04 PM, Gregory J. Ward wrote:

Oh, oops! I assumed since you said you were using Photosphere that you were on a Mac. These are for Apple's OS X (Snow Leopard). I don't have versions for Linux or Windows.

-Greg

From: "Tyukhova, Yulia" <[email protected]>
Date: February 20, 2012 2:41:11 PM PST

Greg,
I apologize for so many questions!

Are those executables for Linux OS?
I use Windows 7 32-bit OS.

_______________________________________________
HDRI mailing list
[email protected]

Greg,

I was able to run your scripts, but now I have more questions on what's
behind them.

1. I assume after combining raw images into one, I can analyze obtained hdr
image in Photosphere.

–r *cam.rsp*

Use the given file for the camera’s response curves. If this file exists,
it must contain the coefficients of three polynomials, one for each color
primary. If the file does not exist, hdrgen will use its principal
algorithm to derive these coefficients and write them out to this file for
later use.

2. So, if I will combine my images for the calibration scene, hdrgen will
derive the coefficients to the file. Is there a way to see those
coefficients and to know what the response curve looks like?

The -C option is to provide a linear factor to correct the overall

exposure based on previous calibrations.

3. How do I make the absolute calibration? Usually it is a luminance value
of a reflectance standard measured with luminance meter applied in
Photosphere. But how do I do it with the given script?

The -c option is to specify an output color space. The default is "sRGB"

which is actually linear CCIR-709 primaries. The only other output color
space I would recommend is AdobeRGB. There is a CIE XYZ space supported by
dcraw, but I have found it to be somewhat unreliable, and I don't know
where the fault lies in this.

4. In order to have luminance values Photosphere has an algorithm that does
color calculations from sRGB to CIE XYZ (standard illuminant D65), where Y
is the luminance value. Here I can specify an output color space, let's say
sRGB, but how would I get luminance values?

···

–s *stonits*

*
*
*
*
*Thank you!*
5.
5.

--
Thank you,
*Yulia Tyukhova*
*
*
Fulbright Scholar, "Intern LC"
Architectural Engineering Graduate Student, UNL-Omaha, NE, USA
B.E. and M.E. in Lighting Engineering (MPEI), Moscow, Russia
[email protected]
[email protected]
+1 (402) 996 0910
PKI 247

Hi Axel,

About the filter, actually it was attached with tape to the back of the FC-E8 adapter (and in front of the camera lens). So it was in the middle of the optics. I think this may have helped. The back lens of the adapter does not protrude, so it was easy to fix it parallel to the other lenses.

The problems with the nd filters were mostly with colour shifting. The cheaper darker filters get everything too green (and these were the best I could get from a photography shop, not that cheap). The calibration was done with the filters in place and compensated with measurements. If I remember correctly, the stated tolerance for the transmissivity was about 5% (for an ND3, that's 1/20 000th?), but I didn't verify whether the real one was within this.

Cheers,
Santiago

···

________________________________________
From: Axel Jacobs [[email protected]]
Sent: Tuesday, February 21, 2012 1:38 AM
To: High Dynamic Range Imaging
Subject: Re: [HDRI] HDRI capture of LED

Hi Santiago,

good to hear from you. May I ask you to elaborate this:

Flare from the sun was quite a problem, especially as I was using a filter behind the fish-eye. I'm still not sure if the filter in front would reduce it, as the less flare would be compensated by the need of a longer exposure... Still, the flare affects the lower end of the dynamic range, so it shouldn't be an issue for the measurement of the LED itself.

How did you attach the filter behind the fisheye lens? Did you simply
stick it there with some glue or silicone? Or onto the sensor?

A note about ND filters, I've had very bad experiences with photography filters, especially at high ND values. It really needs to be an optical filter (I ended up using a combination of filters from Edmund Optics).

Do you mean that the light attenuation was not what was written on the
box? Or were there other issues? I assume you calibrated even the EO
filter, or did you just take it at face value?

Cheers

Axel

_______________________________________________
HDRI mailing list
[email protected]
____________________________________________________________
Electronic mail messages entering and leaving Arup business
systems are scanned for acceptability of content and viruses

Hi Yulia,

Not all of the options of hdrgen are relevant for raw2hdr. See inline....

From: "Tyukhova, Yulia" <[email protected]>
Date: February 21, 2012 1:21:05 PM PST

Greg,

I was able to run your scripts, but now I have more questions on what's behind them.

1. I assume after combining raw images into one, I can analyze obtained hdr image in Photosphere.

Yes, of course.

>> –r cam.rsp
Use the given file for the camera’s response curves. If this file exists, it must contain the coefficients of three polynomials, one for each color primary. If the file does not exist, hdrgen will use its principal algorithm to derive these coefficients and write them out to this file for later use.

2. So, if I will combine my images for the calibration scene, hdrgen will derive the coefficients to the file. Is there a way to see those coefficients and to know what the response curve looks like?

The raw2hdr script doesn't need to derive a response curve, since the sensor data is linear. Instead, it creates an output from dcraw that follows a 2.0 gamma and creates an artificial response curve of x^2 to decode it. This reduces quantization errors from the 8-bit intermediate images.

>>The -C option is to provide a linear factor to correct the overall exposure based on previous calibrations.

3. How do I make the absolute calibration? Usually it is a luminance value of a reflectance standard measured with luminance meter applied in Photosphere. But how do I do it with the given script?

Look at your raw2hdr result in Photosphere and select the measured area. Divide your measurement by the value Photosphere gives you. This is the calibration factor to use with the -C option for conversions for this camera and lens.

>>The -c option is to specify an output color space. The default is "sRGB" which is actually linear CCIR-709 primaries. The only other output color space I would recommend is AdobeRGB. There is a CIE XYZ space supported by dcraw, but I have found it to be somewhat unreliable, and I don't know where the fault lies in this.

4. In order to have luminance values Photosphere has an algorithm that does color calculations from sRGB to CIE XYZ (standard illuminant D65), where Y is the luminance value. Here I can specify an output color space, let's say sRGB, but how would I get luminance values?

As I mentioned, the sRGB and AdobeRGB spaces will both work, and Photosphere will adjust its Y value calculations accordingly. The color space is recorded in the HDR output.

>>–s stonits

You do not need it -- this option is for when the camera doesn't record the necessary aperture, asa and shutter speed settings in the image file.

Best,
-Greg

Hello Greg!

I was able to compile my images of LED with ND filter with raw2hdr, but I
need to clarify a couple of things.

As the intermediate step of combining the images I have the following:

Writing data to standard output ...
Can't locate Image/ExifTool.pm in @INC (@INC contains: /usr/bin/lib
/System/Library/Perl/5.12
/System/Library/Perl/Extras/5.12 .) at /usr/bin/exiftool line 30.
BEGIN failed--compilation aborted at /usr/bin/exiftool line 30.

1. What does it mean? Is there a problem with data input/output? I just
want to make sure that the data is processed properly.

2. At first it didn't make sense why do I need hdrgen, since it uses tiff
or jpeg as an input while I'm combining hdr from raw. But then I've noticed
that raw2hdr generates temporary tiff photos and then uses them in hdrgen
function. But if I want to include some additional settings for hdrgen
(like flare removal) besides the default ones I have to following error:
raw2hdr hdrgen -f -o output6.hdr IMG_02??.CR2
Missing -o output file specification
How do I write my settings?

Thank you!
Yulia

Hi Yulia,

The error with Exiftool is my fault, I'm afraid. I naively thought that the program was self-contained, when it is not. You need to download and install it on your machine via the following URL:

As for your second question, you can use other hdrgen options, but you don't need to write "hdrgen" on the command line as you have done. The raw2hdr script knows how to sort out the various options.

Best,
-Greg

···

From: "Tyukhova, Yulia" <[email protected]>
Date: February 22, 2012 12:59:45 PM PST

Hello Greg!

I was able to compile my images of LED with ND filter with raw2hdr, but I need to clarify a couple of things.

As the intermediate step of combining the images I have the following:

Writing data to standard output ...