errors from hdrgen

Terribly sorry for sending that mammoth attachment to the list, I didn't
realize it was that big before it was too late. :frowning:

Here they are instead:
http://www.juventuz.com/_temp/good_set.tar.gz
http://www.juventuz.com/_temp/bad_set.tar.gz

Martin

路路路

---------- Forwarded Message ----------

Subject: errors from hdrgen
Date: Thursday 10 June 2004 13:11
From: Martin Matusiak <alex@juventuz.net>
To: radiance-general@radiance-online.org

I seem to have trouble using hdrgen to generate hdr's.. The other day it
seemed to be working fine but now I'm getting errors like
"Poor covergence of order 1 fit"
"Cannot solve for response function"

Ultimately it won't generate the image. Perhaps you could tell me what they
mean and what I'm doing wrong? I've attached the five images that gave this
error. I've also attached another set of 3 that gave no error or warning at
all, even though the resulting hdr is quite blurry.

Thanks a lot!

Martin

Martin Matusiak wrote:

"Poor covergence of order 1 fit"

Ignore this one, this is a debug message and only means that things are not running as smoothly as they could.

"Cannot solve for response function"

That is more of a problem. Essentially, the dynamic range of the images has gaps between them or is clipped on either side of the range. Try:

- Taking images at smaller f-stop intervals
- remove images that are too bright or too dark
- include darker or lighter images.

We keep having this problem with webhdr, but here doesn't seem to be a rule of thumb for getting it right. If the worst comes to the worst, you can disable hdrgen's built-in auto exposure routine and do a hand-jobby, but this is more difficult.
https://luminance.londonmet.ac.uk/webhdr/

You should also save the coefficients from a run that doesn't create any errors/warning and re-use them when you do get them. Please note that the response curve is/can be different even between two cameras of the same model/make, so don't mix-n-match them.

Axel

just a quick extra question about HDR and light sources:

supposing that with a camera we have got some images,

1 we calculate the HDR image
2 we measure on the mac screen the luminance with ximage
3 we measure the luminance in the real scene.
4 we compare the numbers and find a factor ( k=lum(hdr):lum(real) ).

is this factor independent from the spectrum of the light source once that
the camera is 'calibrated' or it is related to it?
like in the case of: sun, low pressure sodium, MH, mercury lamps, theatre
blue filtered light....
is the correction factor (k) constant for these different conditions?
if not is there a best calibration environment (D65 or something else)?

many thanks,
giulio

路路路

-----Original Message-----
From: radiance-general-bounces@radiance-online.org
[mailto:radiance-general-bounces@radiance-online.org]On Behalf Of Axel
Jacobs
Sent: 10 June 2004 15:26
To: Radiance general discussion
Subject: Re: [Radiance-general] Fwd: errors from hdrgen

Martin Matusiak wrote:

"Poor covergence of order 1 fit"

Ignore this one, this is a debug message and only means that things are
not running as smoothly as they could.

"Cannot solve for response function"

That is more of a problem. Essentially, the dynamic range of the images
has gaps between them or is clipped on either side of the range. Try:

- Taking images at smaller f-stop intervals
- remove images that are too bright or too dark
- include darker or lighter images.

We keep having this problem with webhdr, but here doesn't seem to be a
rule of thumb for getting it right. If the worst comes to the worst, you
can disable hdrgen's built-in auto exposure routine and do a hand-jobby,
but this is more difficult.
https://luminance.londonmet.ac.uk/webhdr/

You should also save the coefficients from a run that doesn't create any
errors/warning and re-use them when you do get them. Please note that
the response curve is/can be different even between two cameras of the
same model/make, so don't mix-n-match them.

Axel

_______________________________________________
Radiance-general mailing list
Radiance-general@radiance-online.org
http://www.radiance-online.org/mailman/listinfo/radiance-general

___________________________________________________________________
Electronic mail messages entering and leaving Arup business
systems are scanned for acceptability of content and viruses.

Hi Pillo,

Giulio Antonutto wrote:

just a quick extra question about HDR and light sources:

supposing that with a camera we have got some images,

1 we calculate the HDR image
2 we measure on the mac screen the luminance with ximage
3 we measure the luminance in the real scene.
4 we compare the numbers and find a factor ( k=lum(hdr):lum(real) ).

I think that there are a lot of reasons you will come up with a factor, and often they will vary (the factors, that is), making the factor an unreliable means of conversion. In zones of extreme contrast, it's quite difficult to accurately sample a scene with a handheld luminance meter with the same precision that pextrem -o can find the peaks. The factor derived from peak values is different from the factor derived from other values, I've found. Which is right? If the real scene and the HDR are within a certain precentage of each other, do you call them synonomous? I'd love to hear others weigh in on this topic.

路路路

----

聽聽聽聽聽聽Rob Guglielmetti

e. rpg@rumblestrip.org
w. www.rumblestrip.org

Hi Martin,

To get a reasonable fit to a particular camera, Photosphere (or hdrgen) needs more than just a few exposures. I find that it likes at least 7 exposures, better 9, of a wide dynamic-range scene. Each exposure should be separated by an f-stop or so by varying the shutter speed only. This is explained in the Photosphere quickstart_pf.txt file, which I have quoted on this list recently. (See my response to Barbara Matusiak on May 28 under the subject "conversion of digital pictures to Radiance?") Once you have established the response function for a particular camera, you can then reuse it for a shorter exposure sequence, or even a single image (though you should not expect a high dynamic-range result in that case). This storage and reuse is accomplished with hdrgen's -r option, or via the preferences file in Photosphere.

If the algorithm cannot arrive at a reasonable response function for your camera, Photosphere offers the option of applying a generic response function, instead. Although I don't recommend this if your goal is accuracy, the following response file will permit hdrgen a similar fall-back using the -r option:

2 1 0 0

This simply assigns each channel the polynomial f(x) = x^2, which roughly corresponds to a standard gamma curve. It's really a crude approximation, but since you are taking overlapping exposures, the global errors in the mid-exposure region are reasonably small.

By the way, I managed to convert both your sequences without complaints in Photosphere, though the three-image sequence is a bit blurry. Obviously, the alignment algorithm didn't quite work on this one. Unfortunately, it doesn't know when it's failed, so no errors or warnings are issued in most cases.

-Greg

P.S. In response to Pillo's inquiry, the luminance reported by ximage should correspond roughly to the luminance you would measure with a photometer, accounting for the photopic response of the probe's filter. However, you are definitely better off measuring something as close to white as possible, and preferably not a light source -- something more in the middle of the exposure range, like a grey card. The latest version of Photosphere even includes a calibration option to make this work easier.

路路路

From: Martin Matusiak <alex@juventuz.net>
Date: June 10, 2004 5:20:10 AM PDT

Terribly sorry for sending that mammoth attachment to the list, I didn't
realize it was that big before it was too late. :frowning:

Here they are instead:
http://www.juventuz.com/_temp/good_set.tar.gz
http://www.juventuz.com/_temp/bad_set.tar.gz

Martin

Subject: errors from hdrgen
Date: Thursday 10 June 2004 13:11

I seem to have trouble using hdrgen to generate hdr's.. The other day it
seemed to be working fine but now I'm getting errors like
"Poor covergence of order 1 fit"
"Cannot solve for response function"

Ultimately it won't generate the image. Perhaps you could tell me what they
mean and what I'm doing wrong? I've attached the five images that gave this
error. I've also attached another set of 3 that gave no error or warning at
all, even though the resulting hdr is quite blurry.

Thanks a lot!

Martin

My question arises from some experiments measuring the same paper surface
under different light conditions:

Unfortunately it appears that if the source has a really narrow spectrum
this affects the results within a 40% with my camera (which I cannot give
the name... 001DNOKIN :wink: ).

I am stressing the methodology to see if it is reliable enough to substitute
luminance readings for a wide range of conditions with consumer digital
cameras (street lighting - interior lighting - day lighting ).
I really underline the concept of 'wide use' (wide = cheap :wink: ).

With this sort of equipment I would like to achieve an error within +-10%
and therefore a correction factor quite constant (within +-10%); is it
feasible?

I found a very interesting product where the question is somehow explained:
http://www.technoteam.de/pdf/lmkrollei_e.pdf

"
Measuring variations1
halogen, D65,...<5%
fluorescent lamp, MHN-T,...<10%
1 Due to spectral distributions deviating from CIE standard illuminant A
within the scene.
"

and also

"
Under certain circumstances, spectral distributions
deviating from the CIE standard illuminant A may
cause bad measuring errors.The value displayed can be corrected by applying
an additional factor (Colour Correction Factor, ccf) during recalibration
in the software. The calculation of the ccf value requires to know the
spectral distributions within the measuring scene or also to use a mean
correction factor...

...In addition, the user will also be provided with the relative spectral
sensitivity
of the single colour channels and the resulting relative spectral
sensitivity of standard matrix formation. Thus, he
will be able to calculate other ccf values or also other weighting factors.
"

This suggest to me, I am guessing, that calibration is really important,
that must be performed under very strict and controlled conditions and that
all the measurement should be within a reasonable range of conditions
similar to the calibration environment( if I calibrate with D65 I cannot
expect to measure accurately LPS or HPS, isn't it?).

Do you have any field experience about this?

thanks a lot,
giulio

PS - sorry for this long and labyrinth style email!

路路路

-----Original Message-----
From: radiance-general-bounces@radiance-online.org
[mailto:radiance-general-bounces@radiance-online.org]On Behalf Of Greg
Ward
Sent: 10 June 2004 16:23
To: Radiance general discussion
Subject: Re: [Radiance-general] Fwd: errors from hdrgen

Hi Martin,

To get a reasonable fit to a particular camera, Photosphere (or hdrgen)
needs more than just a few exposures. I find that it likes at least 7
exposures, better 9, of a wide dynamic-range scene. Each exposure
should be separated by an f-stop or so by varying the shutter speed
only. This is explained in the Photosphere quickstart_pf.txt file,
which I have quoted on this list recently. (See my response to Barbara
Matusiak on May 28 under the subject "conversion of digital pictures to
Radiance?") Once you have established the response function for a
particular camera, you can then reuse it for a shorter exposure
sequence, or even a single image (though you should not expect a high
dynamic-range result in that case). This storage and reuse is
accomplished with hdrgen's -r option, or via the preferences file in
Photosphere.

If the algorithm cannot arrive at a reasonable response function for
your camera, Photosphere offers the option of applying a generic
response function, instead. Although I don't recommend this if your
goal is accuracy, the following response file will permit hdrgen a
similar fall-back using the -r option:

2 1 0 0
2 1 0 0
2 1 0 0

This simply assigns each channel the polynomial f(x) = x^2, which
roughly corresponds to a standard gamma curve. It's really a crude
approximation, but since you are taking overlapping exposures, the
global errors in the mid-exposure region are reasonably small.

By the way, I managed to convert both your sequences without complaints
in Photosphere, though the three-image sequence is a bit blurry.
Obviously, the alignment algorithm didn't quite work on this one.
Unfortunately, it doesn't know when it's failed, so no errors or
warnings are issued in most cases.

-Greg

P.S. In response to Pillo's inquiry, the luminance reported by ximage
should correspond roughly to the luminance you would measure with a
photometer, accounting for the photopic response of the probe's filter.
聽聽However, you are definitely better off measuring something as close to
white as possible, and preferably not a light source -- something more
in the middle of the exposure range, like a grey card. The latest
version of Photosphere even includes a calibration option to make this
work easier.

From: Martin Matusiak <alex@juventuz.net>
Date: June 10, 2004 5:20:10 AM PDT

Terribly sorry for sending that mammoth attachment to the list, I
didn't
realize it was that big before it was too late. :frowning:

Here they are instead:
http://www.juventuz.com/_temp/good_set.tar.gz
http://www.juventuz.com/_temp/bad_set.tar.gz

Martin

Subject: errors from hdrgen
Date: Thursday 10 June 2004 13:11

I seem to have trouble using hdrgen to generate hdr's.. The other day
it
seemed to be working fine but now I'm getting errors like
"Poor covergence of order 1 fit"
"Cannot solve for response function"

Ultimately it won't generate the image. Perhaps you could tell me what
they
mean and what I'm doing wrong? I've attached the five images that gave
this
error. I've also attached another set of 3 that gave no error or
warning at
all, even though the resulting hdr is quite blurry.

Thanks a lot!

Martin

_______________________________________________
Radiance-general mailing list
Radiance-general@radiance-online.org
http://www.radiance-online.org/mailman/listinfo/radiance-general

___________________________________________________________________
Electronic mail messages entering and leaving Arup business
systems are scanned for acceptability of content and viruses.

Hi all,

I`m not sure about this, but the way I understand the calibration procedure,
you have three curves (one for each rgb color) that relate the pixel values
in the jpegs to the luminance values. On the other hand, most cameras have
some built in correction functions for different light sources, so I guess
each correction function (daylight, tubes, etc.) will have different
response curves for the three colors (in order to compensate for the
different source spectra). However, if you use always the same correction
function in the camera (not the auto-correction, which is changing all the
time, this really got me confused for a while), you`ll get good results with
any light source (probably there is still some error caused by the rgb being
different from the CIE observer... is it?)

So, for example, if you make your calibration with the camera set up for
daylight, then you should use it always with the daylight correction, even
if you are measuring a scene lit up with tubes. The images will look
greenish, but the values will be calculated according to the correct
calibration curves. I`ve tried some measurings (not with such wide range of
conditions, but) using daylight compensation for an indoor scene, with good
results.
I hope this makes some sense, got too long...
Saludos,

Santiago

路路路

-----Original Message-----
From: radiance-general-bounces@radiance-online.org
[mailto:radiance-general-bounces@radiance-online.org]On Behalf Of
Giulio Antonutto
Sent: Friday, June 11, 2004 1:06 AM
To: 'Radiance general discussion'
Subject: RE: [Radiance-general] Fwd: errors from hdrgen

My question arises from some experiments measuring the same paper surface
under different light conditions:

Unfortunately it appears that if the source has a really narrow spectrum
this affects the results within a 40% with my camera (which I cannot give
the name... 001DNOKIN :wink: ).

I am stressing the methodology to see if it is reliable enough to
substitute
luminance readings for a wide range of conditions with consumer digital
cameras (street lighting - interior lighting - day lighting ).
I really underline the concept of 'wide use' (wide = cheap :wink: ).

With this sort of equipment I would like to achieve an error within +-10%
and therefore a correction factor quite constant (within +-10%); is it
feasible?

I found a very interesting product where the question is somehow
explained:
http://www.technoteam.de/pdf/lmkrollei_e.pdf

"
Measuring variations1
halogen, D65,...<5%
fluorescent lamp, MHN-T,...<10%
1 Due to spectral distributions deviating from CIE standard illuminant A
within the scene.
"

and also

"
Under certain circumstances, spectral distributions
deviating from the CIE standard illuminant A may
cause bad measuring errors.The value displayed can be corrected
by applying
an additional factor (Colour Correction Factor, ccf) during recalibration
in the software. The calculation of the ccf value requires to know the
spectral distributions within the measuring scene or also to use a mean
correction factor...

...In addition, the user will also be provided with the relative spectral
sensitivity
of the single colour channels and the resulting relative spectral
sensitivity of standard matrix formation. Thus, he
will be able to calculate other ccf values or also other
weighting factors.
"

This suggest to me, I am guessing, that calibration is really important,
that must be performed under very strict and controlled
conditions and that
all the measurement should be within a reasonable range of conditions
similar to the calibration environment( if I calibrate with D65 I cannot
expect to measure accurately LPS or HPS, isn't it?).

Do you have any field experience about this?

thanks a lot,
giulio

PS - sorry for this long and labyrinth style email!

-----Original Message-----
From: radiance-general-bounces@radiance-online.org
[mailto:radiance-general-bounces@radiance-online.org]On Behalf Of Greg
Ward
Sent: 10 June 2004 16:23
To: Radiance general discussion
Subject: Re: [Radiance-general] Fwd: errors from hdrgen

Hi Martin,

To get a reasonable fit to a particular camera, Photosphere (or hdrgen)
needs more than just a few exposures. I find that it likes at least 7
exposures, better 9, of a wide dynamic-range scene. Each exposure
should be separated by an f-stop or so by varying the shutter speed
only. This is explained in the Photosphere quickstart_pf.txt file,
which I have quoted on this list recently. (See my response to Barbara
Matusiak on May 28 under the subject "conversion of digital pictures to
Radiance?") Once you have established the response function for a
particular camera, you can then reuse it for a shorter exposure
sequence, or even a single image (though you should not expect a high
dynamic-range result in that case). This storage and reuse is
accomplished with hdrgen's -r option, or via the preferences file in
Photosphere.

If the algorithm cannot arrive at a reasonable response function for
your camera, Photosphere offers the option of applying a generic
response function, instead. Although I don't recommend this if your
goal is accuracy, the following response file will permit hdrgen a
similar fall-back using the -r option:

2 1 0 0
2 1 0 0
2 1 0 0

This simply assigns each channel the polynomial f(x) = x^2, which
roughly corresponds to a standard gamma curve. It's really a crude
approximation, but since you are taking overlapping exposures, the
global errors in the mid-exposure region are reasonably small.

By the way, I managed to convert both your sequences without complaints
in Photosphere, though the three-image sequence is a bit blurry.
Obviously, the alignment algorithm didn't quite work on this one.
Unfortunately, it doesn't know when it's failed, so no errors or
warnings are issued in most cases.

-Greg

P.S. In response to Pillo's inquiry, the luminance reported by ximage
should correspond roughly to the luminance you would measure with a
photometer, accounting for the photopic response of the probe's filter.
聽聽However, you are definitely better off measuring something as close to
white as possible, and preferably not a light source -- something more
in the middle of the exposure range, like a grey card. The latest
version of Photosphere even includes a calibration option to make this
work easier.

> From: Martin Matusiak <alex@juventuz.net>
> Date: June 10, 2004 5:20:10 AM PDT
>
> Terribly sorry for sending that mammoth attachment to the list, I
> didn't
> realize it was that big before it was too late. :frowning:
>
> Here they are instead:
> http://www.juventuz.com/_temp/good_set.tar.gz
> http://www.juventuz.com/_temp/bad_set.tar.gz
>
> Martin
>
> Subject: errors from hdrgen
> Date: Thursday 10 June 2004 13:11
>
> I seem to have trouble using hdrgen to generate hdr's.. The other day
> it
> seemed to be working fine but now I'm getting errors like
> "Poor covergence of order 1 fit"
> "Cannot solve for response function"
>
> Ultimately it won't generate the image. Perhaps you could tell me what
> they
> mean and what I'm doing wrong? I've attached the five images that gave
> this
> error. I've also attached another set of 3 that gave no error or
> warning at
> all, even though the resulting hdr is quite blurry.
>
> Thanks a lot!
>
> Martin

_______________________________________________
Radiance-general mailing list
Radiance-general@radiance-online.org
http://www.radiance-online.org/mailman/listinfo/radiance-general

___________________________________________________________________
Electronic mail messages entering and leaving Arup business
systems are scanned for acceptability of content and viruses.

_______________________________________________
Radiance-general mailing list
Radiance-general@radiance-online.org
http://www.radiance-online.org/mailman/listinfo/radiance-general

Ah yes, I had been meaning to try the camera calibration technique but I
hadn't come around to it. It turns out to be very useful. But I still have
some issues. I took in all 13 pictures of an indoor scene looking out the
window for the calibration. I then had to remove the darkest and the lightest
of them to get a hdr image. That leaves about 8 images and I get a hdr image
but there are errors in the conversion process.

The resulting hdr does not look very good, first of all the alignment
algorithm always distorts it like crazy. The images were taken with a tripod
so they should be reasonably well aligned but the algorithm moves them all
over the place. So I disable that. I tried setting -g to kill the ghosts but
that totally messed up the color balance of the image.

If I only turn off the alignment algorithm, the image still looks very
green/brown however. I take it this isn't supposed to happen? I toggle the
exposure adjustment but it makes no difference.

Pardon my ignorance, but is the hdr image supposed to look a lot like the
original? Is it a bad sign when the color balance is all wrong?

The images I took are enclosed. They were taken sequentially from lowest to
highest exposure, -2 to 2 in 1/3 steps. White balance was set to "overcast"
and should be the same for all, taken within 1 minute of each other.

url: http://www.juventuz.com/_temp/for_calibration.tar.bz2

If you have more luck with them, I'd love to know what options were used in
the conversion process.

Martin

路路路

On Thursday 10 June 2004 17:23, Greg Ward wrote:

conversion of digital pictures to
Radiance?

Martin,

Ah yes, I had been meaning to try the camera calibration technique but I hadn't come around to it. It turns out to be very useful. But I still have some issues. I took in all 13 pictures of an indoor scene looking out the window for the calibration. I then had to remove the darkest and the lightest of them to get a hdr image. That leaves about 8 images and I get a hdr image but there are errors in the conversion process.

I hdr'ed all ouf the 13 images in one go, and apart for some 'poor convergence' (orders 1, 3, 4 and 5) warnings, it worked fine. There is no apparent distortion.

The resulting hdr does not look very good, first of all the alignment algorithm always distorts it like crazy. The images were taken with a tripod so they should be reasonably well aligned but the algorithm moves them all over the place. So I disable that. I tried setting -g to kill the ghosts but that totally messed up the color balance of the image.

As you say, the indoor scene is a bit greenish, but outdoors is fine. There seems to be a lot of brown in the room (table, wall panelling). I don't know if this could cause the prob.

Axel

PS: I don't know if Greg has recently updated hdrgen. My copy is >6 months old (LINUX).

PPS: With the two darkest and the two brightest images excluded, there is a poor convergence for orders 1 and 5 fit, but still no distortions.

I hdr'ed all ouf the 13 images in one go, and apart for some 'poor
convergence' (orders 1, 3, 4 and 5) warnings, it worked fine. There is
no apparent distortion.

I repeated the process using the manual mode of the camera rather than
aperture priority. Again 13 images (same room, scene was a bit different as I
eliminated the sky in the framing) and I was able to compute a hdr. But the
alignment problem still persists, I use the -a switch to turn it off.

Later I took a couple more images and used the calibration data to compute a
hdr and read the luminance values off it. It turns out that those are
somewhat accurate so that's a good sign!

However, I'm still not quite satisfied. Obviously, the calibration procedure
is meant to give a most accurate static description of the camera, but I used
the camera's automatic whitebalance correction, so that's a variable. My
question is a complete novice one: how does the whitebalance affect the
luminance? What can I do to eliminate this variable (if it is significant).

Martin

I forgot to mention that my "cvs build" of hdrgen is from last week, so there
could have been changes from the version you have.

Martin

路路路

---------- Forwarded Message ----------

Subject: Re: [Radiance-general] Fwd: errors from hdrgen
Date: Friday 11 June 2004 14:17
From: Martin Matusiak <alex@juventuz.net>
To: Radiance general discussion <radiance-general@radiance-online.org>

I hdr'ed all ouf the 13 images in one go, and apart for some 'poor
convergence' (orders 1, 3, 4 and 5) warnings, it worked fine. There is
no apparent distortion.

I repeated the process using the manual mode of the camera rather than
aperture priority. Again 13 images (same room, scene was a bit different as I
eliminated the sky in the framing) and I was able to compute a hdr. But the
alignment problem still persists, I use the -a switch to turn it off.

Later I took a couple more images and used the calibration data to compute a
hdr and read the luminance values off it. It turns out that those are
somewhat accurate so that's a good sign!

However, I'm still not quite satisfied. Obviously, the calibration procedure
is meant to give a most accurate static description of the camera, but I used
the camera's automatic whitebalance correction, so that's a variable. My
question is a complete novice one: how does the whitebalance affect the
luminance? What can I do to eliminate this variable (if it is significant).

Martin

-------------------------------------------------------

Hi Martin,

I believe that hdrgen isn't coming up with a good response function for your camera with this scene, probably due to the strong color cast of the walls. Since hdrgen (and Photosphere) compute the responses for each of the RGB channels, using a lot of colorful patch samples can skew the result. I've never seen such an extreme example before, though -- most walls are off-white so this hadn't come up. Come to think of it, the tendency of cameras to super-saturate their colors could spell trouble in such cases, and I probably should modify my algorithm to avoid colorful patch samples.

Thanks for the image set -- I'll run some experiments on my end to see if I can improve on this. It is important to fix the white balance whenever you take an HDR sequence, as the camera in auto white mode will alter the coefficients between exposures otherwise, making it impossible to get a consistent result.

-Greg

路路路

From: Martin Matusiak <alex@juventuz.net>
Date: June 11, 2004 5:17:29 AM PDT

I hdr'ed all ouf the 13 images in one go, and apart for some 'poor
convergence' (orders 1, 3, 4 and 5) warnings, it worked fine. There is
no apparent distortion.

I repeated the process using the manual mode of the camera rather than
aperture priority. Again 13 images (same room, scene was a bit different as I
eliminated the sky in the framing) and I was able to compute a hdr. But the
alignment problem still persists, I use the -a switch to turn it off.

Later I took a couple more images and used the calibration data to compute a
hdr and read the luminance values off it. It turns out that those are
somewhat accurate so that's a good sign!

However, I'm still not quite satisfied. Obviously, the calibration procedure
is meant to give a most accurate static description of the camera, but I used
the camera's automatic whitebalance correction, so that's a variable. My
question is a complete novice one: how does the whitebalance affect the
luminance? What can I do to eliminate this variable (if it is significant).

Martin

Hi again.

Well, I played around with limiting the saturation on patches, and it only seems to make matters worse; I'm not sure why. I think it's because you don't have enough unsaturated pixels in your scene for it to work, but I'll have to perform some more experiments to determine that for certain. For now, you'll just have to find a more "neutral" scene to perform your calibration on, then save the response function and reuse it for other scenes, which is what I've been doing.

-Greg

路路路

From: Martin Matusiak <alex@juventuz.net>
Date: June 11, 2004 5:21:55 AM PDT

I forgot to mention that my "cvs build" of hdrgen is from last week, so there
could have been changes from the version you have.

Martin

Hi Pillo,

Basically, I'm writing to agree with Santiago. If you want repeatable measurements, you have to fix your camera's white balance to a particular setting. Otherwise, the response curves for the three channels will change. Technically, the camera merely applies a different multiplier to its three channels, but it does this in a color space other than RGB -- a space we cannot know after the image has been recorded in a JPEG or TIFF. Because we cannot know when and how the camera's gamma/contrast curve is applied relative to its color transformation(s), it is not possible to back this out. The best we can do is keep the color space transitions constant by fixing the white balance and contrast settings on the camera, and let hdrgen do its best to estimate the response function on the result.

-Greg

路路路

From: Giulio Antonutto <Giulio.Antonutto@arup.com>
Date: June 10, 2004 9:06:09 AM PDT

My question arises from some experiments measuring the same paper surface
under different light conditions:

Unfortunately it appears that if the source has a really narrow spectrum
this affects the results within a 40% with my camera (which I cannot give
the name... 001DNOKIN :wink: ).

I am stressing the methodology to see if it is reliable enough to substitute
luminance readings for a wide range of conditions with consumer digital
cameras (street lighting - interior lighting - day lighting ).
I really underline the concept of 'wide use' (wide = cheap :wink: ).

With this sort of equipment I would like to achieve an error within +-10%
and therefore a correction factor quite constant (within +-10%); is it
feasible?

...