ambient setting for very large scenes

Hi,

I'm struggling a little with setting my ambient parameters.

Thing is that my scene is far larger (~10000m) than my area of interest
(~90m), which is an interior scene.

If I follow the logic of 'setting radiance ambient parameters' part of Lash'
thesis, setting -ar should follow this formula:

minimum_distance_between_samples = (aa/max_scene_dimension) / ar

Which in my scene would lead to a -ar setting of ~5000. Which seems absurd
to me.

So what would be the appropriate thing to do here?

Many thanks,

-jelle

Jelle Feringa / EZCT Architecture & Design Research wrote:

I’m struggling a little with setting my ambient parameters.

Join the club! =8-)

Which in my scene would lead to a –ar setting of ~5000. Which seems absurd to me.

So what would be the appropriate thing to do here?

Absurd it is. This is a problem when you have a scene like yours. You want some distant object or scenery to be visible out the window, for example, so you put it in your model and then the images seem really flat or have light leaks because the -ar isn't high enough. One thing you could do is cheat, and place the objects closer to your area of interest than they really are, scaling them up a bit. This works for treelines and things like that, anyway. Another would be to exclude the distant objects from the ambient calculation with the -ae option, the problem there (and maybe some folks here have input on this) is that your objects are outside, your area of interest is inside, so your -av is likely to be too low for the exterior objects (which is how all the indirectly lit surfaces derive their radiance information when their materials are on the -ae list).

-ar 5000 would take forever to render!

Good luck, Jelle.

- Rob

Hi Rob and Jelle,

Although I have not tried this, what about rendering a light probe of the exterior environment (using parameters suitable for an exterior scene) and then using the light probe to supply the background for the interior scene?

-Jack

Rob Guglielmetti wrote:

···

Jelle Feringa / EZCT Architecture & Design Research wrote:

I�m struggling a little with setting my ambient parameters.

Join the club! =8-)

Which in my scene would lead to a �ar setting of ~5000. Which seems absurd to me.

So what would be the appropriate thing to do here?

Absurd it is. This is a problem when you have a scene like yours. You want some distant object or scenery to be visible out the window, for example, so you put it in your model and then the images seem really flat or have light leaks because the -ar isn't high enough. One thing you could do is cheat, and place the objects closer to your area of interest than they really are, scaling them up a bit. This works for treelines and things like that, anyway. Another would be to exclude the distant objects from the ambient calculation with the -ae option, the problem there (and maybe some folks here have input on this) is that your objects are outside, your area of interest is inside, so your -av is likely to be too low for the exterior objects (which is how all the indirectly lit surfaces derive their radiance information when their materials are on the -ae list).

-ar 5000 would take forever to render!

Good luck, Jelle.

- Rob

_______________________________________________
Radiance-general mailing list
[email protected]
http://www.radiance-online.org/mailman/listinfo/radiance-general

--
# Jack de Valpine
# president
#
# visarc incorporated
# http://www.visarc.com
#
# channeling technology for superior design and construction

Jack de Valpine wrote:

Hi Rob and Jelle,

Although I have not tried this, what about rendering a light probe of the exterior environment (using parameters suitable for an exterior scene) and then using the light probe to supply the background for the interior scene?

Hiya Jack,

This was discussed at length on this list a couple years ago, I started the thread and I think my feeble understanding of lightprobes caused the discussion to get a bit scattered. Bottom line: there are issues with this approach, especially if you are going for numeric accuracy, but I also think that Greg's mksource addition may have solved some of that (as I understand it, mksource is sorta like a "mkillum for lightprobes").

I still do not have a fisheye lens so my lightprobe experiments... have not begun. Jack, I was really impressed with your presentation at the Workshop (actually, I was impressed with all of the ones I've read so far!), it renewed my interest in playing around with that stuff. Sliding off the topic a bit, has everyone seen the new Canon PowerShot G7 that's coming out next month? It looks really nice, with film emulation up to ASA1600, and the all-important autobracket for us HDR heads, in a compact size (and a separate film speed dial!). With a 10mp sensor, I'm really curious to see how well the ASA1600 pics come out, but even a low-noise 800 would make me really happy. Guess I'll wait 'till the folks at www.dpreview.com put it through its paces...

Anyhoo, the thread I mentioned way back in the first paragraph of this reply begins here:
http://www.radiance-online.org/pipermail/radiance-general/2003-May/000818.html

The discussion spills over into June, so look for those too. For some reason the posts don't appear continuously even when sorted by thread.

- Rob

Hey Rob,

Rob Guglielmetti wrote:

Jack de Valpine wrote:

Hi Rob and Jelle,

Although I have not tried this, what about rendering a light probe of the exterior environment (using parameters suitable for an exterior scene) and then using the light probe to supply the background for the interior scene?

Hiya Jack,

This was discussed at length on this list a couple years ago, I started the thread and I think my feeble understanding of lightprobes caused the discussion to get a bit scattered. Bottom line: there are issues with this approach, especially if you are going for numeric accuracy, but I also think that Greg's mksource addition may have solved some of that (as I understand it, mksource is sorta like a "mkillum for lightprobes").

Well there you go. However, we do not what exactly Jelle's objectives are with respect to numeric accuracy. I would suggest to anyone interested to go back and read both May and June threads as Rob points out below. It is very interesting and suggests the method of mapping directly to the interior window surfaces to get both the environment mapping as well as the distribution output. I had been thinking of mapping to a source.

I still do not have a fisheye lens so my lightprobe experiments... have not begun. Jack, I was really impressed with your presentation at the Workshop (actually, I was impressed with all of the ones I've read so far!), it renewed my interest in playing around with that stuff.

Thanks! We are working on another night lighting project now where I hope to use similar techniques. One thing that is intriguing from a design standpoint, is when the architect says the want to try out some different things, the possibility is to actually run rtcontrib with all the needed sources (including optional ones and "what if") and then generating various images shows the possible scenarios as a result.

Sliding off the topic a bit, has everyone seen the new Canon PowerShot G7 that's coming out next month? It looks really nice, with film emulation up to ASA1600, and the all-important autobracket for us HDR heads, in a compact size (and a separate film speed dial!). With a 10mp sensor, I'm really curious to see how well the ASA1600 pics come out, but even a low-noise 800 would make me really happy. Guess I'll wait 'till the folks at www.dpreview.com put it through its paces...

Anyhoo, the thread I mentioned way back in the first paragraph of this reply begins here:
http://www.radiance-online.org/pipermail/radiance-general/2003-May/000818.html

This is a great thread, I recommend it....

···

The discussion spills over into June, so look for those too. For some reason the posts don't appear continuously even when sorted by thread.

- Rob

_______________________________________________
Radiance-general mailing list
[email protected]
http://www.radiance-online.org/mailman/listinfo/radiance-general

--
# Jack de Valpine
# president
#
# visarc incorporated
# http://www.visarc.com
#
# channeling technology for superior design and construction

Hi,

I just have to quote Greg, as he replied on a similar question I had posted some days ago in great detail:

> One other note -- you might try setting -ar 0 in your case.
> If you have really large and small geometry that require
> enormous -ar settings, you may be better off without it.

I have a scene which would also require an incredible high ar here, and got processes close to 2GB (which will always lead to slow rendering on today's machines).

Good luck! Lars.

Hi Jelle,

Well here is a question. I am not sure how large the interior spaces are that you are trying to show but wouldn't using illums (mkillum) at the openings enable one to get a reasonable (for the purposes of this study) approximation of ambient light while using a lower set of overall ambient terms?

-Jack

Jelle Feringa / EZCT Architecture & Design Research wrote:

···

Hi!

Thanks so much for the overwhelming replies!

Absurd it is. This is a problem when you have a scene like yours. You

want some distant object or scenery to be visible out the window, for

example, so you put it in your model and then the images seem really

flat or have light leaks because the -ar isn't high enough. One thing

you could do is cheat, and place the objects closer to your area of

interest than they really are, scaling them up a bit. This works for

treelines and things like that, anyway. Another would be to exclude the

distant objects from the ambient calculation with the -ae option, the

problem there (and maybe some folks here have input on this) is that

your objects are outside, your area of interest is inside, so your -av

is likely to be too low for the exterior objects (which is how all the

indirectly lit surfaces derive their radiance information when their

materials are on the -ae list).

-ar 5000 would take forever to render!

Actually, it wasn't *that* bad! An hour and a half (fairly lo-res, but good enough for its purpose) wasn't that dramatic, so I suppose that the algorithm for getting a reasonable --ar setting really is quite ok! Still I'm sure that there are more intelligent strategies rather than brute-force computing power...

I'm interested to get fairly optimal ambient settings since I'm producing a sun path study, so I need to render quite a quantity of imagery here...

//if fellow pythoniasts on the list are interested in my script, let me know, it would be interesting the have a

//a solid python script for sun-path that runs beautifully cross-platform & on multi-core machines...

Good luck, Jelle.

Thanks Rob!

Hi Rob and Jelle,

Although I have not tried this, what about rendering a light probe of

the exterior environment (using parameters suitable for an exterior

scene) and then using the light probe to supply the background for the

interior scene?

-Jack

Hey Jack!

That's an interesting suggestion!

Although since I'm producing a sun path study, HDR isn't really the way to go, it be fairly work-intensive ;')

Interesting idea though, did you apply this technique perhaps in some of the work shown on your site?

Well there you go. However, we do not what exactly Jelle's objectives

are with respect to numeric accuracy.

It's in-between; my goal is to give architects accurate feedback about the lighting conditions on the project they're working on. Their interest is a more visual interest than numerical. Missing a couple of lux isn't too problematic, the idea is to be able to communicate the lighting conditions in the building in a visual & fairly accurate manner.

Hi Lars!

I just have to quote Greg, as he replied on a similar question I had

posted some days ago in great detail:

> One other note -- you might try setting -ar 0 in your case.

> If you have really large and small geometry that require

> enormous -ar settings, you may be better off without it.

I have a scene which would also require an incredible high ar here, and

got processes close to 2GB (which will always lead to slow rendering on

today's machines).

That's a very interesting suggestion!

Will try that right away!

Cheers,

-jelle

------------------------------------------------------------------------

_______________________________________________
Radiance-general mailing list
[email protected]
http://www.radiance-online.org/mailman/listinfo/radiance-general
  
--
# Jack de Valpine
# president
#
# visarc incorporated
# http://www.visarc.com
#
# channeling technology for superior design and construction

Jack suggests a good trick. I've often resorted to mkillum with challenging exteriors setting -aa 0 for the mkillum process, then switching the cache back on for the interior renderings (as needed). This is what we did on the New York Times model (sponsored by NYSERDA).

-Greg

···

From: Jack de Valpine <[email protected]>
Date: September 29, 2006 6:40:11 AM PDT

Hi Jelle,

Well here is a question. I am not sure how large the interior spaces are that you are trying to show but wouldn't using illums (mkillum) at the openings enable one to get a reasonable (for the purposes of this study) approximation of ambient light while using a lower set of overall ambient terms?

-Jack

Greg, can that 3D model of the NY Times be made accessible to us?

Martin

···

-----Original Message-----
From: [email protected] on behalf of Gregory J. Ward
Sent: Fri 9/29/2006 1:52 PM
To: Radiance general discussion
Subject: Re: [Radiance-general] Re: ambient setting for very large scenes

Jack suggests a good trick. I've often resorted to mkillum with
challenging exteriors setting -aa 0 for the mkillum process, then
switching the cache back on for the interior renderings (as needed).
This is what we did on the New York Times model (sponsored by NYSERDA).

-Greg

From: Jack de Valpine <[email protected]>
Date: September 29, 2006 6:40:11 AM PDT

Hi Jelle,

Well here is a question. I am not sure how large the interior
spaces are that you are trying to show but wouldn't using illums
(mkillum) at the openings enable one to get a reasonable (for the
purposes of this study) approximation of ambient light while using
a lower set of overall ambient terms?

-Jack

_______________________________________________
Radiance-general mailing list
[email protected]
http://www.radiance-online.org/mailman/listinfo/radiance-general

Hi Martin,

It's up to LBNL and the people who funded it, I suppose. You should write to Eleanor to ask. I have no problem sharing the model, but bear in mind that it's more like a set of scripts than a model. We only ever look at one floor at a time, since constructing the whole tower with all those thousands of tubes would be a waste of resources. Also, there is the urban model to consider. I'm not sure we even have the rights to distribute that.

-Greg

···

From: "Martin Moeck" <[email protected]>
Date: October 3, 2006 10:57:08 AM PDT

Greg, can that 3D model of the NY Times be made accessible to us?

Martin