# integral of radiation in one point-spherical illuminance

Thanks Giulio,

It's an interesting thought; I am interested in values rather than images though.

The spherical sensor approach seems the best for what I am trying to achieve, so I'll see if I manage to implement that, otherwise the composite cubical sensor is the short and dirty road...

My initial confusion was because I initially thought that 6 right angle sensors would have given me a perfectly spherical sampling of the space, now I understand that's not the case and how to go about this.

Thanks everybody for the ideas and guidance,

Best,

G

···

From: giulio antonutto [mailto:antonutto@yahoo.it]
Sent: 09 May 2012 08:55

Giovanni,

I think there is another way, which requires some more fiddling, but that can give you a lot more flexibility.

Especially with visualisations.

I would start by taking n. 6 x 90º wide angular images of luminance from the observer position.

You could use vwrays to work with rtrace.

Then derive the illuminance at the view point, as you know the solid angle of each pixel / direction and the luminance of it.

See the IESNA book for details. You could write a little script with rcalc, python or octave.

I guess Andy or Greg would do all in one line with sed / awk

Once done you could have polar maps of your illuminance component, directionality of lighting in the space, etc etc.

To start playing with the idea you could do as Mark is suggesting, 6 illuminance values are good as you could easily plot the vector illumiannce to visualise the directionality of lighting.

This is an useful metrics for museums.

Either way, have fun!

G

On 8 May 2012, at 14:16, Giovanni Betti wrote:

Dear Martin, Minki,

Thanks for the replies, I guess what I was trying to calculate (and to whom dr. Martin refers to) is better called cubical illuminance; the script Greg shared with Minki (thanks as always, Greg!) allows to sample a full sphere in one go.

I think I have my ideas a lot clearer now, I'll just need to implement either one of the approaches.

Thanks,

Giovanni

From: Moeck, Dr. Martin [mailto:m.moeck@osram.com]
Sent: 08 May 2012 12:52

Hi Giovanni,

that is called spherical illuminance. As an approximation, you could calculate 6 illuminance values (up, down, East, West, North, South) and average them. Christopher Cuttle wrote a few papers on spherical illuminance.

Regards

Martin Moeck

OSRAM

________________________________

From: Giovanni Betti [mailto:gbetti@fosterandpartners.com]
Sent: Tuesday, May 08, 2012 1:42 PM

Dear list,

I have question that I hope you'll help get my head around.

I want to calculate the overall illuminance on a point in space that is, regardless of directionality.

I have made some simplified 2d sketches for clarity.

As I understand a radiance sensor point in rtrace will have cosine related sensitivity (image01)

If I am to place two coincident with opposing normals (image2) I'll miss on contributions from the sides.

Rotating the normals by 90 degrees at a time (figure 3) and summing contributions might not work either because will overestimate diagonal contributions (figure 4 ).

So I'm not getting too much closer to the solution...

Is there something that I am missing here?

Any light on this will be appreciated,

Best,

Giovanni Betti

_______________________________________________

Just a quick note about the images, with vwrays you can get directions and origin of picture vectors so you can use a view file and create a grid.
And then, as usual, with rtrace and a grid you can get values.
The power of using vwrays it is that it would allow you to sample the space around in all directions so to have a spherical projection of the illuminance contribution to your position.
Imagine now plotting all the values on a spherical surface and scaling the mesh according to the values.
You would get some weird looking solid which embeds directionality of light. A light signature of a space/point couple.
But there is a lot to fiddle around for probably just a cool visualisation.
Ah those were the times….
If I just had the time now
G

···

On 9 May 2012, at 09:57, Giovanni Betti wrote:

Thanks Giulio,

It’s an interesting thought; I am interested in values rather than images though.
The spherical sensor approach seems the best for what I am trying to achieve, so I’ll see if I manage to implement that, otherwise the composite cubical sensor is the short and dirty road…

My initial confusion was because I initially thought that 6 right angle sensors would have given me a perfectly spherical sampling of the space, now I understand that’s not the case and how to go about this.

Thanks everybody for the ideas and guidance,

Best,

G

From: giulio antonutto [mailto:antonutto@yahoo.it]
Sent: 09 May 2012 08:55

Giovanni,

I think there is another way, which requires some more fiddling, but that can give you a lot more flexibility.
Especially with visualisations.
I would start by taking n. 6 x 90º wide angular images of luminance from the observer position.
You could use vwrays to work with rtrace.
Then derive the illuminance at the view point, as you know the solid angle of each pixel / direction and the luminance of it.
See the IESNA book for details. You could write a little script with rcalc, python or octave.
I guess Andy or Greg would do all in one line with sed / awk

Once done you could have polar maps of your illuminance component, directionality of lighting in the space, etc etc.

To start playing with the idea you could do as Mark is suggesting, 6 illuminance values are good as you could easily plot the vector illumiannce to visualise the directionality of lighting.
This is an useful metrics for museums.

Either way, have fun!
G

On 8 May 2012, at 14:16, Giovanni Betti wrote:

Dear Martin, Minki,

Thanks for the replies, I guess what I was trying to calculate (and to whom dr. Martin refers to) is better called cubical illuminance; the script Greg shared with Minki (thanks as always, Greg!) allows to sample a full sphere in one go.

I think I have my ideas a lot clearer now, I’ll just need to implement either one of the approaches.

Thanks,

Giovanni

From: Moeck, Dr. Martin [mailto:m.moeck@osram.com]
Sent: 08 May 2012 12:52

Hi Giovanni,

that is called spherical illuminance. As an approximation, you could calculate 6 illuminance values (up, down, East, West, North, South) and average them. Christopher Cuttle wrote a few papers on spherical illuminance.

Regards

Martin Moeck
OSRAM

From: Giovanni Betti [mailto:gbetti@fosterandpartners.com]
Sent: Tuesday, May 08, 2012 1:42 PM

Dear list,

I have question that I hope you’ll help get my head around.
I want to calculate the overall illuminance on a point in space that is, regardless of directionality.
I have made some simplified 2d sketches for clarity.
As I understand a radiance sensor point in rtrace will have cosine related sensitivity (image01)
If I am to place two coincident with opposing normals (image2) I’ll miss on contributions from the sides.
Rotating the normals by 90 degrees at a time (figure 3) and summing contributions might not work either because will overestimate diagonal contributions (figure 4 ).
So I’m not getting too much closer to the solution…

Is there something that I am missing here?
Any light on this will be appreciated,

Best,
Giovanni Betti
_______________________________________________