BTW, found some references on direct HDR capture of the sun by Jessi Stumpfel:
Stumpfel, J., Jones, A., Wenger, A., Tchou, C., Hawkins, T., & Debevec, P. (2004). Direct HDR Capture of the Sun and Sky.
http://gl.ict.usc.edu/Data/skyprobes/skycapture_poster.pdf
Stumpfel, J., Jones, A., Wenger, A., & Debevec, P. (2004). Direct HDR capture of the sun and sky. Paper presented at the 3rd
International Conference on Virtual Reality, Computer Graphics, Visualization and Interaction in Africa, Cape Town, South
Africa.
Stumpfel, J. (2004). HDR Lighting Capture of the Sky and Sun. (Master of Science Thesis), California Institute of Technology.
Retrieved from http://gl.ict.usc.edu/Data/skyprobes/ms_thesis_stumpfel.pdf
- Joe
On Tue, Jul 1, 2014 at 3:06 AM, Christian Humann <chris@christianhumann.com> wrote:
Hi Joe,
The global values in the CSV file are in footcandles so you will need to multiply them by 10.76 in order to get Lux.
Also, you'll need to add a source description for the sun in order to get the solar contribution as the camera sensor can
not capture the intensity of the sun for the HDR image. Essentially the HDR image allows you to get a close
approximation of the global diffuse value. I use gendaylit (see below) to generate the sun and sky scene. You can get
the altitude, azimuth, direct-normal-illuminance and diffuse-horizontal-illuminance from the CSV file (be sure to
multiply the latter two values by 10.76 to translate them from footcandles to Lux for input into Gendaylit). Also be sure
to adjust your 'rtrace' results by dividing by 10.76 to get footcandles if you want to compare to the
global-horizontal-illuminacne readings in the CSV file.
########Sun and Sky scene -----> global.rad
!gendaylit -ang 45.41 85.92 -w -O 0 -L 80463.28 19916.76 | xform -e -rz 0
void colorpict skypict
11 red green blue 140621_1530.hdr fisheye.cal fish_u fish_v -rx 90 -rz 180
0
skypict glow skyglow
0
4 1 1 1 1
skyglow source sky
0
4 0 0 1 180
skypict glow groundglow
0
4 1 1 1 0
groundglow source ground
0
4 0 0 -1 180
#########
# 3. the cmd.sh file
oconv ./global.rad > ./scene_empty.oct
echo '0 0 0 0 0 1' | rtrace -I -h -w -ab 1 -oov ./scene_empty.oct > ./results_position_irradiance_RGB_wm2.txt
cat ./results_position_irradiance_RGB_wm2.txt | rcalc -e '$1=179*(0.265*$4+0.670*$5+0.065*$6)/10.76' >
./results_illuminance_lux.txt
##########
When I run the above I get a global horizontal illuminance value from 'rtrace' of approx. 6900 footcandles. The
photometer gave a reading of 7176 footcandles.
I'm still working my brain around all this as well and hope that these discussions will foster a better understanding of
how to use the HDR images for the highest level of accuracy possible.
Hope this helps.
Best,
Chris
On Jun 28, 2014, at 10:06 AM, Joe Smith <the.oat.cracker@gmail.com> wrote:
Hi, I found 2 references and did a test to generate HDR image-based rendering, steps are explained below.
But I'm still rubbing my head to understand how the Cartesian coordinates or the postion vector of a point on
the sky is transformed as UV coordinates of the fisheye image as shown in the "angmap.cal" file. So, advices
are greatly appreciated!
Thanks!
Joe
References:
1. Debevec, P. (2002). Image-based lighting. IEEE Computer Graphics and Applications, 22(2), 26-34. doi:
10.1109/38.988744
2. Au, P. Y. P. (2013). HDR Luminance Measurement: Comparing real and simulated data. (Master of Building
Science Thesis), Victoria University of Wellington.
Steps:
Step1. prepare the following 6 files and put them in the same folder
#### 1.1 geom.rad ################################################
red_plastic sphere ball
0
4 2 2 0.5 0.5
steel sphere ball1
0
4 2 -2 0.5 0.5
gold sphere ball2
0
4 -2 -2 0.5 0.5
white_matte sphere ball3
0
4 -2 2 0.5 0.5
crystal sphere ball4
0
4 0 0 1 1
!genbox gray_plastic pedestal_top 8 8 0.5 | xform -t -4 -4 -0.5
#### 1.2 materials.mat ################################################
void plastic red_plastic
0
5 .7 .1 .1 .06 .1
void metal steel
0
5 0.6 0.62 0.68 1 0
void metal gold
0
5 0.75 0.55 0.25 0.85 0.2
void plastic white_matte
0
5 .8 .8 .8 0 0
void dielectric crystal
0
5 .5 .5 .5 1.5 0
void plastic black_matte
0
5 .02 .02 .02 .00 .00
void plastic gray_plastic
0
5 0.25 0.25 0.25 0.06 0.0
#### 1.3 sky_and_ground.rad ################################################
void colorpict hdr_image
7 red green blue 140621_1530.hdr angmap.cal u v
0
hdr_image glow sky_glow
0
4 1 1 1 0
sky_glow source HDR_sky
0
4 0 0 1 180
# ground
void glow ground_glow
0
4 1 1 1 0
ground_glow source ground
0
4 0 0 -1 180
#### 1.4 angmap.cal ################################################
{
angmap.cal
Convert from directions in the world (Dx, Dy, Dz) into (u,v)
coordinates on the light probe image
+z is up (toward top of sphere, i.e. the zenith)
+y is North
}
d = sqrt(Dx*Dx + Dy*Dy);
r = acos(Dz)/PI;
u = 0.5 - Dx/d * r;
v = 0.5 + Dy/d * r;
#### 1.5 view.vf ################################################
# looking towards east
#rvu -vtv -vp -12 0 0.5 -vd 1 0 0 -vu 0 0 1 -vh 60 -vv 40
# looking towards west
#rvu -vtv -vp 12 0 0.5 -vd -1 0 0 -vu 0 0 1 -vh 60 -vv 40
# looking towards north
rvu -vtv -vp 0 -12 0.5 -vd 0 1 0 -vu 0 0 1 -vh 60 -vv 40
# looking towards south
#rvu -vtv -vp 0 12 0.5 -vd 0 -1 0 -vu 0 0 1 -vh 60 -vv 40
#### 1.6 cmd.sh ################################################
oconv ./materials.mat ./sky_and_ground.rad ./geom.rad > ./scene.oct
rvu -vf ./view.vf ./scene.oct
#ximage ./sky.hdr
rpict -x 2400 -y 2400 -t 30 -ab 1 -ar 50000 -aa 0.08 -ad 128 -as 64 -st 0 -lw 0 -lr 8 -vf ./view.vf
./scene.oct > ./image.hdr
pfilt -1 -x /3 -y /3 -r 1 ./image.hdr > ./image_filtered.hdr
Step2. put the 140621_1530.hdr file provided by LBNL (http://flexskycam.lbl.gov) in the same folder
Step3. run the cmd.sh batch file to produce the rendering
On Sat, Jun 28, 2014 at 7:28 AM, Andrew McNeil <amcneil@lbl.gov> wrote:
Hi All,
I haven't done any testing myself, I wanted to make the data available right away so that others
could tinker too (and maybe make it easier for me).
Joe - You're correct that the HDR sky image essentially replaces the skyfunc modifier, here's a thread where
Kyle was doing the same thing:
http://www.radiance-online.org/pipermail/radiance-general/2012-October/008962.html
Rob - In a clear sky condition our HDR images won't capture the full luminance of the sun. Mksource would be
helpful to zero out the pixels, but the source that it makes won't be useful without adjusting the radiance
of the source to match that of the sun.
Best,
Andy
On Fri, Jun 27, 2014 at 2:58 PM, Rob Guglielmetti <rob.guglielmetti@gmail.com> wrote:
Hi German, and everyone else. Certainly one could use these HDR images to generate sky vectors
and apply them to daylight coefficients for a given model(s). Greg Ward has created a cool tool
called mksource to facilitate this process in Radiance, identifying small, intense pixels in the
image; creating and placing Radiance light sources in their stead, and zeroing the pixels to
avoid double counting.
Considerations:
- Capturing the true (full) dynamic range of an exterior scene with direct sun is difficult.
- Using locally-captured HDR images for daylight availability analysis is statistically dubious.
Granted, so is using TMY data, for different reasons. This is why I changed the title of this list to
"considerations", from "problems". =)
On Fri, Jun 27, 2014 at 3:38 PM, CHI-German Molina <gmolina@hdlao.com> wrote:
Wow, I have been thinking on doing this for a while... although I have no idea where to
start from.
Is it possible to calculate the Daylight Coefficients of the building; and use the HDR image to
generate sky vectors and calculate different options for optimizing daylighting?
I am picturing a computer that, every 5 minutes, calculate the sky vector, computes the interior
lighting conditions, and simulates the different lighting options performing a whole-building
lighting control with no photo sensors. Even more, maybe a whole neighborhood could use the same
camera. Nonsense?
Thanks for sharing!
2014-06-27 0:59 GMT-04:00 Joe Smith <the.oat.cracker@gmail.com>:
Hi, Andy, thanks for sharing LBNL's sky mapping experiment!
Can you kindly advice on resources that elaborate on how to use HDR sky image for daylight
simulation? Does it involve specifying the HDR image, rather than a "skyfunc", as the
material identifier for the sky geometry? How is the pixel value of a given point on the
HDR image converted to luminance value of the corresponding position on the sky?
Thanks!
- Joe
Andrew McNeil <amcneil@lbl.gov>于2014年6月27日星期五写道:
Hi Everyone,
LBNL has installed an HDR sky camera at our new FLEXLAB
site: http://flexskycam.lbl.gov. I've uploaded sample data, including hdr
images and csv datafiles, recorded by the camera for three days over the past
week( clear, partly cloudy and overcast). We're happy to share more data with
other researchers and daylight practitioners (but we don't have much to offer
yet).
The images can be mapped to a Radiance sky for simulation under real sky conditions.
I have not used the sky HDR images yet, myself, so if anybody uses them successfully
please report back and share what you've done!
Questions about the camera hardware and capabilities should be directed to Chris
Humann at Terrestrial Light.
Best,
Andy
_______________________________________________
Radiance-general mailing list
Radiance-general@radiance-online.org
http://www.radiance-online.org/mailman/listinfo/radiance-general
--
Germán Molina L.
Ingeniero Trainee
Hunter Douglas Chile S.A.
Celular +569 89224445
Nota de Confidencialidad: Este mensaje incluído los archivos adjuntos son confidenciales y pueden
contener informacion privilegiada protegida por ley. Si Ud. no es el destinatario, deberia
abstenerse de copiarlo, distribuirlo, divulgarlo o usar la informacion contenida. Por favor,
avise inmediatamente al emisor y borre este mensaje de su sistema. Los mensajes electronicos son
susceptibles de ser cambiados, infectados o adulterados sin autorizacion. No asumimos
responsabilidad alguna por ninguna clase de cambios o sus consecuencias. Usted debe estar
informado que la compania puede hacer un seguimiento de sus mensajes electronicos y su contenido,
gracias.
Confidentiality Notice: The information contained in this email message, including any
attachment, is confidential and is intended only for the person or entity to which it is
addressed. If you are neither the intended recipient nor the employee or agent responsible for
delivering this message to the intended recipient, you are hereby notified that you may not
review, retransmit, convert to hard copy, copy, use or distribute this email message or any
attachments to it. If you have received this email in error, please contact the sender
immediately and delete this message from any computer or other data bank, Thank you.
A informação transmitida é confidencial e para conhecimento exclusivo do destinatário. Sua
utilização, não autorizada, constitui crime passível de prisão. Todas as precauções possíveis
foram tomadas para garantir que este e-mail não contenha vírus. Uma vez que nossa empresa não
pode assumir responsabilidade por nenhuma perda ou dano causado por este e-mail ou de seus
anexos, recomendamos que o destinatário utilize seus procedimentos de antivírus antes de qualquer
uso.
_______________________________________________
Radiance-general mailing list
Radiance-general@radiance-online.org
http://www.radiance-online.org/mailman/listinfo/radiance-general
_______________________________________________
Radiance-general mailing list
Radiance-general@radiance-online.org
http://www.radiance-online.org/mailman/listinfo/radiance-general
_______________________________________________
Radiance-general mailing list
Radiance-general@radiance-online.org
http://www.radiance-online.org/mailman/listinfo/radiance-general
_______________________________________________
Radiance-general mailing list
Radiance-general@radiance-online.org
http://www.radiance-online.org/mailman/listinfo/radiance-general
_______________________________________________
Radiance-general mailing list
Radiance-general@radiance-online.org
http://www.radiance-online.org/mailman/listinfo/radiance-general