Prague (A. Wilkie) spectral sky model integration in Radiance?

Dear all;

I want to make a couple of render test at sunrise/sunset using the recently developped sky model from Prague team.

The second version is provided for a simplified integration in rendering plateform such as Radiance.

A C source code file is provided and it works like this:

  • You have to initialize a structure (state).
  • This structure is loading in background the data file containing the necessary information for the sky model
  • Then you have 3 functions to request the models passing this initialized structure
// Computes sky radiance arriving at view point.
double arpragueskymodelground_sky_radiance(
	const ArPragueSkyModelGroundState  * state,
	const double                   theta,
	const double                   gamma,
	const double                   shadow,
	const double                   wavelength

// Computes solar radiance arriving at view point (i.e. including transmittance trough the atmosphere).
double arpragueskymodelground_solar_radiance(
	const ArPragueSkyModelGroundState  * state,
	const double                   theta,
	const double                   wavelength

// Computes transmittance along a ray segment of a given length going from view point along view direction. Could be used e.g. for computing attenuation of radiance coming from the nearest intersection with scene geometry or of radiance coming outside of the atmoshere (just use huge value for the distance parameter, it's how it is internally done for the solar radiance).
double arpragueskymodelground_transmittance(
	const ArPragueSkyModelGroundState  * state,
	const double                   theta,
	const double                   wavelength,
	const double                   distance

I don’t truly know how to integrate this correctly in Radiance…

We can do the same as gendaylit and generate a kind of file. But the problem is how to instanciate this C state object and handle correctly the data file (looks not to be doable in .cal files).

I’m a bit a newbie with Radiance so I possibly missed something…

Do you have any suggestions to help me?

Hi Erwann,

Given the complexity of this model, you would do best to use their C functions to create an HDR (Radiance) picture by sampling 3 representative wavelengths for RGB (say, 604, 519, and 467nm) over the sky dome.

I am not sure about the arguments to the arpragueskymodelground_sky_radiance() call, but you can compute the ray directions corresponding to a fisheye image using the Radiance “vwrays” program at an appropriate resolution, then write your own tool to take these vectors and call arpragueskymodelground_sky_radiance() to compute the RGB values to send to the “pvalue” tool with -r to convert to a Radiance picture.

You can then use a simple .cal file to map the image into your scene, such as, which assumes you use vwrays with the -vta option, and may need to be rotated for a Z-up vector:

	Calculate coordinates for a 180 degree fisheye lens.
	Assume view direction is (0,1,0), view up (0,0,1), (1,0,0) right.


fish_Rxz = sqrt(Dx*Dx + Dz*Dz);
fish_Ry = Acos(Dy) / PI;

fish_u = .5 + Dx/fish_Rxz * fish_Ry;
fish_v = .5 + Dz/fish_Rxz * fish_Ry;

If you have trouble with the details, Chapter 4 of Rendering with Radiance may help, or you can follow up with specific questions, here.


Thank you very much. This is a simpler approach I envisaged but I was not sure if it was doable with Radiance (and how to do it!)

Looks not too complicated to achieve then. Let me give it a try and come back if needed

For my application, I need to make simulations with additionnal spectral bands for irradiance mapping (W/m²). My guess is that it should work but possibly I’m missing something. Do you have any advice/warning/concern regarding this idea? (given a proper scene configuration…)

Many thanks again for your support.

There are a number of papers covering multispectral rendering with Radiance. You can actually designate the RGB channels to mean whatever you like. The calculations for each channel are independent. So, you could designate R to be total radiation, G to match photopic sensitivity, and B to match melanoptic sensitivity if you wished. You just need to make sure you are consistent throughout your scene description, so the corresponding transmission and reflection values correspond to the channels you are simulating.


I’m testing the workflow with simple scene. Sorry it tooks me little time to get all this working (I’m a bit newbie…)

I vas able to generate a sky image from Prague model code with vwrays. The image is saved as an HDR image and bellow a capture of the sky image after converting to tiff with ra_tiff:

I try to integrate this image withing a scene using the following sky definition

void colorpict sky
9 red green blue demo.hdr fish_u fish_v -rz 90

sky glow sky_mat
4 1 1 1 0

sky_mat source sky
4 0 0 1 180

sky glow ground_glow
4 0.9966115208291808 0.9966115208291808 0.9966115208291808 0

ground_glow source ground
4 0 0 -1 180

The sky is mapped on the ground and sky is black…

It’s possibly linked to this Z axis rotation. I initially imagine that rz parameter will solve the problem but it’s not working.

I probably do something wrong.

Any idea?

Additionnal question: Can I use this “env” map for the diffuse sky part and generate a light source for the direct component? I test that using gendaylit outputs but with no sucess… I’m trying to avoid a complex sampling of the sun disk to avoid aliasing due to the image resolution limits…

You are doing really well to have gotten this far!

I think you want to rotate about the x or y axis, not z to get the sky pattern to face upwards. (I.e., try -rx 90 instead of -rz 90 in your colorpict argumennts.)

Also, your ground seems a bit dim to me in absolute terms – less than 170 nits, which means your horizontal illuminance is probably around 100 lux. That would have to be after sunset or before sunrise, I think.

Regarding your additional question, you should definitely model the sun separately. If you place a “light”-modified “source” of the appropriate size and position, it will overlay your sky correctly and hopefully avoid sampling issues. You might consider making the sun slightly larger and dimmer than it should be to include some of the circumsolar region, which by itself also creates some sizeable variance during sampling.



Regarding the sky image indeed it was totally wrong and I got everything thing ok. The view vector used to generate it was wrong (a scripting bug…). I’m not able to upload this new nice image here today…

Regarding my additionnal question: I could need some help…

  • I tested gendaylit to generate a sky source (discarding the brighfunc to get the direct part only).
  • Both sources are not mixing correctly and I get a grey image (while I have a fantastic blue sky)

I have a possible explanation but maybe what I will say is stupid…

  • When I generate the Prague sky HDR image for now I get a single band irradiance value (well RGB values)
  • Regarding gendaylit I use true ground measurement measured broadband (DNI/DHI)
  • So the problem could come from a relative energy scaling between the two light source.

I possibly need to rescale the Prague sky values by the local DHI value (normalize the 3 bands over the entire image and scale by DNI?)

But I could be totally wrong. Any idea?

Again thanks for your support!

I’m not sure I have enough information to address your question, but it could be a scaling issue or your calculation parameters. Did you set -ab 1 (at least)? Without the indirect calculation, the sky will not contribute at all to illumination. Also, is your sky radiance in units of watts/sr/meter^2? This is the basic assumption in Radiance.


So I continue working on this topic…

I read out the code of gendaylit to understand how the Perez model was integrated (to gain knowledge on Raidance also…)

I have two questions about it:

  • in the function get_eccentricity() the global variable daynumber is used but never initialized in my case as I defined the sun position angles as input (a bug?)

  • In the function theta_phi_to_dzeta_gamma. The gamma angle is defined as the angle between the sun direction and the view direction. I don’t understand why within the dot product the sun azimuth is not used… We compute a cosine from 2 vectors so I assume this to be a dot product but I could be wrong…

I was following the gendaylit approach to integrate the Prague model that basically why I was investigating gendaylit code …

Don’t know if someone could help…