Light simulation on 3D mesh with Radiance

Hi @Greg_Ward many thanks for the references and the short of explanation, appreciated. I’ve also found another old thread where you are also explaining a short of pipeline to be used. I will try to create a kind of Wavefront2Radiance translator and possibly share it here. Just to clarify for the color is it the values from Kd or Ka that I should be using because above you mention Kd but in your older message you say Ka and Terrance also refers to Ka to be used for color.

Must be Ka, though I don’t know what Kd is in that case.

Kd values specify the diffuse reflectivity using RGB values.

@Greg_Ward I am trying to parse my Wavefront .mtl file for creating a .rad corresponding based on our discussion above. However, if I look on my file I have materials which they do not contain a Ka parameter but only Kd, e.g.

newmtl m11_material_9
Kd 0.901961 0.109804 0.321569
Ks 0.0666667 0.0666667 0.0666667
Ns 30
map_Kd ../../room/0004d52d1aeeb8ae6de39d6bd993e992/../../texture/wallp_0.jpg

newmtl m12_material_10
Kd 0.501961 0.501961 0.501961

newmtl m13_material_11
Kd 1 1 1
Ks 0.0666667 0.0666667 0.0666667
Ns 30
map_Kd ../../room/0004d52d1aeeb8ae6de39d6bd993e992/../../texture/bricks_1.jpg

newmtl m14
Ka 0.35 0.35 0.35
Kd 0.278431 0.278431 0.278431
Ks 0.762295 0.762295 0.762295
Ns 150

newmtl m15
Ka 1 1 1
Kd 0.556863 0.556863 0.556863
Ks 0.5 0.5 0.5
Ns 200

newmtl m37
Ka 0.569738 0.748604 0.8
Kd 0.454902 0.596078 0.639216
Ks 0.5 0.5 0.5
Tf 0.5 0.5 0.5
Tr 0.5
d 0.5
Ns 100

In that case I guess I can use Kd instead of Ka, could you please confirm.

Moreover, in material m37 I have the parameters Tf and Tr/d which correspond to the transmission filter and transparency/opacity/dissolve of the current material respectively but based on this guide I am not sure where to address it. Should be a dielectric material? I would appreciate if you could give an example, if possible.

Hi Theo,

As Greg mentioned, the wavefront materials do not follow a physical model.
So much of how to interpret the different parameters to radiance materials
depends on what assumptions were made when creating the wavefront obj:

Whether Ka or Kd maps to the color of a radiance material (and for that
matter whether to use a plastic or metal) really depends on the export and
how the values were selected in the source program.
settings, the materials could match either the layer color, layer material,
object color or object material.
Furthermore, depending on the means of assigning those materials, how
specularity and such are assigned can also vary. Whether the values in the
mtl even have physically descriptive information, whether the colors need
gamma correction or not, and whether the parameters can even describe a
physically possible material (which radiance more or less requires) depends
entirely on where the values are coming from, which you have not described.
From a quick glance at your included snippet, by a
conventional interpretation of the wavefront spec, these are not physically
possible materials (specular reflections are way too high, total reflection
and transmission exceeds 1, Ka usually corresponds to an ambient light
(self luminous surface) etc. With these materials, any calculated physical
quantity (like illuminance) would be meaningless.

If in spite of those warnings you want to write a mtl to radiance
interpreter (always keep this in mind:, I would
suggest getting a firm handle on the wavefront material specification,
start here: and then
check the references. After that familiarize yourself with how the program
you are using maps its native materials to the wavefront obj (paying
attention to physics and what colorspace the native renderer operates in).
And then make decisions about what properties are important for your model
translation (colored highlights, index of refraction etc.) Finally, spend
some time understanding the radiance material primitives ( and pick out the
subset of materials to map too (my guess is you can use plastic, glass, and

During this step you may decide it will be easier to write a material file
(or translator) directly from your native format to radiance, bypassing the
mtl file altogether, except to gather the names (as Greg assumed you were
doing). To address the issue with thousands of rooms, it is unlikely that
means you have thousands of different materials (and if you do, how were
they defined?), you can take two paths, either simplify the naming in the
native model, or make use of the alias descriptor (see: and search these
archives), it should be straightforward enough to automate the generation
of thousands of aliases that point to the handful of user defined materials.

Good Luck!


1 Like

Hi Stephen, thanks a lot for the thorough feedback. Well from your description it seems that even if I manage to translate the wavefront materials to radiance ones it would be like shooting in the dark without any warranty that my output would be 100% accurate.

The tools that I am using to extract the scene can be found here this translates the json file from my attached model on my second message to the .obj and .mtl files of my scene. From what I understand the material mapping to the .mtl is done during the reading of the scene since my room layouts and included objects are already predefined. Thus, I guess translating the .mtl files to .rad is my only option.

The point is that going with such kind of models is my only option for now since I cannot find any other large scale dataset with multiple rooms and objects that would follow a physical model. If you are aware of anything that I could use I would be happy if you could share some info. I’ve opened a corresponding thread here but so far there is no response.

@stephanwaz the other case scenario, which for now it might be fine for my needs is to consider a fully lambertian environment. In that case, Kd value from the wavefront .mtl files should be sufficient.

Hi @Greg_Ward I have extracted my materials in materials.rad file e.g.

void colorpict m11_material_9_pat
7 red green blue ../../room/0004d52d1aeeb8ae6de39d6bd993e992/../../texture/wallp_1_1.pic . frac(Lu) frac(Lv)

m11_material_9_pat plastic m11_material_9
5 0.901961 0.109804 0.321569 0 0.2581988897471611

void plastic m12_material_10
5 0.501961 0.501961 0.501961 0 0

void colorpict m13_material_11_pat
7 red green blue ../../room/0004d52d1aeeb8ae6de39d6bd993e992/../../texture/bricks_1.pic . frac(Lu) frac(Lv)

m13_material_11_pat plastic m13_material_11
5 1 1 1 0 0.2581988897471611

void plastic m14
5 0.278431 0.278431 0.278431 0 0.11547005383792516

void plastic m15
5 0.556863 0.556863 0.556863 0 0.1

void plastic m37
5 0.454902 0.596078 0.639216 0 0.1414213562373095

but then when I give obj2rad -m materials.rad house_room2.obj > house_room2.rtm it doesn’t really seem to do anything. My .rtm file gives something like:

# obj2rad -m materials.rad house_room2.obj

# Done processing file: house_room2.obj
# 214054 lines, 214054 statements, 1 unrecognized

Any idea what could be wrong?

I think you meant to run obj2mesh, not obj2rad. And you want the -a option rather than -m.

The output will be a binary “Radiance triangle mesh” that is placed in your scene via a “mesh” primitive, e.g.:

void mesh testmesh
1 house_room2.rtm

I don’t think you want to map your textures in this way. You need to use the Lu and Lv variables. There are other posts on this if you search.

Ok, I thought that obj2rad was the one I should use again. I will test it tomorrow.

Regarding texture mapping as I understood you suggest to use Lu/Lv instead of frac(Lu)/frac(Lv), right? what is their difference, there is also tile_u/tile_v? Because I’ve already went over some previous threads but I did not really understood when I should use each.

Whether you use Lu, Lv, some factor of those, or frac() depends on how the scene was texture-mapped. Best to start simple and correct any problems.

1 Like

Hi @Greg_Ward how I can set manually, if possible, a light source in a room where initially there is not any light source specified?

The simplest is probably putting a spherical light source of an appropriate size slightly below the ceiling plane. If you have a 4x6x3 room in the positive qudrant for instance and Z is your up-direction, you might use:

void light bright
3 1000 1000 1000

bright sphere lamp
4 2 3 2.8 0.1

Yup, I’ve done this already. But then I need to map this material it in the .obj file or give it as an extra one right? This is what I do not understand how to do it.

You don’t need to map it – just add it to the Radiance file(s) you are already rendering with. If you are using obj2mesh, then put the above near the “mesh” primitive that includes it in your scene. See Chapter 1 of Rendering with Radiance for the basics if you don’t know them.

1 Like

Thanks Greg, the tutorial in the Chapter is really helpful.

If I render my scene though I am getting only the light source as you can see in image below:

I used the following commands:

$ obj2mesh -a house_room3.mat house_room3.obj > house_room3.rtm
$ oconv light_source.rad house_room3.rad > house_room3.oct
$ rpict -vp 7.481131553649902 -6.5076398849487305 5.34366512298584 -vd -0.6515582203865051 0.6141704320907593 -0.44527149200439453 -vu -0.32401347160339355 0.3054208755493164 0.8953956365585327 -av 1 1 1 -ab 2 house_room3.oct > render.pic

where my house_room3.rad and light_source.rad are

void mesh testmesh
1 ./house_room3.rtm


# this is the material for my light source:
void light bright
3  1000  1000  1000
#  red_emission  green_emission  blue_emission  #

# here is the light source:
bright sphere fixture
4  2  3  2.8  0.125 # <-- tried to replace these values with 4 45 3 44.8 0.125 see output from getbbox command below
#  xcent  ycent  zcent  radius  #


Trying to visualize my mesh with objview gives me the following output:

I can see the mesh on the top left corner but the output window seems a bit strange (I am not sure whether it should look like that though, meaning stretched).

The bounding box of the scene is the following

$ getbbox house_room3.rad 
xmin      xmax      ymin      ymax      zmin      zmax
43.1173   48.8672  -1.50496   4.24496   40.4901     46.24

Based on these I tried to play with the position of my light source as well as the camera vp, vd and vu parameters but I couldn’t get any good rendering output.

Any hints what might be wrong.

Are you trying to view your mesh from the inside or the outside? The objview script adds distant lights for you, and should work fine for viewing the house from the exterior. Did you try resizing your window? It looks like it started out way too small for some reason – perhaps to do with your X11 window manager. Maybe someone else has experience with this. I’ve never witnessed that particular problem.

You only need the light source for viewing interior rooms, and you need to know where they are and what size. Do you even know which way is up in this model? It could be the Y-axis is up, where you should render with -vu 0 1 0 and set your source to somewhere around Y=3 to be near the ceiling. You can even create an array of lights using the -a option in xform.

It doesn’t really matter I just wanted to view the mesh with the objview to confirm that my mesh loads correctly. My guess is that this stretched window output is also X11 related, I’ve tried to resize the window but then this output image on the left top corner becomes totally black and moreover while the window resizes this left top corner output seems to be staying fixed.

BTW I am getting a similar view also with rpict and rvu.

That’s why I was trying to get the info with the getbbox command. So I can use it to locate my light sphere but while I changed the values it didn’t really help.

This is most likely the case because if you see the output from the objview is tilted.

Let me see if I can test that somehow.

Is there a way from the wavefront format to make a better guess about the type of the material to assign for each object other than setting everything to Plastic?

I was checking for example on the Blender material properties once you load a model but it does not seem clear to find the type. See image below (which correspond to a window glass):

@Dion_Moult you have some experience with Blender already, thus I do not know whether you have anything in mind that might be helpful.

I just relied on the object name coming from OBJ, nothing else. I’m not sure how good of an idea it is to “map” OBJ material properties to Radiance materials, as it may lead to incomplete, or inaccurate material definitions in Radiance.

The Blender material properties depends on the engine that is loaded in Blender. Your screenshot shows material data that can be interpreted by Blender’s Cycles engine and Eevee. However, once it gets exported to OBJ, a lot of that data is lost, so unless you plan to map Blender materials to Radiance, it may be better looking at the OBJ output directly. Also note that in Blender materials, there are many ways to define a material, from very simple to very complex definitions. What you’ve shown there is a single shader node, which is relatively simple, but there is also a glass shader node, and many other Blender nodes which can be combined in many ways.

1 Like