Light simulation on 3D mesh with Radiance

Hi,

I have a room, full model (you can download it from the following link https://filesender.garr.it/?s=download&token=1337e276-7a3f-4cb3-b57f-02c1a6b0088e apologies for the file size, approx. 2.5Gb) in a .ply format. The file contains the following structure:

Description:
ply
format binary_little_endian 1.0
comment generated by omnimap
element vertex 3786298
property float x
property float y
property float z
property float nx
property float ny
property float nz
property float radiance_blue
property float radiance_green
property float radiance_red
property float irradiance_blue
property float irradiance_green
property float irradiance_red
property float irradiance_lamp_0_blue
property float irradiance_lamp_0_green
property float irradiance_lamp_0_red
property float irradiance_lamp_1_blue
property float irradiance_lamp_1_green
property float irradiance_lamp_1_red
property float irradiance_lamp_2_blue
property float irradiance_lamp_2_green
property float irradiance_lamp_2_red
property float irradiance_lamp_3_blue
property float irradiance_lamp_3_green
property float irradiance_lamp_3_red
property float irradiance_lamp_4_blue
property float irradiance_lamp_4_green
property float irradiance_lamp_4_red
property float irradiance_lamp_5_blue
property float irradiance_lamp_5_green
property float irradiance_lamp_5_red
property float reflectance_blue
property float reflectance_green
property float reflectance_red
element face 7570727
property list uchar uint vertex_indices
element lamp 6
property list uint uint vertex_indices
end_header

The radiosity data I guess can be ignored or be used for comparison.

In any case with the above data in theory I have all the needed information for applying a light simulation with Radiance, i.e. 3D model, vertices, faces, normals, reflectance and the vertices corresponding to the 6 light sources in the scene thus, their position from where possibly I could apply a light source model by using an IES file or something. My question now is the following, would be easy for someone to give me some details how I could use these data to model radiance and irradiance through the Radiance light simulation tools so that to get the corresponding flux/lux values.

Also does Radiance supports an output in the 3D space meaning per vertex/face.

I haven’t used Radiance before, thus I would appreciate any feedback about how to deal with such data or even how to restructure it for being possible to apply such a light simulation with Radiance.

Thanks.

Hi guys,

What is the process to include the .mtl info and parameters from a model to the Radiance simulation? For example in this attached model.

As far as I know, there are no general material converters for Radiance. Most people redefine the materials and assign them to surfaces based on material IDs. The obj2rad program has a complicated means of mapping materials to surfaces, but you can usually just use the assigned material name.

Ok, so since my .obj file includes the usemtl <material_id> command obj2rad will automatically apply the conversion.

@Greg_Ward can you explain why is the oconv command is complaining below:

$ obj2rad house_room2.obj > house_room2.rad
$ oconv house_room2.rad > house_room2.oct
oconv: fatal - (house_room2.rad): undefined modifier "m0_material_0"

I found a similar question in the following link obj2mesh -> oconv question but I do not really understand whether there is a relation or not.

Well, you have the material NAME assignments, but not the actual materials. You have to create those in a text editor or assign them from a library of materials. Either way, you need to know/learn enough about Radiance to define your materials based on what the objects are supposed to look like. The information in your *.mtl file may offer some clues, but they won’t translate directly.

I see, what is the proper way to create these materials. I guess there should be some automatic way or something, right? I cannot imagine that this has to be done manually.

Imagine that I have hundreds of such rooms. There should be a way to automatize this procedure.

I’ve found this thread where a similar discussion was up. According to @Terrance_McMinn1 I could create my materials by porting them from the .mtl files but what it is not clear to me how the r, g and b can be extracted from the variables Ka, Kd and Ks since each one of them has their own correspondences. Is it fine to use only the values from Ka. Also how do I define the values of specularity and roughness.

That other thread is a decent guide. You’re kind of on your own with translating the non-physical model of Wavefront to the physical materials in Radiance. The specularity is related to the Ks and the color the the Kd, generally speaking. You can look into the Radiance material models here. The only other formula I can offer is an approximate conversion from specular power to roughness, which is:

roughness = sqrt(2/Ns)

I can also suggest looking at this paper.

1 Like

Hi @Greg_Ward many thanks for the references and the short of explanation, appreciated. I’ve also found another old thread where you are also explaining a short of pipeline to be used. I will try to create a kind of Wavefront2Radiance translator and possibly share it here. Just to clarify for the color is it the values from Kd or Ka that I should be using because above you mention Kd but in your older message you say Ka and Terrance also refers to Ka to be used for color.

Must be Ka, though I don’t know what Kd is in that case.

Kd values specify the diffuse reflectivity using RGB values.

@Greg_Ward I am trying to parse my Wavefront .mtl file for creating a .rad corresponding based on our discussion above. However, if I look on my file I have materials which they do not contain a Ka parameter but only Kd, e.g.

newmtl m11_material_9
Kd 0.901961 0.109804 0.321569
Ks 0.0666667 0.0666667 0.0666667
Ns 30
map_Kd ../../room/0004d52d1aeeb8ae6de39d6bd993e992/../../texture/wallp_0.jpg

newmtl m12_material_10
Kd 0.501961 0.501961 0.501961

newmtl m13_material_11
Kd 1 1 1
Ks 0.0666667 0.0666667 0.0666667
Ns 30
map_Kd ../../room/0004d52d1aeeb8ae6de39d6bd993e992/../../texture/bricks_1.jpg

newmtl m14
Ka 0.35 0.35 0.35
Kd 0.278431 0.278431 0.278431
Ks 0.762295 0.762295 0.762295
Ns 150

newmtl m15
Ka 1 1 1
Kd 0.556863 0.556863 0.556863
Ks 0.5 0.5 0.5
Ns 200

newmtl m37
Ka 0.569738 0.748604 0.8
Kd 0.454902 0.596078 0.639216
Ks 0.5 0.5 0.5
Tf 0.5 0.5 0.5
Tr 0.5
d 0.5
Ns 100

In that case I guess I can use Kd instead of Ka, could you please confirm.

Moreover, in material m37 I have the parameters Tf and Tr/d which correspond to the transmission filter and transparency/opacity/dissolve of the current material respectively but based on this guide I am not sure where to address it. Should be a dielectric material? I would appreciate if you could give an example, if possible.

Hi Theo,

As Greg mentioned, the wavefront materials do not follow a physical model.
So much of how to interpret the different parameters to radiance materials
depends on what assumptions were made when creating the wavefront obj:

Whether Ka or Kd maps to the color of a radiance material (and for that
matter whether to use a plastic or metal) really depends on the export and
how the values were selected in the source program.
settings, the materials could match either the layer color, layer material,
object color or object material.
Furthermore, depending on the means of assigning those materials, how
specularity and such are assigned can also vary. Whether the values in the
mtl even have physically descriptive information, whether the colors need
gamma correction or not, and whether the parameters can even describe a
physically possible material (which radiance more or less requires) depends
entirely on where the values are coming from, which you have not described.
From a quick glance at your included snippet, by a
conventional interpretation of the wavefront spec, these are not physically
possible materials (specular reflections are way too high, total reflection
and transmission exceeds 1, Ka usually corresponds to an ambient light
(self luminous surface) etc. With these materials, any calculated physical
quantity (like illuminance) would be meaningless.

If in spite of those warnings you want to write a mtl to radiance
interpreter (always keep this in mind: https://xkcd.com/1319/), I would
suggest getting a firm handle on the wavefront material specification,
start here:
https://en.wikipedia.org/wiki/Wavefront_.obj_file#Basic_materials and then
check the references. After that familiarize yourself with how the program
you are using maps its native materials to the wavefront obj (paying
attention to physics and what colorspace the native renderer operates in).
And then make decisions about what properties are important for your model
translation (colored highlights, index of refraction etc.) Finally, spend
some time understanding the radiance material primitives (
https://floyd.lbl.gov/radiance/refer/ray.html#Materials) and pick out the
subset of materials to map too (my guess is you can use plastic, glass, and
trans).

During this step you may decide it will be easier to write a material file
(or translator) directly from your native format to radiance, bypassing the
mtl file altogether, except to gather the names (as Greg assumed you were
doing). To address the issue with thousands of rooms, it is unlikely that
means you have thousands of different materials (and if you do, how were
they defined?), you can take two paths, either simplify the naming in the
native model, or make use of the alias descriptor (see:
https://floyd.lbl.gov/radiance/refer/ray.html#Scene and search these
archives), it should be straightforward enough to automate the generation
of thousands of aliases that point to the handful of user defined materials.

Good Luck!

Stephen

1 Like

Hi Stephen, thanks a lot for the thorough feedback. Well from your description it seems that even if I manage to translate the wavefront materials to radiance ones it would be like shooting in the dark without any warranty that my output would be 100% accurate.

The tools that I am using to extract the scene can be found here this translates the json file from my attached model on my second message to the .obj and .mtl files of my scene. From what I understand the material mapping to the .mtl is done during the reading of the scene since my room layouts and included objects are already predefined. Thus, I guess translating the .mtl files to .rad is my only option.

The point is that going with such kind of models is my only option for now since I cannot find any other large scale dataset with multiple rooms and objects that would follow a physical model. If you are aware of anything that I could use I would be happy if you could share some info. I’ve opened a corresponding thread here but so far there is no response.

@stephanwaz the other case scenario, which for now it might be fine for my needs is to consider a fully lambertian environment. In that case, Kd value from the wavefront .mtl files should be sufficient.

Hi @Greg_Ward I have extracted my materials in materials.rad file e.g.

void colorpict m11_material_9_pat
7 red green blue ../../room/0004d52d1aeeb8ae6de39d6bd993e992/../../texture/wallp_1_1.pic . frac(Lu) frac(Lv)
0
0

m11_material_9_pat plastic m11_material_9
0
0
5 0.901961 0.109804 0.321569 0 0.2581988897471611

void plastic m12_material_10
0
0
5 0.501961 0.501961 0.501961 0 0

void colorpict m13_material_11_pat
7 red green blue ../../room/0004d52d1aeeb8ae6de39d6bd993e992/../../texture/bricks_1.pic . frac(Lu) frac(Lv)
0
0

m13_material_11_pat plastic m13_material_11
0
0
5 1 1 1 0 0.2581988897471611

void plastic m14
0
0
5 0.278431 0.278431 0.278431 0 0.11547005383792516

void plastic m15
0
0
5 0.556863 0.556863 0.556863 0 0.1

void plastic m37
0
0
5 0.454902 0.596078 0.639216 0 0.1414213562373095

but then when I give obj2rad -m materials.rad house_room2.obj > house_room2.rtm it doesn’t really seem to do anything. My .rtm file gives something like:

# obj2rad -m materials.rad house_room2.obj

# Done processing file: house_room2.obj
# 214054 lines, 214054 statements, 1 unrecognized

Any idea what could be wrong?

I think you meant to run obj2mesh, not obj2rad. And you want the -a option rather than -m.

The output will be a binary “Radiance triangle mesh” that is placed in your scene via a “mesh” primitive, e.g.:

void mesh testmesh
1 house_room2.rtm
0
0

I don’t think you want to map your textures in this way. You need to use the Lu and Lv variables. There are other posts on this if you search.

Ok, I thought that obj2rad was the one I should use again. I will test it tomorrow.

Regarding texture mapping as I understood you suggest to use Lu/Lv instead of frac(Lu)/frac(Lv), right? what is their difference, there is also tile_u/tile_v? Because I’ve already went over some previous threads but I did not really understood when I should use each.

Whether you use Lu, Lv, some factor of those, or frac() depends on how the scene was texture-mapped. Best to start simple and correct any problems.

1 Like

Hi @Greg_Ward how I can set manually, if possible, a light source in a room where initially there is not any light source specified?