Create a view matrix for three-phase method simulating ceiling-sensors and workplane points

Dear All,

Best wishes for a happy 2022. I am simulating a room with some ceiling-sensors and some workplane points (no shield) by using three-phase method and rsensor (with a custom sensitivity distribution). I think my simulation is different to the simulation in three-phase method tutorials (for example: Sarith’s one) at creating a view matrix. This is the Sarith’s command for creating a view matrix:

 rfluxmtx -v -I+ -ab 4 -ad 5000 -lw 0.0002 -n 16 -y 100 - objects/GlazingVmtx.rad

-i octrees/room3ph.oct < points.txt > matrices/vmtx/v.mtx

The file points.txt is only for workplane points. I need to add some sensor views with a custom sensitivity distribution. This is a command I found in this forum to create a view matrix with rsensor:

 rsensor -h -rd 5000 -vf view_test.vf narrow10.dat . |rcontrib -c 5000 -f klems_int.cal -bn Nkbins -b kbinS -m glazing -I+ -ab 11 -ad 50000 -ds .1 -lw 1e-5 test_r.oct>narrow10.vmx

And the commands for running multiple ceiling-sensors:

 rcalc -o 'rsensor -h -rd 10000 -vp ${$1} ${$2} ${$3} -vd ${$4} ${$5} ${$6} sensor.dat .' posdir.txt

| sh
| rtrace -n 8 -af calc.cache [rtrace options] -h -ov octree
| total -10000 -m
| rlam posdir.txt -
| rcalc -e ‘$1=$1;$2=$2;$3=$3;$4=179*(.265*$7+.670*$8+.065*$9)’

So I will need to combine rfluxmtx with rsensor and rcalc in order to create a view matrix for three-phase method. Could you give me some advice for their combination?

Thanks in advance,
Cong

Hi Cong,

I’m not sure which post you got the first “rsensor | rcontrib” command, but I don’t think there should be a “+I” option in it. The rsensor command with a “.” argument in place of an octree generates a set of ray samples, which are then traced by rcontrib to a window modified by “glazing”. The output will be a single 145 x 3-component vector that can be used in place of an “illuminance” view matrix in Sarith’s 3-phase commands.

You can use rfluxmtx in place of rcontrib, just to simplify things. Something like:

rsensor -h -rd 5000 -vf view_test.vf narrow10.dat . | rfluxmtx -c 5000 -ab 11 -ad 50000 -ds .1 -lw 1e-5 - window.rad -i test_r.oct > single_row.mtx

The lone hyphen (’-’) in the rfluxmtx command is very important, as it says to read the sample rays from the standard input rather than generating them from a sender description. The “window.rad” file will contain a description of the sample surface(s) with comments indicating a Klems distribution as described in the rfluxmtx man page and described in my 2014 presentation in London.

The second rcalc command that generates rsensor commands can be used in place of the original command to create a matrix rather than a vector, i.e.:

rcalc -o 'rsensor -h -rd 5000 -vp ${$1} ${$2} ${$3} -vd ${$4} ${$5} ${$6} sensor.dat .' posdir.txt | sh | rfluxmtx -c 5000 -ab 11 -ad 50000 -ds .1 -lw 1e-5 - window.rad -i test_r.oct > sensor_set.mtx

Note how the rfluxmtx command replaces rtrace, and the -rd and -c option arguments must agree. Again, you can use the matrix generated above just as you would use any illuminance “view matrix” in a 3-phase calculation.

Hope this helps (and works!)
-Greg

Hi Greg,

Thank you for your clear advice. Your suggested commands are only for sensors, so I need to run another three-phase calculation for workplane points when I want to evaluate the correlation between workplane points and ceiling-sensors. This is the command I am planning to use to calculate view matrix for workplane points:

rfluxmtx -v -I+ -ab 11 -ad 50000 -lw 1e-5 -n 16 -y 10 - window.rad -i test_r.oct < photo.txt > workplane.mtx

With the same high setting (-ab 11 -ad 50000), do you think it is reasonable to evaluate the results between the two simulations?

To generate the image viewed from the sensor, I tried rvu but there is no option to add the sensor’s .dat file. Is there a way to generate this image or can I use rfluxmtx with rsensor to create an image of the view matrix?

Regards,
Cong

Hi Cong,

Your “workplane.mtx” calculation looks good, and reminds me that you probably want to add a “-y” option to the rcalc/rsensor command, indicating the number of sensor positions in “posdir.txt”. That way, your output matrix will have a well-defined size in the header, rather than missing the number of rows. (You can use "rfluxmtx -y wc -l < posdir.txt " if your file has one sensor position & orientation per line.)

And yes, you should be able to compare the results between the two simulations. Your rfluxmtx -I+ command will have a cosine weighting, whereas the rsensor command will use whatever weighting is in your “sensor.dat” file.

Finally, you can generate an image from the point of view of a sensor with a fisheye image such as “-vth -vh 180 -vv 180,” which would match the view of a cosine-weighted sensor. There is no direct way to simulate views with different cut-off angles, although you could use the pcomb command to apply a cut-off afterwards, with some effort. If you wanted to apply a weighting function directly from your sensor.dat file, it would be possible but quite a challenge to write a C program to do that, and I’m not sure what the tangible benefit would be.

-Greg

Hi Greg,

Your advice helps me a lot. Thank you.

Regards,
Cong

Hi @Greg_Ward and Radiance community,

I am using the three-phase method with rsensor to calculate signals of cosine, 55cos, and Lithonia (very narrow sensitivity) mounted on the ceiling at 2 m depth from the window (installed a roller shade). I uploaded the result here Compare cosine & 55cos & Lithonia.xlsx - Google Sheets. The excel file includes two sheets: the first one is the roller shade pulled up completely (0% shading) and the second one is the roller shade pulled down 85% (85% shading). The shading material is black (0% OF and 0% VT).

I have 2 questions:

  1. Why does the signal of cosine is higher than the one of 55cos when 0% shading but lower when 85% shading? I think the signal of cosine should be higher than 55cos’s and the results should be consistent for every shading state.
  2. Why does the signal of Lithonia is higher than the one of 55cos when 0% shading? Since the Lithonia has very narrow sensitivity, its signal should be lower.

I would really appreciate any help.

Regards,
Cong

Hi Cong,

You should give us some more information, such as the input files to rsensor and the commands you are using for your 3-phase method. I am not sure you can directly compare the results in this way, since the signal generated by the ray directions from rsensor are based only on samples in those directions, and do not carry the weight values from the provided sensitivity array.

It often helps during debugging to generate fisheye images from your ceiling position at selected time points, to see what the sensor is seeing at those times.

Best,
-Greg

Hi Greg,

These are my input files and commands Inputs and Commands - Google Drive. Using fisheye images is a good idea to study the signals of the photosensors. Could you please write a sample command for creating a view matrix using rsensor? Then I can generate fisheye images by using Radiance three-phase method. Thank you.

Regards,
Cong

Hi Cong,

First, a comment on your calculation procedure, which is quite good. It would be more efficient to compute all your view matrices at once with a single rfluxmtx command. To do this, gather all your “varShading/GlazingVmtx_60_*.rad” files together into one file, which we’ll call “AllGlazing.rad”:

#@rfluxmtx o=matrices/vmtx/v1.mtx
!xform varShading/GlazingVmtx_60_1.rad

#@rfluxmtx o=matrices/vmtx/v2.mtx
!xform varShading/GlazingVmtx_60_2.rad

# etc...

Since rfluxmtx allows any number of receivers, this will save time by sharing the results of the rays sent out from your sensor positions. The #@rfluxmtx directives ensure that your results go where you want them to. Then, you can run a single command:

rcalc -o 'rsensor -h -rd 5000 -vp ${$4} ${$5} ${$6} -vd ${$8} ${$9} ${$10} sensors/55cos.dat .' posdir.txt | sh | rfluxmtx -c 5000 -ab 11 -ad 50000 -ds .1 -lw 1e-5 - AllGlazing.rad -i octrees/room3ph.oct

There is no need to redirect the output, as we have designated different output matrix files in “AllGlazing.rad”. Unfortunately, there is no similar trick for the daylight matrix. You still need to do those one at a time, since rfluxmtx only supports a single sender surface.

My original supposition about comparing absolute results from different sensor distributions holds, no matter what you do. Your 3-phase sensor results will be uncalibrated, and cannot be compared between sensor types unless you compute your own calibration factors based on the sensitivity function integrals and apply those post hoc. Sensor calibration factors are not otherwise defined.

Regarding views from the sensor positions, you can simply pass your “posdir.txt” file containing the different sensor views to rpict with the following options to render 12 pictures:

rpict -x 512 -y 512 [render options] -S 1 < posdir.txt -o pos%d.hdr scene.oct

If you like, you can run N of these commands in the background, and they will share the load if you have N processors. The octree “scene.oct” should contain your scene geometry, windows, and a suitable sky created by gendaylit or gensky and the associated sources. Be sure to share the ambient cache using the -af option if you set -ab 1 or greater, as you probably should. (You don’t need to go as high as -ab 11 for visualization.)

If you still want to render the views from the sensor positions over the whole course of the year using the 3-phase method, it gets a bit more complicated. I think Sarith’s tutorial goes into details on that in Section 7.1.1.2.

Cheers,
-Greg

Hi Greg,

Thank you for your advice. Gathering all my glazing files will reduce my simulation time.

I understand that I should not compare directly the signals from different photosensors. However, considering only a photosensor type, is it reasonable if I evaluate the correlation between the sensor’s signal to the workplane illuminance; and can I combine different signals of different shading states to get hourly signals for a whole year under a shading control? I think the answer is yes because I only consider one sensitivity distribution.

Thank you again for your advice on sensor view images.

Regards,
Cong

Yes, that should work just fine. Sensors values from the same distribution may be compared to each other and to other times in the simulation as relative values. It is just the absolute calibration that is at issue, and the reason that different sensor distribution types will have some unknown ratio between them compared to the physical sensors.

Hi Greg,

I understand. Thanks for the help.

Regards,
Cong