Rtrace with Rsensor

Hi all,
I’m using ‘rtrace’ to compute irradiance of the specific point. However, the lights incoming from various directions don’t obey the cosine weighting law. I found that ‘rsensor’ may help me. I have looked through most topics on rsensor in the community and tried to verify my understanding with a simple example.

sky.mat
! gensky 12 09 14:00 -a 51 -o 0 -m 0
sky.rad
skyfunc glow skyglow
0
0
4 1 1 1 0
skyglow source sky
0
0
4 0 0 1 180
skyfunc glow groundglow
0
0
4 1 1 1 0
groundglow source ground
0
0
4 0 0 -1 180

oconv sky.mat sky.rad > sky.oct
When I input echo ‘0 0 0 0 0 1’ | rtrace –I –ab 1 sky.oct
The result is 5.598167e+01 5.598167e+01 5.598167e+01;
When I input rsensor –h –ab 1 –vp 0 0 0 –vd 0 0 1 sens.dat sky.oct
The result is 5.4437e+01 5.4437e+01 5.4437e+01;
Specifically, I set my snes.dat according to the cosine law as:
degrees 0 20 40 60 80 100 120 140 160 180 200 220 240 260 280
0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
10 .985 .985 .985 .985 .985 .985 .985 .985 .985 .985 .985 .985 .985 .985 .985
20 .940 .940 .940 .940 .940 .940 .940 .940 .940 .940 .940 .940 .940 .940 .940
40 .766 .766 .766 .766 .766 .766 .766 .766 .766 .766 .766 .766 .766 .766 .766
60 .500 .500 .500 .500 .500 .500 .500 .500 .500 .500 .500 .500 .500 .500 .500
70 .342 .342 .342 .342 .342 .342 .342 .342 .342 .342 .342 .342 .342 .342 .342
80 .174 .174 .174 .174 .174 .174 .174 .174 .174 .174 .174 .174 .174 .174 .174
90 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
I thought the two results should be theoretically equal, but they were not.
I am not sure if it is due to my misunderstanding of rtrace and rsensor or the non-optimized parameters.

Many thanks for sharing your thoughts.
Yating

You are missing a portion of your sensor distribution. I think you misunderstood the man page. If you are giving azimuth in 10-degree increments, then you must proceed to 350°, not stopping at 280. Otherwise, the missing matrix elements will be assigned sensitivity values of 0.

It is not clear to me why you are using rsensor in this case. What are you saying about your light sources not obeying the cosine law? From which perspective – that of the lights, or that of the measurement point?

Hi Greg,

I set this sens.dat obeying the cosine law to figure out the syntax of the rsensor, since I could compare the results with those using rtrace. My real sensitivity distribution doesn’t exactly obey the cosine law.

And in my above case, I set the position and orientation of the sensor same with those of the measurement point. Actually, I am also kind of confused about the view.vf of the rsensor.

I tried to correct the sens.dat according to your guide, and the result was

5.4528e+01 5.4528e+01 5.4528e+01.

This improves a little but still differs from that using rtrace.

Assuming the sensor distribution is correct, try increasing your calculation parameters in both your rtrace and rsensor runs. Try:

-ad 4096 -as 1024 -lw 1e-5 -aa .05

Hi Greg,

I tried to set the parameters to meet the ‘high quality’ as you suggested. However, the difference of irradiance from ‘rtrace’ and ‘rsensor’ was relatively stable (around 3%).
I guess it was the calculation error caused by the interpolation of the sensitivity distribution and tried to detail my sens.dat as:
degrees 0 10 20 30 40 50 60 … 330 340 350 360
0 1 1 1 1 1 1 1 … 1 1 1 1
10 .985 .985 .985 .985 .985 .985 .985 … .985 .985 .985 .985
20 .940 .940 .940 .940 .940 .940 .940 … .940 .940 .940 .940
30
.
.
.
70
80
90 0 0 0 0 0 0 0 … 0 0 0 0
And it showed ‘fatal - maximum theta must be positive in sensor file’.
Do you know where I can learn the degree setting law of the sensitivity distribution?

I learned some calculation methods of Radiance through your book. I am wondering if the irradiance calculation methods of ‘rsensor’ is same as ‘rtrace –I’ except the part of sensitivity distribution. Specifically, the irradiance values in specified direction are same for ‘rsensor’ and ‘rtrace –I’, and the only difference is that ‘rtrace’ is multiplied by cosine while ‘ rsensor’ is multiplied by custom sensitivity distribution.

The latest version of rsensor does not produce this error. You should make sure you have the latest Radiance download, which is 5.2 (or 5.3a if you are running from the HEAD). What does “rtrace -version” say?

Hi Greg,

I tried them under Radiance 5.2 and the error was not shown again. However, the deviation of irradiance value calculated by ‘rtrace’ and ‘rsensor’ still exists (around 3%). I am not sure the cause of this deviation and if it is rational or not?

If this deviation is like system error that we could not change, which one do you think is more accurate? Rtrace or Rsensor?

My actual sensitivity distribution is near cosine law, so the accuracy comparison of rtrace and rsensor is quite important for my simulation.

You could also try increasing the polar angle resolution of your sensitivity file, since interpolation may be leading to a slight bias in the calculation from 10° theta increments. You can reduce the number of azimuth values to 90° increments if you like, but try 5° increments for theta.

Hi Greg,

yes, you are right. The polar angle resolution should be more important for my sensitivity distribution. After several trials, the combination of 1° increment for theta and 10° increments for phi makes the same irradiance value for these two programs. However, I found that it doesn’t fit all the scenes, for some relatively complicated scenes, the differences (around 3%) still exist.

Another problem that I found is that the output would differ even with the same command. Is it due to the Monte Carlo algorithm? And a difference of 2% is rational? If I want to get a more accurate value, should I run one command for several times and calculate the mean value?

Yes, the Monte Carlo simulation can probably account for the differences in complex scenes. You can keep increasing the -ad parameter until you are happy with the agreement, but 2% is already very good by our standards.