Radiance animation questions

Hello group,

I am attempting my first full blown Radiance animation and have several related questions. This link has a rough version of the animation, done with -ab 0.

http://www.archenergy.com/downloads/pub/images/Radiance/walkthrough_init3.mov

and yes, i seem to trip up the curb, have several neck spasms, and stare at a wall at one point....all have since been fixed but this leads to my first question.

1.) What programs do people use to produce animation paths? I initially wanted to use rshow but couldn't get it working on our Suse 8.2. I understand programs like maya, studio vis, form-z?? have the ability to generate these animations paths but are too much (expensive) for what I need. It would be nice to be able to just export my spline path from autoCAD and then have the ability to adjust frames/sec, slow down portions of the path, and adjust the view direction throughout.

2.) There are portions of this path that I want to slow down and I wanted to use pinterp to interpolate some extra frames. How can I generate a z-buffer file for a view without recreating the picture? I thought I could just run it at very low settings if nothing else. And yes, I am running ranimate using the windows version (our linux machines were filled up with CFD calcs at go time and I was forced to use our NT machines - btw: ranimate does not work on XP - error "Windows socket operations not supported") and it is giving errors using INTERPOLATE and MBLUR which would have created the zbf files automatically. Hence, I was planning on using pinterp afterwards to achieve interpolation but now I have no z-buffer files. which leads to my next questions....

3.) Where is pinterp? I just installed the HEAD version on our linux machines (yes, its been a while since I last dabbled in the linux release) and i can't find pinterp. It shows up in the man pages but not in the bin directory. BTW - pinterp does not seem to work in the windows version either.

4.) So, we then tried to install the last official release 3.5 and it gave us a bunch of errors. The error log is here. Help!

http://www.archenergy.com/downloads/pub/images/Radiance/installLog.txt

5.) Using an ambfile does not seem to be doing much for my animation times? I was expecting a big decrease in calc time the further i got into my animation but at about 5/6 through it has not sped up much. It has been building the ambient file the whole time, although it is only 7MB after roughly 1000 frames, which seems too small (10 panoramic views I've been running on a different machine already has an ambient file of 7MB). Here is what one of my ranfiles looks like:

DIRECTORY = anim/walkthrough_sect2
VIEWFILE = anim_path2.pts
START = 311
END = 622
OCTREE = octrees/bc.oct
RESOLUTION = 768
render = -av 2.5 2.5 2.5 -ar 46 -aa .1 -ad 4096 -as 2048 -ab 1 -af ambfiles/bc.amb
render = -dp 512 -ds .3 -dj 0 -dt 0.2 -dr 1 -dc .5
render = -lr 6 -lw .002 -sj .9 -st 0.01

6.) Any suggestions on the parameters used? I am no expert on optimizing parameters. These parameters give me a fairly good image with about 1/2 hour calc time (which was the speed needed to meet deadline). But there are still a few splotches, the light from my electric lights is a little too crisp, and I am not too happy with the specular reflections. You can see the quality I am getting in one of the final images here:

http://www.archenergy.com/downloads/pub/images/Radiance/frame741.tif

7.) As can be seen, I have many curved surfaces in this model most of which have been faceted. I would like to use a function to smooth the shading on these. Is there any way to do this given the geometry is already built? Something like the Phong shading -s option in gensurf or maybe with the new mesh primitive??? I apologize for my ignorance on this topic, I rarely require this much detail. I'm imagining a modifier or something I could give to the polygons I want to smooth over.

8.) I was initially mapping wood grain onto a lot of the interior wood surfaces but given the varying distances of the wood I could never get it to look very good. When it looked good up close, it looks like particle board far away. And when it looks good far away (ie larger scale) it looks like a zebra up close. I would like to have the grain at the correct scale, but don't want it looking like particle board (i'm assuming this is just cause there are limited number of pixels to display this detail). What strategies have others used to get a good looking pattern at varying distances? Also, I would like to map the textures better onto these surfaces rather than using xgrain, ygrain...etc. How could this be done?

Sorry for the long e-mail and so many questions clumped into one and the various tangents my questions may have took.

Best Regards,
Zack

···

--
Zack Rogers
Staff Engineer
Architectural Energy Corporation
2540 Frontier Avenue, Suite 201
Boulder, CO 80301 USA

tel (303)444-4149 ext.235
fax (303)444-4304

I'm going to take a stab at one question - I've been using obj2mesh

7.) As can be seen, I have many curved surfaces in this model most of
which have been faceted. I would like to use a function to smooth the
shading on these. Is there any way to do this given the geometry is
already built? Something like the Phong shading -s option in gensurf or

maybe with the new mesh primitive??? I apologize for my ignorance on
this topic, I rarely require this much detail. I'm imagining a modifier

or something I could give to the polygons I want to smooth over.

You can use obj2mesh to get smooth shading
Isolate the curved surfaces and export them as .obj run obj2mesh, put the
.rtm into a .rad
here's a big honkin long URL that explains efforts to get phong shading with
textures
http://www.daiservices.btinternet.co.uk/Radiance/Radiance_v3_5_UNIX/Phong_sh
ading_and_bitmap_textures/Phong_shading_and_bitmap_textures.htm
Near the bottom is an obj2mesh example... posted by John Graham of Digital
Architectural Illustration Services

The new OBJ2MESH utility is another way of importing a mesh object into
RADIANCE and rendering with both phong smoothing and UV mapping
co-ordinates. Ensure that RADIANCE 3.6A (patched V3.5) or greater is used.

obj2mesh -a Cyl01.mtl -n 15 Cyl01.obj Cyl01.rtm

Once the .RTM and corresponding .RAD files have been created, a material can
be specified to utilise both the UV mapping co-ordinates and smoothing data.
Below is a possible example of an appropriate material definition (note the
dot between the bitmap name and Lu and the necessary second modifier
Mat.Wood):

void colorpict Mat.WoodGrain
7 red green blue ashwood.pic . Lu Lv
0
0

Mat.WoodGrain plastic Mat.Wood
0
0
5 0.75 0.75 0.75 0 0

Mat.Wood mesh CYLIN
1 Cyl01.rtm
0
0

HTH

Rob F

Zack Rogers wrote:

i seem to trip up the curb, have several neck spasms, and stare
at a wall at one point....

You may also want to set the exposure to a fixed value, resp. one
for exterior and one for interior shots.

2.) There are portions of this path that I want to slow down and I
wanted to use pinterp to interpolate some extra frames. How can I
generate a z-buffer file for a view without recreating the picture?
I thought I could just run it at very low settings if nothing else.

That's the easiest way I can think of.

btw: ranimate does not work on XP - error "Windows socket
operations not supported") and it is giving errors using INTERPOLATE and
MBLUR which would have created the zbf files automatically.

Ranimate is one of the really hard ones to port to non-posix
platforms, which is probably the reason why the folks doing it
for DR left it at "good enough".

3.) Where is pinterp? I just installed the HEAD version on our linux
machines (yes, its been a while since I last dabbled in the linux
release) and i can't find pinterp.

Should be there. Are you sure you installed the additional
support package before building? Because otherwise you're missing
out on the tifflib, which will cause other stuff in the same
directory not to get built (even if it doesn't actually use the
tifflib).

BTW - pinterp does not seem to work in the
windows version either.

Try to make sure that the file paths you feed to pinterp are as
short as possible (ie. use relative paths). This is fixed in
HEAD, but many of the picture manipulation programs used to
suffer from buffer overflows when comand lines got too long.

4.) So, we then tried to install the last official release 3.5

Use your HEAD! :wink:

5.) Using an ambfile does not seem to be doing much for my animation
times?

I'm not sure about that, but the ambient file may have limited
savings effects with just one ambient bounce. You're also moving
around a lot, so that most of your previous ambient values will
be irrelevant for the part of the scene you're looking at a few
frames later.

6.) Any suggestions on the parameters used?

You'll have to bribe Jack to reveal some of his secrets... :wink:

7.) As can be seen, I have many curved surfaces in this model most of
which have been faceted. I would like to use a function to smooth the
shading on these. Is there any way to do this given the geometry is
already built?

Some future versions of Radout and dxf2rad may do this (with the
new mesh primitive), but it's not an urgent issue for me as long
as Rayfront still uses Radiance 3.1/3.2.

8.) I was initially mapping wood grain onto a lot of the interior wood
surfaces but given the varying distances of the wood I could never get
it to look very good. When it looked good up close, it looks like
particle board far away.

You'd need to do oversized renderings and filter them down to
avoid the particle look in distant patterns. On the other hand,
there's no real point in mapping a wood grain pattern to surfaces
that you never look at from up close.

  Also, I would like to map the
textures better onto these surfaces rather than using xgrain,
ygrain...etc. How could this be done?

There's an "arbmap.cal" included with Rayfront for that (and it
should also be in the archives here somewhere for those who don't
have that).

-schorsch

···

--
Georg Mischler -- simulations developer -- schorsch at schorsch com
+schorsch.com+ -- lighting design tools -- http://www.schorsch.com/

Hi Zack,

Nice animation, even with the path problems. This is quite a model. I'll address your questions inline...

From: Zack Rogers <[email protected]>
Date: December 16, 2003 4:48:18 PM PST

Hello group,

I am attempting my first full blown Radiance animation and have several related questions. This link has a rough version of the animation, done with -ab 0.

http://www.archenergy.com/downloads/pub/images/Radiance/walkthrough_init3.mov

and yes, i seem to trip up the curb, have several neck spasms, and stare at a wall at one point....all have since been fixed but this leads to my first question.

1.) What programs do people use to produce animation paths? I initially wanted to use rshow but couldn't get it working on our Suse 8.2. I understand programs like maya, studio vis, form-z?? have the ability to generate these animations paths but are too much (expensive) for what I need. It would be nice to be able to just export my spline path from autoCAD and then have the ability to adjust frames/sec, slow down portions of the path, and adjust the view direction throughout.

I posted something to the list about a year ago on generating animation paths, which I've attached to this message. Perhaps this can help.

2.) There are portions of this path that I want to slow down and I wanted to use pinterp to interpolate some extra frames. How can I generate a z-buffer file for a view without recreating the picture? I thought I could just run it at very low settings if nothing else. And yes, I am running ranimate using the windows version (our linux machines were filled up with CFD calcs at go time and I was forced to use our NT machines - btw: ranimate does not work on XP - error "Windows socket operations not supported") and it is giving errors using INTERPOLATE and MBLUR which would have created the zbf files automatically. Hence, I was planning on using pinterp afterwards to achieve interpolation but now I have no z-buffer files. which leads to my next questions....

You can take out all your light sources, and this should give you a fast rendering of the z-buffer if that's all you're after. With one source (the sun) and no indirect, there shouldn't be much difference from rendering all over, though. (Also set -st 1 to avoid specular sampling -- then it's just like ray casting, really.)

3.) Where is pinterp? I just installed the HEAD version on our linux machines (yes, its been a while since I last dabbled in the linux release) and i can't find pinterp. It shows up in the man pages but not in the bin directory. BTW - pinterp does not seem to work in the windows version either.

Beats me. It should compile OK -- maybe you had a problem with ra_tiff or something and the files in src/px weren't installed as a result. Go into src/px and run "rmake -k install" manually to see what happens.

4.) So, we then tried to install the last official release 3.5 and it gave us a bunch of errors. The error log is here. Help!

http://www.archenergy.com/downloads/pub/images/Radiance/installLog.txt

There's no fixing the 3.5 build on Linux at this point. You have to work from the binaries or the HEAD -- sorry!

5.) Using an ambfile does not seem to be doing much for my animation times? I was expecting a big decrease in calc time the further i got into my animation but at about 5/6 through it has not sped up much. It has been building the ambient file the whole time, although it is only 7MB after roughly 1000 frames, which seems too small (10 panoramic views I've been running on a different machine already has an ambient file of 7MB). Here is what one of my ranfiles looks like:

DIRECTORY = anim/walkthrough_sect2
VIEWFILE = anim_path2.pts
START = 311
END = 622
OCTREE = octrees/bc.oct
RESOLUTION = 768
render = -av 2.5 2.5 2.5 -ar 46 -aa .1 -ad 4096 -as 2048 -ab 1 -af ambfiles/bc.amb
render = -dp 512 -ds .3 -dj 0 -dt 0.2 -dr 1 -dc .5
render = -lr 6 -lw .002 -sj .9 -st 0.01

The time reduction after the first frame isn't noticeable as it doesn't progress much from there -- each new frame adds incrementally to the ambient file. You are still much better off than not using one (MUCH), but don't expect your rendering times to continue decreasing as the animation progresses.

6.) Any suggestions on the parameters used? I am no expert on optimizing parameters. These parameters give me a fairly good image with about 1/2 hour calc time (which was the speed needed to meet deadline). But there are still a few splotches, the light from my electric lights is a little too crisp, and I am not too happy with the specular reflections. You can see the quality I am getting in one of the final images here:

http://www.archenergy.com/downloads/pub/images/Radiance/frame741.tif

Looks pretty good to my eyes. You could improve rendering times by removing some of the fussy details from the ambient calcultion with an ambient exclude file (-aE mat_excl.txt option). I would remove the window frames in particular. You might also increase your -ar value, as 45 seems very low for a model of this size, especially with all the exterior geometry. I think you're losing resolution on the interreflection calculation as a result. Divide the octree cube size (getinfo -d octree) by the length of your arm in your world coordinates, and that should give you a reasonable number.

7.) As can be seen, I have many curved surfaces in this model most of which have been faceted. I would like to use a function to smooth the shading on these. Is there any way to do this given the geometry is already built? Something like the Phong shading -s option in gensurf or maybe with the new mesh primitive??? I apologize for my ignorance on this topic, I rarely require this much detail. I'm imagining a modifier or something I could give to the polygons I want to smooth over.

gensurf will take an ordered list of vertices on a grid and interpolate normals, but without simple connectivity, there's no general way to do this for an arbitrary mesh. Obj2mesh won't do it either -- normals and (u,v) need to be specified on the input if you want them in your output. You can pass the output of the new version of gensurf with the -o option to obj2mesh, however.

8.) I was initially mapping wood grain onto a lot of the interior wood surfaces but given the varying distances of the wood I could never get it to look very good. When it looked good up close, it looks like particle board far away. And when it looks good far away (ie larger scale) it looks like a zebra up close. I would like to have the grain at the correct scale, but don't want it looking like particle board (i'm assuming this is just cause there are limited number of pixels to display this detail). What strategies have others used to get a good looking pattern at varying distances? Also, I would like to map the textures better onto these surfaces rather than using xgrain, ygrain...etc. How could this be done?

Regarding sampling errors, you can use the fade(near_val,far_val,T) function defined in rayinit.cal to use an average value at distances. Otherwise, you have to rely on oversampling to clear up these problems, which we know is expensive.

Regarding arbitrary mapping onto surfaces, I don't have a good solution for woodgrain, which is a solid texture. You sort of have to decide what the general orientation of your object is, or it's not going to look right. The only other option is to get (u,v) coordinates into a .OBJ model and export through obj2mesh.

Sorry for the long e-mail and so many questions clumped into one and the various tangents my questions may have took.

Best Regards,
Zack

---- From radiance-general archives ----

···

Date: Wed, 27 Nov 2002 10:04:58 -0800
From: Greg Ward <[email protected]>
To: [email protected]
Subject: Re: how to specify the animation path

Hi Gurneet,

There really needs to be a FAQ on generating animation paths. Peter
Apian-Bennewitz wrote a program called rshow that may be used to
interactively walk through a space using OpenGL rendering, and has
options for storing keyframes and generating view sequences, which may
then be rendered in batch mode or using ranimate.

I often choose keyframes myself in rview (or the new rholo program),
utilizing the "view" or 'V' commands to output views at desired points
along the animation path. These may then be constructed into a path
using a spline generator or the spline.cal program that may be found in
ray/src/cal/cal in the radiance distribution. You take the views
output by rview or rholo, which look like so (these are not commands --
they are views stored by rview into a file):

rview -vtv -vp 5.46 15.5 4.2 -vd 0.91674 0.347628 -0.196833 -vu 0 0 1 -vh 60 -vv 49 -vs 0 -vl 0 -t 1
rview -vtv -vp 10.5 15.6 4.2 -vd 0.9896 -0.1408 -0.0294619 -vu 0 0 1 -vh 60 -vv 49 -vs 0 -vl 0 -t 2
rview -vtv -vp 20.4 13.7 4.2 -vd 0.680414 -0.680414 -0.272166 -vu 0 0 1 -vh 60 -vv 49 -vs 0 -vl 0 -t 4

The -t options at the end were added manually, indicating the expected
distance (in seconds) between the previous frame and this one. These
options may be added within rview may appending them to the view
command, like so:

: view key.vf -t 2

Once you have the above file (key.vf), you can create a format file for
rcalc to extract the values you are interested in, and pass these to
the tabfunc program to get them into the form you need for spline.cal.
Create a file called "key.fmt" that contains the following single line:

rview -vtv -vp ${px} ${py} ${pz} -vd ${dx} ${dy} ${dz} -vu 0 0 1 -vh ${vh} -vv ${vv} -vs 0 -vl 0 -t ${t}

Then, run the following command to extract the desired values from your
keyframe file and put them into a form that may be passed to rcalc as a
.cal file:

% rcalc -i key.fmt -e
'$1=recno;$2=px;$3=py;$4=pz;$5=dx;$6=dy;$7=dz;$8=vh;$9=vv;$10=t' key.vf \
  > tabfunc Px Py Pz Dx Dy Dz H V T > key.cal

This new file, "key.cal", may then be used with rcalc to generate a set
of desired inbetween views, spaced evenly in time. If your total sum
of times is 30 seconds, for example, and you want to render 10
frames/sec, you might use the following command to generate the
individual frame views for ranimate:

% cnt 300 | rcalc -o key.fmt -f key.cal -f spline.cal -e 't=$1/10' \
  -e 'px=s(Px);py=s(Py);pz=s(Pz);dx=s(Dx);dy=s(Dy);dz=s(Dz);vh=s(H);vv=s(V)' > anim.vf

In this command, I have used the same "key.fmt" file to generate the
animation views, but you may want to produce something slightly
different, which had only the changing view point and direction, for
example. The above sequence allows you to vary the view position,
direction, and zoom, but does not permit the up vector to change. This
is rarely needed, but if you do have a tilting camera, you can always
modify the commands to include this information.

I hope this is enough to get you started. Another excellent place to
look for tips is Peter's chapter on animation in "Rendering with
Radiance."

-Greg

Zack Rogers wrote:

Hello group,

I am attempting my first full blown Radiance animation and have several related questions. This link has a rough version of the animation, done with -ab 0.

http://www.archenergy.com/downloads/pub/images/Radiance/walkthrough_init3.mov

and yes, i seem to trip up the curb, have several neck spasms, and stare at a wall at one point....all have since been fixed but this leads to my first question.

1.) What programs do people use to produce animation paths? I initially wanted to use rshow but couldn't get it working on our Suse 8.2. I understand programs like maya, studio vis, form-z?? have the ability to generate these animations paths but are too much (expensive) for what I need. It would be nice to be able to just export my spline path from autoCAD and then have the ability to adjust frames/sec, slow down portions of the path, and adjust the view direction throughout.

rshow is high on my todo list, but I can't promise any deadlines, - up to my neck in other things . The Open-GL version doesn't offer animation paths (apart from it not being ported to R3.5- shame!), but will (promises....).

2.) There are portions of this path that I want to slow down and I wanted to use pinterp to interpolate some extra frames. How can I generate a z-buffer file for a view without recreating the picture? I thought I could just run it at very low settings if nothing else. And yes, I am running ranimate using the windows version (our linux machines were filled up with CFD calcs at go time and I was forced to use our NT machines - btw: ranimate does not work on XP - error "Windows socket operations not supported") and it is giving errors using INTERPOLATE and MBLUR which would have created the zbf files automatically. Hence, I was planning on using pinterp afterwards to achieve interpolation but now I have no z-buffer files. which leads to my next questions....

pinterp is great, but fails at reflections or refractions (which is probably why you thought of using it piecewise anyway)

3.) Where is pinterp? I just installed the HEAD version on our linux machines (yes, its been a while since I last dabbled in the linux release) and i can't find pinterp. It shows up in the man pages but not in the bin directory. BTW - pinterp does not seem to work in the windows version either.

4.) So, we then tried to install the last official release 3.5 and it gave us a bunch of errors. The error log is here. Help!

http://www.archenergy.com/downloads/pub/images/Radiance/installLog.txt

5.) Using an ambfile does not seem to be doing much for my animation times? I was expecting a big decrease in calc time the further i got into my animation but at about 5/6 through it has not sped up much. It has been building the ambient file the whole time, although it is only 7MB after roughly 1000 frames, which seems too small (10 panoramic views I've been running on a different machine already has an ambient file of 7MB). Here is what one of my ranfiles looks like:

For the long ISE outside animation, I had to pre-run along the path mulitple times at low res to fill up the ambient file. Be sure that it is fairly "stable" and has enough ambient points, otherwise it "discovers" more light during the main animation, leading to brightness jumps or flicker (depends on the sequence of generation of the image frames).

....

6.) Any suggestions on the parameters used? I am no expert on optimizing parameters.

Which you probably will be, after you finished the animation on time ....

...

7.) As can be seen, I have many curved surfaces in this model most of which have been faceted. I would like to use a function to smooth the shading on these.

Nop - the info is lost. Theoretecally one could fit a spline surface to the polygons and approximate the surface normal by this, but the better solutions is to export the surface normals from the CAD too. Some CAD export VRML format, which may contain normals.

...

8.) I was initially mapping wood grain onto a lot of the interior wood surfaces but given the varying distances of the wood I could never get it to look very good.

RenderMan (and probably others) use different textures/patterns depending on distance (more exactly on the solid angle the pixel covers). This angle is not supplied in the function files (at least it wasn't when I discussed that with Greg two years ago), so view dependent textures are a bit tricky to do. One reason for not making it available was that it is known in rvu/rpict, but not rtrace. If we agree (on radiance-dev) it would be useful, I'd be happy to see this variable.

  When it looked good up close, it looks like particle board far away.

Your spatial resolution has to be very high to avoid aliasing.

...

cheers
Peter

···

--
pab-opto, Freiburg, Germany, www.pab-opto.de

Hi Rob,

Thanks for the input! I had seen that webpage before, I guess I was hoping I didn't have to export to obj or 3ds from my CAD file, although that really shouldn't be too difficult.

So, I took one of my objects I wanted to smooth and exported it as a 3ds file. I use AutoCAD 2000 and could not see any way to export as an obj file. Anyways, my 3ds seemed to generate fine. And so I ran:

3ds2mgf entry_roof.3ds entry_roof.mgf -s40 -om

and got:

3ds2mgf: unexpected EOF

seems like an end-of-file error has to do with a linux vs windows problem, but on the webpage John mentioned he also used the DR version of 3ds2mgf. Any ideas whats causing this error?

I would like to the try the obj2mesh approach as well but am not sure how to get an obj file from my autoCAD 3d model.

Now stepping back just so I understand, I'm guessing you need to use obj2mesh or the 3ds2mgf because these will generate the A2 thru A10 parameters required by tmesh.cal. On that website it gives this example of .rad data created by 3ds2mgf.

WOOD texfunc T-nor
4 dx dy dz tmesh.cal
0
10 0
-0.00163623 0.00042471 1.21556195
0.00738964 -0.00120012 -0.94298270
0.00000000 0.00000000 -0.00000000

where A2 thru A10 is the surface normal perturbation matrix - I have no clue what this is or how to generate it. Because couldn't I also just apply this modifier to the polygons I want to smooth if I could create the necessary tmesh.cal matrix. Do the polygons have to be triangles or could I just apply this as a modifier to my .rad file created with radout (assuming I have a correct matrix).

....an hour later.....Well, I just saw Peter and Greg's responses. I'm guessing that I can not just apply it to my polygons in my radout rad file cause there are no normals...but can't the normal be determined fairly easily as the cross product of this sides of the polygon? Do you really need to define each surface normal when the surface vertices are defined? Is it just a more expensive calculation?

I hope my confusion makes sense.
Regards,
Zack

···

--
Zack Rogers
Staff Engineer
Architectural Energy Corporation
2540 Frontier Avenue, Suite 201
Boulder, CO 80301 USA

tel (303)444-4149 ext.235
fax (303)444-4304

Peter Apian-Bennewitz wrote:

RenderMan (and probably others) use different textures/patterns depending on distance (more exactly on the solid angle the pixel covers). This angle is not supplied in the function files (at least it wasn't when I discussed that with Greg two years ago), so view dependent textures are a bit tricky to do. One reason for not making it available was that it is known in rvu/rpict, but not rtrace. If we agree (on radiance-dev) it would be useful, I'd be happy to see this variable.

Actually, a solid angle in the context of rtrace would be useful in general. In situations where you don't need this info, you can simply set it to zero, denoting an infinitesimal solid angle.

···

--
Roland Schregle
PhD candidate, Fraunhofer Institute for Solar Energy Systems
RADIANCE Photon Map page: www.ise.fhg.de/radiance/photon-map

END OF LINE. (MCP)

Zack Rogers wrote:

[about inter-polygon smoothing]
I'm guessing that I can not just apply it to my polygons in my radout rad
file cause there are no normals...but can't the normal be determined
fairly easily as the cross product of this sides of the polygon? Do you
really need to define each surface normal when the surface vertices are
defined? Is it just a more expensive calculation?

Unfortunately, a typical rad file is just a heap of unrelated
triangles. Or worse, other types of polygons, which you then need
to split into triangles first.

If the normal vectors aren't supplied, then you'll have to try
to reconstruct the topology of the surface. Basically you run
through the list of triangles multiple times, searching all the
ones that share vertices. The assumption being that the normal of
each vertex will be the average of all adjoining triangles.
As long as the geometry is topologically simple (no manifolds or
intersections etc.), this can be done, but it isn't very
efficient, and may produce unexpected results in some cases.

If your surface has any discontinuities (eg. edges: triangles
connecting at an angle that shouldn't be smoothed away), then
you need to start guessing threshold angles, which is at best
unreliable. If the normals are supplied, then you can just smooth
each triangle individually, and not worry about connections
because you *know* the result will be correct in any case.

In your animation, the most obvious places where I noticed any
lack of smoothness in supposedly curved surfaces were the
handrails, and possibly the roof of the entrance pavillion. Those
effects *won't* go away with normal vector interpolation, because
it's not primarily the shading that gives them away, but the
segmented outlines. You'll have to make shorter segments for that
to improve. For some entity types (extruded arcs and arc segments
in polylines), you can reduce the angle and distance tolerances
in Radout to improve their appearance.

-schorsch

···

--
Georg Mischler -- simulations developer -- schorsch at schorsch com
+schorsch.com+ -- lighting design tools -- http://www.schorsch.com/

Hi Zack,

Nice animation, I find that having read the replies most of the salient
points have already been covered but I would like to add that this is
why I use Maya for the animation path generation. Since you can see the
graphs for the camera movement these can be tweaked to give a smooth
flow from point to point and by adjusting the gradients of the graphs
the accelerations of the camera can be controlled.

I am not sure if it possible to import obj or other type files into the
"personal learning edition" of Maya but if so this version is a free
download and so could be used to set these paths visually which might
help.

Rich

···

____________________________

Richard Gillibrand

Department of Computer Science,

University of Bristol,

Tel: 0117 9545256

____________________________

There's no way to get a solid angle associated with a sample in rtrace, because there is no image plane. That's what Peter was saying. A value of zero would have to be given, which would have to be treated specially by the handling .cal file.

Except for perspective view rays, we'd probably always be giving a solid angle of zero. The vast majority of rays in a rendering are secondary rays, which have reflected one or more time from various surfaces, making solid angle problematic to track, if not impossible. What is the solid angle of an ambient ray? A source ray? You see what I'm driving at. (Parallel views also cannot have solid angle associated with rays.)

Even if we made solid angle available under the rare circumstances where it made sense and we knew it, what would you do with it? It's not like you're going to resample your texture on the fly in a .cal file. What would you do? When I first considered this problem (many years ago), I decided that ray distance from the origin was as good a measure as any for driving these sorts of approximations, so that's why you have access to T. If you've tried it and it really can't work for you, then is the time to start looking at alternatives.

-Greg

···

From: Roland Schregle <[email protected]>
Date: December 17, 2003 4:58:03 AM PST

Peter Apian-Bennewitz wrote:

RenderMan (and probably others) use different textures/patterns depending on distance (more exactly on the solid angle the pixel covers). This angle is not supplied in the function files (at least it wasn't when I discussed that with Greg two years ago), so view dependent textures are a bit tricky to do. One reason for not making it available was that it is known in rvu/rpict, but not rtrace. If we agree (on radiance-dev) it would be useful, I'd be happy to see this variable.

Actually, a solid angle in the context of rtrace would be useful in general. In situations where you don't need this info, you can simply set it to zero, denoting an infinitesimal solid angle.

Georg's response

You may also want to set the exposure to a fixed value, resp. one
for exterior and one for interior shots.

Yeah, I need to work on how I set the exposure or human sensitivity. I compile a pre-final animation for my deadline today and its posted here.

http://www.archenergy.com/downloads/pub/images/Radiance/walkthrough_final.mov

My last frames just finished overnight (they had been running on 4 NT machines since thurs. - i way overestimated the efficiencies an ambient file would add) and I had to quickly compile them this morning. Hence, I just applied human sensitivity to the frames. As you can see in the animation there are some erratic frames. I am not sure why this happened. Besides the couple of frames that stick out, the other frames seem to flow pretty well. Some of the erratic frames are just due to the instant eye adjustment that the human sensitivity function gives as they happen when bright objects come into or out of view. One of the reasons I used human sensitivity rather than just setting an exposure is that I have quite a range of brightness, especially in my lobby, and I want to be able to see the floor tiles in the direct sun patches rather than just a white patch while keeping the darker corners visible. I'm all ears for other ways to approach this. Seems like you need a time dependant human sensitivity adjustment for better accuracy, although this seems very tough as it depends on frames/sec. Also, I don't like the way human sensitivity sometimes makes my red doors and mullions look almost black. I'm guessing this is due to the mesopic adjustments done by human sensitivity. I thought maybe I should just rerun pcond with -a+ -v+ and -s+ instead. Any other suggestions are certainly welcomed. I offered to run a higher quality simulation over my Christmas break and so I will probably have one more pass at this animation.

Should be there. Are you sure you installed the additional
support package before building? Because otherwise you're missing
out on the tifflib, which will cause other stuff in the same
directory not to get built (even if it doesn't actually use the
tifflib).

I did not pass information onto our IT guy very well....we did not install the support package first. Thanks! and we will just use the HEAD version.

BTW - pinterp does not seem to work in the
windows version either.
   

Try to make sure that the file paths you feed to pinterp are as
short as possible (ie. use relative paths). This is fixed in
HEAD, but many of the picture manipulation programs used to
suffer from buffer overflows when comand lines got too long.

I tried this again with a very short path (z:\tmp\) and it still gave me the same error...or actually "no error" :-). The output is:
"read: no error"
Once I get our linux Radiance up and running I plan on using linux for my next round of simulations anyways.

Some future versions of Radout and dxf2rad may do this (with the
new mesh primitive), but it's not an urgent issue for me as long
as Rayfront still uses Radiance 3.1/3.2.

This would be great added feature!

You'd need to do oversized renderings and filter them down to
avoid the particle look in distant patterns. On the other hand,
there's no real point in mapping a wood grain pattern to surfaces
that you never look at from up close.

Yeah, I ended up taking off the wood grain on most surfaces except the handrails. The problem with an animation is that the distance to these surfaces is always changing. I am planning on exploring the fade function approach in rayinit.cal.

There's an "arbmap.cal" included with Rayfront for that (and it
should also be in the archives here somewhere for those who don't
have that).

I've tried to use arbmap.cal before and was having some problems - but that wouldn't be necessary for a solid texture anyways, correct?? I can't remember the errors I was getting, i will try again and report back if i have any problems.

Greg's response

I posted something to the list about a year ago on generating animation paths, which I've attached to this message. Perhaps this can help.

Thanks Greg. I remember seeing this and had forgotten about the spline.cal function. I was hoping for something more visual, but this method would definitly be easier than the way I did it (essentially with an exported spline from CAD and a spreadsheet)

The time reduction after the first frame isn't noticeable as it doesn't progress much from there -- each new frame adds incrementally to the ambient file. You are still much better off than not using one (MUCH), but don't expect your rendering times to continue decreasing as the animation progresses.

I was thinking that since each frame is so similar (6 inches - 15cm - apart) that there would be alot of ambient data that could be shared from one frame to the next. Is it because the incident angle to each surface changes enough that the previous ambient data cannot be used. I don't have to great of a grasp on how the ambient file works, so I apologize.

Looks pretty good to my eyes. You could improve rendering times by removing some of the fussy details from the ambient calcultion with an ambient exclude file (-aE mat_excl.txt option). I would remove the window frames in particular.

I shied away from using an ambient exclude file because I use the same material on many objects. I think for my next pass though I will take this advice, and either name the materials differently or use an alias.

You might also increase your -ar value, as 45 seems very low for a model of this size, especially with all the exterior geometry. I think you're losing resolution on the interreflection calculation as a result. Divide the octree cube size (getinfo -d octree) by the length of your arm in your world coordinates, and that should give you a reasonable number.

Oops, I had set my -ar value before I added in the the extensive parking lot. Getinfo -d gives me this:

bc.oct: -214.5790355 -249.1505 -170.126913 352.68657

My model is in meters. I don't quite understand this. My xmin and ymin may be as stated but I know my zmin is not much more than -1 meter. Am I misunderstanding this output. And so my arm is about a meter so would a good -ar setting be 352/1 = 352? This would make my calculation time much longer. I've been setting this in Rayfront where you input xsize, ysize, zsize and octree size (which is longest of the 3 sizes).

This isn't the cause of some the splotches i am still seeing, is it? I have my -ad and -as parameters really high, but perhaps too high?? Here is a panoramic animation I also created for this model that is even more splotchy:

http://www.archenergy.com/downloads/pub/images/Radiance/panoramic_walkthrough.exe

I had these running on some XP machines (since they couldn't handle ranimate) and with a separately generated ambfile. These were the settings from my .rif file

QUALITY = medium
DETAIL = high
VARIABILITY = high
INDIRECT = 1
oconv = -r 4096
render = -ad 4096
render = -st 0.01
render = -dp 512
render = -av 2.5 2.5 2.5
render = -as 2048
render = -sj 0.1
render = -dt 0.1
render = -aa 0.08
render = -ar 60

There are some pretty significant splotches - I am not sure why these are so much worse than my walkthrough - both ambfiles were about 7MB by the end.

Peter's response

For the long ISE outside animation, I had to pre-run along the path mulitple times at low res to fill up the ambient file. Be sure that it is fairly "stable" and has enough ambient points, otherwise it "discovers" more light during the main animation, leading to brightness jumps or flicker (depends on the sequence of generation of the image frames).

Good suggestion. I will try this on my next round of simulations. I don't think this is why I had so many flickers in my final animation though, is it??? I thought my flicker was more due to pcond issues.

Georgs response 2

In your animation, the most obvious places where I noticed any
lack of smoothness in supposedly curved surfaces were the
handrails, and possibly the roof of the entrance pavillion. Those
effects *won't* go away with normal vector interpolation, because
it's not primarily the shading that gives them away, but the
segmented outlines. You'll have to make shorter segments for that
to improve. For some entity types (extruded arcs and arc segments
in polylines), you can reduce the angle and distance tolerances
in Radout to improve their appearance.

I almost did reduce the tolerances but wasn't sure the effect this would have on octree size, or more importantly, rendering times. In general, would phong smoothing or increased number of facets be more expensive?

Hey, thanks everyone for all the input. This group is such an essential Radiance resource! Sorry for such lengthy emails.

Zack

···

--
Zack Rogers
Staff Engineer
Architectural Energy Corporation
2540 Frontier Avenue, Suite 201
Boulder, CO 80301 USA

tel (303)444-4149 ext.235
fax (303)444-4304

Hi Zack,

I'll just respond to a couple of things that caught my eye...

From: Zack Rogers <[email protected]>
Date: December 17, 2003 3:07:35 PM PST

Greg's response

The time reduction after the first frame isn't noticeable as it doesn't
progress much from there -- each new frame adds incrementally to the
ambient file. You are still much better off than not using one (MUCH),
but don't expect your rendering times to continue decreasing as the
animation progresses.

I was thinking that since each frame is so similar (6 inches - 15cm - apart) that there would be alot of ambient data that could be shared from one frame to the next. �Is it because the incident angle to each surface changes enough that the previous ambient data cannot be used. �I don't have to great of a grasp on how the ambient file works, so I apologize.

The thing is, you are saving a lot of time over not having an ambient file and you are sharing a lot of values. It's just that the time doesn't continuously decrease -- it quickly drops to a small fraction of what it would have been and stays there as your frames progress. You have to think about it I guess, but once you are sharing 95% of the values from the previous frame, just calculating the new ones needed for whatever small movement you've made, what more can you save? You reach this level of sharing after one or two frames, and your rendering times stay constant from there on in a walkthrough. If you make a "jump cut" to another position or rapidly rotate the view, you might see a momentary slow-down where it has to add a lot of new ambient values, but after that your calculation times will level off again. The biggest savings after ambient value sharing in a walkthrough will come from view interpolation using pinterp. Even if you decide you can't afford to space your rpict renderings any further apart, you can still add inbetween frames with pinterp and improve the smoothness of your results at little additional cost.

You might also increase your -ar value,
as 45 seems very low for a model of this size, especially with all the
exterior geometry. I think you're losing resolution on the
interreflection calculation as a result. Divide the octree cube size
(getinfo -d octree) by the length of your arm in your world
coordinates, and that should give you a reasonable number.
Oops, I had set my -ar value before I added in the the extensive parking lot. �Getinfo -d gives me this:

bc.oct: -214.5790355 -249.1505 -170.126913 352.68657

My model is in meters. �I don't quite understand this. �My xmin and ymin may be as stated but I know my zmin is not much more than -1 meter. �Am I misunderstanding this output. �And so my arm is about a meter so would a good -ar setting be 352/1 = 352? �This would make my calculation time much longer. �I've been setting this in Rayfront where you input xsize, ysize, zsize and octree size (which is longest of the 3 sizes).

Getinfo reports the minimum x, y, and z coordinates plus the size of the bounding cube. An octree is always a cube, so one or two of the dimensions will always be smaller than the actual minimum -- oconv centers the cube around your geometry.

This isn't the cause of some the splotches i am still seeing, is it? �I have my -ad and -as parameters really high, but perhaps too high?? �Here is a panoramic animation I also created for this model that is even more splotchy:

http://www.archenergy.com/downloads/pub/images/Radiance/panoramic_walkthrough.exe

Can't look at an .exe file, I'm afraid. Increasing -ar to 352 is not a bad idea, though you may end up seeing a few more splotches. The too-low value you currently have is causing interreflected (ambient) values to be spread around too generously, giving an artificially smooth appearance to your rendered frames.

I had these running on some XP machines (since they couldn't handle ranimate) and with a separately generated ambfile. �These were the settings from my .rif file

QUALITY���� = medium
DETAIL����� = high
VARIABILITY = high
INDIRECT��� = 1
oconv������ = -r 4096
render����� = -ad 4096
render����� = -st 0.01
render����� = -dp 512
render����� = -av 2.5 2.5 2.5
render����� = -as 2048
render����� = -sj 0.1
render����� = -dt 0.1
render����� = -aa 0.08
render����� = -ar 60

There are some pretty significant splotches - I am not sure why these are so much worse than my walkthrough - both ambfiles were about 7MB by the end.

I'd only be guessing without seeing your image, but a frequent cause of splotches in a model such as yours is solar reflections off specular surfaces, such as glass.

-Greg

Zack Rogers wrote:

>There's an "arbmap.cal" included with Rayfront for that (and it
>should also be in the archives here somewhere for those who don't
>have that).
>
I've tried to use arbmap.cal before and was having some problems - but
that wouldn't be necessary for a solid texture anyways, correct??

No, this would be in place of picture.cal. I didn't remember
for a moment that the woodgrain functions are fully procedural
solid patterns.

>I posted something to the list about a year ago on generating animation
>paths, which I've attached to this message. Perhaps this can help.

Thanks Greg. I remember seeing this and had forgotten about the
spline.cal function. I was hoping for something more visual, but this
method would definitly be easier than the way I did it (essentially with
an exported spline from CAD and a spreadsheet)

If you're exporting them from Autocad anyway, then you might
want to look at the spline fitting functions in tube.lsp:
  http://www.schorsch.com/download/tube/
It's not ideal for this application, as it automatically
selects the point spacing to optimize geometric deviation
against the number of generated points. In other words, it
will generate more points in narrow curves, and speed up
on the straight stretches.

This isn't the cause of some the splotches i am still seeing, is it? I
have my -ad and -as parameters really high, but perhaps too high??

If you want to go for speed and non-splotchy appearance, and
don't mind the resulting luminances to be inaccurate, then
you could reduce -ad and -as, while *increasing* -aa.

Those three need to be "balanced" to get nice renderings.
You can balance them for accuracy or for speed...

> For some entity types (extruded arcs and arc segments
>in polylines), you can reduce the angle and distance tolerances
>in Radout to improve their appearance.
>
I almost did reduce the tolerances but wasn't sure the effect this would
have on octree size, or more importantly, rendering times.

The amount of geometry usually has a surprisingly small effect
on rendering times, so I wouldn't be too shy here. If in doubt,
pick the layers where you expect the visual benefit to be the
highest, and only export those (seperately) with lower tolerances.

In general,
would phong smoothing or increased number of facets be more expensive?

A texfunc applied to each polygon will most likely eat more
CPU cycles than a number of additional plain polygons.
Procedural modifiers are quite expensive.

-schorsch

···

--
Georg Mischler -- simulations developer -- schorsch at schorsch com
+schorsch.com+ -- lighting design tools -- http://www.schorsch.com/

Hi!

1.) What programs do people use to produce animation paths? I initially wanted to use rshow but couldn't get it working on our Suse 8.2. I understand programs like maya, studio vis, form-z??

I have written down some key points and interpolated. There is some help on this topic in the archives - I also didn't know how to do so :wink: I don't know if you can find a way to convert from a vector drawing tool to viewfiles, e.g. translating from eps or dxf to radiance view point might be possible? You would still have to define both splines... So interpolating keyframes is certainly not the worst.

5.) Using an ambfile does not seem to be doing much for my animation times? I was expecting a big decrease in calc time the further i got into my animation but at about 5/6 through it has not sped up much. It has been building the ambient file the whole time, although it is only 7MB after roughly 1000 frames, which seems too small (10 panoramic views I've been running on a different machine already has an ambient file of 7MB). Here is what one of my ranfiles looks like:

I have made an animation with ab3 - and had a lot of time saved by reusing ambient data. However, you will need an ouverture-calculation, as was mentioned before. It depends on the model and the importance of the ambient data, if you need only ab1, you can't save too much with this. I am also quite sure that in my case using illums instead of the open windows would have allowed some faster settings in the ambient calculation, but I didn't try that so far.

7.) As can be seen, I have many curved surfaces in this model most of which have been faceted. I would like to use a function to smooth the shading on these. Is there any way to do this given the geometry is already built? Something like the Phong shading -s option in gensurf or maybe with the new mesh primitive??? I apologize for my ignorance on this topic, I rarely require this much detail. I'm imagining a modifier or something I could give to the polygons I want to smooth over.

I remember that the 3ds-translator did this fine.

CU Lars.

···

--
Lars O. Grobe
[email protected]