running radiance on multiple cores

Dear List,

I am running into some issues while running radiance on multiple cores.

I have radiance running on a Linux virtual machine and I am using the
rad -N option.

Sometimes though, I happen to notice in the final image tiles of varying
brightness or even misplaced tiles.

I thought it had to do with poor synchronization because of reading and
writing on the network, but it seem to happen also when I have all the
files on my local drive.

Has anyone run into a similar problem and/or knows a solution for it?

Thanks in advance,

Giovanni

Some people have complained about misplaced or incomplete tiles using rpiece, but I haven't been able to reproduce or find the problem.

As for varying tile brightness, this is usually caused by forgetting to specify an ambient file (rad AMBFILE variable), causing redundant and wasteful as well as inconsistent calculations.

-Greg

···

From: "Giovanni Betti" <[email protected]>
Date: May 23, 2012 8:50:40 AM PDT

Dear List,

I am running into some issues while running radiance on multiple cores.
I have radiance running on a Linux virtual machine and I am using the rad –N option.
Sometimes though, I happen to notice in the final image tiles of varying brightness or even misplaced tiles.
I thought it had to do with poor synchronization because of reading and writing on the network, but it seem to happen also when I have all the files on my local drive.
Has anyone run into a similar problem and/or knows a solution for it?

Thanks in advance,

Giovanni

See this before, many times.
It seems to be related to the machine being under pressure and not able to cope with the synch.
So my suggestion is not to run too many things at the same time and modulate the machine workload.
Alternatively you should look into splitting rpict with vwrays. You should be able to easily split images in tiles and render them and assemble in the end.
This would avoid synch issue as you would be doing the synch in the end manually.

One of the tricks that I have been using, unless you have -aa 0, is to render the same thing with an ambient file in many processes at the same time.
It seems (and it has been proven with practise) to help the rendering to come out faster.
So nothing difficult, you need just to launch the same rendering over each core, share the ambient file and retain just 1 image and trash the others.
Simple and effective.
:slight_smile:

G

···

On 23 May 2012, at 16:50, Giovanni Betti wrote:

happen

I have had this problem (tiles getting written to the wrong xy image location) as well and tried to track down the issue to little avail. I think the easiest thing to do is render separate tiles and assemble as a post process. I think that this is also actually preferable at least from the standpoint of file locking over a network. Probably a variety of ways to do this.

I have not tried Giulio's trick with the populating the ambient cache, I would like to give it a try if I get a chance. Typically I have run one process to populate the ambient cache before distributing the tiles.

-Jack

See this before, many times.

···

It seems to be related to the machine being under pressure and not able to cope with the synch.
So my suggestion is not to run too many things at the same time and modulate the machine workload.
Alternatively you should look into splitting rpict with vwrays. You should be able to easily split images in tiles and render them and assemble in the end.
This would avoid synch issue as you would be doing the synch in the end manually.

One of the tricks that I have been using, unless you have -aa 0, is to render the same thing with an ambient file in many processes at the same time.
It seems (and it has been proven with practise) to help the rendering to come out faster.
So nothing difficult, you need just to launch the same rendering over each core, share the ambient file and retain just 1 image and trash the others.
Simple and effective.
:slight_smile:

G

On 23 May 2012, at 16:50, Giovanni Betti wrote:

happen

_______________________________________________
Radiance-general mailing list
[email protected]
http://www.radiance-online.org/mailman/listinfo/radiance-general

Some people have complained about misplaced or incomplete tiles using rpiece, but I haven't been able to reproduce or find the problem.

The problem seems--I say cautiously--to occur on Linux systems and not BSD systems. I think I'm going to run a test.

Randolph

···

On 2012-05-23 16:15:36 +0000, Greg Ward said:

As for varying tile brightness, this is usually caused by forgetting to specify an ambient file (rad AMBFILE variable), causing redundant and wasteful as well as inconsistent calculations.

-Greg

From: "Giovanni Betti" <[email protected]>
Date: May 23, 2012 8:50:40 AM PDT

Dear List,

I am running into some issues while running radiance on multiple cores.
I have radiance running on a Linux virtual machine and I am using the rad –N option.
Sometimes though, I happen to notice in the final image tiles of varying brightness or even misplaced tiles.
I thought it had to do with poor synchronization because of reading and writing on the network, but it seem to happen also when I have all the files on my local drive.
Has anyone run into a similar problem and/or knows a solution for it?

Thanks in advance,

Giovanni

--
Randolph M. Fritz

Unfortunately, I suspect you are correct...

···

Sent from my Verizon Wireless 4G LTE DROID

-----Original message-----
From: "Randolph M. Fritz" <[email protected]>
To: [email protected]
Sent: Wed, May 23, 2012 16:57:02 EDT
Subject: Re: [Radiance-general] running radiance on multiple cores

On 2012-05-23 16:15:36 +0000, Greg Ward said:

Some people have complained about misplaced or incomplete tiles using
rpiece, but I haven't been able to reproduce or find the problem.

The problem seems--I say cautiously--to occur on Linux systems and not
BSD systems. I think I'm going to run a test.

Randolph

As for varying tile brightness, this is usually caused by forgetting to
specify an ambient file (rad AMBFILE variable), causing redundant and
wasteful as well as inconsistent calculations.

-Greg

From: "Giovanni Betti"
<[email protected]>
Date: May 23, 2012 8:50:40 AM PDT

Dear List,

I am running into some issues while running radiance on multiple cores.
I have radiance running on a Linux virtual machine and I am using the
rad –N option.
Sometimes though, I happen to notice in the final image tiles of
varying brightness or even misplaced tiles.
I thought it had to do with poor synchronization because of reading and
writing on the network, but it seem to happen also when I have all the
files on my local drive.
Has anyone run into a similar problem and/or knows a solution for it?

Thanks in advance,

Giovanni

--
Randolph M. Fritz

_______________________________________________
Radiance-general mailing list
[email protected]
http://www.radiance-online.org/mailman/listinfo/radiance-general

Another workaround is to use vwrays and rtrace instead of rpiece, like so:

  set vargs=`rad -s -n -V -v $vw $rif OPTF=render.opt`
  vwrays -ff -x $xr -y $yr $vargs | rtrace -n $nprocs -ffc `vwrays -d -x $xr -y $yr $vargs` @render.opt $octree > render.hdr

The first command sets the view and puts rendering options in a file for rtrace. The side-effect is that every pixel gets rendered, and of course you will want to run pfilt on the output in most cases. (You can of course add it to the command with "| pfilt -1 -x /3 -y /3 -r .6" after rtrace.)

-Greg

···

From: "Jack de Valpine"<[email protected]>
Date: May 23, 2012 2:01:30 PM PDT

Unfortunately, I suspect you are correct...

Sent from my Verizon Wireless 4G LTE DROID

-----Original message-----
From: "Randolph M. Fritz" <[email protected]>
To: [email protected]

On 2012-05-23 16:15:36 +0000, Greg Ward said:

> Some people have complained about misplaced or incomplete tiles using
> rpiece, but I haven't been able to reproduce or find the problem.

The problem seems--I say cautiously--to occur on Linux systems and not
BSD systems. I think I'm going to run a test.

Randolph

> As for varying tile brightness, this is usually caused by forgetting to
> specify an ambient file (rad AMBFILE variable), causing redundant and
> wasteful as well as inconsistent calculations.
>
> -Greg
>
>> From: "Giovanni Betti"
>>
>> Date: May 23, 2012 8:50:40 AM PDT
>>
>> Dear List,
>>
>> I am running into some issues while running radiance on multiple cores.
>> I have radiance running on a Linux virtual machine and I am using the
>> rad –N option.
>> Sometimes though, I happen to notice in the final image tiles of
>> varying brightness or even misplaced tiles.
>> I thought it had to do with poor synchronization because of reading and
>> writing on the network, but it seem to happen also when I have all the
>> files on my local drive.
>> Has anyone run into a similar problem and/or knows a solution for it?
>>
>> Thanks in advance,
>>
>> Giovanni

--
Randolph M. Fritz

So much for that hypothesis.

I've just managed to produce versions of the problem on a Linux, a FreeBSD, and a Mac OS system. :frowning:

Same Radiance model and scripts; probably three slightly different versions of Radiance.

Randolph

···

On 2012-05-23 21:01:30 +0000, Jack de Valpine said:

Unfortunately, I suspect you are correct...

Sent from my Verizon Wireless 4G LTE DROID

-----Original message-----
From: "Randolph M. Fritz" <[email protected]>
To: [email protected]
Sent: Wed, May 23, 2012 16:57:02 EDT
Subject: Re: [Radiance-general] running radiance on multiple cores

On 2012-05-23 16:15:36 +0000, Greg Ward said:

> Some people have complained about misplaced or incomplete tiles using
> rpiece, but I haven't been able to reproduce or find the problem.

The problem seems--I say cautiously--to occur on Linux systems and not
BSD systems. I think I'm going to run a test.

Randolph

> As for varying tile brightness, this is usually caused by forgetting to
> specify an ambient file (rad AMBFILE variable), causing redundant and
> wasteful as well as inconsistent calculations.
>
> -Greg
>
>> From: "Giovanni Betti"
>>
>> Date: May 23, 2012 8:50:40 AM PDT
>>
>> Dear List,
>>
>> I am running into some issues while running radiance on multiple cores.
>> I have radiance running on a Linux virtual machine and I am using the
>> rad –N option.
>> Sometimes though, I happen to notice in the final image tiles of
>> varying brightness or even misplaced tiles.
>> I thought it had to do with poor synchronization because of reading and
>> writing on the network, but it seem to happen also when I have all the
>> files on my local drive.
>> Has anyone run into a similar problem and/or knows a solution for it?
>>
>> Thanks in advance,
>>
>> Giovanni

--
Randolph M. Fritz

_______________________________________________
Radiance-general mailing list
[email protected]
http://www.radiance-online.org/mailman/listinfo/radiance-general
_______________________________________________
Radiance-general mailing list
[email protected]
http://www.radiance-online.org/mailman/listinfo/radiance-general

--
Randolph M. Fritz

Hi Randolph,

Actually, having a repeatable test case is great! Maybe you can get this to me off-list and I'll look at it when I have some time.

Thanks!
-Greg

···

From: "Randolph M. Fritz" <[email protected]>
Date: May 23, 2012 3:02:46 PM PDT

So much for that hypothesis.

I've just managed to produce versions of the problem on a Linux, a FreeBSD, and a Mac OS system. :frowning:

Same Radiance model and scripts; probably three slightly different versions of Radiance.

Randolph

On 2012-05-23 21:01:30 +0000, Jack de Valpine said:

Unfortunately, I suspect you are correct...
Sent from my Verizon Wireless 4G LTE DROID
-----Original message-----
From: "Randolph M. Fritz" <[email protected]>
To: [email protected]
Sent: Wed, May 23, 2012 16:57:02 EDT
Subject: Re: [Radiance-general] running radiance on multiple cores

I can verify that missing sections of images happens on both Linux and MAC computers when an ambient file is used though not consistantly.

Often I have suspected/verified that the number of -N specified is greater than the actual number of cores available. For example a i7 Macbook that a student had in my office 10 minutes ago shows only 4 cores hence -N 4 would seem appropriate. Some images would complete while others left black patches. All I did was to delete the offending hdr image and re-issue the rad command exactly to have the image rebuilt - much faster as the ambient file has already been populated.

I suspect that the -N number should generally be kept to at least 1 below the max cores available.

Questions for the Development List :
1) Can multi-core processing be added to trad (-n for rvu and -N rad)? While it is possible to add a -n 4 in the trad options line for rvu, you cannot use a -N in the option line for rad.
2) Can the image size resolution for single core processing be the same as multi-core to allow further processing with falsecolor. When rad -N processing uses rpiece the image size can be modified/rationalised. This does not happen in single core processing, hence falsecolor would fail when mixing images due to image size mismatch.

Terrance Mc Minn
Lecturer
School of Built Environment
Curtin University of Technology

···

**

Yes to #1, no to #2. This was discussed before, and I added a warning from rpiece when the resolution changes, but I can't keep it from changing. Rpiece (rpict really) needs the tiles to be of equal size, which means both dimensions must be multiples of the respective number of tile divisions.

You can work around the problem with some difficulty by noting what rpiece changes the resolution to and making sure you set -x and -y to this with -pa 0 on any other runs you do. Unfortunately, the actual resolution will change with different -N settings because rad will make a different number of tiles to optimize processor loads.

Regarding #1, I haven't changed trad in ages, so it may take a while to remember enough Tcl/Tk to add this.

Best,
-Greg

···

From: Terrance Mc Minn <[email protected]>
Date: May 23, 2012 9:28:34 PM PDT

I can verify that missing sections of images happens on both Linux and MAC computers when an ambient file is used though not consistantly.

Often I have suspected/verified that the number of -N specified is greater than the actual number of cores available. For example a i7 Macbook that a student had in my office 10 minutes ago shows only 4 cores hence -N 4 would seem appropriate. Some images would complete while others left black patches. All I did was to delete the offending hdr image and re-issue the rad command exactly to have the image rebuilt - much faster as the ambient file has already been populated.

I suspect that the -N number should generally be kept to at least 1 below the max cores available.

Questions for the Development List :
1) Can multi-core processing be added to trad (-n for rvu and -N rad)? While it is possible to add a -n 4 in the trad options line for rvu, you cannot use a -N in the option line for rad.
2) Can the image size resolution for single core processing be the same as multi-core to allow further processing with falsecolor. When rad -N processing uses rpiece the image size can be modified/rationalised. This does not happen in single core processing, hence falsecolor would fail when mixing images due to image size mismatch.

Terrance Mc Minn

Thanks everybody for the many replies.

Contrary to what Greg suggests, the error (varying brightness) happens
with an ambient file specified.

When I am trying different -av values, a triplet of 0s seems to give the
best results.

I haven't tried yet the combination of vwrays and rtrace suggested
below, will let you know how it goes on with that,

Thanks,

Giovanni

···

From: Greg Ward [mailto:[email protected]]
Sent: 23 May 2012 22:17
To: Radiance general discussion
Subject: Re: [Radiance-general] running radiance on multiple cores

Another workaround is to use vwrays and rtrace instead of rpiece, like
so:

set vargs=`rad -s -n -V -v $vw $rif OPTF=render.opt`

vwrays -ff -x $xr -y $yr $vargs | rtrace -n $nprocs -ffc `vwrays -d -x
$xr -y $yr $vargs` @render.opt $octree > render.hdr

The first command sets the view and puts rendering options in a file for
rtrace. The side-effect is that every pixel gets rendered, and of
course you will want to run pfilt on the output in most cases. (You can
of course add it to the command with "| pfilt -1 -x /3 -y /3 -r .6"
after rtrace.)

-Greg

From: "Jack de Valpine"<[email protected]>

Date: May 23, 2012 2:01:30 PM PDT

  Unfortunately, I suspect you are correct...
  
  Sent from my Verizon Wireless 4G LTE DROID

  -----Original message-----
  
  From: "Randolph M. Fritz" <[email protected]>
  To: [email protected]
  
  On 2012-05-23 16:15:36 +0000, Greg Ward said:
  
  > Some people have complained about misplaced or incomplete
tiles using
  > rpiece, but I haven't been able to reproduce or find the
problem.
  
  The problem seems--I say cautiously--to occur on Linux systems
and not
  BSD systems. I think I'm going to run a test.
  
  Randolph
  
  > As for varying tile brightness, this is usually caused by
forgetting to
  > specify an ambient file (rad AMBFILE variable), causing
redundant and
  > wasteful as well as inconsistent calculations.
  >
  > -Greg
  >
  >> From: "Giovanni Betti"
  >>
  >> Date: May 23, 2012 8:50:40 AM PDT
  >>
  >> Dear List,
  >>
  >> I am running into some issues while running radiance on
multiple cores.
  >> I have radiance running on a Linux virtual machine and I am
using the
  >> rad -N option.
  >> Sometimes though, I happen to notice in the final image tiles
of
  >> varying brightness or even misplaced tiles.
  >> I thought it had to do with poor synchronization because of
reading and
  >> writing on the network, but it seem to happen also when I
have all the
  >> files on my local drive.
  >> Has anyone run into a similar problem and/or knows a solution
for it?
  >>
  >> Thanks in advance,
  >>
  >> Giovanni
  
  --
  Randolph M. Fritz

You need to tell us more about your scene and rendering parameters. Sometimes, the direct calculation can cause variances. Try setting -dt 0.

-Greg

···

From: "Giovanni Betti" <[email protected]>
Date: May 24, 2012 7:10:52 AM PDT

Thanks everybody for the many replies.

Contrary to what Greg suggests, the error (varying brightness) happens with an ambient file specified.
When I am trying different –av values, a triplet of 0s seems to give the best results.
I haven’t tried yet the combination of vwrays and rtrace suggested below, will let you know how it goes on with that,

Thanks,

Giovanni

From: Greg Ward [mailto:[email protected]]
Sent: 23 May 2012 22:17
To: Radiance general discussion
Subject: Re: [Radiance-general] running radiance on multiple cores

Another workaround is to use vwrays and rtrace instead of rpiece, like so:

set vargs=`rad -s -n -V -v $vw $rif OPTF=render.opt`
vwrays -ff -x $xr -y $yr $vargs | rtrace -n $nprocs -ffc `vwrays -d -x $xr -y $yr $vargs` @render.opt $octree > render.hdr

The first command sets the view and puts rendering options in a file for rtrace. The side-effect is that every pixel gets rendered, and of course you will want to run pfilt on the output in most cases. (You can of course add it to the command with "| pfilt -1 -x /3 -y /3 -r .6" after rtrace.)

-Greg

From: "Jack de Valpine"<[email protected]>

Date: May 23, 2012 2:01:30 PM PDT

Unfortunately, I suspect you are correct...

Sent from my Verizon Wireless 4G LTE DROID

-----Original message-----

From: "Randolph M. Fritz" <[email protected]>
To: [email protected]

On 2012-05-23 16:15:36 +0000, Greg Ward said:

> Some people have complained about misplaced or incomplete tiles using
> rpiece, but I haven't been able to reproduce or find the problem.

The problem seems--I say cautiously--to occur on Linux systems and not
BSD systems. I think I'm going to run a test.

Randolph

> As for varying tile brightness, this is usually caused by forgetting to
> specify an ambient file (rad AMBFILE variable), causing redundant and
> wasteful as well as inconsistent calculations.
>
> -Greg
>
>> From: "Giovanni Betti"
>>
>> Date: May 23, 2012 8:50:40 AM PDT
>>
>> Dear List,
>>
>> I am running into some issues while running radiance on multiple cores.
>> I have radiance running on a Linux virtual machine and I am using the
>> rad –N option.
>> Sometimes though, I happen to notice in the final image tiles of
>> varying brightness or even misplaced tiles.
>> I thought it had to do with poor synchronization because of reading and
>> writing on the network, but it seem to happen also when I have all the
>> files on my local drive.
>> Has anyone run into a similar problem and/or knows a solution for it?
>>
>> Thanks in advance,
>>
>> Giovanni

--
Randolph M. Fritz

_______________________________________________
Radiance-general mailing list
[email protected]
http://www.radiance-online.org/mailman/listinfo/radiance-general

Hi Greg,

Thanks for the tip, and sorry for the late reply.

Setting -dt to 0 actually sorted the issue (although, as expected,
rendering time went up a lot!)

The scene I am rendering is an interior scene with a lot of artificial
lights, no sun or sky (yet).

The rif file looks something like:

scene= Materials.rad Geometry.rad

AMBFILE= test.amb

OCTREE= test.oct

RESOLUTION= 1195 761

DETAIL= MEDIUM

VARIABILITY = MEDIUM

QUALITY= LOW

INDIRECT= 2

REPORT=2

render= -ad 2048 -as 1024 -dt 0.05 -dc 0.15 -ds 0.02 -dj 0 -av 0 0 0

view= View10 -vtv -vp -220.898262 490.000427 9.770000 -vd -0.310859
2636.535561 0.000000 -vu 0 0 1 -vh 71 -vv 51 -vs 0 -vl 0 -vo 0 -va 0

Best,

Giovanni

···

From: Greg Ward [mailto:[email protected]]
Sent: 24 May 2012 15:55
To: Radiance general discussion
Subject: Re: [Radiance-general] running radiance on multiple cores

You need to tell us more about your scene and rendering parameters.
Sometimes, the direct calculation can cause variances. Try setting -dt
0.

-Greg

From: "Giovanni Betti" <[email protected]>

Date: May 24, 2012 7:10:52 AM PDT

  Thanks everybody for the many replies.

  Contrary to what Greg suggests, the error (varying brightness)
happens with an ambient file specified.

  When I am trying different -av values, a triplet of 0s seems to
give the best results.

  I haven't tried yet the combination of vwrays and rtrace
suggested below, will let you know how it goes on with that,

  Thanks,

  Giovanni

  From: Greg Ward [mailto:[email protected]]
  Sent: 23 May 2012 22:17
  To: Radiance general discussion
  Subject: Re: [Radiance-general] running radiance on multiple
cores

  Another workaround is to use vwrays and rtrace instead of
rpiece, like so:

  set vargs=`rad -s -n -V -v $vw $rif OPTF=render.opt`

  vwrays -ff -x $xr -y $yr $vargs | rtrace -n $nprocs -ffc `vwrays
-d -x $xr -y $yr $vargs` @render.opt $octree > render.hdr

  The first command sets the view and puts rendering options in a
file for rtrace. The side-effect is that every pixel gets rendered, and
of course you will want to run pfilt on the output in most cases. (You
can of course add it to the command with "| pfilt -1 -x /3 -y /3 -r .6"
after rtrace.)

  -Greg

  From: "Jack de Valpine"<[email protected]>

  Date: May 23, 2012 2:01:30 PM PDT

    Unfortunately, I suspect you are correct...
    
    Sent from my Verizon Wireless 4G LTE DROID

    -----Original message-----
    
    From: "Randolph M. Fritz" <[email protected]>
    To: [email protected]
    
    On 2012-05-23 16:15:36 +0000, Greg Ward said:
    
    > Some people have complained about misplaced or
incomplete tiles using
    > rpiece, but I haven't been able to reproduce or find
the problem.
    
    The problem seems--I say cautiously--to occur on Linux
systems and not
    BSD systems. I think I'm going to run a test.
    
    Randolph
    
    > As for varying tile brightness, this is usually caused
by forgetting to
    > specify an ambient file (rad AMBFILE variable),
causing redundant and
    > wasteful as well as inconsistent calculations.
    >
    > -Greg
    >
    >> From: "Giovanni Betti"
    >>
    >> Date: May 23, 2012 8:50:40 AM PDT
    >>
    >> Dear List,
    >>
    >> I am running into some issues while running radiance
on multiple cores.
    >> I have radiance running on a Linux virtual machine
and I am using the
    >> rad -N option.
    >> Sometimes though, I happen to notice in the final
image tiles of
    >> varying brightness or even misplaced tiles.
    >> I thought it had to do with poor synchronization
because of reading and
    >> writing on the network, but it seem to happen also
when I have all the
    >> files on my local drive.
    >> Has anyone run into a similar problem and/or knows a
solution for it?
    >>
    >> Thanks in advance,
    >>
    >> Giovanni
    
    --
    Randolph M. Fritz
    
  _______________________________________________
  Radiance-general mailing list
  [email protected]
  http://www.radiance-online.org/mailman/listinfo/radiance-general

OK, at least we found the issue with the tile variances. A less expensive parameter change that might work as well is "-dt .03 -dc 1". This will turn off the effect of statistics gathered by the individual rpict processes on the results.

-Greg

···

From: "Giovanni Betti" <[email protected]>
Date: May 25, 2012 9:09:26 AM PDT

Hi Greg,

Thanks for the tip, and sorry for the late reply.

Setting –dt to 0 actually sorted the issue (although, as expected, rendering time went up a lot!)

The scene I am rendering is an interior scene with a lot of artificial lights, no sun or sky (yet).

The rif file looks something like:

scene= Materials.rad Geometry.rad

AMBFILE= test.amb
OCTREE= test.oct

RESOLUTION= 1195 761
DETAIL= MEDIUM
VARIABILITY = MEDIUM
QUALITY= LOW

INDIRECT= 2
REPORT=2

render= -ad 2048 -as 1024 -dt 0.05 -dc 0.15 -ds 0.02 -dj 0 -av 0 0 0

view= View10 -vtv -vp -220.898262 490.000427 9.770000 -vd -0.310859 2636.535561 0.000000 -vu 0 0 1 -vh 71 -vv 51 -vs 0 -vl 0 -vo 0 -va 0

Best,

Giovanni

Quick follow-up to this... I did add a slider to trad to control the number of processes. It's checked into the latest HEAD.

Regarding falsecolor and image size, I recommend using pfilt to correct the image size on the way in.

Cheers,
-Greg

···

From: Greg Ward <[email protected]>
Date: May 23, 2012 9:45:49 PM PDT

Yes to #1, no to #2. This was discussed before, and I added a warning from rpiece when the resolution changes, but I can't keep it from changing. Rpiece (rpict really) needs the tiles to be of equal size, which means both dimensions must be multiples of the respective number of tile divisions.

You can work around the problem with some difficulty by noting what rpiece changes the resolution to and making sure you set -x and -y to this with -pa 0 on any other runs you do. Unfortunately, the actual resolution will change with different -N settings because rad will make a different number of tiles to optimize processor loads.

Regarding #1, I haven't changed trad in ages, so it may take a while to remember enough Tcl/Tk to add this.

Best,
-Greg

From: Terrance Mc Minn <[email protected]>
Date: May 23, 2012 9:28:34 PM PDT

I can verify that missing sections of images happens on both Linux and MAC computers when an ambient file is used though not consistantly.

Often I have suspected/verified that the number of -N specified is greater than the actual number of cores available. For example a i7 Macbook that a student had in my office 10 minutes ago shows only 4 cores hence -N 4 would seem appropriate. Some images would complete while others left black patches. All I did was to delete the offending hdr image and re-issue the rad command exactly to have the image rebuilt - much faster as the ambient file has already been populated.

I suspect that the -N number should generally be kept to at least 1 below the max cores available.

Questions for the Development List :
1) Can multi-core processing be added to trad (-n for rvu and -N rad)? While it is possible to add a -n 4 in the trad options line for rvu, you cannot use a -N in the option line for rad.
2) Can the image size resolution for single core processing be the same as multi-core to allow further processing with falsecolor. When rad -N processing uses rpiece the image size can be modified/rationalised. This does not happen in single core processing, hence falsecolor would fail when mixing images due to image size mismatch.

Terrance Mc Minn

Works a treat - thank you Greg.

*Terrance Mc Minn

···

**
*

On 4/06/12 3:24 AM, Gregory J. Ward wrote:

Quick follow-up to this... I did add a slider to trad to control the number of processes. It's checked into the latest HEAD.

Regarding falsecolor and image size, I recommend using pfilt to correct the image size on the way in.

Cheers,
-Greg

From: Greg Ward<[email protected]>
Date: May 23, 2012 9:45:49 PM PDT

Yes to #1, no to #2. This was discussed before, and I added a warning from rpiece when the resolution changes, but I can't keep it from changing. Rpiece (rpict really) needs the tiles to be of equal size, which means both dimensions must be multiples of the respective number of tile divisions.

You can work around the problem with some difficulty by noting what rpiece changes the resolution to and making sure you set -x and -y to this with -pa 0 on any other runs you do. Unfortunately, the actual resolution will change with different -N settings because rad will make a different number of tiles to optimize processor loads.

Regarding #1, I haven't changed trad in ages, so it may take a while to remember enough Tcl/Tk to add this.

Best,
-Greg

From: Terrance Mc Minn<[email protected]>
Date: May 23, 2012 9:28:34 PM PDT

I can verify that missing sections of images happens on both Linux and MAC computers when an ambient file is used though not consistantly.

Often I have suspected/verified that the number of -N specified is greater than the actual number of cores available. For example a i7 Macbook that a student had in my office 10 minutes ago shows only 4 cores hence -N 4 would seem appropriate. Some images would complete while others left black patches. All I did was to delete the offending hdr image and re-issue the rad command exactly to have the image rebuilt - much faster as the ambient file has already been populated.

I suspect that the -N number should generally be kept to at least 1 below the max cores available.

Questions for the Development List :
1) Can multi-core processing be added to trad (-n for rvu and -N rad)? While it is possible to add a -n 4 in the trad options line for rvu, you cannot use a -N in the option line for rad.
2) Can the image size resolution for single core processing be the same as multi-core to allow further processing with falsecolor. When rad -N processing uses rpiece the image size can be modified/rationalised. This does not happen in single core processing, hence falsecolor would fail when mixing images due to image size mismatch.

Terrance Mc Minn