high performance graphics cards

I just saw this announcement about a new Nvidia graphics card.

http://www.gwn.com/news/story.php/id/10042/

At $18,000 it is a bit out of my price range but it begs the question, would it be possible to harness the horsepower of such a card to render Radiance scenes in realtime?

Chris Jessee

Hi, far from being an expert, at least I have a history of asking such questions on the list (remember my cell-processor thread) :wink:

So the little I understood is as follows:

1) It needs double(!) precision power, while most of these 3d accelerations give mainly single fp and integer performance.

2) It is hard to do the calcution as implemented in Radiance on many processing units parallel. So if the card has ONE processor that is so powerful, one could try to push the code to that core. Most cards are based on multi-cpu or core designs, and than it will be hard to use them. The problem is that one pixel traced results in a "tree" of pixels, with an unknown number of leaves. So e.g. calculating simply 8 pixels at once would result in an unknown number of parallel calculations.

For 2), I think one might still try to port a complete rpict to the instruction set of the processing units and do parallel processing as on SMP machines, at least with e.g. a cell architecture. But than till 1) is not solved.

So I doubt that this card will help you much in Radiance. Of course with rad -ogl or rshow...

CU Lars.

Sure. Write a C compiler for the card, port Linux to it, then compile Radiance on there. Should be a snap!

-Greg

···

From: Chris Jessee <[email protected]>
Date: August 16, 2006 7:52:58 AM PDT

I just saw this announcement about a new Nvidia graphics card.

http://www.gwn.com/news/story.php/id/10042/

At $18,000 it is a bit out of my price range but it begs the question, would it be possible to harness the horsepower of such a card to render Radiance scenes in realtime?

Chris Jessee

Not knowing anything about GPUs I ran your response by a researcher at Nvidia.

From David Luebke, research at Nvidia.

GPUs are a very different architecture
from CPUs and porting anything to them is a significant effort, even
if the problem fits the GPU model nicely. That said, there is a lot of
research activity on using GPUs for real-time global illumination. But
nothing out there yet approaches the goal of a straight-up C compiler
or a GPU that runs Linux. One could certainly accelerate Radiance w/
GPUs but that would be a full-blown research project.

Perhaps not all that practical after all. But it never hurts to ask.

Take care,

Chris Jessee

···

On Aug 17, 2006, at 11:56 AM, Gregory J. Ward wrote:

Sure. Write a C compiler for the card, port Linux to it, then compile Radiance on there. Should be a snap!

-Greg

From: Chris Jessee <[email protected]>
Date: August 16, 2006 7:52:58 AM PDT

I just saw this announcement about a new Nvidia graphics card.

http://www.gwn.com/news/story.php/id/10042/

At $18,000 it is a bit out of my price range but it begs the question, would it be possible to harness the horsepower of such a card to render Radiance scenes in realtime?

Chris Jessee

_______________________________________________
Radiance-general mailing list
[email protected]
http://www.radiance-online.org/mailman/listinfo/radiance-general

I was joking!

Nice of David to write such a cogent reply...

-G

···

From: Chris Jessee <[email protected]>
Date: August 17, 2006 11:27:15 AM PDT

Not knowing anything about GPUs I ran your response by a researcher at Nvidia.

From David Luebke, research at Nvidia.

GPUs are a very different architecture
from CPUs and porting anything to them is a significant effort, even
if the problem fits the GPU model nicely. That said, there is a lot of
research activity on using GPUs for real-time global illumination. But
nothing out there yet approaches the goal of a straight-up C compiler
or a GPU that runs Linux. One could certainly accelerate Radiance w/
GPUs but that would be a full-blown research project.

Perhaps not all that practical after all. But it never hurts to ask.

Take care,

Chris Jessee

On Aug 17, 2006, at 11:56 AM, Gregory J. Ward wrote:

Sure. Write a C compiler for the card, port Linux to it, then compile Radiance on there. Should be a snap!

-Greg

From: Chris Jessee <[email protected]>
Date: August 16, 2006 7:52:58 AM PDT

I just saw this announcement about a new Nvidia graphics card.

http://www.gwn.com/news/story.php/id/10042/

At $18,000 it is a bit out of my price range but it begs the question, would it be possible to harness the horsepower of such a card to render Radiance scenes in realtime?

Chris Jessee

_______________________________________________
Radiance-general mailing list
[email protected]
http://www.radiance-online.org/mailman/listinfo/radiance-general

_______________________________________________
Radiance-general mailing list
[email protected]
http://www.radiance-online.org/mailman/listinfo/radiance-general

Joking? Whoops. Time to call the credit card company and take back that $18,000.

Mark

···

On Thu, 17 Aug 2006, Gregory J. Ward wrote:

I was joking!

Nice of David to write such a cogent reply...

-G

From: Chris Jessee <[email protected]>
Date: August 17, 2006 11:27:15 AM PDT

Not knowing anything about GPUs I ran your response by a researcher at Nvidia.

From David Luebke, research at Nvidia.

GPUs are a very different architecture
from CPUs and porting anything to them is a significant effort, even
if the problem fits the GPU model nicely. That said, there is a lot of
research activity on using GPUs for real-time global illumination. But
nothing out there yet approaches the goal of a straight-up C compiler
or a GPU that runs Linux. One could certainly accelerate Radiance w/
GPUs but that would be a full-blown research project.

Perhaps not all that practical after all. But it never hurts to ask.

Take care,

Chris Jessee

On Aug 17, 2006, at 11:56 AM, Gregory J. Ward wrote:

Sure. Write a C compiler for the card, port Linux to it, then compile Radiance on there. Should be a snap!

-Greg

From: Chris Jessee <[email protected]>
Date: August 16, 2006 7:52:58 AM PDT

I just saw this announcement about a new Nvidia graphics card.

http://www.gwn.com/news/story.php/id/10042/

At $18,000 it is a bit out of my price range but it begs the question, would it be possible to harness the horsepower of such a card to render Radiance scenes in realtime?

Chris Jessee

_______________________________________________
Radiance-general mailing list
[email protected]
http://www.radiance-online.org/mailman/listinfo/radiance-general

_______________________________________________
Radiance-general mailing list
[email protected]
http://www.radiance-online.org/mailman/listinfo/radiance-general

_______________________________________________
Radiance-general mailing list
[email protected]
http://www.radiance-online.org/mailman/listinfo/radiance-general