Has anyone tried this, or know anything about it? With the 32-bit (and now 64-bit) capabilities of OpenGL 3, it seems like some things might be possible, though I'm not sure how to get a GPU to do a global illumination calculation. Anyone know? Tried? Seen any papers?
···
--
Randolph Fritz • RFritz@lbl.gov
Environmental Energy Technologies Division • Lawrence Berkeley Labs
There are several approaches to get global illumintation on gpus,
es.wikipedia.org/wiki/*OpenCL
*es.wikipedia.org/wiki/*CUDA*
*http://www.luxrender.net/wiki/index.php?title=Luxrender_and_OpenCL (Open Source)
http://www.mentalimages.com/products/iray.html (commercial)
http://www.youtube.com/watch?v=eRoSFNRQETg (commercial)
···
*
El 12/07/2010 4:21, R Fritz escribió:
Has anyone tried this, or know anything about it? With the 32-bit (and now 64-bit) capabilities of OpenGL 3, it seems like some things might be possible, though I'm not sure how to get a GPU to do a global illumination calculation. Anyone know? Tried? Seen any papers?
Thank you. That makes it clear that something is possible. But what? As of version 3.5, Mental Ray (the base of iray) had some serious limitations in physically based rendering. In Labayrade and Fontoynont's 2009 CIE 171:2006 testing, it could not, for instance, simulate a clear sky. Iray is based on Mental Ray 3.8, and it is possible that some of those problems have been resolved since version 3.5, but there does not seem to have been any testing published that says so. Mental Image's iray site doesn't mention CIE 171. The Luxrender site doesn't mention any testing at all. Are you aware of any testing that shows accuracy of either of these tools in the physical simulation of light?
Greg, have you looked at this at all?
Randolph
Ref: Labayrade, Raphael, and Marc Fontoynont. “Use of CIE 171:2006 Test Cases to Assess the Scope of Lighting Simulation Programs.” CIE Light and Lighting 2009, n.d. (Seen in preprint.)
···
On 2010-07-11 23:46:21 -0700, Ignacio Munarriz said:
There are several approaches to get global illumintation on gpus,
es.wikipedia.org/wiki/OpenCL
es.wikipedia.org/wiki/CUDA
http://www.luxrender.net/wiki/index.php?title=Luxrender_and_OpenCL (Open Source)
http://www.mentalimages.com/products/iray.html (commercial)
http://www.youtube.com/watch?v=eRoSFNRQETg (commercial)
El 12/07/2010 4:21, R Fritz escribió:
Has anyone tried this, or know anything about it? With the 32-bit (and now 64-bit) capabilities of OpenGL 3, it seems like some things might be possible, though I'm not sure how to get a GPU to do a global illumination calculation. Anyone know? Tried? Seen any papers?
_______________________________________________
Radiance-general mailing list
Radiance-general@radiance-online.org
http://www.radiance-online.org/mailman/listinfo/radiance-general
--
Randolph Fritz • RFritz@lbl.gov
Environmental Energy Technologies Division • Lawrence Berkeley Labs
I'm not aware of any physical simulation of light software using gpu, but probably it would be interesting to externalize ray-geometry intersections in radiance. I think there is already gpu code(as libraries or open source) that makes that intersections. Radiance instances, meshes, memory management ... would be some issues to solve, maybe waiting a bit the ray-geometry intersections will be normalized in gpus and it will be easier to send rays and geometry from radiance to the cards
···
El 12/07/2010 19:05, R Fritz escribió:
Thank you. That makes it clear that something is possible. But what? As of version 3.5, Mental Ray (the base of iray) had some serious limitations in physically based rendering. In Labayrade and Fontoynont's 2009 CIE 171:2006 testing, it could not, for instance, simulate a clear sky. Iray is based on Mental Ray 3.8, and it is possible that some of those problems have been resolved since version 3.5, but there does not seem to have been any testing published that says so. Mental Image's iray site doesn't mention CIE 171. The Luxrender site doesn't mention any testing at all. Are you aware of any testing that shows accuracy of either of these tools in the physical simulation of light?
Greg, have you looked at this at all?
Randolph
Ref: Labayrade, Raphael, and Marc Fontoynont. “Use of CIE 171:2006 Test Cases to Assess the Scope of Lighting Simulation Programs.” CIE Light and Lighting 2009, n.d. (Seen in preprint.)
On 2010-07-11 23:46:21 -0700, Ignacio Munarriz said:
There are several approaches to get global illumintation on gpus,
es.wikipedia.org/wiki/OpenCL
es.wikipedia.org/wiki/CUDA
http://www.luxrender.net/wiki/index.php?title=Luxrender_and_OpenCL (Open Source)
http://www.mentalimages.com/products/iray.html (commercial)
http://www.youtube.com/watch?v=eRoSFNRQETg (commercial)
El 12/07/2010 4:21, R Fritz escribió:
Has anyone tried this, or know anything about it? With the 32-bit (and now 64-bit) capabilities of OpenGL 3, it seems like some things might be possible, though I'm not sure how to get a GPU to do a global illumination calculation. Anyone know? Tried? Seen any papers?
_______________________________________________
Radiance-general mailing list
Radiance-general@radiance-online.org
http://www.radiance-online.org/mailman/listinfo/radiance-general
The basic problem is that you need to have the entire scene description available to every process, and GPU's don't work that way. They are optimized for SIMD (single-instruction, multiple-data) processing, meaning that you run the same operation on many operands simultaneously. This is great if you are multiplying huge matrices together, but not much use in a ray-tracing context.
Even if the GPU were optimized for this problem, much of what goes on in Radiance is ray tree evaluation. It all sort of happens together -- you need to trace rays, figure out what to do at surfaces based on material properties, evaluate .cal files, share ambient data (reading and writing a shared cache) and all sorts of things that would have to be completely rethought for a GPU implementation.
-Greg
···
From: Ignacio Munarriz <info@aisarquitectura.com>
Date: July 12, 2010 11:05:27 AM PDT
I'm not aware of any physical simulation of light software using gpu, but probably it would be interesting to externalize ray-geometry intersections in radiance. I think there is already gpu code(as libraries or open source) that makes that intersections. Radiance instances, meshes, memory management ... would be some issues to solve, maybe waiting a bit the ray-geometry intersections will be normalized in gpus and it will be easier to send rays and geometry from radiance to the cards
El 12/07/2010 19:05, R Fritz escribió:
Thank you. That makes it clear that something is possible. But what? As of version 3.5, Mental Ray (the base of iray) had some serious limitations in physically based rendering. In Labayrade and Fontoynont's 2009 CIE 171:2006 testing, it could not, for instance, simulate a clear sky. Iray is based on Mental Ray 3.8, and it is possible that some of those problems have been resolved since version 3.5, but there does not seem to have been any testing published that says so. Mental Image's iray site doesn't mention CIE 171. The Luxrender site doesn't mention any testing at all. Are you aware of any testing that shows accuracy of either of these tools in the physical simulation of light?
Greg, have you looked at this at all?
Randolph
Even if the GPU were optimized for this problem, much of what goes on
in Radiance is ray tree evaluation. It all sort of happens together-- you need to trace rays, figure out what to do at surfaces based on
material properties, evaluate .cal files, share ambient data (reading
and writing a shared cache) and all sorts of things that would have to
be completely rethought for a GPU implementation.
Thanks, Greg. It sounds like a reimplementation of Radiance in some combination of GLSL and C, and that's a huge amount of work. On the other hand, it does sound possible, which is interesting. One more research project...
···
--
Randolph Fritz • RFritz@lbl.gov
Environmental Energy Technologies Division • Lawrence Berkeley Labs