I thought I'd let you know about my recently-completed 4-minute 3D-360
video that was rendered with Radiance. The subject is a dynamic
triangle mesh from a computational fluid dynamics simulation of a
sphere of fluid with a density discontinuity. (Lots of grey
cylinders.)
The original (2D rendering) video is here:
The new video (3D-360, for GearVR and other VR HMDs) is in multiple places:
The rendering took 8 months on a 4.4GHz, 8-core Intel Haswell chip.
Each original frame was 11520x11520 (in HDR, of course), but reduced
to 3840x3840 for storage, and further reduced to 3840x2160 for
publishing. In all, almost 1 Terapixel of primary rays were traced.
The directory takes up ~600 GB. I used an rcalc/rtrace trick to
generate the 3D-360 frames. The command line for each frame looked
like:
From: Mark Stock <[email protected]>
Date: June 30, 2016 9:16:27 PM PDT
Folks,
I thought I'd let you know about my recently-completed 4-minute 3D-360
video that was rendered with Radiance. The subject is a dynamic
triangle mesh from a computational fluid dynamics simulation of a
sphere of fluid with a density discontinuity. (Lots of grey
cylinders.)
The rendering took 8 months on a 4.4GHz, 8-core Intel Haswell chip.
Each original frame was 11520x11520 (in HDR, of course), but reduced
to 3840x3840 for storage, and further reduced to 3840x2160 for
publishing. In all, almost 1 Terapixel of primary rays were traced.
The directory takes up ~600 GB. I used an rcalc/rtrace trick to
generate the 3D-360 frames. The command line for each frame looked
like:
I've been playing around with 360 stereo renderings in Radiance too. I use
google cardboard and created an rcalc command for generating view rays
based on this:
I was planning to present some of my renderings at the workshop in August.
But your video puts my renderings to shame!
Earlier this week I was looking at image.c to see what it would take to add
native support for equirectangular view types, including both mono and
over-under stereo. I think the awkward part would be specifying the pupil
spacing for stereo renderings. I think we could use the length of the
direction vector for pupil spacing, but this would prevent adding depth of
field blur.
Andy
···
On Thu, Jun 30, 2016 at 9:16 PM, Mark Stock <[email protected]> wrote:
Folks,
I thought I'd let you know about my recently-completed 4-minute 3D-360
video that was rendered with Radiance. The subject is a dynamic
triangle mesh from a computational fluid dynamics simulation of a
sphere of fluid with a density discontinuity. (Lots of grey
cylinders.)
The rendering took 8 months on a 4.4GHz, 8-core Intel Haswell chip.
Each original frame was 11520x11520 (in HDR, of course), but reduced
to 3840x3840 for storage, and further reduced to 3840x2160 for
publishing. In all, almost 1 Terapixel of primary rays were traced.
The directory takes up ~600 GB. I used an rcalc/rtrace trick to
generate the 3D-360 frames. The command line for each frame looked
like:
The mesh is the set of computational elements for the fluid
simulation, and, yes, a lot of the effort went into creating a dynamic
mesh refinement and re-coalescence algorithm.
Make sure to download the mp4 when you get your hands on a
GearVR---the quality through YouTube leave a lot to be desired, though
it does give some of the feeling of being enveloped by this writing,
grey mesh.
This is really impressive, Mark! I'll have to try it out next time I have
access to a GearVR. Did you create your own dynamic meshing algorithm?
-Greg
From: Mark Stock <[email protected]>
Date: June 30, 2016 9:16:27 PM PDT
Folks,
I thought I'd let you know about my recently-completed 4-minute 3D-360
video that was rendered with Radiance. The subject is a dynamic
triangle mesh from a computational fluid dynamics simulation of a
sphere of fluid with a density discontinuity. (Lots of grey
cylinders.)
The rendering took 8 months on a 4.4GHz, 8-core Intel Haswell chip.
Each original frame was 11520x11520 (in HDR, of course), but reduced
to 3840x3840 for storage, and further reduced to 3840x2160 for
publishing. In all, almost 1 Terapixel of primary rays were traced.
The directory takes up ~600 GB. I used an rcalc/rtrace trick to
generate the 3D-360 frames. The command line for each frame looked
like:
I didn't see that document before I took on this project, but I think
my method was pretty similar. If it wasn't, I'm not about to re-render
this one. It'd be great if you shared your method to the list!
Note that the Samsung Internet movie player (for playing YouTube
movies on GearVR) expects your movie data to fill the 16:9 frame,
otherwise it won't map correctly. The Oculus player (for movies
downloaded to your phone+GearVR) is a little smarter, though, and will
play both 3840x1920 (2:1) or 3840x2160 (16:9) just fine.
I've been playing around with 360 stereo renderings in Radiance too. I use
google cardboard and created an rcalc command for generating view rays
based on this: https://developers.google.com/vr/jump/rendering-ods-content.pdf
I was planning to present some of my renderings at the workshop in August.
But your video puts my renderings to shame!
Earlier this week I was looking at image.c to see what it would take to add
native support for equirectangular view types, including both mono and
over-under stereo. I think the awkward part would be specifying the pupil
spacing for stereo renderings. I think we could use the length of the
direction vector for pupil spacing, but this would prevent adding depth of
field blur.
Andy
On Thu, Jun 30, 2016 at 9:16 PM, Mark Stock <[email protected]> wrote:
Folks,
I thought I'd let you know about my recently-completed 4-minute 3D-360
video that was rendered with Radiance. The subject is a dynamic
triangle mesh from a computational fluid dynamics simulation of a
sphere of fluid with a density discontinuity. (Lots of grey
cylinders.)
The rendering took 8 months on a 4.4GHz, 8-core Intel Haswell chip.
Each original frame was 11520x11520 (in HDR, of course), but reduced
to 3840x3840 for storage, and further reduced to 3840x2160 for
publishing. In all, almost 1 Terapixel of primary rays were traced.
The directory takes up ~600 GB. I used an rcalc/rtrace trick to
generate the 3D-360 frames. The command line for each frame looked
like:
I'm happy to share, though I think your cal file is more useful to others.
Especially since i didn't comment my script. The main difference between
what I did and the pseudo code in the pdf file linked was that I added
pixel jittering by adding a random number from 0 to 1 to the pixel position.
As for the viewer, I used mobile-vr-station ( Mobile VR Station® on the App Store) on my
iphone with google cardboard viewer and your video worked perfectly. The
mobile-vr-station app itself is a pain in the rear, but I haven't found
anything better for downloading and storing VR images or videos on the
iphone.
On Fri, Jul 1, 2016 at 2:21 PM, Mark Stock <[email protected]> wrote:
Andy,
I didn't see that document before I took on this project, but I think
my method was pretty similar. If it wasn't, I'm not about to re-render
this one. It'd be great if you shared your method to the list!
Note that the Samsung Internet movie player (for playing YouTube
movies on GearVR) expects your movie data to fill the 16:9 frame,
otherwise it won't map correctly. The Oculus player (for movies
downloaded to your phone+GearVR) is a little smarter, though, and will
play both 3840x1920 (2:1) or 3840x2160 (16:9) just fine.
Mark
On 7/1/16, Andy McNeil <[email protected]> wrote:
> Mark! That is awesome!
>
> I've been playing around with 360 stereo renderings in Radiance too. I
use
> google cardboard and created an rcalc command for generating view rays
> based on this:
> https://developers.google.com/vr/jump/rendering-ods-content.pdf
> I was planning to present some of my renderings at the workshop in
August.
> But your video puts my renderings to shame!
>
> Earlier this week I was looking at image.c to see what it would take to
add
> native support for equirectangular view types, including both mono and
> over-under stereo. I think the awkward part would be specifying the pupil
> spacing for stereo renderings. I think we could use the length of the
> direction vector for pupil spacing, but this would prevent adding depth
of
> field blur.
>
> Andy
>
>
> On Thu, Jun 30, 2016 at 9:16 PM, Mark Stock <[email protected]> wrote:
>
>> Folks,
>>
>> I thought I'd let you know about my recently-completed 4-minute 3D-360
>> video that was rendered with Radiance. The subject is a dynamic
>> triangle mesh from a computational fluid dynamics simulation of a
>> sphere of fluid with a density discontinuity. (Lots of grey
>> cylinders.)
>>
>> The original (2D rendering) video is here:
>> https://www.youtube.com/watch?v=OShSC1VyBi0
>>
>> The new video (3D-360, for GearVR and other VR HMDs) is in multiple
>> places:
>> https://www.youtube.com/watch?v=zGFMqEKiAGM
>> https://vimeo.com/173000788
>> Mark Stock
>>
>> I'll provide a direct download link for all you Radiance fans, who
>> know how hard this actually was:
>> http://markjstock.org/vr/MarkStock_SmokeWaterFire_UHDlo_360_TB.mp4
>>
>> The rendering took 8 months on a 4.4GHz, 8-core Intel Haswell chip.
>> Each original frame was 11520x11520 (in HDR, of course), but reduced
>> to 3840x3840 for storage, and further reduced to 3840x2160 for
>> publishing. In all, almost 1 Terapixel of primary rays were traced.
>> The directory takes up ~600 GB. I used an rcalc/rtrace trick to
>> generate the 3D-360 frames. The command line for each frame looked
>> like:
>>
>> cnt 11520 11520 | rcalc -f 3d360.cal -e
>> "XD=11520;YD=11520;X=0;Y=0;Z=0;IPD=0.02;EX=0;EZ=0" | rtrace -ab 2 -aa
>> 0 -ad 32 -as 0 -dj 0.7 -ds 0.06 -u+ -dv- -x 11520 -y 11520 -n 3 -fac
>> scene0695.oct > img_0695temp.pic
>>
>> I attached the 3d360.cal file in case anyone else is interested.
>>
>> Have fun, and good night!
>>
>> Mark
>>
>> _______________________________________________
>> Radiance-general mailing list
>> [email protected]
>> http://www.radiance-online.org/mailman/listinfo/radiance-general
>>
>>
>
Since it seems that the attachment was stripped from my original
message, here are the contents of the 3d360.cal file that I used:
{
3d360.cal
Definitions for full 360 over-under stereo equirectangular projection
(c)2014 Mark J. Stock
Use it like this:
X=2048; Y=2048; cnt $Y $X | rcalc -f 3d360.cal -e
"XD=$X;YD=$Y;X=0;Y=0;Z=-0.1;IPD=0.06;EX=0;EZ=0" | rtrace [rpict
options] -x $X -y $Y -fac scene.oct > out.hdr
Parameters defined externally:
X : neck rotation origin x
Y : neck rotation origin y
Z : neck rotation origin z
XD : horizontal picture dimension ( pixels )
YD : vertical picture dimension ( pixels )
IPD : inter-pupillary distance
this is between 0.055m and 0.07m on most humans
These don't seem to work all that well:
EX : forward distance between neck rotation center and bridge of
nose (between eyes)
this is between 0.05m and 0.07m on most humans
EZ : vertical distance between neck rotation center and eye
elevation when altitude is 0 degrees
this is around 0.1m on most humans
}
{ Direction of the current pixel (both angles in radians) }
px = $2;
py = YD - $1;
frac(x) : x - floor(x);
altitude = (frac((py-0.5)/(YD/2)) - 0.5) * PI;
{ to do over-under stereo, azimuth is easy }
azimut = px * 2 * PI / XD;
{ Transformation into a direction vector }
xdir = cos(azimut) * cos(altitude);
ydir = sin(azimut) * cos(altitude);
zdir = sin(altitude);
{ Transform the viewpoint to account for the eye position }
dx = EX;
dy = if($1 - YD/2, 0.5*IPD, -0.5*IPD);
dz = EZ;
xpos = X + xdir*dx - sin(azimut)*dy + cos(azimut)*zdir*dz;
ypos = Y + ydir*dx + cos(azimut)*dy + sin(azimut)*zdir*dz;
zpos = Z - zdir*dx + 0*dy + cos(altitude) *dz;
{ Output line to rtrace; each ray needs: xorg yorg zorg xdir ydir zdir }
$1 = xpos; $2 = ypos; $3 = zpos;
$4 = xdir; $5 = ydir; $6 = zdir;
{ EOF }
Note that the above will generate a 1:1 ratio final image, with left
on the top half and right on the bottom. To knock that down to 16:9
for Youtube, I used the following mencoder command:
I'm happy to share, though I think your cal file is more useful to others.
Especially since i didn't comment my script. The main difference between
what I did and the pseudo code in the pdf file linked was that I added
pixel jittering by adding a random number from 0 to 1 to the pixel
position.
As for the viewer, I used mobile-vr-station ( Mobile VR Station® on the App Store) on my
iphone with google cardboard viewer and your video worked perfectly. The
mobile-vr-station app itself is a pain in the rear, but I haven't found
anything better for downloading and storing VR images or videos on the
iphone.
On Fri, Jul 1, 2016 at 2:21 PM, Mark Stock <[email protected]> wrote:
Andy,
I didn't see that document before I took on this project, but I think
my method was pretty similar. If it wasn't, I'm not about to re-render
this one. It'd be great if you shared your method to the list!
Note that the Samsung Internet movie player (for playing YouTube
movies on GearVR) expects your movie data to fill the 16:9 frame,
otherwise it won't map correctly. The Oculus player (for movies
downloaded to your phone+GearVR) is a little smarter, though, and will
play both 3840x1920 (2:1) or 3840x2160 (16:9) just fine.
Mark
On 7/1/16, Andy McNeil <[email protected]> wrote:
> Mark! That is awesome!
>
> I've been playing around with 360 stereo renderings in Radiance too. I
use
> google cardboard and created an rcalc command for generating view rays
> based on this:
> https://developers.google.com/vr/jump/rendering-ods-content.pdf
> I was planning to present some of my renderings at the workshop in
August.
> But your video puts my renderings to shame!
>
> Earlier this week I was looking at image.c to see what it would take to
add
> native support for equirectangular view types, including both mono and
> over-under stereo. I think the awkward part would be specifying the
> pupil
> spacing for stereo renderings. I think we could use the length of the
> direction vector for pupil spacing, but this would prevent adding depth
of
> field blur.
>
> Andy
>
>
> On Thu, Jun 30, 2016 at 9:16 PM, Mark Stock <[email protected]> wrote:
>
>> Folks,
>>
>> I thought I'd let you know about my recently-completed 4-minute 3D-360
>> video that was rendered with Radiance. The subject is a dynamic
>> triangle mesh from a computational fluid dynamics simulation of a
>> sphere of fluid with a density discontinuity. (Lots of grey
>> cylinders.)
>>
>> The original (2D rendering) video is here:
>> https://www.youtube.com/watch?v=OShSC1VyBi0
>>
>> The new video (3D-360, for GearVR and other VR HMDs) is in multiple
>> places:
>> https://www.youtube.com/watch?v=zGFMqEKiAGM
>> https://vimeo.com/173000788
>> Mark Stock
>>
>> I'll provide a direct download link for all you Radiance fans, who
>> know how hard this actually was:
>> http://markjstock.org/vr/MarkStock_SmokeWaterFire_UHDlo_360_TB.mp4
>>
>> The rendering took 8 months on a 4.4GHz, 8-core Intel Haswell chip.
>> Each original frame was 11520x11520 (in HDR, of course), but reduced
>> to 3840x3840 for storage, and further reduced to 3840x2160 for
>> publishing. In all, almost 1 Terapixel of primary rays were traced.
>> The directory takes up ~600 GB. I used an rcalc/rtrace trick to
>> generate the 3D-360 frames. The command line for each frame looked
>> like:
>>
>> cnt 11520 11520 | rcalc -f 3d360.cal -e
>> "XD=11520;YD=11520;X=0;Y=0;Z=0;IPD=0.02;EX=0;EZ=0" | rtrace -ab 2 -aa
>> 0 -ad 32 -as 0 -dj 0.7 -ds 0.06 -u+ -dv- -x 11520 -y 11520 -n 3 -fac
>> scene0695.oct > img_0695temp.pic
>>
>> I attached the 3d360.cal file in case anyone else is interested.
>>
>> Have fun, and good night!
>>
>> Mark
>>
>> _______________________________________________
>> Radiance-general mailing list
>> [email protected]
>> http://www.radiance-online.org/mailman/listinfo/radiance-general
>>
>>
>
this is absolutely fascinating, thank you very much for joining the video and also your procedure.
I just tested it with a current project (no video just a rendering) and it is amazing.
One thing that came up. My rendering seems to be mirrored about the vertical axes. That means left and right is switched. Pflip does the job, no problem. But I am just curious, maybe I did something wrong.