3D-360 video rendered with Radiance

Folks,

I thought I'd let you know about my recently-completed 4-minute 3D-360
video that was rendered with Radiance. The subject is a dynamic
triangle mesh from a computational fluid dynamics simulation of a
sphere of fluid with a density discontinuity. (Lots of grey
cylinders.)

The original (2D rendering) video is here:

The new video (3D-360, for GearVR and other VR HMDs) is in multiple places:



I'll provide a direct download link for all you Radiance fans, who
know how hard this actually was:
http://markjstock.org/vr/MarkStock_SmokeWaterFire_UHDlo_360_TB.mp4

The rendering took 8 months on a 4.4GHz, 8-core Intel Haswell chip.
Each original frame was 11520x11520 (in HDR, of course), but reduced
to 3840x3840 for storage, and further reduced to 3840x2160 for
publishing. In all, almost 1 Terapixel of primary rays were traced.
The directory takes up ~600 GB. I used an rcalc/rtrace trick to
generate the 3D-360 frames. The command line for each frame looked
like:

cnt 11520 11520 | rcalc -f 3d360.cal -e
"XD=11520;YD=11520;X=0;Y=0;Z=0;IPD=0.02;EX=0;EZ=0" | rtrace -ab 2 -aa
0 -ad 32 -as 0 -dj 0.7 -ds 0.06 -u+ -dv- -x 11520 -y 11520 -n 3 -fac
scene0695.oct > img_0695temp.pic

I attached the 3d360.cal file in case anyone else is interested.

Have fun, and good night!

Mark

3d360.cal (1.69 KB)

This is really impressive, Mark! I'll have to try it out next time I have access to a GearVR. Did you create your own dynamic meshing algorithm?

-Greg

3d360.cal (1.69 KB)

···

From: Mark Stock <[email protected]>
Date: June 30, 2016 9:16:27 PM PDT

Folks,

I thought I'd let you know about my recently-completed 4-minute 3D-360
video that was rendered with Radiance. The subject is a dynamic
triangle mesh from a computational fluid dynamics simulation of a
sphere of fluid with a density discontinuity. (Lots of grey
cylinders.)

The original (2D rendering) video is here:
https://www.youtube.com/watch?v=OShSC1VyBi0

The new video (3D-360, for GearVR and other VR HMDs) is in multiple places:
https://www.youtube.com/watch?v=zGFMqEKiAGM
https://vimeo.com/173000788
https://www.facebook.com/mark.stock/posts/10153805081796376

I'll provide a direct download link for all you Radiance fans, who
know how hard this actually was:
http://markjstock.org/vr/MarkStock_SmokeWaterFire_UHDlo_360_TB.mp4

The rendering took 8 months on a 4.4GHz, 8-core Intel Haswell chip.
Each original frame was 11520x11520 (in HDR, of course), but reduced
to 3840x3840 for storage, and further reduced to 3840x2160 for
publishing. In all, almost 1 Terapixel of primary rays were traced.
The directory takes up ~600 GB. I used an rcalc/rtrace trick to
generate the 3D-360 frames. The command line for each frame looked
like:

cnt 11520 11520 | rcalc -f 3d360.cal -e
"XD=11520;YD=11520;X=0;Y=0;Z=0;IPD=0.02;EX=0;EZ=0" | rtrace -ab 2 -aa
0 -ad 32 -as 0 -dj 0.7 -ds 0.06 -u+ -dv- -x 11520 -y 11520 -n 3 -fac
scene0695.oct > img_0695temp.pic

I attached the 3d360.cal file in case anyone else is interested.

Have fun, and good night!

Mark

Mark! That is awesome!

I've been playing around with 360 stereo renderings in Radiance too. I use
google cardboard and created an rcalc command for generating view rays
based on this:


I was planning to present some of my renderings at the workshop in August.
But your video puts my renderings to shame!

Earlier this week I was looking at image.c to see what it would take to add
native support for equirectangular view types, including both mono and
over-under stereo. I think the awkward part would be specifying the pupil
spacing for stereo renderings. I think we could use the length of the
direction vector for pupil spacing, but this would prevent adding depth of
field blur.

Andy

···

On Thu, Jun 30, 2016 at 9:16 PM, Mark Stock <[email protected]> wrote:

Folks,

I thought I'd let you know about my recently-completed 4-minute 3D-360
video that was rendered with Radiance. The subject is a dynamic
triangle mesh from a computational fluid dynamics simulation of a
sphere of fluid with a density discontinuity. (Lots of grey
cylinders.)

The original (2D rendering) video is here:
https://www.youtube.com/watch?v=OShSC1VyBi0

The new video (3D-360, for GearVR and other VR HMDs) is in multiple places:
https://www.youtube.com/watch?v=zGFMqEKiAGM
https://vimeo.com/173000788
https://www.facebook.com/mark.stock/posts/10153805081796376

I'll provide a direct download link for all you Radiance fans, who
know how hard this actually was:
http://markjstock.org/vr/MarkStock_SmokeWaterFire_UHDlo_360_TB.mp4

The rendering took 8 months on a 4.4GHz, 8-core Intel Haswell chip.
Each original frame was 11520x11520 (in HDR, of course), but reduced
to 3840x3840 for storage, and further reduced to 3840x2160 for
publishing. In all, almost 1 Terapixel of primary rays were traced.
The directory takes up ~600 GB. I used an rcalc/rtrace trick to
generate the 3D-360 frames. The command line for each frame looked
like:

cnt 11520 11520 | rcalc -f 3d360.cal -e
"XD=11520;YD=11520;X=0;Y=0;Z=0;IPD=0.02;EX=0;EZ=0" | rtrace -ab 2 -aa
0 -ad 32 -as 0 -dj 0.7 -ds 0.06 -u+ -dv- -x 11520 -y 11520 -n 3 -fac
scene0695.oct > img_0695temp.pic

I attached the 3d360.cal file in case anyone else is interested.

Have fun, and good night!

Mark

_______________________________________________
Radiance-general mailing list
[email protected]
http://www.radiance-online.org/mailman/listinfo/radiance-general

Greg,

The mesh is the set of computational elements for the fluid
simulation, and, yes, a lot of the effort went into creating a dynamic
mesh refinement and re-coalescence algorithm.

Make sure to download the mp4 when you get your hands on a
GearVR---the quality through YouTube leave a lot to be desired, though
it does give some of the feeling of being enveloped by this writing,
grey mesh.

Mark

···

On 7/1/16, Greg Ward <[email protected]> wrote:

This is really impressive, Mark! I'll have to try it out next time I have
access to a GearVR. Did you create your own dynamic meshing algorithm?

-Greg

From: Mark Stock <[email protected]>
Date: June 30, 2016 9:16:27 PM PDT

Folks,

I thought I'd let you know about my recently-completed 4-minute 3D-360
video that was rendered with Radiance. The subject is a dynamic
triangle mesh from a computational fluid dynamics simulation of a
sphere of fluid with a density discontinuity. (Lots of grey
cylinders.)

The original (2D rendering) video is here:
https://www.youtube.com/watch?v=OShSC1VyBi0

The new video (3D-360, for GearVR and other VR HMDs) is in multiple
places:
https://www.youtube.com/watch?v=zGFMqEKiAGM
https://vimeo.com/173000788
https://www.facebook.com/mark.stock/posts/10153805081796376

I'll provide a direct download link for all you Radiance fans, who
know how hard this actually was:
http://markjstock.org/vr/MarkStock_SmokeWaterFire_UHDlo_360_TB.mp4

The rendering took 8 months on a 4.4GHz, 8-core Intel Haswell chip.
Each original frame was 11520x11520 (in HDR, of course), but reduced
to 3840x3840 for storage, and further reduced to 3840x2160 for
publishing. In all, almost 1 Terapixel of primary rays were traced.
The directory takes up ~600 GB. I used an rcalc/rtrace trick to
generate the 3D-360 frames. The command line for each frame looked
like:

cnt 11520 11520 | rcalc -f 3d360.cal -e
"XD=11520;YD=11520;X=0;Y=0;Z=0;IPD=0.02;EX=0;EZ=0" | rtrace -ab 2 -aa
0 -ad 32 -as 0 -dj 0.7 -ds 0.06 -u+ -dv- -x 11520 -y 11520 -n 3 -fac
scene0695.oct > img_0695temp.pic

I attached the 3d360.cal file in case anyone else is interested.

Have fun, and good night!

Mark

Andy,

I didn't see that document before I took on this project, but I think
my method was pretty similar. If it wasn't, I'm not about to re-render
this one. It'd be great if you shared your method to the list!

Note that the Samsung Internet movie player (for playing YouTube
movies on GearVR) expects your movie data to fill the 16:9 frame,
otherwise it won't map correctly. The Oculus player (for movies
downloaded to your phone+GearVR) is a little smarter, though, and will
play both 3840x1920 (2:1) or 3840x2160 (16:9) just fine.

Mark

···

On 7/1/16, Andy McNeil <[email protected]> wrote:

Mark! That is awesome!

I've been playing around with 360 stereo renderings in Radiance too. I use
google cardboard and created an rcalc command for generating view rays
based on this:
https://developers.google.com/vr/jump/rendering-ods-content.pdf
I was planning to present some of my renderings at the workshop in August.
But your video puts my renderings to shame!

Earlier this week I was looking at image.c to see what it would take to add
native support for equirectangular view types, including both mono and
over-under stereo. I think the awkward part would be specifying the pupil
spacing for stereo renderings. I think we could use the length of the
direction vector for pupil spacing, but this would prevent adding depth of
field blur.

Andy

On Thu, Jun 30, 2016 at 9:16 PM, Mark Stock <[email protected]> wrote:

Folks,

I thought I'd let you know about my recently-completed 4-minute 3D-360
video that was rendered with Radiance. The subject is a dynamic
triangle mesh from a computational fluid dynamics simulation of a
sphere of fluid with a density discontinuity. (Lots of grey
cylinders.)

The original (2D rendering) video is here:
https://www.youtube.com/watch?v=OShSC1VyBi0

The new video (3D-360, for GearVR and other VR HMDs) is in multiple
places:
https://www.youtube.com/watch?v=zGFMqEKiAGM
https://vimeo.com/173000788
https://www.facebook.com/mark.stock/posts/10153805081796376

I'll provide a direct download link for all you Radiance fans, who
know how hard this actually was:
http://markjstock.org/vr/MarkStock_SmokeWaterFire_UHDlo_360_TB.mp4

The rendering took 8 months on a 4.4GHz, 8-core Intel Haswell chip.
Each original frame was 11520x11520 (in HDR, of course), but reduced
to 3840x3840 for storage, and further reduced to 3840x2160 for
publishing. In all, almost 1 Terapixel of primary rays were traced.
The directory takes up ~600 GB. I used an rcalc/rtrace trick to
generate the 3D-360 frames. The command line for each frame looked
like:

cnt 11520 11520 | rcalc -f 3d360.cal -e
"XD=11520;YD=11520;X=0;Y=0;Z=0;IPD=0.02;EX=0;EZ=0" | rtrace -ab 2 -aa
0 -ad 32 -as 0 -dj 0.7 -ds 0.06 -u+ -dv- -x 11520 -y 11520 -n 3 -fac
scene0695.oct > img_0695temp.pic

I attached the 3d360.cal file in case anyone else is interested.

Have fun, and good night!

Mark

_______________________________________________
Radiance-general mailing list
[email protected]
http://www.radiance-online.org/mailman/listinfo/radiance-general

I'm happy to share, though I think your cal file is more useful to others.
Especially since i didn't comment my script. The main difference between
what I did and the pseudo code in the pdf file linked was that I added
pixel jittering by adding a random number from 0 to 1 to the pixel position.

As for the viewer, I used mobile-vr-station (
https://itunes.apple.com/us/app/mobile-vr-station/id959820493?mt=8) on my
iphone with google cardboard viewer and your video worked perfectly. The
mobile-vr-station app itself is a pain in the rear, but I haven't found
anything better for downloading and storing VR images or videos on the
iphone.

here's the script I use:

#! /bin/bash

## Viewpoint Coordinates
vpx=4200
vpy=-1520
vpz=163

## Image Resolution
res=16384

## Octree file
oct=octs/octree.oct
amb=octs/ambient.amb

## Image filename
out=image.hdr

## distance between pupils (for different model units)
# #Inches
# ipd=2.48031
# #Feet
# ipd=0.206693
# #cm
ipd=6.30
# #Meter
# ipd=0.063

halfres=$((res/2))

cnt $res $res | \
rcalc -of -e 'px=$2+rand(($2+1)*($1+3)); py=$1+rand(($2+1)*($1+3)*($1+2))' \
-e 'theta=px/'"${res}"'*2*PI-PI/2' \
-e
'phi=if(py-'"${halfres}"',PI/2-(py-'"${halfres}"')/'"${halfres}"'*PI,PI/2-py/'"${halfres}"'*PI)'
\
-e 'IPD='"${ipd}"'' \
-e 'scale=if($1-'"${halfres}"'+1,IPD/2,-IPD/2)' \
-e
'$1=scale*sin(theta)+'"${vpx}"';$2=scale*cos(theta)+'"${vpy}"';$3='"${vpz}"';$4=sin(theta-PI/2)*cos(phi);$5=cos(theta-PI/2)*cos(phi);$6=sin(phi)'

\

rtrace -ffc -n 40 -ab 5 -ad 8000 -as 2000 -lw 5e-4 -af $amb -dj 1 -st 0 -ss
50 \
-x $res -y $res -ld- $oct | \
ra_rgbe -r - > ${out}

···

On Fri, Jul 1, 2016 at 2:21 PM, Mark Stock <[email protected]> wrote:

Andy,

I didn't see that document before I took on this project, but I think
my method was pretty similar. If it wasn't, I'm not about to re-render
this one. It'd be great if you shared your method to the list!

Note that the Samsung Internet movie player (for playing YouTube
movies on GearVR) expects your movie data to fill the 16:9 frame,
otherwise it won't map correctly. The Oculus player (for movies
downloaded to your phone+GearVR) is a little smarter, though, and will
play both 3840x1920 (2:1) or 3840x2160 (16:9) just fine.

Mark

On 7/1/16, Andy McNeil <[email protected]> wrote:
> Mark! That is awesome!
>
> I've been playing around with 360 stereo renderings in Radiance too. I
use
> google cardboard and created an rcalc command for generating view rays
> based on this:
> https://developers.google.com/vr/jump/rendering-ods-content.pdf
> I was planning to present some of my renderings at the workshop in
August.
> But your video puts my renderings to shame!
>
> Earlier this week I was looking at image.c to see what it would take to
add
> native support for equirectangular view types, including both mono and
> over-under stereo. I think the awkward part would be specifying the pupil
> spacing for stereo renderings. I think we could use the length of the
> direction vector for pupil spacing, but this would prevent adding depth
of
> field blur.
>
> Andy
>
>
> On Thu, Jun 30, 2016 at 9:16 PM, Mark Stock <[email protected]> wrote:
>
>> Folks,
>>
>> I thought I'd let you know about my recently-completed 4-minute 3D-360
>> video that was rendered with Radiance. The subject is a dynamic
>> triangle mesh from a computational fluid dynamics simulation of a
>> sphere of fluid with a density discontinuity. (Lots of grey
>> cylinders.)
>>
>> The original (2D rendering) video is here:
>> https://www.youtube.com/watch?v=OShSC1VyBi0
>>
>> The new video (3D-360, for GearVR and other VR HMDs) is in multiple
>> places:
>> https://www.youtube.com/watch?v=zGFMqEKiAGM
>> https://vimeo.com/173000788
>> https://www.facebook.com/mark.stock/posts/10153805081796376
>>
>> I'll provide a direct download link for all you Radiance fans, who
>> know how hard this actually was:
>> http://markjstock.org/vr/MarkStock_SmokeWaterFire_UHDlo_360_TB.mp4
>>
>> The rendering took 8 months on a 4.4GHz, 8-core Intel Haswell chip.
>> Each original frame was 11520x11520 (in HDR, of course), but reduced
>> to 3840x3840 for storage, and further reduced to 3840x2160 for
>> publishing. In all, almost 1 Terapixel of primary rays were traced.
>> The directory takes up ~600 GB. I used an rcalc/rtrace trick to
>> generate the 3D-360 frames. The command line for each frame looked
>> like:
>>
>> cnt 11520 11520 | rcalc -f 3d360.cal -e
>> "XD=11520;YD=11520;X=0;Y=0;Z=0;IPD=0.02;EX=0;EZ=0" | rtrace -ab 2 -aa
>> 0 -ad 32 -as 0 -dj 0.7 -ds 0.06 -u+ -dv- -x 11520 -y 11520 -n 3 -fac
>> scene0695.oct > img_0695temp.pic
>>
>> I attached the 3d360.cal file in case anyone else is interested.
>>
>> Have fun, and good night!
>>
>> Mark
>>
>> _______________________________________________
>> Radiance-general mailing list
>> [email protected]
>> http://www.radiance-online.org/mailman/listinfo/radiance-general
>>
>>
>

_______________________________________________
Radiance-general mailing list
[email protected]
http://www.radiance-online.org/mailman/listinfo/radiance-general

Since it seems that the attachment was stripped from my original
message, here are the contents of the 3d360.cal file that I used:

{
  3d360.cal

  Definitions for full 360 over-under stereo equirectangular projection

  (c)2014 Mark J. Stock

  Use it like this:
  X=2048; Y=2048; cnt $Y $X | rcalc -f 3d360.cal -e
"XD=$X;YD=$Y;X=0;Y=0;Z=-0.1;IPD=0.06;EX=0;EZ=0" | rtrace [rpict
options] -x $X -y $Y -fac scene.oct > out.hdr

  Parameters defined externally:
  X : neck rotation origin x
  Y : neck rotation origin y
  Z : neck rotation origin z
  XD : horizontal picture dimension ( pixels )
  YD : vertical picture dimension ( pixels )
  IPD : inter-pupillary distance
       this is between 0.055m and 0.07m on most humans
  These don't seem to work all that well:
  EX : forward distance between neck rotation center and bridge of
nose (between eyes)
       this is between 0.05m and 0.07m on most humans
  EZ : vertical distance between neck rotation center and eye
elevation when altitude is 0 degrees
       this is around 0.1m on most humans
}

{ Direction of the current pixel (both angles in radians) }
px = $2;
py = YD - $1;
frac(x) : x - floor(x);
altitude = (frac((py-0.5)/(YD/2)) - 0.5) * PI;
{ to do over-under stereo, azimuth is easy }
azimut = px * 2 * PI / XD;

{ Transformation into a direction vector }
xdir = cos(azimut) * cos(altitude);
ydir = sin(azimut) * cos(altitude);
zdir = sin(altitude);

{ Transform the viewpoint to account for the eye position }
dx = EX;
dy = if($1 - YD/2, 0.5*IPD, -0.5*IPD);
dz = EZ;
xpos = X + xdir*dx - sin(azimut)*dy + cos(azimut)*zdir*dz;
ypos = Y + ydir*dx + cos(azimut)*dy + sin(azimut)*zdir*dz;
zpos = Z - zdir*dx + 0*dy + cos(altitude) *dz;

{ Output line to rtrace; each ray needs: xorg yorg zorg xdir ydir zdir }
$1 = xpos; $2 = ypos; $3 = zpos;
$4 = xdir; $5 = ydir; $6 = zdir;

{ EOF }

Note that the above will generate a 1:1 ratio final image, with left
on the top half and right on the bottom. To knock that down to 16:9
for Youtube, I used the following mencoder command:

mencoder "mf://@allframes.txt" -mf w=3840:h=3840:type=png:fps=30 -o
MarkStock_SmokeWaterFire_UHD_360_TB.mp4 -sws 9 -of lavf -lavfopts
format=mp4 -nosub -vf softskip,dsize=16/9,scale=3840:2160,harddup
-nosound -ovc x264 -x264encopts
crf=24:nointerlaced:force_cfr:frameref=3:mixed_refs:bframes=1:b_adapt=2:weightp=1:direct_pred=auto:aq_mode=1:me=umh:me_range=16:subq=6:mbtree:psy_rd=0.8,0.2:chroma_me:trellis=1:nocabac:deblock:partitions=p8x8,b8x8,i8x8,i4x4:nofast_pskip:nodct_decimate:threads=auto:ssim:psnr:keyint=300:keyint_min=30:level_idc=30:global_header

Also, two piece of advice for makers of 360 or 3D-360 videos from
Oculus and Youtube:

https://support.oculus.com/help/oculus/1044498395609952/?ref=hc_fnav
https://support.google.com/youtube/answer/6178631?hl=en

Mark

···

On 7/1/16, Andy McNeil <[email protected]> wrote:

I'm happy to share, though I think your cal file is more useful to others.
Especially since i didn't comment my script. The main difference between
what I did and the pseudo code in the pdf file linked was that I added
pixel jittering by adding a random number from 0 to 1 to the pixel
position.

As for the viewer, I used mobile-vr-station (
https://itunes.apple.com/us/app/mobile-vr-station/id959820493?mt=8) on my
iphone with google cardboard viewer and your video worked perfectly. The
mobile-vr-station app itself is a pain in the rear, but I haven't found
anything better for downloading and storing VR images or videos on the
iphone.

here's the script I use:

#! /bin/bash

## Viewpoint Coordinates
vpx=4200
vpy=-1520
vpz=163

## Image Resolution
res=16384

## Octree file
oct=octs/octree.oct
amb=octs/ambient.amb

## Image filename
out=image.hdr

## distance between pupils (for different model units)
# #Inches
# ipd=2.48031
# #Feet
# ipd=0.206693
# #cm
ipd=6.30
# #Meter
# ipd=0.063

halfres=$((res/2))

cnt $res $res | \
rcalc -of -e 'px=$2+rand(($2+1)*($1+3)); py=$1+rand(($2+1)*($1+3)*($1+2))'
\
-e 'theta=px/'"${res}"'*2*PI-PI/2' \
-e
'phi=if(py-'"${halfres}"',PI/2-(py-'"${halfres}"')/'"${halfres}"'*PI,PI/2-py/'"${halfres}"'*PI)'
\
-e 'IPD='"${ipd}"'' \
-e 'scale=if($1-'"${halfres}"'+1,IPD/2,-IPD/2)' \
-e
'$1=scale*sin(theta)+'"${vpx}"';$2=scale*cos(theta)+'"${vpy}"';$3='"${vpz}"';$4=sin(theta-PI/2)*cos(phi);$5=cos(theta-PI/2)*cos(phi);$6=sin(phi)'
> \
rtrace -ffc -n 40 -ab 5 -ad 8000 -as 2000 -lw 5e-4 -af $amb -dj 1 -st 0 -ss
50 \
-x $res -y $res -ld- $oct | \
ra_rgbe -r - > ${out}

On Fri, Jul 1, 2016 at 2:21 PM, Mark Stock <[email protected]> wrote:

Andy,

I didn't see that document before I took on this project, but I think
my method was pretty similar. If it wasn't, I'm not about to re-render
this one. It'd be great if you shared your method to the list!

Note that the Samsung Internet movie player (for playing YouTube
movies on GearVR) expects your movie data to fill the 16:9 frame,
otherwise it won't map correctly. The Oculus player (for movies
downloaded to your phone+GearVR) is a little smarter, though, and will
play both 3840x1920 (2:1) or 3840x2160 (16:9) just fine.

Mark

On 7/1/16, Andy McNeil <[email protected]> wrote:
> Mark! That is awesome!
>
> I've been playing around with 360 stereo renderings in Radiance too. I
use
> google cardboard and created an rcalc command for generating view rays
> based on this:
> https://developers.google.com/vr/jump/rendering-ods-content.pdf
> I was planning to present some of my renderings at the workshop in
August.
> But your video puts my renderings to shame!
>
> Earlier this week I was looking at image.c to see what it would take to
add
> native support for equirectangular view types, including both mono and
> over-under stereo. I think the awkward part would be specifying the
> pupil
> spacing for stereo renderings. I think we could use the length of the
> direction vector for pupil spacing, but this would prevent adding depth
of
> field blur.
>
> Andy
>
>
> On Thu, Jun 30, 2016 at 9:16 PM, Mark Stock <[email protected]> wrote:
>
>> Folks,
>>
>> I thought I'd let you know about my recently-completed 4-minute 3D-360
>> video that was rendered with Radiance. The subject is a dynamic
>> triangle mesh from a computational fluid dynamics simulation of a
>> sphere of fluid with a density discontinuity. (Lots of grey
>> cylinders.)
>>
>> The original (2D rendering) video is here:
>> https://www.youtube.com/watch?v=OShSC1VyBi0
>>
>> The new video (3D-360, for GearVR and other VR HMDs) is in multiple
>> places:
>> https://www.youtube.com/watch?v=zGFMqEKiAGM
>> https://vimeo.com/173000788
>> https://www.facebook.com/mark.stock/posts/10153805081796376
>>
>> I'll provide a direct download link for all you Radiance fans, who
>> know how hard this actually was:
>> http://markjstock.org/vr/MarkStock_SmokeWaterFire_UHDlo_360_TB.mp4
>>
>> The rendering took 8 months on a 4.4GHz, 8-core Intel Haswell chip.
>> Each original frame was 11520x11520 (in HDR, of course), but reduced
>> to 3840x3840 for storage, and further reduced to 3840x2160 for
>> publishing. In all, almost 1 Terapixel of primary rays were traced.
>> The directory takes up ~600 GB. I used an rcalc/rtrace trick to
>> generate the 3D-360 frames. The command line for each frame looked
>> like:
>>
>> cnt 11520 11520 | rcalc -f 3d360.cal -e
>> "XD=11520;YD=11520;X=0;Y=0;Z=0;IPD=0.02;EX=0;EZ=0" | rtrace -ab 2 -aa
>> 0 -ad 32 -as 0 -dj 0.7 -ds 0.06 -u+ -dv- -x 11520 -y 11520 -n 3 -fac
>> scene0695.oct > img_0695temp.pic
>>
>> I attached the 3d360.cal file in case anyone else is interested.
>>
>> Have fun, and good night!
>>
>> Mark
>>
>> _______________________________________________
>> Radiance-general mailing list
>> [email protected]
>> http://www.radiance-online.org/mailman/listinfo/radiance-general
>>
>>
>

_______________________________________________
Radiance-general mailing list
[email protected]
http://www.radiance-online.org/mailman/listinfo/radiance-general

Hi Mark,

this is absolutely fascinating, thank you very much for joining the video and also your procedure.

I just tested it with a current project (no video just a rendering) and it is amazing.

One thing that came up. My rendering seems to be mirrored about the vertical axes. That means left and right is switched. Pflip does the job, no problem. But I am just curious, maybe I did something wrong.

This is my command:

cnt 3000 3000 | rcalc -f 3d360.cal -e "XD=3000;YD=3000;X=62.854;Y=29.687;Z=7.1;IPD=0.02;EX=0;EZ=0" | rtrace -ab 3 -x 3000 -y 3000 -n 12 -af pano_ambs_ol -fac T1_ol.oct > Pano1_ol.hdr

X, Y an Z is the Position of the Head. And I did not change anything in the 3d360.cal

On the iPhone I use the Mobile VR Station and there I use the 3D Spherical Top/Bottom Playback Type.

Many thanks and best regards

Martin Klingler