Hi Thomas & Greg,
Many thanks for your suggestions here!
Thomas, the texture-based appraoch is something that came across my mind as
The other thing that I was wondering about is whether the colordata
primitive might be relavant here?
For the textures based approach, probably ra_xyz is my friend here, though
I'm more wondering how I could generate the proper set of uv coordinates to
get the right mapping.
I'm not using textures in any sense, the use of per-vertex coloring is
closer to scientific data visualization.
Each vertex has a corresponding scalar value assigned to it, which is mapped
to a color value by a lookup-table, and the values of the lookup-table are
the ones I'm trying to map.
Many thanks for your suggestions!
Nice to be right at least once but I was thinking of your mail
about 'extraction of luminance values' from 17/08/2007, especially
the part with the coloured 3D plot at the end.
Reading it again it might not be that simple to apply to this
problem, though. My idea was to read the colour value and save
it to a known picture position, then assign this position as
UV-coordinate to the mesh vertex. The picture used as texture
should give the mesh the desired colouring.
What I probably had in mind was something Blender users can do:
create a mesh (example: a sphere), 'paint' a mouth, nose, eyes and
hair onto the sphere in 3D and 'unwrap' it to a texture image.
Then you can use Photoshop etc. to refine that sketchy image to
a proper face texture.
Applied here it would be:
1) get the mesh into Blender (...)
2) unwrap and save texture of vertex colours
3) export mesh to *.obj with uv-texture
4) convert texture image to *.pic
5) use obj2mesh to get shape and texture into Radiance