# Radiance pic - float values to normal rgb ?

hi,

i want to make a simple hdri (radiance-pic)-viewer
i have enocodet a hdr-image and as result i have floats from the 32Bit float
triplet
how can ein convert this values in a optimized bmp ?
there is a formal for it ?
now i make this

for example:

r_float_value = 2.00;
g_float_value = 3.00;
b_float_value = 5.00;

r = 2*factor;
if(r>255)r=255;
g = 2*factor;
if(g>255)r=255;
b = 2*factor;
if(b>255)r=255;

this shows me a hdr-image.
factor is a userdefined value. the user can modify this in runtime. while
changing this the picture show me the different details. okay .. this is a
strange work, but i dont know, how can i convert this (with gamma and other)

(work with vc c++)
thanks for help
Greetings

···

--
+++ GMX DSL Premiumtarife 3 Monate gratis* + WLAN-Router 0,- EUR* +++
Clevere DSL-Nutzer wechseln jetzt zu GMX: http://www.gmx.net/de/go/dsl

Why don't you simply modify the code in ra_bmp.c in the Radiance HEAD distribution? This may be found at:

-Greg

···

From: [email protected]
Date: October 10, 2004 11:10:15 AM GMT+01:00

hi,

i want to make a simple hdri (radiance-pic)-viewer
i have enocodet a hdr-image and as result i have floats from the 32Bit float
triplet
how can ein convert this values in a optimized bmp ?
there is a formal for it ?
now i make this

for example:

r_float_value = 2.00;
g_float_value = 3.00;
b_float_value = 5.00;

r = 2*factor;
if(r>255)r=255;
g = 2*factor;
if(g>255)r=255;
b = 2*factor;
if(b>255)r=255;

this shows me a hdr-image.
factor is a userdefined value. the user can modify this in runtime. while
changing this the picture show me the different details. okay .. this is a
strange work, but i dont know, how can i convert this (with gamma and other)

(work with vc c++)
thanks for help
Greetings