Because you generally don't want to convert it as-is, you want it toned into a range that LDR images (8 and 16 bit/channel) can handle.
And because after you convert from 32 bit/channel down to 8 or 16, you've lost a lot of data that you can't recover.
i tested that if i convert to 8 bit using exposure and gamma method with default settings, the appearance of the hdr image remains the same, it seems that no color data was lost there(this relating to the second sentence of your post). to me the modifying of all those lighting settings is something u choose no matter what type of picture u have in front of you, doesnt look like needed for hdr ones. i may be very much mistaking, but it's because i still cant make sense of it.
Yes, you are mistaken.
HDR is High Dynamic Range. You can't see the whole range on your display, or in an 8 or 16 bit image. LDR images represent a range of 0 to 1. HDR represents a range of -infinity to +infinity.
When you converted to 8 bit, you lost the data brighter than white, and darker than a few levels above black.
You also have to remember that to make an HDR image, you need many LDR images made at different exposures, or a high precision 3D rendering of a scene. You can't just convert an 8 or 16 bit image to 32 bit/channel and instantly have high dynamic range data.
bare with me if you please and i am not bugging you too much:)
"When you converted to 8 bit, you lost the data brighter than white, and darker than a few levels above black."
i thought i cannot see that data anyway, due to the limitation of the monitor. so what difference does it make?
Just because you couldn't see the data on your monitor does not mean that the data does not exist, or shouldn't be visible when the image is toned correctly.
Many of Ansel Adams's negatives would have had blown out white areas when printed - but he spent time tweaking the exposure, dodging and burning to make more detail visible. That's the same sort of thing you are doing with HDR toning.
ok, one more thing i want to ask. is it safe to say that normally viewing of a hdr in.. say acdsee wouldnt reveal its true quality, but in photoshop, with necessary adjusting, all that "hidden" data becomes available? so to speak, only in photoshop?
Unless you do some toning to make the HDR data visible in LDR, or use a specialized HDR display, you can't see all the HDR data at one time. But there are other applications that also work with HDR data, and can do toning or at least change the exposure in realtime.
I don't know if this will help, but I've heard about HDR monitors (still very expensive), which supposedly very accurately display the HDR values. If you hang one on the wall in your house, people might think it's a window, since it is literally a source of light, in the same way a window to the outside world is.
Possibly the most surprising use of HDR imagery is in 3D apps where global illumination models are used. HDR imagery literally is the only source of light for these images. There are no lights in the scene, only HDR imagery. This is because of the great dynamic range of HDR.
A more practical (to me) use of HDR is when resurrecting antique images. If you scan multiple exposures from old prints, then create HDR, sculpting values in 16-bit becomes a very powerful tool, only due to the HDR value range.
Photoshop, although it can edit very well in 16-bit, can only show 8-bit representations of HDR imagery, since almost all monitors are 8-bits per channel. Because of the monitors, we can only see a small part of what is actually present in the HDR file.
I've never done it, but I imagine that if we all had HDR monitors, we'd need to wear protective glasses (like you wear driving in your car when the sun is out) most of the time to do our work. :+)
Yeah, "white" on the HDR displays kinda demands sunglasses.
as u said, when converting hdr to 8 or 16 bit, information is lost. then how can i reveal back information(via tonal mapping) if it has been lost in the process? cos adjusting gamma for example isnt adding back information to each pixel, is it? its still 8 or 16 bit.. it sounds to me like "do all this adjusting and u will see the 8 bit version as good as the hdr would show on a hdr display device, whatever that may be, but by definition of the hdr u simply cant", thats why im so confused.
thanks for the time and patience, if im pushing it too far, just feel free to stop answering, i will totally understand
Once you reduce the bit-depth from an HDR image, the values are still there, there just aren't as many of them. Your 8-bit monitor will still show all the values that it can show you, but if the image is 8-bit, what you see on your mopnitor is [all] the values there are in the image.
I find the best strategy is to retain the 32-bit image for any potential future changes, reduce it to 16-bit for any post-HDR retouch and adjustment and retain that version as well. Then for emailing clients or putting images on the web, an 8-bit conversion (as in a jpeg format) works perfectly. And you still have the higher bit-depth versions in case something needs to be adjusted in the future.
When you do the toning, you are compressing the larger HDR tonal range into the smaller LDR tonal range.
You aren't adding anything back, you are compressing the range to make more things visible before finalizing the conversion to an LDR image. And no, LDR images are never as good as HDR images.