Phocus and The Colorspace

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Hassilistic

Prologue:
There is an old topic I came across which goes back 8yrs, but still comes up in various forms today all over this forum.

Pls see link before continuing to read below:  http://www.hasselbladdigitalforum.com/index.php?topic=151.0

The couple of points I'll highlight today apply to what was the issue witnessed back then, and what possibly could still be happening with all our modern devices.  (this is of course not comprehensive)

However it is the most common scenarios:

1- When we are viewing the images on Phocus Software, 100% of the time we are reviewing and editing the [RAW] Hasselblad files.  What you need to know is, Raw files have no colorspace so to speak.  Raw data is linearly mapped.  A color working space is applied when the image is Exported as the example here to Photoshop.  At that time the linear data is gamma mapped to the space you have selected in ACR.  Working on the linear data is less destructive than working on the gamma mapped data... Which is one of the advantages to shooting Raw.

The RAW data in a DNG does not have a color space but the embedded preview does.  When you open the RAW data in a DNG - say by opening the file in Adobe Camera Raw - the data will be opened *into* a colorspace.

The embedded preview is built in sRGB colorspace, but does not actually have a tag in it (this would only confuse matters). Applications that know how to render DNGs know that this preview is actually an sRGB file, and may indicate that the file has that tag. 

Note that setting the colorspace in your camera will not alter the RAW image data.  Rather, it will just put a tag in the file as to what colorspace manufacturer's software will prefer to open the image into.  It will also affect how the JPEG is built if you are shooting JPEGs.

2- The Output devices, as the case here being the EIZO monitor will greatly be affected by how the PC connects to the Monitor, now-a days, its via HDMI, which is no good at all, as HDMI 1.3, 1.4 aren't capable of displaying the Wide Color Gamut generated by these modern camera's and computers, hence the delay in 4K media material, until HDMI 2.0 is ready that won't change.  Plus HDMI cables have varying levels of thorough-put based on their quality, so make sure it can consistently carry more than +10GFlops . 

PS. I have personally found that Thunderbolt cable connected to my EIZO Monitor yields much better results that are instantly recognisable without a shred of doubt.

On another note:
You have probably heard me mention on more than one occasion; Never process a Raw File twice (its redundant); By Converting it again into another Raw File which in this case the files were exported from 3F to DNG.  I understand that the intention was to save HDD space, but the workflow is not a good one.  As suggested, an 8BIT TIFF would have been a much better choice.

Cheers,