Why do I need a Decklink?

Home Forums General Discussion Why do I need a Decklink?

Viewing 2 posts - 1 through 2 (of 2 total)
  • Author
    Posts
  • #10009

    JoeSchr
    Participant
    • Offline

    Hi,

    maybe someone here can help me understand, why people keep telling me to get a Decklink while using Resolve. My situation is as follows:

    I have:

    • A widegamut 10bit monitor, I can calibrate and install a rec709 LUT to:
      • Samsung U32D970q via their Natural Color Expert Software
    • Windows 10
    • NVIDIA Settings
      • Change Resolution->Use NVIDIA color settings
        • Desktop Color depth: Highest,
        • Output Color Depth: 10bpc,
          • (But 10bit doesnt actually work since I last tested it)
        • Color Format: RGB
        • Output dynamic range: Full
      • Adjust video color settings -> Advanced -> Full (0-255)
    • Resolve
    • GTX 970 connected via DisplayLink to Monitor

    So how I see it, if I just use the method from the FAQ Guide “Creating a 3D LUT for the GUI color viewer” to create and use the LUT in Refrence GUI Viewer, I should be dandy without the Decklink, right?

    This is how I would use the Decklink (I just bought it):

    Why is this any better than the other method, where I just use the GUI color viewer?

    Assuming I have my stuff in order (always using same rec709 mode on my monitor, always using the same profile in windows CMS).  at least this is how I understood this passage in the resolve manual:

    When the image looks the same on the GUI Viewer (in Fullscreen) and DeskLink it means I didn’t mess something up on the OS side, right?

    The only remaing plus point for the DeckLink would be the 10bit, isn’t it? Or how somebody else on the BMD forums described it to me:

    The last paragraph in that extract says it all from “Strictly speaking…” onwards. Consumer gpus and computer displays are limited to 8bit and often the displays can in fact sometimes be 6 bit. The colour space is completely different to video. Although you can emulate 601/709 you are still limited by the bit depth. Most of that extract refers to grading for the web which is far less critical.

    You can only grade for professional broadcast online HD with full spec I/O hardware and a true grade A calibrated 10 bit monitor to Rec. 709. Happily most cost effective I/O hardware, even the Shuttle, can achieve this, these days, but the required quality monitors are still not cheap. When you come to grading for 2K/4K and new technologies such as HDR, those costs can spiral and the technical demands are even more critical.

    But Full HD is 8bit as far as I know and sRGB is not a completely different colorspace then REC709, right?

    I just have the feeling, that behind all this reference monitor and buying extra cards and equipment is a lot of FUD propaganda. So people throw money at it to have peace of mind and save time trying to understand what they actually need.

    Am I wrong here? If so please help me understand, thx!

    #10050

    Florian Höch
    Administrator
    • Offline

    Hi,

    NVIDIA Settings
    […]
    Output Color Depth: 10bpc (But 10bit doesnt actually work since I last tested it)

    It needs software specifically written to make use of the additional available bits (basically requires using DirectX 11 or OpenGL as far as I understand). Otherwise, only the videoLUT (graphics card gamma tables) will benefit from the additional bitdepth (but due to bugs/quirks in nVidia’s implementation, even that does not work reliably, e.g. 10bpc videoLUT is reproducibly lost after resuming from hibernation or standby on my system).

    So how I see it, if I just use the method from the FAQ Guide “Creating a 3D LUT for the GUI color viewer” to create and use the LUT in Refrence GUI Viewer, I should be dandy without the Decklink, right?

    Yes.

    This is how I would use the Decklink (I just bought it):

    […]

    Why is this any better than the other method, where I just use the GUI color viewer?

    Some people prefer a dedicated vieo monitor for a variety of reasons:

    • To bypass the system graphics card and drivers and any restrictions that may come with it (e.g. bitdepth)
    • To use the GUI monitor mainly for scopes/small preview etc with a bigger preview available via the video monitor
    • etc.

    But Full HD is 8bit as far as I know

    The point of high bitdepth output when applying a video calibration 3D LUT to 8-bit data is to not loose effective bitdepth (which would happen if staying in 8-bit). Thus it makes sense to use a higher output bitdpeth even if only working on 8-bit data (or atleast dither down to 8-bit on output after applying the 3D LUT if no higher output bitdepth is available).

Viewing 2 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic.

Log in or Register

Display Calibration and Characterization powered by ArgyllCMS