Topic: So is 10bit support going to happen? We'll need it for HDR...
HDR requires the use of 10bit if you don't want to end up with color banding when using an HDR-capable display. While HDR media isn't that widespread, it'd be better to get support implemented before widespread use occurs and SVP ends up behind the curve.
Here's a sample 10bit HDR clip in 4k:
http://files.hdrsamples.com/downloads/h … HD_HDR.mp4
EDIT: It must be noted however that Dolby Vision HDR uses 12bit, so perhaps it'd be wiser to have a method of support that's more flexible and independent of whatever the color depth is (maybe even do 16bit internally and then output at whatever the display is running at?).