What's the consensus on 8bit vs. 10bit? (2713H vs. HM)
I suppose I can search for threads I'd imagine that's been discussed plenty.
10 bit has virtually no software support, apart from a very few OpenGL apps, so simply being 10 bit does almost nothing. Windows is 8 bit all the way, any 10 bit rendering must be done using the graphics card driver, bypassing any of the windows graphics functions (so needs specifically written software).
However, 10 bit has a minor advantage, if you are recalibrating. Most graphics cards can store a look-up-table which allows colors/brightness to be modified by the card after rendering, but before the signal is transmitted to the monitor. With a 10 bit monitor, you get the advantage of a 10-bit lookup table. However, if you are calibrating, then this is very much a poor-man's option, even with 10-bit.
As it is, the H supports internal 14-bit calibration, and any decent calibration tool can upload a new calibration into the H's internal signal processor.
However, this is not the only difference between the H and the HM.
The H is wide gamut, so can display most of the Adobe RGB color-space, whereas the HM is just about sRGB capable.
The other difference is that the H supports internal re-calibration, whereas the HM cannot be re-calibrated. The HM is supplied from the factory calibrated to sRGB. However, if you need a different calibration (e.g. for mac compatibility, or for medical use) then the HM cannot be recalibrated without seriously degrading image quality.