Monitor for Photography

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Puffnstuff

Lifer
Mar 9, 2005
16,033
4,798
136
Believe it or not windows 8.1 pro x64 with classic shell is better than win 7. I ran win 7 ult x64 since '09 until a few months ago when I received win 8.1 and decided that the superior underpinnings were more important to me. Once I got it installed I immediately installed classic shell and returned it to the win 7 appearance and with 8gadgetpack I also got my desktop gadgets back. Once I got my desktop finished I immediately did the same to my laptop replacing win 7 home premium x64 with 8.1 pro x64. There's nothing wrong with win 7 so don't think that I'm bashing it as I have several valid versions of it in reserve just in case I should desire to roll back. Win 8 does load the security features first so nothing can usurp them which is a nice improvement from win 7.

I've tried win 10 tech preview and I like how it gives the user the choice to have desktop or metro appearance. MS should've never removed it from win 8.x and I'm glad it's coming back in official form. Meanwhile classic shell is holding the fort down.
 

nvsravank

Member
Jul 14, 2005
54
0
0
Oh nothing against win 8. I was planning on upgrading to it until Microsoft announced free upgrade to 10!
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Thanks for all the replies folks. I want to use a PCIe 3.0 card for future upgradeability. I checked and my asus p8z77v supports PCIe 3. So I don't want to use a very old card. So either gtx 970/980 or quadro 2200 or R9 285/x

I found one 970 that supports multiple dp ports.

I understand that I won't get 10 bit colors with 970/980

I am going to figure out if maxwell 2 is being released in quadro at a price I can afford. Obviously quadro 6000 is way out of my range!

If I need to run my Samsung as a second monitor I don't know which of these cards can support it. Some more research to do.

My current card is Radeon 6870. Would it be easier to stay with ati to allow multiple monitor support with the new monitor run from the new card and the current monitor from the current card? Can I even do that?

Appreciate any direction as to the research to do. I do hope HP comes out with a compatibility list to help as well but I doubt as the choices are much more now.

I read somewhere that I need windows 8.1. Is that really necessary? I was hoping to get the free upgrade to 10 and just skip 8 altogether. Currently on windows 7.

PCIe version on the card is basically irrelevant because as long as the motherboard has an equal or later generation PCIe, the card won't lose any speed or anything. So I wouldn't worry about that, it's made to be forwards and backwards compatible.
 

nvsravank

Member
Jul 14, 2005
54
0
0
I have to open up my box and see if I have enough power to drive a R9 290x. Seems like what I need to if I want to play games at all!

Thanks for all the ideas folks
 

deanx0r

Senior member
Oct 1, 2002
890
20
76
4k or 5k under windows doesn't look so good. If this is strictly for photo editing, I would take a look at the iMac with 5k retina display instead.
 

Kippa

Senior member
Dec 12, 2011
392
1
81
Just for clarification does all Nvidia GeForce only do 8bit colour whilst the latest mainstream AMD cards do 10bit? I currently have a Titan MK I but am thinking of crossing over to AMD as my 3D render program (Octane) is going to directly support OpenCl and I think that AMD card is more value for money (not Nvidia bashing I am brand agnostic).

Basically my next upgrade is to get a true 10bit 4K monitor and gfx card to go with it. I will be using it for gaming, photo editing (true 10bit would be nice), small amounts of video editing and 3D rendering. The plan is to wait out a month or two and see what the 390X card is like.
 
Last edited:

nvsravank

Member
Jul 14, 2005
54
0
0
4k or 5k under windows doesn't look so good. If this is strictly for photo editing, I would take a look at the iMac with 5k retina display instead.


Deanx0r, I had a Mac Pro for the longest time. I got out of Mac because upgrading it was not so easy. I had it for a long time because upgrading it was mostly about changing the video card (hard to find the latest ones that work) or hard disk! I prefer to upgrade components every year so that in 4 years I basically have a new computer. Last year was SSD move and hdd increase, This year is monitor and card. Next year is back to mono and cpu. Can't really do that with a Mac.
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
I thought 300 series are all rebadge? Any time soon that power will come down for amd cards?

The 390X is emphatically not a rebadge. Not sure what the power will be on them but the efficiency should be going up in a big way from added performance.
 

nvsravank

Member
Jul 14, 2005
54
0
0
OP, what are you shooting with?


5D mark III

Going to try the 5ds and 7dii this summer before plunging in. Hoping we get some details of the replacement for 5d mark III by then! Yes I don't consider the 50mp cameras a replacement for a do it all like the mark III
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
AMD makes zero cards that can output 5k resolution. Even their most expensive, $3,100 professional card cannot output above 4k: http://www.amazon.com/Sapphire-Disp..._sbs_pc_4?ie=UTF8&refRID=0C9P0NJF0ES07H1PD5QK

nVidia makes multiple cards that are less than $250, which can handle 5k resolution: http://www.newegg.com/Product/Produ...7128&cm_re=GTX_960_4GB-_-14-487-128-_-Product http://www.newegg.com/Product/Produ...5778&cm_re=GTX_960_4GB-_-14-125-778-_-Product

The cards have plenty of bandwidth. It's hard to tell from the promo materials, but that should just be a matter of a driver update. It might have already been done. I would check a bit deeper before simply dismissing AMD. 10bit was a requirement by the OP.
 

ruben_c

Junior Member
May 30, 2014
4
0
0
As far as a calibration device, I'd go with an i1 Display Pro. They run 225-250 and are considered to be quite a bit better than anything spyder makes. They also are supposed to keep their accuracy longer than other colorimeters. I have one, as well as a colormunki spectro to keep it accurate, and after 2 years it hasn't drifted very much.

I would also recommend calibrating the monitor, this is always the first and most important step.

Actually I'm surprised by your state, I'm using a Spyder4Pro for 4 years now and everything seems ok. My print's come as I saw the pictures on my screen when editing... will be observating though...
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126

Wow really dude, you've just GOT to post misinformation at every turn where it would result in a guy making an nVidia purchase?

Your point is disproven with a google search that took me literally 5 seconds. http://lmgtfy.com/?q=amd+card+2x+display+port

This is blatant partisan trolling

==================

OP: 10-bit is really pretty important if you're serious about your photo reproduction on the screen. I'd say bite the bullet and go with a professional card that's got the capability to support it, even if it means stepping down to 4k from 5.

Also look into what alternative codepaths exist for the plugins/effects you use most frequently. A good few effects in Photoshop can use both CUDA and OpenCL gpu-accelerated code but some are still CUDA only, so you might prefer to pick an nVidia card if you need those effects to be accelerated.
 
Last edited:

lopri

Elite Member
Jul 27, 2002
13,209
594
126
Doesn't the 6870 have two DP outlets already? Why not try it out to see if it works?

You can get 10-bit colors on consumer AMD cards with some hack. I forgot how but there are ways. It comes down to the drivers, which is why professional cards do 10-bit colors. What you read is correct: Everything has to support 10-bit for it to work. Display, video card, and the OS and software.
 

nvsravank

Member
Jul 14, 2005
54
0
0
Doesn't the 6870 have two DP outlets already? Why not try it out to see if it works?

I think i will though i dont expect it to work for 5K :)


OP: 10-bit is really pretty important if you're serious about your photo reproduction on the screen. I'd say bite the bullet and go with a professional card that's got the capability to support it, even if it means stepping down to 4k from 5.

I think NVIDIA Quadro 2200 does both. So If i go professional cards, I think i have atleast one that works for all my needs/wants nearly in the price range i am looking for as well. Just worried that it is previous generation. I tend to like to get latest generation mainstream stuff normally, but i guess my wants of 5K means i cant do that! Just need to get used to that idea. Talk myself into it. Maxwell 2 is very very tempting except as you said it doesnt do 10 bit! What is the direct replacement for NVIDIA 2200?

Also look into what alternative codepaths exist for the plugins/effects you use most frequently. A good few effects in Photoshop can use both CUDA and OpenCL gpu-accelerated code but some are still CUDA only, so you might prefer to pick an nVidia card if you need those effects to be accelerated.

So you are saying CUDA support is better for most GPU acceleration is photoshop. I dont do much effects / plugins. Nik and perfect photo are the only two suites i use. I might use them for maybe around 10 photos per shoot. So not much of an issue with time (I do a shoot every week during season). I guess another reason to go with NVIDIA!

Any reason why NVIDIA doesnt support 10 bit color? Just arbitrary differentiation?
 

nvsravank

Member
Jul 14, 2005
54
0
0
I think NVIDIA Quadro 2200 does both. So If i go professional cards, I think i have atleast one that works for all my needs/wants nearly in the price range i am looking for as well. Just worried that it is previous generation. I tend to like to get latest generation mainstream stuff normally, but i guess my wants of 5K means i cant do that! Just need to get used to that idea. Talk myself into it. Maxwell 2 is very very tempting except as you said it doesnt do 10 bit! What is the direct replacement for NVIDIA 2200?


Doh! I thought because the 600 series was kepler the K220 was also kepler. Came to know it is GM107 part today! I guess the decision time is done. Thanks folks!
 
Nov 20, 2009
10,048
2,576
136
5K resolution on a 27" monitor seems a bit absurd when you ask yourself if you can even resolve that resolution with your own eyes. How close does one need to sit to the monitor to be able to benefit from it? Everyone buying into the marketing hype, even the blind.

Woohoo, I got bling!
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
5K resolution on a 27" monitor seems a bit absurd when you ask yourself if you can even resolve that resolution with your own eyes. How close does one need to sit to the monitor to be able to benefit from it? Everyone buying into the marketing hype, even the blind.

Woohoo, I got bling!

This is for photo work... with potentially very high DPI printing... so yes it's important.

OP: you'll have to google the particular photo suites you use and see if they utilize GPU acceleration