Jaydip
Diamond Member
- Mar 29, 2010
- 3,691
- 21
- 81
![]()
![]()
Least AMD didn't furbar tri and quad scaling for single monitors. Srsly thats just pure fail there.
Fail review.U shouldn't test single monitor gaming with 3 or 4 680s or 7970s
![]()
![]()
Least AMD didn't furbar tri and quad scaling for single monitors. Srsly thats just pure fail there.
He probably has them running in PCIe 3.0 whereas the review is 2.0
You have to be a true fan to cite scaling, while ignoring the fact that it has more pronounced MS because of it though.
Everyone knows 5xxx series had issues with scaling, and AMD solved that with 6xxx by no longer attempting to combat MS.
Maybe "U" in fact should when even 4 flagship cards can't get you a solid 60fps at 2560x1600.Fail review.U shouldn't test single monitor gaming with 3 or 4 680s or 7970s
Maybe "U" in fact should when even 4 flagship cards can't get you a solid 60fps at 2560x1600.
Don't be silly no card needs more than 256mb memory because as the Frostbite engine dynamically adjusts the level of detail you wouldn't suffer any loss of quality.and you are missing MY POINT. you dont gimp your fastest card by not giving it enough vram to handle every playable situation. you dont cut down the vram and tell customers not to play games with settings that create oddball situations. 1.5gb will limit you now in some cases so it would be stupid to equip a 7970 with hit.
That proves how poorly optimized the game is.Tri or quad fire or sli should be tested in eyefinity resolutions.Maybe "U" in fact should when even 4 flagship cards can't get you a solid 60fps at 2560x1600.
Read the forum thread over at hardocp...it scarees me that so many people are ignorant about microstuttering.
It's like watching lemmings...go ape*bleep* over the current FOTM:
OMG...GHZ!!!!
OMG...FPS!!!!
WTF...IPC?!
WTF...Micro-what?!
![]()
What? I was responding to a post stating single monitor shouldn't be tested at all, and my point was that it can be far from a useless inclusion.Who ever promised you that?
Fallacy time?
Personally I agree about the poor optimization and desire for more focus on (multi-monitor) resolutions people will be buying these cards to play at. Just remarking that in certain cases even single monitor resolutions aren't even maxed so it's informative to include those (as a measure of just how unoptimized some things remain if nothing else).That proves how poorly optimized the game is.Tri or quad fire or sli should be tested in eyefinity resolutions.
Using a frame limiter to match the refresh rate of the monitor can eliminate practically all traces of MS, but it can be annoying to get it to work across all games.
Again... BF3 adjusts LOD depending on available VRAM, the idea is to max out your VRAM regardless of how much you have. If you have 3gb, bf3 will max it out. If you have 2gb, bf3 will max it out. Its not like your performance will degrade if you go from 3gb > 2b, LOS just lowers slightly and the performance will be the same. This was all covered by a DICE developer at last years Geforce LAN....
This is the only game i'm aware of that even has such a mechanic. Anyway, i'm saying, give customers a choice. I don't see an issue with that - it won't happen anyway because AMD are clueless.
I was getting really tired of microstutter. Last night I tried adaptive vsync for the first time and so far signs of ms have been decreased to the amount that it no longer bothers me. Sure, it's only after one night of using it but it looks promising.
Im looking forward to trying out adaptive vsync,sounds great on paper.
Again, it's only after one night but you won't be disappointed. Gameplay was a much smoother experience, just as they promised.
Important information for the discerning customer willing to spend $2,000 on graphics cards but no more than $200 on a monitor. I'm going to go out on a limb and guess that people who are actually going to run three or four 7970s or 680s are going to be running multiple monitor setups, and, as of right now, the only ATI driver that allows that is 5 months old. That's a pretty major oversight, and it makes nVIDIA the de facto winner at the bleeding edge level (multi-GPU with multi-monitor).
Important information for the discerning customer willing to spend $2,000 on graphics cards but no more than $200 on a monitor. I'm going to go out on a limb and guess that people who are actually going to run three or four 7970s or 680s are going to be running multiple monitor setups, and, as of right now, the only ATI driver that allows that is 5 months old. That's a pretty major oversight, and it makes nVIDIA the de facto winner at the bleeding edge level (multi-GPU with multi-monitor).
AMD does seem to be pretty slow with driver fixes and support recently, and some problems have just been ignored completely. I guess it's due to supporting 3 architectures at once (VLIW-4, VLIW-5 and GCN), so it's good to hear they are dropping VLIW-5 support in the near future, but I hope this translates into strides with their GCN drivers.
Oh, I don't think they are dropping VLIW-5 any time soon. There are 7xxx series cards (mobility) that are still VLIW-5, and Llano. That's here for a looooong time to come
It's Pre-DX11 cards that they have pushed in front of the bus![]()
Looks like AMd really needs to sell more GPU's than they do:
http://www.theinquirer.net/inquirer/news/2169235/amd-reports-usd580m-loss-graphics-sales-fall
So would I but the problem I have is that it's more than 6 months since they released this card and AMD typically stop making improvements to the performance of a card once it's successor arrives on the scene so we have until this autumn probably before the 7900 is 'retired' and we likely see no further improvements. I hope I'm wrong but that appears to me to be the case (I've been with ATI/AMD since 4870 replaced my 8800 gtx but the green team's driver support is pushing me back towards their camp despite myself). AMD need to spend a little more on driver engineers for their GPU's so stuff works off the bat not 6 months after launch.
