Here's a fun game. Count the number of technical blunders and flat out PR lies in this article.

Golgatha

Lifer
Jul 18, 2003
12,396
1,069
126
http://www.wired.com/news/technology/computers/0,70870-0.html?tw=rss.index

Favorite quote:

Even with today's systems, there are caveats. Don't expect dual graphics cards to offer much improvement in older titles such as Quake 3, Unreal Tournament or Half-Life 2, or in other games that rely more on CPU computation and less on graphics processing. The frame rates might even decrease when playing some older games using SLI or CrossFire dual-graphics cards compared to using a single card, especially if the monitor resolution is set to a low value.
 
Apr 17, 2003
37,622
0
76
Originally posted by: fierydemise
Wow, I didn't know $1200 was a modest investment

i dont think it says that:

Such systems could cost $1,200 or more when they first became available from ATI and Nvidia last year. But now gamers who make a relatively modest investment can easily tap the improved resolution and faster frame rates that dual-card systems offer.
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Originally posted by: Corporate Thug
Originally posted by: fierydemise
Wow, I didn't know $1200 was a modest investment

i dont think it says that:

Such systems could cost $1,200 or more when they first became available from ATI and Nvidia last year. But now gamers who make a relatively modest investment can easily tap the improved resolution and faster frame rates that dual-card systems offer.
Ah my mistake, still its wrong in general its better to go with 1 faster card then 2 slower ones, you'll get better performance, and fewer potential problems.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Its not meant for an "expert" (and I use the term both facetiously and sarcastically) because anyone here attempting to talk down about David Kirk (or a scientist from ATI for that matter as well) or an article he might have provided info from, in this forum is a twit.

HL2 is much less advanced with shading/shadows than Doom3 and Oblivion. I believe thats what they meant there as far as it being an old game. As far as engines go, in a forced ranking Source would prob sit next closer to UT04 than the D3 engine.

$1200 is a modest investment for a college educated white collar worker. Not sure I saw the article said that anyway as Corporate Thug pointed out.
But in the scheme of life $1,200 is little for the hard working college educated folk.
What the article says is that its much cheaper now.. opening up dual cards to the mass public.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Its not meant for an "expert" (and I use the term both facetiously and sarcastically) because anyone here attempting to talk down about David Kirk (or a scientist from ATI for that matter as well) or an article he might have provided info from, in this forum is a twit.

Regardless of who the article is meant for, even as a non-technical informative article it is bad. I don't really have an issue with the quotes from NV or ATI employees, since I don't expect them to say anything bad about their products. It is the content like this that bugs me:
Higher frame rates let gamers splatter monsters faster and allows them to set their games to the maximum resolution with less risk of system instability.

This is simply not true. If your system isn't stable, adding another video card to increase your frame rate is not going to make it more stable or resolve anything other than performance related issues. Furthermore, a dual card setup wouldn't let me "splatter monsters faster" than I do now. That sentence to me sounds more like a string of buzzwords than it does anything resembling useful knowledge.
 

Golgatha

Lifer
Jul 18, 2003
12,396
1,069
126
2nd place for favorite quote...

"The concept of dual-card graphics is nothing new, but previous iterations simply failed to make the mainstream. A company called 3DFX, which Nvidia eventually bought, introduced dual-card graphics several years ago, but the technology suffered technical hurdles and never took off with motherboard makers."

Never took off with motherboard makers? This was back in the day of boards with 4-6 PCI slots and you only needed two of those to make SLI work. The only technical hurdle I can think of for V2 SLI was for higher 2D resolutions. The pass through cable sometimes made the picture fuzzy a bit when you where on the desktop at 1600x1200. Back then it was mainstream for the high end, just like SLI and Crossfire are mainstream for the high end.