Originally posted by: Frackal
How come? I've read it tends to give good performance gains w/o much if any image quality degradation
Originally posted by: Frackal
Neither of what you say is valid in this case.
The same cable from monitor to video card (it's DVI) is being used in both cases
And I am familiar with digital vibrance / saturation, and it's not the case either.
How exactly is it that the ATI card gives a sharper picture in my bios (where I can see each pixel rather than a white sem-blurry letter) if it's just "digital vibrance?"
Read the post man.
I don't know what the difference is, but I still think the ATI "part" responsible is simply higher quality than the nvidia one, which is unacceptable on nvidia's part due to the high cost of the cards.
Originally posted by: redbox
Your the only one I have ever heard that has a problem with the signal quality. I swear it is all in your head.
Originally posted by: redbox
Originally posted by: Frackal
Neither of what you say is valid in this case.
The same cable from monitor to video card (it's DVI) is being used in both cases
And I am familiar with digital vibrance / saturation, and it's not the case either.
How exactly is it that the ATI card gives a sharper picture in my bios (where I can see each pixel rather than a white sem-blurry letter) if it's just "digital vibrance?"
Read the post man.
I don't know what the difference is, but I still think the ATI "part" responsible is simply higher quality than the nvidia one, which is unacceptable on nvidia's part due to the high cost of the cards.
Your the only one I have ever heard that has a problem with the signal quality. I swear it is all in your head.
Originally posted by: Frackal
Originally posted by: josh6079
Originally posted by: josh6079
Okay, here is a clip of me deer hunting in the woods, submerged in foliage near some temple ruins:
Click
Now, here is what happend when I found those deer:
Click
I flubbed up and accidently took out my torchbut I got my bow out soon enough and started shooting. The frames were good enough for me to accurately hit the deer on the run while I was jumping over a rock, then land and hit another deer on the run.
4xAA+HDR, 16xHQAF, 1680x1050, max Oblivion settings except for self shadows and grass shadows, Vsync+Triple Bufffering, 2048x2048 LOD mod, enhanced tree shadows, Jarrods re-Texture mod, Natural Environment mod, (many other mods but none else that would effect the frame performance i.e. weapons and armor and such)
Did........um.......no one see.......those................clips..............
My system specs are in my sig, but just in case a certain crow thinks that I'm gaming with a $1000 CPU, I'm not. Opteron 148 Venus presently at stock.
Hey is that "Lord of the Rings" music? How'd you get that?
Originally posted by: josh6079
Originally posted by: Frackal
Originally posted by: josh6079
Originally posted by: josh6079
Okay, here is a clip of me deer hunting in the woods, submerged in foliage near some temple ruins:
Click
Now, here is what happend when I found those deer:
Click
I flubbed up and accidently took out my torchbut I got my bow out soon enough and started shooting. The frames were good enough for me to accurately hit the deer on the run while I was jumping over a rock, then land and hit another deer on the run.
4xAA+HDR, 16xHQAF, 1680x1050, max Oblivion settings except for self shadows and grass shadows, Vsync+Triple Bufffering, 2048x2048 LOD mod, enhanced tree shadows, Jarrods re-Texture mod, Natural Environment mod, (many other mods but none else that would effect the frame performance i.e. weapons and armor and such)
Did........um.......no one see.......those................clips..............
My system specs are in my sig, but just in case a certain crow thinks that I'm gaming with a $1000 CPU, I'm not. Opteron 148 Venus presently at stock.
Hey is that "Lord of the Rings" music? How'd you get that?
I replaced the mp3's that the Oblivion uses by default with the Lord of the Rings mp3's that I have. I have them set to have the Nine Riders song play when I get into fights and things as well as the Bridge of Khaza-Dum or whatever you want to call it. Basically, I went through and set the appropriate songs with the appropriate atmosphere's. The Hobbit's Shire song is set to play in cities and sometimes when exploring and things too.
Originally posted by: tuteja1986
This thread needs to die ;( its a total flame war :!
Originally posted by: Frackal
Originally posted by: redbox
Originally posted by: Frackal
Neither of what you say is valid in this case.
The same cable from monitor to video card (it's DVI) is being used in both cases
And I am familiar with digital vibrance / saturation, and it's not the case either.
How exactly is it that the ATI card gives a sharper picture in my bios (where I can see each pixel rather than a white sem-blurry letter) if it's just "digital vibrance?"
Read the post man.
I don't know what the difference is, but I still think the ATI "part" responsible is simply higher quality than the nvidia one, which is unacceptable on nvidia's part due to the high cost of the cards.
Your the only one I have ever heard that has a problem with the signal quality. I swear it is all in your head.
It isn't just me. Others mentioned it before me. I think perhaps (no offense) it takes a higher quality monitor to highlight the difference. Go check Anand's reveiw of the 2005fpw, it's a top quality monitor (why I bought it) and has a superb panel. From seeing it directly I can understand why it might not show up on all monitors.
On the first display, the MSI NX6800 Ultra-T2D256 produces the UXGA resolution at a frequency of only 141 MHz, i.e. with a reduced blanking interval. This can lead to problems with older monitors (see MSI FX-5700 Ultra-TD128). Other than that, the eye diagrams show a very homogeneous distribution of data.
Result: DVI compliant at 162MHz
The ATI Radeon X800 XT PE (Built by ATI) is also fully DVI compliant at 162MHz (UXGA). The eye diagrams show a very homogeneous distribution of the data. An exemplary result.
Result: DVI compliant at 162MHz
As the tests further on in this article will show, a general statement can't always be made about the quality of the DVI output on ATi and NVIDIA cards. Even if a separate TMDS chip known for its quality is used on a card - that does not automatically mean that every card that features this chip will offer a good DVI signal. Even the placement on the PCB of the graphics card can have a huge impact on the end result!
I thought we have been using 24" lcds with just one DVI connection.The DVI standard specifies a maximum pixel frequency of 165 MHz (Single Link). Thanks to the ten-fold multiplication of the frequency described above, this results in a peak data rate of 1.65GB/s, which is enough for a resolution of 1600x1200 at 60Hz. If higher resolutions are required, the display would need to be connected via Dual Link DVI, which uses two DVI transmitters together resulting in twice the bandwidth.
Signal quality cannot be an issue in a pure digital connection. This isn't open to debate. Logically, you must be tying together unrelated problems and/or seeing things. Then again, you haven't said anything remotely useful in identifying the root cause of your "poor signal quality" problem.Originally posted by: Frackal
Neither of what you say is valid in this case.
The same cable from monitor to video card (it's DVI) is being used in both cases
And I am familiar with digital vibrance / saturation, and it's not the case either.
It is because you're introducing an entirely unrelated "problem." The fonts drawn when in the system BIOS are stored in the video card's BIOS, and they vary between ATI and nVidia cards. It's called a character display mode.How exactly is it that the ATI card gives a sharper picture in my bios (where I can see each pixel rather than a white sem-blurry letter) if it's just "digital vibrance?"
I did. Know what you're talking about before you set ultimatums for a company, man.Read the post man.
I think it's unacceptable to hold a company responsible for an unspecified problem.I don't know what the difference is, but I still think the ATI "part" responsible is simply higher quality than the nvidia one, which is unacceptable on nvidia's part due to the high cost of the cards.
You might try reading the article. FYI,Originally posted by: nts
Originally posted by: redbox
Your the only one I have ever heard that has a problem with the signal quality. I swear it is all in your head.
ATi signal quality > NVIDIA signal quality
Here is an old article, http://www.tomshardware.com/2004/11/29/the_tft_connection/index.html
AFAIK nothing has changed and it is more noticeable now as the resolutions scale up.
Try a shorter DVI cable for NVIDIA cards...
Overall I don't think that signal quality is a feature to get all wraped up in, but if that is your cup of tea then all the power to you.
Originally posted by: nullpointerus
Signal quality cannot be an issue in a pure digital connection. This isn't open to debate. Logically, you must be tying together unrelated problems and/or seeing things. Then again, you haven't said anything remotely useful in identifying the root cause of your "poor signal quality" problem.Originally posted by: Frackal
Neither of what you say is valid in this case.
The same cable from monitor to video card (it's DVI) is being used in both cases
And I am familiar with digital vibrance / saturation, and it's not the case either.
It is because you're introducing an entirely unrelated "problem." The fonts drawn when in the system BIOS are stored in the video card's BIOS, and they vary between ATI and nVidia cards. It's called a character display mode.How exactly is it that the ATI card gives a sharper picture in my bios (where I can see each pixel rather than a white sem-blurry letter) if it's just "digital vibrance?"
The system BIOS, text-mode OS, boot loader, or whatever plots a string of character codes (i.e. "AGP Multiplier:", "4x") along with a few bits to determine color, background color, and effects (e.g. blinking), leaving the actual font and effect rendering up to the video BIOS. (There are even special ASCII character codes for the lines you see.)
Now, if you prefer the ATI text-mode fonts, that's entirely up to you, but it has nothing to do with signal quality.
I did. Know what you're talking about before you set ultimatums for a company, man.Read the post man.
I think it's unacceptable to hold a company responsible for an unspecified problem.I don't know what the difference is, but I still think the ATI "part" responsible is simply higher quality than the nvidia one, which is unacceptable on nvidia's part due to the high cost of the cards.
One glaring omission in my previous post was the mention of output scaling. As you're undoubtedly aware, LCD monitors have a fixed native resolution. AFAIK, video cards still boot at 640x480 resolution, so the output has to be scaled somehow.Originally posted by: Frackal
This was the progression of my experience: I take out nvidia card, plug in ATI card. Turn on PC. First thing I notice is the letters in my bios appear sharper and brighter. Whereas with the nvidia card all I saw was a white letter, the ATI card showed each subtle pixel.
It sounds like the game settings are different, either because the game detected a new card and picked defaults for you or because the game has different paths and/or incomparable settings for ATI and nVidia cards. 3D IQ really isn't my thing, though. Wasn't there a thread specifically comparing the IQ of ATI and nVidia cards?Fast forward to game, Battlefield 2. I notice on medic packs, supply packs, characters, little tiny texturing details that were not present period on the nvidia card (even on Nvid. HQ mode)
HL2 EP1 had a different look as well, along the same lines.
Yet you should have been able to pick out one part of a scene, one blotchy-looking part of an XP window or image, or one particular letter that shows what you are talking about. Specifics, man. And if it's a scaling problem, I don't think it'll show up in an screenshot, but even then you should be able to hold your eyes close to the screen and explain the differences as best you can.What I'm talking about is hard to explain, but easy to show, that's why it sounds vague.
Reasons for my contempt:(Watch your contempt if you actually want to continue this discussion.)
And you see this on the Windows desktop? With a digital connection? I doubt it.The best way I can characterize would be like this:
If you wear contacts or glasses, look at a wall with your contacts/glasses out/off. You can see the wall, maybe some detail. Then you put you contacts and glasses on, and all of a sudden on the same wall you can see more detail, everything is sharper, less washed out. More clarity. That's the closest I can come to describing what I mean without showing someone side by side comparisons.
FWIW, it's also possible that there were some nVidia cards with minor defects.I am not the only person who noticed this. I can't find the other member's post which mentioned the same thing; he phrased it roughly as 'a little like getting a new LCD' ... this could all be bull, but there is a difference here. Perhaps its simply as straightforward as different letters in the bios, and then regular old better image quality on ATI's part in games.
Originally posted by: Frackal
Well, here's my view;
HDR+AA is great, but I only use it in Oblivion at the moment.
Now, if you're going to be keeping y our 7950GX2 for a long time, keep in mind that many more games may come out that will have HDR+AA, such as Crysis and UT2007. By that time G80 will be out and there will be wider support for it.
I am a performance nut too. It's almost pointless atm for me to go conroe, but I am because I have a great deal of fun with computer components that are really fast. (Although I do think with this next gen or its refresh that 1600x1200 may become the new 1200x1024, ie, where CPU limitation can begin to show up.)
So XTX vs. GX2... ?
hmm ... Legit Reviews has a review of an overclocked GX2 versus a stock XTX, they don't test all the games I'd have liked, but the XTX holds its own quite well against an overclocked GX2.
If you go to Anand's review of the GX2, they compare it to a stock XT . I add 15% to the XT's scores because I have my memory at 20% over the XT's stock and my core at 10%, and have found that splitting the difference between memory and core OC's tends to be close to its actual gain. (ie, 10% core, 20% mem, = about 15% boost in performance).
Comparing those scores, you'll find the XTX performing very close to the GX2 in most situations. I think you could also figure that since the XT is benched at higher stock image settings and the Gx2 is benched at lower image quality settings the GX2 MAY not perform as well apples to apples as it seems to in the benchmarks.
I looked at it like this to summarize:
Getting XTX over Gx2
Pros: (remember this is in my judgement)
- Possibly very close in performance in many situations (see above) - HDR+AA in Oblivion and future games (though I doubt I'll keep XTX past G80 or R600)
- Better picture quality per that "scaler" discussion (may not even be worth worrying about, not resolved yet)
- HQ-AF which is pretty nice but not a dealmaker/breaker IMO
- Will put less heat into case
Cons: (reason to go Gx2)
- GX2 ought to be faster overall. If you play CoD2 for instance, the Gx2 seems much faster than anything else. I tend to think it may not be faster in Oblivion because an XTX beats even 7800GTX 512 in SLI, and matches 7900GTX SLI in minimum frames, and probably approaches it in max frames once OC'ed a bit. You can use HDR+AA in Oblivion too with ATI. I think Gx2 should be a bit faster in BF2, although again, most benchies DON'T use equal quality settings, and when they do, Nvidia cards lose a fair chunk of performance (10%-20% oftentimes)
- If you want to overclock XTX, and turn the fan up to do it, it can be loud. That's really only if you want to push it to the limit though. Otherwise, it's not that loud.
It's really tough to say. The above was basically my decision-making process. I decided that the Gx2 was not necessarily all that much faster than the XTX in the games I play. When at equal settings for image quality, a 700mhz/1800mhz 7900GTX gets beaten pretty badly (all things considered) by an XTX at stock, that says something.
Also, the XTX would (should) allow me to use HDR+AA in Crysis, Oblivion and UT2007, was cheaper, and had better image quality across the board. I game at 1680x1050 so an XTX is often overkill, a Gx2 would be more so. I mean, I don't REALLY need 120+fps in BF2, 92-95fps in DOD-S, etc with everything cranked up, but it's fun as hell to see it though, which is why I think a Gx2 would be a lot of fun also for that reason.
Remember, this post will seem more pro x1900 becuase I'm giving you the reasoning I used that led me to pick the x1900 over the Gx2. I had had the opportunity to borrow an XTX at the time the Gx2 came out so I had a unique chance to try the XTX and then decide whether to buy one of my own, or go Gx2. I wish I could have tried the Gx2. At this point I definitely wouldn't change my decision becuase the XTX gives a better visual experience, and will be plenty until G80 or R600 comes along, especially when I put it on water.
If you lived in CO maybe we could trade for a weekand you can try XTX and I can try Gx2 becuase I am curious about it