What happens to nvidia?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
Please excuse my ignorance, but why will I want to spend 600+ bucks on the high end card where I can simply SLI 2 GTX460 with better performance at 400 bucks?

Single fastest card always will cost more....Everybody knows that :)

The single fastest card never will outperform SLI or Cross-fire of the single fastest card(s)....Most people know that

GTX 460's in SLI are nice and a good bang for the buck....Nobody argues that YET(/b) and most likely won't if the prices keep DROPPING down.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Of course the GF100 released for the HPC market are all cool parts that don't consume insane amounts of power... and HPC really loves high power consuming parts...

And of course the GF 5800 never existed... and all the recent GPUs from NVIDIA are best known for being tiny, cool and consume small amounts of power.



Except those 512 SPs parts are nowhere to be seen.



That the GF100 isn't a gaming design there is few doubts.

The GF104 is a much more balanced part from a gaming pov. Although it must have come as someone hobby, because clearly NVIDIA wouldn't spend resources in the gaming market.

But soon we will see if AMD doesn't retake the single GPU performance crown with a smaller power budget and GPU.


Why are you bringin up the fact that nvidia has made bad choices in terms of design, with the 5800 just like with Fermi.

I'm not flaunting the architecture as being better than ATI, I am telling you what it was designed for because that's where nvidia knows the money is.

You are derailing from the topic.



Doubtfull, Fermi with the remaining 32 stream processors enabled will not do a huge difference in performance, they have everything else of the Uncore enabled (I think they had 1 TMU disabled), so simply adding 32 stream processors and 1 TMU will not make it up for the fact that the HD 5970 is considerable faster, look at these benchmarks.

http://en.expreview.com/2010/08/07/more-benchmarking-results-of-512sp-gtx-480-exposed/9018.html

"Under the Extreme mode with 3DMark Vantage, 512SP GTX 480 got a GPU score of 10072 while the 480SP version scored at 9521, showing a slight performance difference.

According to the Feature Test 1, 512SP delivers better texture fill performance of 41.55 GTEXELS/S, compared to 38.82 GTEXELS/S on the 480SP-equipped model. Under the enthusiast mode with Crysis Warhead (1920×1200, 8xAA on), the 512SP GTX 480 runs at 34.72fps, while the 480SP model at 32.96fps."


You are right but the difference in performance was hardly the point. The point is it can't be done. The chip is capable of running with 512SP at 800 Mhz, but it can't be done. So nvidia cut it down until it was faster than the 5870, and they still had to make two huge compromises. 1. Heat, 2. Noise.

It clearly wasn't designed with GeForce in mind and neither will nvidia's future products, so get used to it. AMD is still designing "Radeon" GPUs, that's why they are successful at it. nvidia doesn't design for GeForce anymore, they stopped doing that when they started developing G80. They had other targets in mind.

This is why I am trying to explain to the moron flame baiter OP, which hasn't shown up yet by the way, that AMD's HD6000 series isn't going to do anything to nvidia.

Personal attacks are not permitted in here. -Admin DrPizza
 
Last edited by a moderator:

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
First, it is IZ3D. It isn't a software, but a hardware setup consists of its own monitor and shutter glasses. It is a 22" LCD @ 1680x1050 and doesn't support multi-display.

Second, PhysX does have its values and people who have brought an new ATI card will try to heck its drivers for it. People won't be doing this if it is just a gimmick.

Everything else are covered by the post you have <snipped>.

I snip you to make the thread less cluttered during quotation, nothing personal.

Second, PhysX is a gimmick because it has been badly implemented, I like the idea of GPU accelerated Phisics making the game more inmersive, but there's less than 10 GPU accelerated PhysX games which most effects are unrealistic and doesn't make a difference in playability, Batman AA was indeed a great game with outstanding PhysX effects, but besides of it, what about Mafia 2? Unrealistic rubber made debris? Cryostasis? Poltergeist like glitches across all game, what about Mirrors Edge? Shattered glasses and minor debris effects, about Darkvoid? Horribly unrealistic debris effects and one of the games with the worst usage of PhysX, what about Metro 2033? More debris and some nice looking fog that doesn't interact much with the user, so far PhysX has been a failure, I like nVidia's initiative to it but the fact that is currently mediocre, doesn't take the fact that it can only valuable for the people who values it. I enjoyed playing Mirror's Edge and Batman AA with my AGEIA card, but it is not a must feature for now, PhysX should be called DebrisX. :(
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
This is why I am trying to explain to the moron flame baiter OP, which hasn't shown up yet by the way, that AMD's HD6000 series isn't going to do anything to nvidia.

Well, it will impact nVidia's bottom line for sure in the graphic card market overall, AMD releasing a new lineup of cards when nVidia hasn't finished yet their Fermi based cards will impact nVidia and its market share, but in the professional and HPC, we will have to wait and see, I doubt that we will see significant strides from AMD's part. nVidia will hold tight such market.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Why are you bringin up the fact that nvidia has made bad choices in terms of design, with the 5800 just like with Fermi.

I'm not flaunting the architecture as being better than ATI, I am telling you what it was designed for because that's where nvidia knows the money is.

You are derailing from the topic.






You are right but the difference in performance was hardly the point. The point is it can't be done. The chip is capable of running with 512SP at 800 Mhz, but it can't be done. So nvidia cut it down until it was faster than the 5870, and they still had to make two huge compromises. 1. Heat, 2. Noise.

It clearly wasn't designed with GeForce in mind and neither will nvidia's future products, so get used to it. AMD is still designing "Radeon" GPUs, that's why they are successful at it. nvidia doesn't design for GeForce anymore, they stopped doing that when they started developing G80. They had other targets in mind.

This is why I am trying to explain to the moron flame baiter OP, which hasn't shown up yet by the way, that AMD's HD6000 series isn't going to do anything to nvidia.
This is only your wish.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
<snip,snip>

This is why I am trying to explain to the moron flame baiter OP, which hasn't shown up yet by the way, that AMD's HD6000 series isn't going to do anything to nvidia.

Don't think that comment was called for as many others would agree!

As far as flame baiting goes....The followers of the nvidia cult are way better at it.

Don't think AMD will be able to take nvidia out of the picture anytime soon. But combine it with a couple more bad moves on nvidia's part and who knows what could happen. :)
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
Why are you bringin up the fact that nvidia has made bad choices in terms of design, with the 5800 just like with Fermi.

I'm not flaunting the architecture as being better than ATI, I am telling you what it was designed for because that's where nvidia knows the money is.

You are derailing from the topic.






You are right but the difference in performance was hardly the point. The point is it can't be done. The chip is capable of running with 512SP at 800 Mhz, but it can't be done. So nvidia cut it down until it was faster than the 5870, and they still had to make two huge compromises. 1. Heat, 2. Noise.

It clearly wasn't designed with GeForce in mind and neither will nvidia's future products, so get used to it. AMD is still designing "Radeon" GPUs, that's why they are successful at it. nvidia doesn't design for GeForce anymore, they stopped doing that when they started developing G80. They had other targets in mind.

This is why I am trying to explain to the moron flame baiter OP, which hasn't shown up yet by the way, that AMD's HD6000 series isn't going to do anything to nvidia.


moron flame baiter OP... lol


You really dont think AMD being a almost a full cycle ahead of Nvidia in putting out new grahic cards will do anything to Nvidias situation? Have you been awake since september/november 2009?

Pardon me for saying, but you could perhaps need a new processing unit. A logical one at that... har har

Why the favorism JAG? what does it give or take away from you whichever GPU producer leads the race?
As long as both GPU producers are IN the race, and there is competition, its all good for us, isnt it?


And also...what exactly is doable and what isnt? Can you blame people for calling you out on this stuff when you manage to contradict yourself to this degree?:

It's clear that Fermi can run even at 800 Mhz with all 512 SP enabled, and that would probably make the 5970 even beg for mercy, but it wasn't feasible in terms of power and heat. So nvidia settled for fastest single GPU and called it a day.


The point is it can't be done. The chip is capable of running with 512SP at 800 Mhz, but it can't be done.


Peace
 
Last edited:

Seero

Golden Member
Nov 4, 2009
1,456
0
0
<Snip><Snip><Snip>
I snip you because I don't like you and is very personal.:twisted:

Have you played DAO with PhysX enabled? Games pretty much require a very strong video card to play at a decent FPS, yet most of the time the GPU is not doing much while gaming. Again, PhysX is a tool, whether or not it makes a differences depend on the programmers who use it. Keep in mind that not all user who play the game can experience the work that is done with PhysX. I am not saying it is the difference between night and day, but there are differences.

Back to the snipping business.

<Snip> your next post

<Snip> your next next post

damn, where did my "S" key go?
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
How can you ask for no trolling when you're instigating yourself. And how can you be so naive? You are either ignorant or in denial if you think AMD could even put a dent in nvidia's profits with Radeon cards.

Nvidia has profit margins with quadro and tesla cards that AMD can only dream of. A market that AMD can't even touch, nvidia has no competition at the moment except for Intel in this arena.

You have to look at the big picture.
http://www.nvidia.com/object/quadro-fermi-home.html
Up to $5000 a piece.
http://www.nvidia.com/object/tesla_computing_solutions.html
I don't even know how much they cost, but you certainly can't buy one.

Plenty of huge corporations purchase these. All you're focusing on is what you are interested in, the small ludic gaming market which isn't even 30% of their profits. And even there they are doing just fine, as you can clearly see Fermi cards sell well.

If anything you should be thankful that AMD is early to the game with a good competitive end user gaming product, because if they kept releasing products like R600 they probably wouldn't exist by now. The two companies compete but they aren't the same.

+1
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
You are right but the difference in performance was hardly the point. The point is it can't be done. The chip is capable of running with 512SP at 800 Mhz, but it can't be done. So nvidia cut it down until it was faster than the 5870, and they still had to make two huge compromises. 1. Heat, 2. Noise.

It clearly wasn't designed with GeForce in mind and neither will nvidia's future products, so get used to it. AMD is still designing "Radeon" GPUs, that's why they are successful at it. nvidia doesn't design for GeForce anymore, they stopped doing that when they started developing G80. They had other targets in mind.

This is why I am trying to explain to the moron flame baiter OP, which hasn't shown up yet by the way, that AMD's HD6000 series isn't going to do anything to nvidia.

So do we agree yet that AMD's Radeon can in fact put a dent in Nvidia's bottom line? And the 6000 series will likely do the same, but not for some time. Once they get the entire line up out, if Nvidia has nothing until late 2011, then the 6 series can indeed put a dent in Nvidia's bottom line.

And correct me if I'm wrong, wasn't the GF104 designed to be a GeForce? I thought it was pretty average in the CUDA/HPC world? I could be wrong on that, but I thought it was more GPU than the GF100 which was more HPC oriented?
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
I am not saying it is the difference between night and day, but there are differences.

No one here is denying that, but a 40-50&#37; performance hit for effects which do not change the way we play game(s), no sir.

Mafia2_DX11_Benchmark.jpg
 
Last edited:

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
Thats what ive read aswell. More a gaming gpu. And look at its success on these forums.. and you know what?
That success will help Nvidia penetrate even deeper into the market with their GPGPU stuff, just by the power of being in peoples rigs and not being totally cripled like some other cards are for GPGPU purposes.

Mindshare, and Nvidia is good at it.
If JAG is correct, that Nvidia is moving away from gaming cards on the whole. Nvidia will lose alot of mindshare/publicity. And will struggle in the future.

Lets hope he is not correct
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
And correct me if I'm wrong, wasn't the GF104 designed to be a GeForce? I thought it was pretty average in the CUDA/HPC world? I could be wrong on that, but I thought it was more GPU than the GF100 which was more HPC oriented?

Hmm....What does the GF in GF104 stand for anyways....Good Fart? :)
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
I snip you because I don't like you and is very personal.:twisted:Back to the snipping business.

<Snip> your next post

<Snip> your next next post

damn, where did my "S" key go?

Well, if its personal then you have issues, in reality I doubt you can stand a chance against me, a 195+ pounder guy who goes to the gym :biggrin:

No one here is denying that there is zero difference, but a 40-50&#37; performance hit for effects which do not change the way we play game(s), no sir.

Mafia2_DX11_Benchmark.jpg

1+

Not only am I bigger than you, but I'm a mod too. So I'm instituting a smackdown for physically threatening a poster and continued personal attacks. Come join us when your hospital vacation is over in a couple of weeks.

-ViRGE
 
Last edited by a moderator:

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
It is funny how you want to keep this civil where you begin this thread is a troll thread. I will try to break your argument one by one, in a civil way.

First, it is true that the cypress series from AMD is a success, but there is no guarantee for 6xxx. Look at Nvidia, with the success of the G92 core, they roll out the entire 2xx series with success. It is atleast as good as the 4xxx series from ATI at the time and could have simply recycle the design just like ATI did with there 4xxx series, but they didn't. The Fermi series are more or less a complete redesign in terms of architecture. While the first product isn't perfect, it is expected. GTX460 clearly indicated that the Fermi architecture works at multiple ends, unlike what people said "It isn't for gaming."

So now Nvidia has the first generation of their new architecture out, plus all sorts of CUDA, 3D and PhysX to back it up, what do ATI have? Are they going to recycle the 4xxx design again and use 25 nm this time? Sooner or later ATI needs to make a new design, and it will get a hit just like Fermi got. You can't expect something new just work without problems. Here is the catch though, if they get a hit like the one Nvidia got, they may not be able to get back like Nvidia.

How did Nvidia got back? Well with those proprietary stuffs. Those who have Nvidia 3D vision probably work even try an ATI card if it isn't like 150% performance with 50% the price. Those are utilizing the power of CUDA will not switch until ATI come up with something that can suit their needs. Yes some can hack their way through and use their old Nvidia card as a PhysX card, which many have done, but Nvidia's new drivers won't support such setup, so it is the matter of time before they sell their ATI card for a Nvidia card. Why do I say that? Well, they could have sold their old Nvidia card at the first place and ditch PhysX, but they didn't. An outdated Nvidia card can serve as a PhysX card. An outdated ATI card can do what?

The one thing the lead to the success of the cypress series wasn't what ATI has done right, but the amount of screw up Nvidia had made. Nvidia cease the production of GTX285 way too early. The reason for this action may be due to their confidence on the fermi architecture and they were so ready to mass produce it as soon as it is ready. On one hand my beloved Charlie was busy generating FUD about how Fermi is unmanufacturable, too big, and too hot while Nvidia was busy making their big hit. All was suppose to go well until TSMC delivered the message to both Nvidia and ATI about the manufacture problem they have on the 40nm yield. While ATI can modify their design with ease since it really isn't new, Nvidia was more or less screwed. They can't go all out with Fermi and redesign would take too long as the factory line has stopped. They have no other choice but to go forward with what they have, just not at full force. Rumors said that Fermi will have very limited supply due to yield, which was busted. Charlie's FUD about unmanufacturable was busted. Rumors about no tessellation was also busted. It was, however, power hungry as it isn't 100% efficient and generates too much heat.

Usually this wouldn't be as serious as it had been but it was at the time where window 7 and DirectX 11 first arrives. This is a golden time when consumer want to buy new hardwares and Nvidia has nothing to sell between september 2009 and march 2010. Mean while ATIs cypress was the only option for a new video card, which eventually lead to a unexpected sells that ATI have to increase the price to smooth out the demand. All OEMs were asking for goods and Nvidia really have nothing. The new isn't ready and the old has ceased. Just when things are bad enough on this end, their other product, Tegra2, had discovered a problem all the way down to its design level. Many new products (tablets) never made into market in time which benefited ipad, and therefore iphone.

As if things are not bad enough, some smart programmers actually cause some aftermarket HS fan not to spin with their new Nvidia driver. They new SC2 is a demanding game and will cause video card to get very hot. The intention is to adjust the fan speed (spin faster) but somehow some aftermarket HS fan decided not to follow instructions/specifications. Now we know that SC2 kills video cards, but at the time people believed that the driver is the solo cause of the dying cards. GG!

As you can see, Nvidia's wound was not caused by ATI, but TSMC and themselves. Of course public attention only focus on 480, and people relate the failure of everything to 480 as 480 is the only thing Nvidia ever made. Many believed that it is too big and too hot. The truth is, 480 is not bad and heat isn't a problem. I was expecting to find lots of threads about the death of GTX480 in the SC2 forum, at least a few. To my surprise, no deaths of 480! Many G92 chip based cards were fried, some ATI 4xxx and cypress, but no 480! Too hot? I don't think so. The idea of hot however, stayed.

So with the 6 months glory ATI had, people are slowly selling their ATI card for a Nvidia card even if it is a side-grade or down-grade. Why? Well there are things that you can only do with a Nvidia card. Those FUD about 480/470 will keep people away from them, which allows their new 460 to sell. Interestingly, 460 is indeed a better make, lower cost and higher yield. This leaves 460 the only card they should make, so they are massively producing it, and therefore able to reduce the cost of BoM. Eventually, 485/475 will replace 480/470, and dual core is not far away.

It is true that most cypress user ain't going to drop the card now because their is no reason to, but that is going to change really soon. Lots of 3d movies are coming out and not long until 3D home theaters kicks in. Lots of laptop manufacturers see this and are now using Nvidia chipset for Nvidia 3D. Where is the AMD's version? I hope the 6xxx series will retrofit 3D or else the market will swing back to Nvidia as soon as Avatar 3D video boxset arrives. IMO ditching the ATI name is not a smart thing AMD to do at this time.

Programmers are not gods. Think of them as smiths who craft softwares, and video cards and processing units are woods and nails. Without the proper tools, the quality of those woods and nails are meaningless. The thing about CUDA is not on the quaility, but the tools which drives it. The fact is, although 2xx architecture supports CUDA, the new Fermi architecture will allow better utilization when it comes to CUDA computing, think of this as making nails that goes into wood easier. This is an improvement of quality where ATI has not even begin. CUDA is based on C++, which is something all programmer knows, and Nvidia has provided CUDA development suite which allows programmers to forge softwares with, while using "open whatever" and/or ATI's stream is like trying to make a table with bare hands. Someday someone will invent the tools which can utilize them, but none has been invented yet. Even if AMD starts now they are still years behind.

Video games are important, more important than food some may believe. They also believe that the video card that pushes the most FPS is going to have the last laugh, and I agree. The question is, how do they get there? AMD creates quality woods and nails and hope someone will figure out how to utilize it. Nvidia creates quality woods, nails and the tools to use it. Who will have the last laugh?

Forget about theorycrafts, look at what we have now. Nvidia users are the first who experience Adobe Flash accelerated by GPU. Video editors are also benefited only by Nvidia users. When will ATI users have a taste at what GPU can do for them other than playing games?

Will 6xxx changes all this? Will fusion changes all this? We will see.

Wicked post!
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
This is absolute rubbish, Both Nvidia and ATI cards have been used by people doing other stuff than gaming since the companies started producing graphics cards.

Your whole post is just one "praise Nvidia" salme. Im disgusted, but to each his own.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
No one here is denying that, but a 40-50% performance hit for effects which do not change the way we play game(s), no sir.

Mafia2_DX11_Benchmark.jpg


LOL, thats actually a 18% hit for 460 from AA to Med, not 40% or 50%..The 5850 is a 55% hit for the same however!
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
It's really only mid-high end consumer desktop graphics cards that nvidia has any chance of loosing out badly to amd in, and while they matter, they don't matter that much.

This winters big profit area for nvidia I expect will be laptops. They now have a complete 4 series line-up there, and optimus which basically means if it's an intel laptop with a discrete gpu (which these days is a lot of them) it'll have nvidia inside. That's a lot of laptops.

intel sells a LOT more laptops with gma4500 than they do discrete, however, and that trend will get even worse when SB gets here. maybe this fall nvidia is ok, but what about 1 year, 2 years, 3 years down the road?

nvidia's roadmap is ok for now, but they'll be in trouble when/if amd and/or intel decide to really target the hpc market. of course, if it weren't for jhh's determination to remain ceo then intel probably would have already bought nvidia and we wouldn't even be having this discussion.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
LMAO...blow your horn much?

That's the best you could come with? PhysX is running on the CPU on the AMD graphic cards sheesh :biggrin:

Only in your dreams, the GTX 460 1GB can outperform the HD 5870.
 
Last edited:

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
That's the best you could come with? PhysX is running on the CPU on the AMD graphic cards sheesh :biggrin:

Only in your dreams, the GTX 460 1GB can outperform the HD 5870.

Who cares what its running on?, the 460 is not taking a 50% hit, AMD is!
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Who cares what its running on?, the 460 is not taking a 50&#37; hit, AMD is!

Whatever troll, PhysX on AMD hardware is runnin on the CPU and uses nVidias GPU for acceleration, talk about apples/orange comparison, but with it off, look at these results with PhysX Off and enjoy the show.

http://www.techspot.com/review/312-mafia2-performance/page4.html

PhysX_Off_AA_On_2560.png


PhysX_Off_AA_On_1920.png


PhysX_Off_AA_Off_2560.png


PhysX_Off_AA_Off_1920.png


The HD 5870 is a match for the far more expensive and power hungry GTX 480 and the HD 5970 obliterates the GTX 480, heck, even the HD 5850 is close to it, hard to swallow right? An old ass GTX 285 outperforming the GTX 460 1GB :awe: Need a beer?
 
Last edited: