The fall and rise of Fermi and nVidia.

Cattykit

Senior member
Nov 3, 2009
521
0
0
On one hand, Fermi is treated as a trash GPU that has no appeal (on hardware review sites where the focus is set on gaming performance). On the other hand, Fermi is seen as the most amazing revolution (for video editing guys).

Though Adobe CS5 is not officially out as of now, I saw a brief benchmark that was done on the beta version. With its native CUDA support, encoding a H.264 clip that took 45 minutes without CUDA support, took only 4 minutes. Unlike before, this was done without visual penalties and limitations. Needless to say, it is a dramatic difference. Even better, the test was done on GTX 285. Considering how Fermi is desinged to perform much better than GTX 285 in terms of CUDA performance, I can only wonder what Fermi can do.

Keep in mind, unlike gamers, video guys haven't seen this kind of revolution. Modern 4-6 core i7 CPU that is blazingly fast in most situations are still painfully slow for video editing. I know a guy who shot 5400 minutes of documentary using 5D mk2 (h. 264, 40mbps videos), and it literally took him a month just for transcoding them into an editable codec (He used two MacPro along with two of http://www.bhphotovideo.com/c/produc...320_Array.html.)
With Fermi and CS5, such need for transcoding will be gone, and finally real time editing on high bitrate-h.264 video will become reality. Encoding the final work would take far less time.

Given video guys haven't seen this sort of revolution and that those are willing to drop big bucks, I do not doubt Fermi will be very successful in that market.
The notion and usage of GPU is chaning yet we are only talking about games and fps it can do. That, I think is something we need to reconsider when talking about Fermi and nVidia.

--------------------------------------------------------------------------------------------------------------------------------------
What I wrote in the Digital and Video Cameras section:

Too bad Fermi is getting trashed among hardware sites and gamers. Due to the fact its appeal is terrible in the gaming market, I think nVidia will try to cash in the video market. After all, nVidia is the pioneer in the video market and that it's a blue-ocean where competition does not exist.

So far, only below mentioned cards are currently supported:

* GeForce GTX 285 (Windows and Mac OS)
* Quadro FX 3800 (Windows)
* Quadro FX 4800 (Windows and Mac OS)
* Quadro FX 5800 (Windows)
* Quadro CX

However, Adobe mentioned that they are "planning to support additional cards in the future, including some of the new NVIDIA solutions based on the upcoming Fermi parallel computing architecture."
Keep it mind there's limitation set on GTX 285: only 3 tracks on timeline are CUDA supported. Fermit will, I bet, be supported fully. Given the limitation set on GTX 285, the future of other cheap GPUs being supported look somewhat gloomy. If they want to cash-in with Fermi, they will limit and limit other cards, I foresee.

Notheless, this is a revolution and a great step toward a great future. I have high hopes!
 
Last edited:

Insomniator

Diamond Member
Oct 23, 2002
6,294
171
106
Since when is Fermi trash with no appeal? Maybe to some ATI fanboys...

Its hot and power hungry but provides great performance. It is certainly no 5800ultra.

I would also love to see how it handles editing.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
And since when was Fermi labeled "trash with no appeal"? Some have labeled it trash, but certainly trash WITH appeal.
Most have said: best single gpu performance but with flaws. Hell the 480 SLI is even given medals of honor for the performance. Where was the trash thing again?
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Looking at the discrete market numbers from the AMD conference call thread. There is plenty of reason for Nvidia to open up their gaming GPU to other revenue streams.

I am really interested in hearing about Tesla implementations that replace entire data centers worth of CPUs from Intel or AMD.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
Since when is Fermi trash with no appeal? Maybe to some ATI fanboys...

Its hot and power hungry but provides great performance. It is certainly no 5800ultra.

I would also love to see how it handles editing.

The biggest problem was with the huge delay of fermi. If it was released closer to the 5xxx series from ATI things would be looked at differently.

Now I guess for somebody does alot of encoding with CS5 then one could offset the power usage with time saved. :)
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
I really don't see the point of this thread. We all know fermi kicks arse at GPGPU, and we all knew video transcoding with CUDA was drastically faster without :\
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Since when is Fermi trash with no appeal? Maybe to some ATI fanboys...

Its hot and power hungry but provides great performance. It is certainly no 5800ultra.

I would also love to see how it handles editing.

The bolded part could be applied to the 5800 Ultra when it came out 100%. What the 5800 Ultra failed to do was provide a benefit to those who waited for it, the same the GTX 480/470 failed to do. Both NV30 and GF100 were much, much faster than their predecessors(NV25 and GT200) but they were not better than ATI's next gen cards that had already been out for months, which is why there is disappointment.
 

1h4x4s3x

Senior member
Mar 5, 2010
287
0
76
without visual penalties and limitations.
Since you've seen it mind showing me an example? I've been looking for it for quite some time now.

I think nVidia will try to cash in the video market.
Exactly, that's why your logic fails at some point.

Keep it mind there's limitation set on GTX 285: only 3 tracks on timeline are CUDA supported.
You realize that 3 tracks are rather useless for people who do it professionally?
Guess why it's limited.

Fermit will, I bet, be supported fully.
I'm sure you're aware of Nvidia's Quadro cards which have a way higher margin.
So, you really think Nvidia would allow the Mercury Engine to run as well on their consumer cards as on their workstation cards?
I'd be pleasantly surprised but I don't think this is going to happen.


In all your praise you forgot to mention that OpenCL support is in the pipeline.
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
The bolded part could be applied to the 5800 Ultra when it came out 100%. What the 5800 Ultra failed to do was provide a benefit to those who waited for it, the same the GTX 480/470 failed to do. Both NV30 and GF100 were much, much faster than their predecessors(NV25 and GT200) but they were not better than ATI's next gen cards that had already been out for months, which is why there is disappointment.

I wasn't knowledgeable at the time so correct me if I'm wrong but I think 5800 ultra was late, hot, loud, expensive and slow right? Fermi is late, loud, and hot, but is not slow or that expensive, that's the difference and why fermi isn't an NV30 repeat
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Since you've seen it mind showing me an example? I've been looking for it for quite some time now.


Exactly, that's why your logic fails at some point.


You realize that 3 tracks are rather useless for people who do it professionally?
Guess why it's limited.


I'm sure you're aware of Nvidia's Quadro cards which have a way higher margin.
So, you really think Nvidia would allow the Mercury Engine to run as well on their consumer cards as on their workstation cards?
I'd be pleasantly surprised but I don't think this is going to happen.


In all your praise you forgot to mention that OpenCL support is in the pipeline.

I'm sure I read somewhere that something GPGPU related was only going to be allowed on the workstation cards, although I can't for the life of me remember what it was.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
The biggest problem was with the huge delay of fermi. If it was released closer to the 5xxx series from ATI things would be looked at differently.

Now I guess for somebody does alot of encoding with CS5 then one could offset the power usage with time saved. :)

Who cares about the delay? Sure Fermi earlier would have been great, but it has nothing to do with it's actual performance.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
There is a small beloved patriot in The amazing video editing power of Fermi over Intel CPUs . Its called Sandy Bridge . Since the current form of Fermi really isn't all that . Well have to wait for Fermi II , By than Intel will be ready with sandy Bridge.

http://www.youtube.com/watch?v=SZrVlurvRu4
 
Last edited:

pmv

Lifer
May 30, 2008
14,831
9,744
136
I wasn't knowledgeable at the time so correct me if I'm wrong but I think 5800 ultra was late, hot, loud, expensive and slow right? Fermi is late, loud, and hot, but is not slow or that expensive, that's the difference and why fermi isn't an NV30 repeat

Fermi is pretty damn expensive everywhere but the US. So expensive as to be pointless. Also, currently non-existent. its still vapourware outside the US.

It's not slow, though, I give you that.

I thought the original post was interesting. Don't know why everyone's reacting so aggressively to it*. Fermi's clearly by no means an all-round disaster, its just rather disappointing for gamers. It might yet be a good starting point for future cards though.

* sorry, forgot that here 'aggressive and combative' is the default mode.
 

Cattykit

Senior member
Nov 3, 2009
521
0
0
Since you've seen it mind showing me an example? I've been looking for it for quite some time now.


Exactly, that's why your logic fails at some point.


You realize that 3 tracks are rather useless for people who do it professionally?
Guess why it's limited.


I'm sure you're aware of Nvidia's Quadro cards which have a way higher margin.
So, you really think Nvidia would allow the Mercury Engine to run as well on their consumer cards as on their workstation cards?
I'd be pleasantly surprised but I don't think this is going to happen.


In all your praise you forgot to mention that OpenCL support is in the pipeline.

Ah...you...another typical angry internet persona throwing ill-directed anger over internet.:hmm:

Try reading carefully. I never said I saw the end result. It was a brief benchmark and comments I saw.

How did my logic fail? a. Given kin-partnership between Adobe and nVidia over the recent years, b. Fermi's getting negative responses in the gaming market, c. Fermi's GPGPU oriented design and various other factors, video production market is where Fermi will shine the most. Even better, it's the market Fermi has absolute dominance. Good place and time to cash in.

Of course, I know "3 tracks are rather useless for people who do it professionally." Did I mention that it's still all good and perfect? Did I say such limitation is nothing to worry about? No. I stated the limitation because there is limitation. I pointed out more limitation will probably take place considering current situation. I don't like it but I foresee that will be the way. That was the point. Don't imply something other than that. Plus, don't forget professionals are already drooling over its encoding performance alone. Even if such limitations are universal, CS5 + CUDA will have major appeal as long as its encoding performance remains.


"In all your praise you forgot to mention that OpenCL support is in the pipeline"??
I'm here talking about Fermi and CS5. Those products are already out (though PP cs5 is still in beta stage). OpenCL support on CS whatever is not.
When OpenCL becomes reality, I'll praise it. Until then, as of now, it's CUDA only.

Fermi is not a name of a consumer level GPU. Rather, it is a code name for
"The Next Generation CUDA Architecture."

Keep that in mind. Now, read what you said over my saying "Fermi will, I bet, be supported fully"

"I'm sure you're aware of Nvidia's Quadro cards which have a way higher margin. So, you really think Nvidia would allow the Mercury Engine to run as well on their consumer cards as on their workstation cards? I'd be pleasantly surprised but I don't think this is going to happen."

Again, you fail at implying. You see, I never said CS5 will fully support consumer level cards. I said 'CS5 will support Fermi fully.' Before you accuse me of stuffs I never said, think about your own limitations and reading comprehension problems. You succeeding on those problems? "I don't think this is going to happen."
 
Last edited:

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Fermi is pretty damn expensive everywhere but the US. So expensive as to be pointless. Also, currently non-existent. its still vapourware outside the US.

It's not slow, though, I give you that.

I thought the original post was interesting. Don't know why everyone's reacting so aggressively to it*. Fermi's clearly by no means an all-round disaster, its just rather disappointing for gamers. It might yet be a good starting point for future cards though.

* sorry, forgot that here 'aggressive and combative' is the default mode.

I'm no expert on foreign stocking status so all my comments are based on the assumption that there are no countries other than USA/CA.


But come on, it's "been available" less than a week, of course there's stock issues, and of course some vendors are price gouging. 5970 has been out for months and still has some stock issues and sells for over MSRP
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
I didn't think there was anything wrong with the 5900 Ultra, (other than that it was slower than the 9700 Pro) the refresh of the 5800 Ultra, and there is nothing wrong with Fermi, considering nvidia's filtering quality is still superior (AF is angle invariant on ATi, but their trilinear LOD calculation is worse), plus ATi does a z-range optimization that can't be turned off.
 

Braznor

Diamond Member
Oct 9, 2005
4,767
435
126
I don't get comments about Fermi being vapourware. My retailer can get Fermi for me and I live in India!
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I don't get comments about Fermi being vapourware. My retailer can get Fermi for me and I live in India!

I'm pretty sure you can get a GTX 480 pretty much anywhere in the world right now, it just depends on how much you are willing to pay for it.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I am really interested in hearing about Tesla implementations that replace entire data centers worth of CPUs from Intel or AMD.

Are you talking about a data center containing only GPUs? Is this possible now?
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
I didnt realize that I had trash in my system. I r sad now. :(


We dont really know Fermi's appeal in gaming yet. Every one that becomes available gets bought up right away, but we have know way of knowing just how many that is.

Supply is low, but it will get better. nV did a pretty slick move and reserved 40nm wafers for thier mobile cards, and then switched it to Fermi.*

*If you believe the usual rumour sites.
 
Last edited:

Magusigne

Golden Member
Nov 21, 2007
1,550
0
76
I buy the best price/preformance ratio.. I'm neither Red nor Green. (Although I do lean green). As you can see this round for me the Red Team got my money. Last round..I had a GTX 285.

Fermi is not trash..it's just the first revision and has ALOT of potential.