Fermi huh? Supposed code name for GT300

Status
Not open for further replies.

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Nucular reactors are good
Yea Faud's been posting some interesting tidbits lately. Black Friday's been thrown around a few times, maybe pure speculation but who knows. Me thinks this thing is gonna be a freakin beast. Can hardly wait but I'm gonna try..
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
LMAOOO.. Video was pretty funny.

Well, I hope GT300 is a beast. New architecture doesn't leave much room for speculation. We don't know squat about it except that it most likely has GDDR5, but does it still retain a 512bit bus like GT200. How many shaders and what exactly are they, how they are different or similar to GT200 shaders? What's changed, what hasn't. Yeesh.. So many questions and the suspense is killing me.
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
2x the same stuff as GT200 with 2GB DDR5 and some new goodies too. Here's to hoping.. :beer:
 

MODEL3

Senior member
Jul 22, 2009
528
0
0
Just boring.
The only purpose of the article is for just a few more hits.

----------------------------------------------------

The chip that we ended up calling GT300 has internal codename Fermi. The name might suit it well as Enrico Fermi was the chap that came up with the first nuclear reactor.

"Wow, very nice new info, although irrelevant regarding what specs can we expect from GT300.(nuclear power marketing...)"
"I hope the internal name doesn't suggest about GT380 TDP. lol "


-----------------------------------------------------

The new Nvidia chip is taped out and running, and we know that Nvidia showed it to some important people.

"Anything new? I mean we heard before that is taped out, ofcource now GT300 is running like the wind, lol."

-------------------------------------------------------

The chip should be ready for very late 2009 launch.

"Yes it should be!
But would it be?


------------------------------------------------------

This GPU will also heavily concentrate on parallel computing and it will have bit elements on chip adjusted for this task.

"What a shock, nobody expected this with DXcompute/OpenCL/etc and with GDDR5..."

------------------------------------------------------

Nvidia plans to earn a lot of money because of that.

"GTFO, unbelievable."

------------------------------------------------------

The chip supports GDDR5 memory, has billions of transistors and it should be bigger and faster than Radeon HD 5870.

"So, we have:
1.Finally, NV is going to use GDDR5 after 1,5 year ATI used it.
What a shock, i thought they would use GDDR3 until GT400.
2.It has billions of transistors.
(that's 2 millions or more) fantastic.
3.It should be bigger and faster than Radeon HD 5870.
What? You mean they are going to do exactly what they did the last 3 years?I thought they will implement ATI's <300$ Q4 2007/ Q2 2008 die size/performance target strategy
"

--------------------------------------------------------

The GX2 dual version of card is also in the pipes.

"He means at GT380 launch, or sometime in the future?
If he means the latter, wow, i never thought about it.Honestly
"

------------------------------------------------------

It's well worth noting that this is the biggest architectural change since G80 times as Nvidia wanted to add a lot of instructions for better parallel computing.

"But why? NV made such radical architectural changes (especially in <$200 parts, you just have to look the model numbers (renames), lol) with its DX10 transitions."

-------------------------------------------------------

The clocks will be similar or very close to one we've seen at ATI's DirectX 11 card for both GPU and memory but we still don't know enough about shader count and internal structure to draw any performance conclusions.

"He means HD5870, right? So GT300 will have 825MHz or 850MHz core speed in the standard config?
WTF? This will be interesting, if true...
"

---------------------------------------------------------

Of course, the chip supports DirectX 11 and Open GL 3.1

"But of cource."

-------------------------------------------------------

Don't get me wrong, i hope NV to have something good with GT300.
I just find articles like this meaningless.
That's why the tongue in cheek post.
 

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
This is an article I wrote myself: Prolly not allowed so removed to be sure :)

Text added (translation is shit)

"vidia's DirectX 11 GPU, the code-named Fermi and wear this year for sale. The video chip manufacturer would a 384bit wide memory interface and give it to combine with GDDR5 memory. The GPU would still need a respin.

To date, Nvidia's upcoming GT300 GPU labeled, but according to anonymous sources, the GPU in the Nvidia code-named Fermi wear. Nvidia's partners were recently sparsely informed about the new chip from Nvidia, which among others would have a 384bit memory bus, and uses GDDR5 memory. This bandwidth would be 1.5 times higher than in AMD's DirectX 11 graphics cards. Nvidia claimed ten days ago that DirectX 11 support would be an important factor for consumers to purchase to proceed, but partners would the manufacturer have now indicated that Fermi supports DirectX 11 and soon arrive.

The number of shader processors on Nvidia was less explicit, but reportedly was at least twice the number as high as in the previous generation. The GT200 has 240 shader processors on board, so if the rumors are correct, therefore at least 480 shader processors at Fermi. The clock speeds of the GPU and shader processors are not yet known. This would also not stand down because the gpu by anonymous sources would need a respin. Only after the first real working copies of the tape roll off the clock speeds can be determined. The respin makes it about six to eight weeks will be before this happens. Should this be plausible that the probability that the first graphics cards based on Fermi this year on the market."

I figure linking isn't a problem, since it's in Dutch? If it is, plz repost and I'll remove it :p Anyways I've had Fermi confirmed by my own sources, as well as 384bit memorybus and gddr5. Number of shaders is supposed to be > 2x gt200, but information regarding shaders was shady. This is Nvidia PR btw, I mean, this is what they tell partners. It could all be bullshit, trying to get attention away from ATI/Cypress.

The respin piece is from a different source btw.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: MODEL3
Just boring.
The only purpose of the article is for just a few more hits.

----------------------------------------------------

The chip that we ended up calling GT300 has internal codename Fermi. The name might suit it well as Enrico Fermi was the chap that came up with the first nuclear reactor.

"Wow, very nice new info, although irrelevant regarding what specs can we expect from GT300.(nuclear power marketing...)"
"I hope the internal name doesn't suggest about GT380 TDP. lol "


-----------------------------------------------------

The new Nvidia chip is taped out and running, and we know that Nvidia showed it to some important people.

"Anything new? I mean we heard before that is taped out, ofcource now GT300 is running like the wind, lol."

-------------------------------------------------------

The chip should be ready for very late 2009 launch.

"Yes it should be!
But would it be?


------------------------------------------------------

This GPU will also heavily concentrate on parallel computing and it will have bit elements on chip adjusted for this task.

"What a shock, nobody expected this with DXcompute/OpenCL/etc and with GDDR5..."

------------------------------------------------------

Nvidia plans to earn a lot of money because of that.

"GTFO, unbelievable."

------------------------------------------------------

The chip supports GDDR5 memory, has billions of transistors and it should be bigger and faster than Radeon HD 5870.

"So, we have:
1.Finally, NV is going to use GDDR5 after 1,5 year ATI used it.
What a shock, i thought they would use GDDR3 until GT400.
2.It has billions of transistors.
(that's 2 millions or more) fantastic.
3.It should be bigger and faster than Radeon HD 5870.
What? You mean they are going to do exactly what they did the last 3 years?I thought they will implement ATI's <300$ Q4 2007/ Q2 2008 die size/performance target strategy
"

--------------------------------------------------------

The GX2 dual version of card is also in the pipes.

"He means at GT380 launch, or sometime in the future?
If he means the latter, wow, i never thought about it.Honestly
"

------------------------------------------------------

It's well worth noting that this is the biggest architectural change since G80 times as Nvidia wanted to add a lot of instructions for better parallel computing.

"But why? NV made such radical architectural changes (especially in <$200 parts, you just have to look the model numbers (renames), lol) with its DX10 transitions."

-------------------------------------------------------

The clocks will be similar or very close to one we've seen at ATI's DirectX 11 card for both GPU and memory but we still don't know enough about shader count and internal structure to draw any performance conclusions.

"He means HD5870, right? So GT300 will have 825MHz or 850MHz core speed in the standard config?
WTF? This will be interesting, if true...
"

---------------------------------------------------------

Of course, the chip supports DirectX 11 and Open GL 3.1

"But of cource."

-------------------------------------------------------

Don't get me wrong, i hope NV to have something good with GT300.
I just find articles like this meaningless.
That's why the tongue in cheek post.

Well, we don't have very much else right now. Any little tidbit here and there is at least something. The Fermi codename seems to have been confirmed though. That is at least one thing we didn't know before. Albeit not very useful.
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Speaking of boring, someone must be r-e-a-l-l-y bored to dissect a Fud article.. ;)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: yacoub
Hopefully it's not referring to how hot the core runs. lol

All high end GPU's run hot. That's a given. They are meant to perform, and run hard. And they run hot. But I agree. Nobody needs the China Syndrome in their PC. ;)
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Originally posted by: Keysplayr
Originally posted by: yacoub
Hopefully it's not referring to how hot the core runs. lol

All high end GPU's run hot. That's a given. They are meant to perform, and run hard. And they run hot. But I agree. Nobody needs the China Syndrome in their PC. ;)

oh and I can't till somebody starts saying... "oh noes, It uses too much power"
 

MODEL3

Senior member
Jul 22, 2009
528
0
0
Originally posted by: Hauk
Speaking of boring, someone must be r-e-a-l-l-y bored to dissect a Fud article.. ;)

But, but, but the site traffic at the time i wrote it was really slow. :(

 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Looks promising, but I want to see some benchmarks. :)

Come on Nvidia - throw us a bone!
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
"The chip supports GDDR5 memory, has billions of transistors and it should be bigger and faster than Radeon HD 5870."

Briiiiiillllliant - perfectly shows what an utterly clueless PoS this article is. And we, idiots, after so many similarly worthless posts, are still generating clicks for this clueless loser every time he posts another piece of re-hashed sh!t without any info like this.

Originally posted by: Keysplayr
Any little tidbit here and there is at least something.

Clicks for this loser, right.

Ah and there are no tidbits anywhere AFAICT.
 

Rusin

Senior member
Jun 25, 2007
573
0
0
Naah.. Nvidia's new DX10.1 chip GT300 is still over 6 months from releasing
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Originally posted by: MODEL3
Originally posted by: Hauk
Speaking of boring, someone must be r-e-a-l-l-y bored to dissect a Fud article.. ;)

But, but, but the site traffic at the time i wrote it was really slow. :(

Must have been.. :D

 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Originally posted by: ArizonaSteve
First company to throw out a single GPU card with 2GB of memory gets my business...

Yea I'm wanting 2GB for my DX11 solution. With as much power as these next gen cards will have, 2GB is future proofing if anything. An X2 card from either camp should demolish games for at least two years. I rerouted some cables this weekend to allow for a 5870 X2. Come on nV, show your cards..
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Originally posted by: ArizonaSteve
First company to throw out a single GPU card with 2GB of memory gets my business...
well the gt300 will be 1536mb if its 384bit.

 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Keysplayr
LMAOOO.. Video was pretty funny.

Well, I hope GT300 is a beast. New architecture doesn't leave much room for speculation. We don't know squat about it except that it most likely has GDDR5, but does it still retain a 512bit bus like GT200. How many shaders and what exactly are they, how they are different or similar to GT200 shaders? What's changed, what hasn't. Yeesh.. So many questions and the suspense is killing me.

yes, it makes me want to go nucular on them
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Originally posted by: ArizonaSteve
First company to throw out a single GPU card with 2GB of memory gets my business...

Wasn't this already released? They made a GTX 285 with 2GB memory already....
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Fuad may equate Enrico Fermi to nuclear reactors but he was so much more than just the Manhattan guy.

To this day many people in the fields of engineering and sciences employ his so-called Fermi estimate approach to answering questions by way of eliminating the impossible as well as the improbable on our way to estimating the probable.

(used in the so-called Fermi paradox)

And perhaps an equally great contribution, in the field of statistical mechanics was the development of Fermi-Dirac statistics.

None of these contributions lend themselves nicely to the creation of needlessly sensationalistic journalism though, so I don't have to stretch my imagination very far to fathom the reasons why they were overlooked as competing explanations for why Nvidia might be compelled to name a MIMD-based architecture chip after him.
 

Barfo

Lifer
Jan 4, 2005
27,539
212
106
Hopefully this doesn't turn out to be the Prescott of GPUs, don't like that code name at all :(
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Barfo
Hopefully this doesn't turn out to be the Prescott of GPUs, don't like that code name at all :(

That's because your equating a GPU with the heat of a nuclear reactor.
Do some research on Enrico Fermi. You might like the name better then.


 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
I think that article was written by a third grader:

This GPU will also heavily concentrate on parallel computing
Nvidia plans to earn a lot of money because of that.
The gaming part is also going to be fast
has billions of transistors



ALSO IN THE NEWS
GT400 does not yet have a code name, but it has billions of transistors! This GPU will also heavily concentrate on parallel computing! The gaming part is also going to be fast so you know Nvidia plans to earn a lot of money because of that.


ALSO IN THE FUTURE NEWS
GT500 does not yet have a code name, but it has billions of transistors! This GPU will also heavily concentrate on parallel computing! The gaming part is also going to be fast so you know Nvidia plans to earn a lot of money because of that.


ALSO IN THE FUTURER'R NEWS
GT600 does not yet have a code name, but it has tens of billions of transistors! This GPU will also heavily concentrate on parallel computing! The gaming part is also going to be fast so you know Nvidia plans to earn a lot of money because of that.
 
Status
Not open for further replies.