Nvidia Launches 300-Series Cards

jpeyton

Moderator in SFF, Notebooks, Pre-Built/Barebones
Moderator
Aug 23, 2003
25,375
142
116
http://www.pcpro.co.uk/blogs/2010/02/24/its-at-it-again-nvidia-rebrands-yet-more-gpus/

gf100-ready.jpg
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
The GT 330 has either 1GB or 2GB of VRAM and an interface of 128 bits, 192 bits or 256 bits. The models can also have either 96 or 112 cores, along with a GPU frequency of 500-550MHz, a memory clock of 500-800MHz and the shader running at 1250-1340MHz.

Woo, sounds great.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Lol, something like that. You get the same performance as the (2007) 8800 GT, but with less power draw, awesome.

Maybe G92 version 5.0 released in 2011 would be small enough to use in smart phones.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Lol, something like that. You get the same performance as the (2007) 8800 GT, but with less power draw, awesome.

Maybe G92 version 5.0 released in 2011 would be small enough to use in smart phones.

If they could get a G92 in a smart phone that would be tight as hell!
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Before reading the links, we've got 8800GT/8800GTS 512 except on newer processes, 1/2gb memory, and smaller ram buses (though they may be DDR5 which would mean more bandwidth, haven't read yet)

And let me guess, they're not gonna cost only ~$90-100 which is about what they're worth
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Yep pretty much. I see the GT320/330 are def G92 rebadges, but is the GT340 using GT200 shaders?
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
G92 isn't a bad core, if only nvidia could work on scaling it up a little.
 

yh125d

Diamond Member
Dec 23, 2006
6,886
0
76
Imagine the plight of someone upgrading their aged 8800GT to a brand new GT330
 

mikek753

Senior member
Dec 21, 2005
358
0
0
those would be lower usage power cards, but no advancement to DX11

how many people will get those NEWER cards to discover later it doesn't come with DX11 ?...
but if GT 340 will on sale for $50 and be low profile then it will be perfect to HTPC
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Like DX8,DX9,and DX10 before it. DX 11 will matter about 3 generations from now. And honestly a low end card wont do squat in DX11 except add a bullet point to its marketing material.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Like DX8,DX9,and DX10 before it. DX 11 will matter about 3 generations from now.

Are you saying this because we have to wait for Xbox 3 console ports in order to get substantial numbers of DX11 games?
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Are you saying this because we have to wait for Xbox 3 console ports in order to get substantial numbers of DX11 games?

I'd say that may have something to do with it. But I was thinking more along the lines of game engines utilizing DX11 to their fullest typically show up down the road and then games use that take time to be developed. DX10 has been out for how many years? Is it a major driving force in game development yet??? It took years for DX9 to become mainstream. It is the nature of the beast. And on a low end part? Outside of the bullet point is really shouldnt be a major concern when making a purchase today.
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
I'd say that may have something to do with it. But I was thinking more along the lines of game engines utilizing DX11 to their fullest typically show up down the road and then games use that take time to be developed. DX10 has been out for how many years? Is it a major driving force in game development yet??? It took years for DX9 to become mainstream. It is the nature of the beast. And on a low end part? Outside of the bullet point is really shouldnt be a major concern when making a purchase today.

Well of course it'll take time cus it's so new but since W7 is selling infinitely faster than Vista ever did, I see DX11 adoption being faster as well. We already have a few DX11 games and even more coming out.
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
I'd say that may have something to do with it. But I was thinking more along the lines of game engines utilizing DX11 to their fullest typically show up down the road and then games use that take time to be developed. DX10 has been out for how many years? Is it a major driving force in game development yet??? It took years for DX9 to become mainstream. It is the nature of the beast. And on a low end part? Outside of the bullet point is really shouldnt be a major concern when making a purchase today.

But there's a lot of reasons DX10 was never used:
1. Console ports
2. Noone bought into Vista
3. Nvidia forced MS to nerf the hell out of DX10 to the point it was basically DX9 with different memory management
4. You had to write a DX9 engine and a dx10 engine to cover both bases

Reason why DX11 should be more prevalent:
1. Next gen of consoles isn't too far out
2. Vista and Win7 are both well established now
3. DX11 has everything DX10 was suppose to have and everything that was wanted for DX11 too.
4. DX11 uses a compatibility switch so one DX11 engine can operate on DX9, DX10 or DX11 hardware, meaning no reason not to write in DX11 and let it switch down to DX9 mode.
 

mikek753

Senior member
Dec 21, 2005
358
0
0
But there's a lot of reasons DX10

Reason why DX11 should be more prevalent:
1. Next gen of consoles isn't too far out
2. Vista and Win7 are both well established now
3. DX11 has everything DX10 was suppose to have and everything that was wanted for DX11 too.
4. DX11 uses a compatibility switch so one DX11 engine can operate on DX9, DX10 or DX11 hardware, meaning no reason not to write in DX11 and let it switch down to DX9 mode.

5. DX11 runs on both Vista & Win 7 - so this will be super set - far more people who will have it - unlike DX10 for Vista only.
 

WicKeD

Golden Member
Nov 20, 2000
1,893
0
0
They are going to sell tons to oem builders. They could care less about "people like us" when it comes to this card. It was not made for us.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
So this will be the 'midrange' GPUs when Fermi hits? Instead of giving us a smaller cut down fermi we get the same old crap again....I admit it probably performs where you'd expect a midrange card to but it goes to show how ATI/AMD got it so right in execution with Cypress and all the lower end derivatives which are out now.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
no, it won't "perform like a midrange card should", it will suck. that is the whole idea behind nvidia building the faster card out there, they use the halo effect to convince people that their 8 year old g92 derivative is faster than a 5830 or whatever. why couldn't they at least take a gt200 derivative instead??? why???!!! /rant
 

gorobei

Diamond Member
Jan 7, 2007
4,110
1,613
136
4. DX11 uses a compatibility switch so one DX11 engine can operate on DX9, DX10 or DX11 hardware, meaning no reason not to write in DX11 and let it switch down to DX9 mode.

FYI, there is no dx9 compatibility option in dx11. it is a superset of dx10, meaning that dx10 hardware can use the dx10 base of the engine and won't implement the extra dx11 features.
currently you write a dx9 version of a game, and then write a separate version for dx10. there is some overlap, but they are fundamentally different. dx11 is better thought of as dx10+. the card/hardware can execute the 10code and ignores the +code. there is no option to convert it to dx9.