EVGA 7800GT GDDR2 vs GDDR3

Fuelrod

Senior member
Jul 12, 2000
369
0
76
Newegg has two different EVGA 7800GT's for sale.

256-P2-N515-AX GDDR2 version

256-P2-N518 GDDR3 version

Both have same core clock (445) and memory clock (1070). As best I can tell the only difference it one uses GDDR2 and the other GDDR3. Does the GDDR3 version really worth the $80 difference? What would be the difference in real world benchmarks? If there is a real world difference, to me this is like when video card makers sell 128bit memory interface cards at reduced prices compared to 256bit cards and fool the uneducated customer.
 
Jun 14, 2003
10,442
0
0
Originally posted by: Acanthus
there could be a small latency difference, but at the same clockspeed, very very little difference in perofrmance.


doesnt DDR2 run a little hotter as well?

are u sure this isnt a typo...looks odd to me

heres a good article higlighting the differences of DDR2 and DDR3.

the G simply indicates that its use is for graphics.

according to that when DDR2 arrived on the motherboard scene with intel socket 775, GDDR2 had already become obsolete so i dont no why they are putting GDDR2 on new cards like the 7800

that is GDDR2 has already become obsolete, reasons being high power consumption along with thermal problems that negate the marginal performance increase found over the original DDR design.

In the graphics sector, the next generation of DDR, that is, GDDR3 has almost overnight become a main player, with a simplified design, lower power consumption and final riddance of some historical DRAM protocol baggage. The design improvements of GDDR3 compared to (G)DDR-II are extremely intuitive, which makes us wonder whether a similar transition could happen overnight in the system memory / desktop segment as well.


there would probably be marginal ie un-noticable performance difference. the GDDR2 wont overclock as good, and my make the entire card as a whole draw more energy and therefore create more heat, further reducing oc potential

but the price difference is a big one, so you can save alot of money and have potentially a card that wont overclock on the memory all that well, and as a whole use more power. but for nearly $100 saving that seems worth it. GDDR2 is old tech though
 

shabby

Diamond Member
Oct 9, 1999
5,782
45
91
Gddr2 never came out on any video cards, its either ddr or gddr3.
No idea why one is cheaper though.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: shabby
Gddr2 never came out on any video cards, its either ddr or gddr3.
No idea why one is cheaper though.

yes it did ;)

9800 Pro 256mb IIRC...
 

monster64

Banned
Jan 18, 2005
466
0
0
Someone posted this before and IT IS A TYPO.
Wanna know why they have different prices? The more expensive one comes with a free eVGA (Jetway) mobo.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: dug777
Originally posted by: shabby
Gddr2 never came out on any video cards, its either ddr or gddr3.
No idea why one is cheaper though.

yes it did ;)

9800 Pro 256mb IIRC...



Also, NV made a huge spiel back in the day how GDDR2 was going to help the 5800 Ultra so much.

Yeah, GDDR2 saw a little use, but mostly got skipped for GDDR3.
 

MobiusPizza

Platinum Member
Apr 23, 2004
2,001
0
0
GDDR2 is crap. You know why GeForce FX sucks in performance? Because it used GDDR2 which has crappy latency
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: AnnihilatorX
GDDR2 is crap. You know why GeForce FX sucks in performance? Because it used GDDR2 which has crappy latency


Um...

actually GeForce FX sucks in performance for two main reasons:

1) It's a 4 pipeline architecture, which means in today's largely shader limited environment, it's ability to only process 4 pixels per clock (in most situations) means it's extremely crippled.

2) It's pixel pipelines take a HUGE hit if code is written out of order for them. (which most DX9 shader code based on HLSL is). This is why Nvidia introduced it's compiler in it's drivers during the FX series in order to reorder shader code as it was being processed and improve performance. There was still some overhead involved in this though.

Luckily, with the GeForce 7 series, such shader replacement is no longer needed, and has been removed from the driver.


GeForce FX's suckage had VERY LITTLE to do with GDDR2.