[GamersNexus] GTX 1030 DDR4 "A Disgrace of a Graphics Card"

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
Your not supposed to play games with the 1050 GTX . It's sole purpose is the 2D desktop and it does a great job at that. They put this junk on ready made system from costco or Fry's and what not. Its a POS or else it would be called 1080 hehhe.

If you want to display the desktop then integrated graphics works fine. The proper version of the 1030 can achieve playable framerates in Overwatch with reasonable settings, this crippled version gets half the performance.

I have no issue with the card existing. A cheap card that can decode modern video codecs and support modern APIs serves a purpose. My problem is with the marketing. Why the hell does this have the same name as a card with twice the performance?
 

Ranulf

Platinum Member
Jul 18, 2001
2,822
2,423
136
That must be Intel integrated, because AMDs kicks it to the curb. I'd actually like to see a real Intel IGP comparison, as I wonder if current Intel IGPs are closer than that graphic suggest.

Judging by this comparison of the Intel HD 630 IGP to the regular GT 1030, the IGP looks about as far behind the GT1030, as the GT1030 DDR4 is. Which should make 1030 DDR4 and HD 630 IGP something of a contest:
https://www.youtube.com/watch?v=h0MeI0sQfy4

I have to wonder why any business with integrity would be involved with this product at all. It's such garbage, that no customer would buy it if properly informed, so it really only sells via some level of deception.

I wonder if board partners are required by contract to build every variation that NVidia churns out.

I'd give high marks to the board partner that said: "No thanks, we aren't building that variation because it's garbage we won't sully our name with". Note that definitely is not you EVGA.

If you look at the full size pic, it says they (evga) tested their ddr4 1030 on a i3-6100 vs an i5-6600 in 3dmark 11 Performance. So, same 6th gen chips, both with Intel® HD Graphics 530 but the i5 with 4 cores, slower 3.3ghz clock speed (boosts higher to 3.9, 3.7 base for i3).

I'm thinking the blame goes all around here. Mostly with Nvidia but for AiB's to put this out, meh. Its happened before but give me a break.
 

thigobr

Senior member
Sep 4, 2016
243
185
116
Not defending nVidia: they should make it clear that this is a capped version! But we have seen this before without this much of surprise.

Who remembers GeForce 4 MX440 using 64bit SDR memory (1/4 bandwidth compared to 128bit DDR). The various versions of GeForce 4 MX4000. FX5200, Radeon 9200/9550, etc

It has always been like this for end of life GPUs: they cut costs in every imaginable way without passing the savings to the end user. Usually increasing the Memory Size but reducing speed and bus width. Classical move! It's not new!
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Not defending nVidia: they should make it clear that this is a capped version! But we have seen this before without this much of surprise.

Who remembers GeForce 4 MX440 using 64bit SDR memory (1/4 bandwidth compared to 128bit DDR). The various versions of GeForce 4 MX4000. FX5200, Radeon 9200/9550, etc

It has always been like this for end of life GPUs: they cut costs in every imaginable way without passing the savings to the end user. Usually increasing the Memory Size but reducing speed and bus width. Classical move! It's not new!


Links or it didn't happen.

I have been following video cards since were arguing about the quality of RAMDACs for VGA quality, and which had the fastest frame buffers for DOS games.

I don't remember a case as egregious as this one. Usually there is a designation like "LE" that warns you away, or they are bulk OEM cards for Dell only, or the card gets a bigger buffer at the same time and it wasn't that bandwidth constrained so the results aren't that different.

But this card is often about HALF the speed, and there is no "LE" designation to warn buyers away.

Regardless of what happened before, we should get out the pitchforks every time NVidia tries to pull this redacted, just like when they tried the GPP redacted.

Enough negative reaction online, might have them rethink this kind of "strategy".

Profanity is not allowed
in the tech areas.

AT Mod Usandthem
 
Last edited by a moderator:
  • Like
Reactions: loafbred and .vodka

thigobr

Senior member
Sep 4, 2016
243
185
116
I totally agree we should react negatively! Many costumers won't even know they are getting a card way slower because of the naming scheme. What I don't get is why many people are reacting as if this issue is a new thing!

About the links or didn't happen, here is an example:
https://www.techpowerup.com/gpudb/2132/geforce4-mx-440-8x
https://www.evga.com/products/pdf/nv65-lx.pdf

Same "MX440-8X" but 32 bit DDR bus! Less than 1/4 bandwidth as I said before, compared to the original 128bit/DDR/Higher clocks
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
Not defending nVidia: they should make it clear that this is a capped version! But we have seen this before without this much of surprise.

Who remembers GeForce 4 MX440 using 64bit SDR memory (1/4 bandwidth compared to 128bit DDR). The various versions of GeForce 4 MX4000. FX5200, Radeon 9200/9550, etc

It has always been like this for end of life GPUs: they cut costs in every imaginable way without passing the savings to the end user. Usually increasing the Memory Size but reducing speed and bus width. Classical move! It's not new!

yes... I bought a x1550 with 32bit memory once... a card that originally had 128bits memory.

it's not something new or just Nvidia, but, it's good that they are getting negative press because of it anyway.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
I totally agree we should react negatively! Many costumers won't even know they are getting a card way slower because of the naming scheme. What I don't get is why many people are reacting as if this issue is a new thing!

About the links or didn't happen, here is an example:
https://www.techpowerup.com/gpudb/2132/geforce4-mx-440-8x
https://www.evga.com/products/pdf/nv65-lx.pdf

Same "MX440-8X" but 32 bit DDR bus! Less than 1/4 bandwidth as I said before, compared to the original 128bit/DDR/Higher clocks

Actually I think that is an error in reporting. Find me a review that shows it has 1/4 bandwidth. It looks more like they are reporting the size of chips used than bus on the card.

Even the SE and 420 don't have a 32 bit bus.

If this happened this drastically before, then there should be similar reviews complaining about it.

Why this looks worse to me is because I have yet to see those reviews.

I do remember lots of little ones like the recent AMD case, but nothing as big as losing half the performance. It seems like these extreme cases would be easy to find if they were really that extreme.

Previous reviews are important because as I indicated in a previous post, there were gimped cards that were for big OEMs (like Dell) only. So not many people cared about a special Dell card.

But name brand cards that cut performance in half with insufficient designation would have had similar outrage to this one, and I don't remember those outrages.
 
Last edited:

Tweak155

Lifer
Sep 23, 2003
11,449
264
126
yes... I bought a x1550 with 32bit memory once... a card that originally had 128bits memory.

it's not something new or just Nvidia, but, it's good that they are getting negative press because of it anyway.
What was bad then is still bad now. Shame on any company that pulls this crap.
 

coercitiv

Diamond Member
Jan 24, 2014
7,225
16,982
136
What I don't get is why many people are reacting as if this issue is a new thing!
First of all, many posters in this thread know it happened before, in fact I'd like to see the thread post that says otherwise, in the lines of "this is never happened before, from either AMD or Nvidia". Second of all, this is how history works, people forget, were not interested at the time or they're just too young to remember.

What I don't get is why it's more important for some that this happened before apparently without retaliation than the fact that some of the media doesn't want to turn a blind eye to this crap anymore and turns up the heat instead. If the lack of reaction in the past is of utmost importance, then read how Tom's Hardware commented on the GT 730 announcement four years ago:
We especially do not understand why the first model with just 96 CUDA cores is called a GT 730. This will only cause confusion among customers.
and Geeks3D
This card is available in three versions, with important differences between versions which make the choice of this almost unworthy card difficult.

This happened before, the media warned and complained about deceiving branding years ago, albeit with less intensity. I'm surprised that some people around here think complaining about it is a first time occurrence. The irony is strong here.

Start your search engine, look around. Even forums discussed on the subject.
 

Samwell

Senior member
May 10, 2015
225
47
101
Links or it didn't happen.

I have been following video cards since were arguing about the quality of RAMDACs for VGA quality, and which had the fastest frame buffers for DOS games.

I don't remember a case as egregious as this one. Usually there is a designation like "LE" that warns you away, or they are bulk OEM cards for Dell only, or the card gets a bigger buffer at the same time and it wasn't that bandwidth constrained so the results aren't that different.

But this card is often about HALF the speed, and there is no "LE" designation to warn buyers away.

Regardless of what happened before, we should get out the pitchforks every time NVidia tries to pull this bullshit, just like when they tried the GPP bullshit.

Enough negative reaction online, might have them rethink this kind of "strategy".

Seems you missed all the redacted happening in low end then. It happened in every generation we had so far. There were always cards with the same name and just DDR3/2 etc. which had half the speed.

http://www.sapphiretech.com/productdetial.asp?pid=D4693F7D-75BC-482C-AB2F-909D638C838B&lang=eng
http://www.sapphiretech.com/productdetial.asp?pid=B4DBE4FA-071A-4291-B92E-F4434BD796FA&lang=eng

Sometimes like above they even gave the slower cards more ram, so people were thinking they're buying the faster one.
It's even worse in notebookspace. There was so much redacted released, it's unbelievable.

The scandal here is that reviewers totally didn't care in the last 10-15 years. It should've been called out much earlier.

Profanity (even when abbreviated)
is not allowed in the tech areas.

AT Mod Usandthem
 
Last edited by a moderator:

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Seems you missed all the redacted happening in low end then. It happened in every generation we had so far. There were always cards with the same name and just DDR3/2 etc. which had half the speed.

http://www.sapphiretech.com/productdetial.asp?pid=D4693F7D-75BC-482C-AB2F-909D638C838B&lang=eng
http://www.sapphiretech.com/productdetial.asp?pid=B4DBE4FA-071A-4291-B92E-F4434BD796FA&lang=eng

Sometimes like above they even gave the slower cards more ram, so people were thinking they're buying the faster one.
It's even worse in notebookspace. There was so much redacted released, it's unbelievable.

The scandal here is that reviewers totally didn't care in the last 10-15 years. It should've been called out much earlier.

I actually found a review for that one with some compares to the GDDR version.
https://www.youtube.com/watch?v=xb9QjaybewQ

Not quite as bad as 1030 falls with DDR.

If it is a case of Reviewers paying more attention today. Then good for the the reviewers of today paying more attention. We can't gain anything by chastising different reviewers 10+ years in the past, for not being on the ball.

All I can say is good job to today's reviewers for doing a better job, getting the outrage going on this redacted.

As I said several times. I am aware of lots of minor cases like this in the past but often there were mitigating factors that weren't quite as bad as this case, Memory size variations that made it more obvious it was a different model, Significant naming differences, significantly lower prices to compensate.

This card is named the same, priced about the same, with the same amount of memory, but loses up to half it's performance, or stated another way, the "real" card is up to TWICE as fast. This feels like an outright scam more than any previous cases I remember.

I just went to microcenter and typed GT 1030 into the search engine, and there was only one card available for shipping. $95 for DDR4 version.

Profanity isn't allowed
in the tech areas.

AT Mod Usandthem
 
Last edited by a moderator:
  • Like
Reactions: crisium

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
I can see a bit of sense in most of the other deceptive offerings.

I can't see any sense in the 64bit SDDR4-2100 GT1030 offering. No sense for Nvidia to have it, no sense for retailers to try to sell it, no sense for customers to buy it.
I don't see an up side.

DDR4-2100 ram can't be that cheap, can it? While DDR4 is very expensive otherwise.
Did they accidentally make a batch of extra-slow DDR4 chips they had to get rid of? They aren't even 2133.
 
  • Like
Reactions: wilds and Cerb

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
I can see a bit of sense in most of the other deceptive offerings.

I can't see any sense in the 64bit SDDR4-2100 GT1030 offering. No sense for Nvidia to have it, no sense for retailers to try to sell it, no sense for customers to buy it.
I don't see an up side.

DDR4-2100 ram can't be that cheap, can it? While DDR4 is very expensive otherwise.
Did they accidentally make a batch of extra-slow DDR4 chips they had to get rid of? They aren't even 2133.

If DDR4 is very expensive right now, you can bet that GDDR5 will be even more expensive.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
If DDR4 is very expensive right now, you can bet that GDDR5 will be even more expensive.
If there was a real price advantage to it, then we'd expect to see more of it on low end cards wouldn't we?
Shouldn't we have seen it earlier if it made the cards more profitable?

Is there a price advantage to the 64bit DDR4 arrangement over at least 128bit?
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
The scandal here is that reviewers totally didn't care in the last 10-15 years. It should've been called out much earlier.
10-15 years ago, the slow versions of one of these cards were still far faster than IGP, so there wasn't quite as much to raise a fuss about. With Intel's integrated video not sucking, and AMD's being competitive with sub-$100 cards, especially when given fast RAM, the value proposition of every current Geforce lower than a GTX 1050 is questionable. That is something very different from 10-15 years ago. 10-15 years ago, even the slow version offered good value when compared to IGP. I dislike the model numbering tricks, but in the past, I've bought some of the DDRx cards, because there wasn't any compelling reason to spend even $10 more, but compelling reasons to buy a current-generation video card. Today, not so much.

You can be better off, today, in bang/buck terms, by spending $20 more on RAM and completely forgoing a video card, instead of buying a GT 1030 GDDR5, or getting a few cheaper parts to afford a GTX 1050 or RX 560 on the same budget. The GT 1030 GDDR5 still has a place for upgrading a crappy desktop, but not really for a new build. The DDR4 version of it is not only a poor value as an upgrade to even Intel's IGP, but can be worse than not having a card at all, against the R5 2400G, and does not offer any special features to make up for it (FI, several simultaneously usable DP ports).
 
Last edited:
  • Like
Reactions: coercitiv

whm1974

Diamond Member
Jul 24, 2016
9,436
1,569
126
It really looks pretty stupid to cripple a dGPU to the point that the iGPU on a $100 processor can outperform it by a good margin.
 
  • Like
Reactions: crisium and Cerb

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Your not supposed to play games with the 1050 GTX . It's sole purpose is the 2D desktop and it does a great job at that. They put this junk on ready made system from costco or Fry's and what not. Its a POS or else it would be called 1080 hehhe.
The GTX 1050 is a fine lower-end gaming card, and also a good low-end content creation card. For 2D desktop, IGP from about the AMD 780 chipset, and Intel's HD graphics from Haswell, will be more than enough, even for a few monitors. Substantially cheaper cards than even this GT 1030 DDR4 will do, for other uses (like getting newer HDMI DRM support). The GTX 1030 GDDR5 is an OK low-end card, capable of acceptable 720P gaming, even if it does compete with the latest IGP.
 
  • Like
Reactions: SlowBox

whm1974

Diamond Member
Jul 24, 2016
9,436
1,569
126
The GTX 1050 is a fine lower-end gaming card, and also a good low-end content creation card. For 2D desktop, IGP from about the AMD 780 chipset, and Intel's HD graphics from Haswell, will be more than enough, even for a few monitors. Substantially cheaper cards than even this GT 1030 DDR4 will do, for other uses (like getting newer HDMI DRM support). The GTX 1030 GDDR5 is an OK low-end card, capable of acceptable 720P gaming, even if it does compete with the latest IGP.
If I recall from reading the benchmarks, the 1030 GDDR5 has better graphics performance then the 2200G APU does, but the 2400G APU comes close enough that casual and low end games don't need the low end dGPU to preform at playable framerates.

However the 2400G is kind of an odd duck given that is is around the same price as the 2600.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
If I recall from reading the benchmarks, the 1030 GDDR5 has better graphics performance then the 2200G APU does, but the 2400G APU comes close enough that casual and low end games don't need the low end dGPU to preform at playable framerates.

However the 2400G is kind of an odd duck given that is is around the same price as the 2600.
That's because not using it, but wanting similar graphics performance, effectively adds another $100 to your costs (GTX 1030 GDDR5 or RX 550), with a stronger/better CPU, and results in worse bang/buck with a weaker/cheaper CPU. The R5 2600 all of a sudden becomes a $300+ proposition, against the <$200 2400G, and the 2400G often edges out the Core i3s in CPU performance. Why charge the same as a Core i3, if you can get more money than that, right?
 

whm1974

Diamond Member
Jul 24, 2016
9,436
1,569
126
That's because not using it, but wanting similar graphics performance, effectively adds another $100 to your costs (GTX 1030 GDDR5 or RX 550), with a stronger/better CPU, and results in worse bang/buck with a weaker/cheaper CPU. The R5 2600 all of a sudden becomes a $300+ proposition, against the <$200 2400G, and the 2400G often edges out the Core i3s in CPU performance. Why charge the same as a Core i3, if you can get more money than that, right?
Well I'm thinking more along the lines of a midrange gaming system which would have a dGPU anyway. For such builds the 2600 will make more sense then the 2400G given that they are the same price and the 2600 has more cores and threads.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Well I'm thinking more along the lines of a midrange gaming system which would have a dGPU anyway. For such builds the 2600 will make more sense then the 2400G given that they are the same price and the 2600 has more cores and threads.
If you're planning on spending more than about $600 on the box, I don't think the 2400G makes sense. It makes sense when your other option for gaming is a Pentium, Core i3, or Ryzen 3, combined with a low-end dGPU, or if the faster options would necessitate too little RAM, a low quality PSU, tiny OS SSD, lower quality monitor, etc..
 

whm1974

Diamond Member
Jul 24, 2016
9,436
1,569
126
If you're planning on spending more than about $600 on the box, I don't think the 2400G makes sense. It makes sense when your other option for gaming is a Pentium, Core i3, or Ryzen 3, combined with a low-end dGPU, or if the faster options would necessitate too little RAM, a low quality PSU, tiny OS SSD, lower quality monitor, etc..
Around $600 sound almost right for a budget build using Windows
https://pcpartpicker.com/list/7hRrcY
Although at that price the 2200G APU sans video card does look temping depending on what games the user plays.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Around $600 sound almost right for a budget build using Windows
https://pcpartpicker.com/list/7hRrcY
Although at that price the 2200G APU sans video card does look temping depending on what games the user plays.
Here's one with a 2400G, using an SSD and fast RAM. Windows omitted for easier comparison. Came out a little cheaper than I was expecting, actually.

PCPartPicker part list / Price breakdown by merchant
CPU: AMD - Ryzen 5 2400G 3.6GHz Quad-Core Processor ($154.99 @ Newegg Marketplace)
Motherboard: ASRock - AB350 Pro4 ATX AM4 Motherboard ($54.99 @ Newegg)
Memory: Team - T-Force Vulcan 8GB (2 x 4GB) DDR4-3200 Memory ($89.99 @ Newegg)
Storage: SanDisk - SSD PLUS 240GB 2.5" Solid State Drive ($53.99 @ Adorama)
Case: Corsair - SPEC-01 RED ATX Mid Tower Case ($34.99 @ Newegg)
Power Supply: Corsair - CXM (2015) 450W 80+ Bronze Certified Semi-Modular ATX Power Supply ($29.99 @ Newegg)
Total: $418.94
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2018-07-12 16:58 EDT-0400
 

whm1974

Diamond Member
Jul 24, 2016
9,436
1,569
126
Here's one with a 2400G, using an SSD and fast RAM. Windows omitted for easier comparison. Came out a little cheaper than I was expecting, actually.

PCPartPicker part list / Price breakdown by merchant
CPU: AMD - Ryzen 5 2400G 3.6GHz Quad-Core Processor ($154.99 @ Newegg Marketplace)
Motherboard: ASRock - AB350 Pro4 ATX AM4 Motherboard ($54.99 @ Newegg)
Memory: Team - T-Force Vulcan 8GB (2 x 4GB) DDR4-3200 Memory ($89.99 @ Newegg)
Storage: SanDisk - SSD PLUS 240GB 2.5" Solid State Drive ($53.99 @ Adorama)
Case: Corsair - SPEC-01 RED ATX Mid Tower Case ($34.99 @ Newegg)
Power Supply: Corsair - CXM (2015) 450W 80+ Bronze Certified Semi-Modular ATX Power Supply ($29.99 @ Newegg)
Total: $418.94
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2018-07-12 16:58 EDT-0400
Personally I would go for a 500GB SSD, but then again the money I save by using Linux and not Windows can be used for that.