The bottom end: AMD accepting Intel is good enough? (+Poll)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Sub $100 GPUs - would you?

  • I already have bought one recently

  • I would never buy a sub-$100 GPU

  • I would buy sub-$100 if making a low power comp/HTPC (even if the CPU had integrated graphics)

  • Don't know

  • Other

  • Integrated is good enough for that sort of computer


Results are only viewable after voting.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
If you don't, the current crop of Integrated graphics are just fine.

If you do, you're not doing yourself any favors with a <$100 card.

For casual gamers on a budget, IVB is going to bring a decent increase in performance over SB.

Performance is pretty good considering it's "free".

2012-02-21-image-6.png

Source

I'd even argue that IVB will already make HD7750 somewhat questionable at $100 unless a person needs specific features of GCN for HTPC, etc. For people who don't play at higher resolutions with DX11 settings on or who don't play the most modern games, I don't think something like an HD7750 is even worth $100. IVB will also support 3 displays and even faster video encoding. Also, I would expect as the industry further embraces the APU path, we might fairly large performance increases from one APU generation to the next, especially compared to what appear to be marginal upgrades in the $100-125 GPU space. It took AMD almost 2.5 years to introduce the HD7770 @ $159 that's barely 20-25% faster than an $80 HD5770; but Haswell's APU will probably double the performance of IVB's APU in Q2 of 2013. The future is much brighter for APU performance increases, while performance at $100-125 level is pretty stagnant at the moment for discrete GPUs.

AMD's Kaveri looks to be as fast as an HD7750 by next year. Unless these estimates are way off or we see much more powerful entry level GPUs, I'd imagine < $100 GPUs are more or less irrelevant. NV and AMD would need to seriously increase the performance offered at $50-100 levels to make any reasonable sense. I think they'd need HD6850 level of performance at minimum for $80 and HD6870 at $100 by next year to sway casual gamers to upgrade from the APU.
 
Last edited:

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
I think this picture paints a better view of the lower end graphics landscape:

vsgt240.png


Ivy Bridge is pretty well nestled below mainstream dedicated graphics. It's also only 1280 x 720p resolution, though looking at the 3DMark scores and the actual gaming scores, I wonder why are the HD graphics parts so strongly positioned. Perhaps it's not so memory and memory bandwidth dependent? Intel has to increase their memory bandwidth potential by increasing the DDR3 spec capabilities or by expanding to another memory channel in order to keep scaling the actual graphics capability. Main processor TDW is also another potential problem, especially in mobile systems. I've had own recent "fun" with Intel's Turbo Boost BS throttling my system, and that is while not using the on board IGP (lappy in sig).

Whatever the case, lower end dedicated graphics still has a niche. I would like to see how scores scale as the resolution is increased too. I also think computer noobs or cheap asses, whoever the case, would eventually get fed up with the issues involved with IGP gaming: low relative performance and processor throttling. It's only too bad you're stuck with the components for the most part unless you do some (typically) risky upgrading. If only USB 3.0 and other high speed connections were more prevalent and faster for external graphics.

Ivy Bridge sits in the performance area of the 8600 GT, a mainstream/mid range gaming card from 5 years ago.

How could dedicated GPU makers continue to make smaller/cheaper dedicated graphics processors that can really compete (at least in desktops)? Pull an Intel and ramp up the graphics clock. HD 4000 seems to be right at the level of AMD's Caicos (Radeon 6450). Something like Caicos on 28 nm, could definitely be pushed beyond 1100 or 1200 MHz territory, keeping it useful as a cheap add in graphics processor, while being significantly more powerful than IB.

Perhaps for the HD 8000 line, AMD could start completely fresh, with GCN architecture across the entire board. The lowest end should be a 256:16:8 configured GPU @ 1 GHz max with 128 bit memory interface. It could replace Caicos, Redwood and Turks in own full swoop and increase what to expect out of the low.
 
Last edited:

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Of course APUs will eventually settle into their niche and a certain segment of the GPU market will cease to exist. It hasn't quite happened yet, though, the market still needs to be saturated. Based on what I've seen, I don't think that will quite reach the $100 level. More like $75. Any advance that pushes forward APUs will probably push forward contemporary discrete GPUs as well.
 

palladium

Senior member
Dec 24, 2007
539
2
81
I have a laptop with i3-2100M with HD3000 and Radeon HD 6770 (switchable gfx). I only use the intel IGP for 2D work, anything 3D (even at 1366*768) and I would switch to the Radeon. Heck the IGP can't max out warcraft 3 at 1366*786 while maintaining 60fps (in dota).
 

T_Yamamoto

Lifer
Jul 6, 2011
15,007
795
126
I have a laptop with i3-2100M with HD3000 and Radeon HD 6770 (switchable gfx). I only use the intel IGP for 2D work, anything 3D (even at 1366*768) and I would switch to the Radeon. Heck the IGP can't max out warcraft 3 at 1366*786 while maintaining 60fps (in dota).

Well...it is a iGPU
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
The $100 gpu is exactly the sort of thing I look for when getting a laptop - kids do game on it but I'm not willing to pay the extortionate "gaming" laptop prices. Hence the $100 gpu market tends to be just the one I look at - significantly faster then igp (and always will be due to separate memory + bus) but still affordable.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
You either care about games or you don't.

If you don't, the current crop of Integrated graphics are just fine.

If you do, you're not doing yourself any favors with a <$100 card.

The few rare situations where I need a PCIE videocard that isn't particularly powerful, I'll just use an older/retired card like a ~9800GT.

hi, i am a CAD user...

i have an radeon 4650, and i bought it last year... super cheap :thumbsup:... i am not using a sandy bridge right now, just because intel openGL is broken

actually, this is the reazon that sub-100 cards exist.....intel poor drivers
 

blckgrffn

Diamond Member
May 1, 2003
9,687
4,348
136
www.teamjuchems.com
I buy plenty of sub $100 GPUs.

I'd say that it is rare that I pay more than $100 for one. Call me cheap if you want, but the utility on higher end cards is rather low for people who rarely bother to mess with settings outside of getting their native resolution dialed in. GTX 275s (260, 285,etc.)? GTS 250? 5770/6770? 4850/4870? They get the job done just fine for many and are probably still pretty powerful compared to what most people use.

TF2 and SC2 aren't really that taxing.

Typically they are super cheap AMIR a generation or two old, but given the performance "progress" made of late they are still very relevant as long as you buy a PSU that can handle them (and if you are buying a decent PSU, a single card south of those "two cards on one PCB" this should not be an issue) then you can be gaming with reasonable performance for not much dough.

I've been particular pleased with my little 6570 as a new purchase. Less then $50, sips power, good features, and enough oomph to play just about anything so long as you turn the details down. As a secondary card this is typically for LAN parties when it is more about having fun than inspecting walls and flora.

Recently (as in yesterday) I started spec'ing out a llano based system for a friend who wanted a PC to play games on sometimes but really just to be well rounded and reasonably priced. It just didn't make sense, I ended up looking at getting a 5670 w/GDDR5 instead as the motherboards are just too pricey to go with the APUs, IMHO.
 
Last edited:

Concillian

Diamond Member
May 26, 2004
3,751
8
81
Want to bet that it's not going to happen until DDR4? Juniper has 76GB/s of bandwidth. How do you expect an integrated GPU, with less than half the bandwidth that it has to share with the CPU, to achieve the same performance?

Trinity is supposed to be quad channel DDR3, which would be 51.2 GB/sec for DDR3-1600, 60 GB/sec at 1866 and 68 at 2133.

Given how well they did with GCN on the same bandwidth moving from 5770 to 7770, it seems at least possible to get 5770 level performance or close to it on quad channel DDR3. Don't need to wait for DDR4.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Trinity is supposed to be quad channel DDR3, which would be 51.2 GB/sec for DDR3-1600, 60 GB/sec at 1866 and 68 at 2133.

Given how well they did with GCN on the same bandwidth moving from 5770 to 7770, it seems at least possible to get 5770 level performance or close to it on quad channel DDR3. Don't need to wait for DDR4.

Trinity quad channel? Seriously? Where did you get that information? No way a low-cost platform is going to be quad channel. Everything I've read so for points to a dual channel.

AMD_Trinity_1M.jpg


See those 2 DDR3 DIMMS? The only improvement in memory department is the support of DDR3 2133
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
8,523
7,782
136
That may be part of the reason, but an even bigger factor is that older products become the sub $100 card segment as new products are released and the old ones are discounted. Newegg has as GeForce 210 for $20, 5450's run for as low as $30, and a 6450 goes for about $40. I also found a GeForce 520 going for $30, and a 6570 going for $50. There was also a Radeon 7XXX card for under $100; well, sort of.

There are still plenty of good value cards out there for under $100. Right now, with the new process, AMD and nVidia are trying to get as many of their mainstream and other high value cards to market. Those areas account for more revenue and profit than the lowest of the low end.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
My music jukebox has an i3-2100 with HD2000. It's all the GPU power I need for normal windows use and media playback.

When the GTX 560 ti in my gaming system was off being RMA'd, it was even good enough for even light gaming (Half-Life 2, Torchlight at 1024 x 768) and I did not see any "driver issues" in 40+ hours of playing Torchlight.

Since IVB will be better than HD2000, sub-$100 cards make no sense for most users.
 

blckgrffn

Diamond Member
May 1, 2003
9,687
4,348
136
www.teamjuchems.com
Trinity quad channel? Seriously? Where did you get that information? No way a low-cost platform is going to be quad channel. Everything I've read so for points to a dual channel.

See those 2 DDR3 DIMMS? The only improvement in memory department is the support of DDR3 2133

Hah, I wish too. I'd even settle for a triple channel part on the destkop. I mean, if they are truly going to an APU only future (are they?) then they need to get serious.

If they make a 4 module/960 shader/quad channel part would we consider that "high end" or would we laugh at them? I suppose it would come down to price...
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Hah, I wish too. I'd even settle for a triple channel part on the destkop. I mean, if they are truly going to an APU only future (are they?) then they need to get serious.

If they make a 4 module/960 shader/quad channel part would we consider that "high end" or would we laugh at them? I suppose it would come down to price...

It's really less about channels than the overall bandwidth. Until we have DDR4, more channels and/or higher freq. DDR3 is what we have today. That said, Intel has said they prefer a channel/2 cores. We could see triple-channel mainstream setups down the road, but that brings extra costs to many OEMs. That means 3 DIMMs per standard build, and that is more components. I see higher-bandwidth dual-channel continuing as 'the norm' for some time.
 

LoneNinja

Senior member
Jan 5, 2009
825
0
0
Sorry if this has already been addressed as I didn't read the whole thread.

AMD hasn't dropped the sub $100 GPU market, they just don't intend to release anything below the 7750 on the new architecture at this time. They wanted to keep Trinity crossfire capable with their low end 7000 line which was only possible by using the old architecture found in the 6000 series since that is what Trinity's GPU is based on. Besides 28nm is new and there is bound to be production capacity constraints, why cause a shortage of higher end 7000 parts when they're still manufacturing most of their 40nm 6000 series parts?

To those who don't know, the 7670 has already been released, it's a 6670 rebadge currently available only to OEMs. Just like the 6770 was OEM only at launch, I'm sure we'll be seeing 7450, 7670, and other rebadged parts in retail later this year.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
AMD hasn't dropped their sub $100 GPUs. They're still available to OEMs as rebranded 6xxx parts (which are upclocked rebrands themselves). Pretty much what the poster above me said.

There really isn't a point to creating 28nm versions of these cards either - 28nm is too expensive currently, and there simply isn't enough wafer production to justify adding more designs to the already overbooked HKMG process.

I'm sure we'll see new designs with the 8000 series though... it's been far too long since the last time AMD has given love to the low end GPU market.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
Is a 5750 considered a sub-$100 GPU? I have one. Those things rock. High settings on nearly all games at 720p or medium at x900. No current CPUs/APUs from neither Intel nor AMD can match that performance or get anywhere close.

The poll should be <$60, not $100. Also, I'd imagine nVidia would be losing the most from the cannibalizing of the low end. They've got no x86 license and despite Intel winning the majority of market share, at least AMD has products out there. With Trinity they should be within 6670 range on an APU and we'll see where IB goes. And as others have stated, as 28nm gets cheaper to produce and yields go up we may be seeing some low end stuff on that node as well.
 

blckgrffn

Diamond Member
May 1, 2003
9,687
4,348
136
www.teamjuchems.com
It's really less about channels than the overall bandwidth. Until we have DDR4, more channels and/or higher freq. DDR3 is what we have today. That said, Intel has said they prefer a channel/2 cores. We could see triple-channel mainstream setups down the road, but that brings extra costs to many OEMs. That means 3 DIMMs per standard build, and that is more components. I see higher-bandwidth dual-channel continuing as 'the norm' for some time.

True, but a bummer. Let's hope they figure out DDR4 and get it out in time for the GCN APU refresh. If I am dreaming about more channels, I might as well through DDR4 in, right? :)

I also want to be able to put more than 16GB of RAM cheaply into a box.

That put's me officially on the hunt for used 1366 platform, I suppose... Or I could drop $200 in ram and go to 32GB probably/maybe for my ESXi box. The site says 32GB, the manual says 16GB, what to believe?

Anyway, I've also read that AMD could extract more bandwidth from their DDR3 controller. Let's hope they do that as well.
 

videoclone

Golden Member
Jun 5, 2003
1,465
0
0
Sorry if this has already been addressed as I didn't read the whole thread.

AMD hasn't dropped the sub $100 GPU market, they just don't intend to release anything below the 7750 on the new architecture at this time. They wanted to keep Trinity crossfire capable with their low end 7000 line which was only possible by using the old architecture found in the 6000 series since that is what Trinity's GPU is based on. Besides 28nm is new and there is bound to be production capacity constraints, why cause a shortage of higher end 7000 parts when they're still manufacturing most of their 40nm 6000 series parts?

To those who don't know, the 7670 has already been released, it's a 6670 rebadge currently available only to OEMs. Just like the 6770 was OEM only at launch, I'm sure we'll be seeing 7450, 7670, and other rebadged parts in retail later this year.

32nm Trinity will be made by GlobalFoundries and will use the newest GCN HD7000 seires for its IGP...
"Thats from AMD Direct"

so i'm not sure what your talking about!
 

Obsoleet

Platinum Member
Oct 2, 2007
2,181
1
0
For casual gamers on a budget, IVB is going to bring a decent increase in performance over SB.

Performance is pretty good considering it's "free".

2012-02-21-image-6.png

Source

I'd even argue that IVB will already make HD7750 somewhat questionable at $100 unless a person needs specific features of GCN for HTPC, etc. For people who don't play at higher resolutions with DX11 settings on or who don't play the most modern games, I don't think something like an HD7750 is even worth $100. IVB will also support 3 displays and even faster video encoding. Also, I would expect as the industry further embraces the APU path, we might fairly large performance increases from one APU generation to the next, especially compared to what appear to be marginal upgrades in the $100-125 GPU space. It took AMD almost 2.5 years to introduce the HD7770 @ $159 that's barely 20-25% faster than an $80 HD5770; but Haswell's APU will probably double the performance of IVB's APU in Q2 of 2013. The future is much brighter for APU performance increases, while performance at $100-125 level is pretty stagnant at the moment for discrete GPUs.

AMD's Kaveri looks to be as fast as an HD7750 by next year. Unless these estimates are way off or we see much more powerful entry level GPUs, I'd imagine < $100 GPUs are more or less irrelevant. NV and AMD would need to seriously increase the performance offered at $50-100 levels to make any reasonable sense. I think they'd need HD6850 level of performance at minimum for $80 and HD6870 at $100 by next year to sway casual gamers to upgrade from the APU.

Those IB numbers are some pretty impressive IMO. Considering it's sitting on the same chip as the CPU and it's all in one, I think it's pretty decent. Good enough for me and most games I'd play on a laptop, probably a little faster. That's 720P which is high enough res for me.

I might not be happy with that at the desktop level, but in 1 or 2 generations I plan on not buying PCIE GPUs at all. That 1-2 generations probably only applies to AMD integrated, but hopefully Intel will be right behind them.
 

pelov

Diamond Member
Dec 6, 2011
3,510
6
0
32nm Trinity will be made by GlobalFoundries and will use the newest GCN HD7000 seires for its IGP...
"Thats from AMD Direct"

so i'm not sure what your talking about!

Well that was an interesting twist... During a talk on the next generation of GPU technology at the AMD Fusion Developer Summit, one of the engineers was asked about Trinity, the next APU to be released in 2012 (and shown running today for the very first time). It was offered that Trinity in fact used a VLIW4 architecture rather than the VLIW5 design found in the just released Llano A-series APU.

http://www.pcper.com/news/Graphics-...rinity-APU-will-use-VLIW4-Cayman-Architecture

What we know for a fact is that Trinity – the 2012 Bulldozer APU – will not use GCN, it will be based on Cayman’s VLIW4 architecture.

http://www.anandtech.com/Show/Index...-core-next-preview-amd-architects-for-compute

Trinity is VLIW4. Any rumors stating it's GCN are just rumors =P
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Wait, Trinity will use VLIW4? Wasn't the 6900 series the only one to use VLIW4? So there is no budget-level VLIW4 GPU chip already in existence; they have to make one from scratch and put it in Trinity. Seems to me like it would have been easier to take the Turks VLIW5 GPU and put it in Trinity -- you don't have to spend R&D money to adapt an architecture that is already obsolete but you still get improved performance over Llano and its Redwood-based GPU.