nVidia and ATI just might be out of business by 2016

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

LOUISSSSS

Diamond Member
Dec 5, 2005
8,771
58
91
nobody in the world can make predictions on what exactly will happen in 6 or 10 years from now. if you're going to try (like the OP) then you're an IDIOT.

and btw, that OP was the worst broken english in this forum. i know foreigners that speak/write better english
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I have actually researched about Intel one day have a chip that does GPU and CPU. Then we won't need a GPU card.

Well ATI has this coming next year in the form of "Fusion".

However, I am still trying to figure out how ATI will integrate Fusion with Discrete cards.

Shared frame rendering as way to reduce input lag and increase FPS over a larger Nvidia discrete video card used by itself?
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
Theoretically speaking, a CPU and a GPU from the same time period can perhaps be combined into a single unit, but as two separate units they will perform their respective jobs much more efficiently.

Speaking of efficiency...

Do HPC tasks require the same amount of memory bandwidth as a high end graphics card? Can some money/energy/space could be saved by getting rid of the extra Video card memory chips, PCBs, etc
 
Last edited:

faxon

Platinum Member
May 23, 2008
2,109
1
81
He has admitted he has some mental health issues. I would just shrug it off tbh. He's trying to bring up a valid discussion but cause his perception of the world is different he processes the info differently, making it harder for us to understand. I got the idea of what he was saying and so did some others. These threads are always good discussions though :)

As far as the topic goes, yea a lot of the patents expire soon, but there's always newer tech out that gets patented as well so you still have to work around that. Look at x86 as an architecture from pre x86 chips to now and you will see my point. Between AMD and intel they have the whole tree covered, and most of the patents you gotta deal with today are within the last couple years now. If it were as simple as letting pattents expire there would be tons of companies producing cheap shit obsolete x86 chips about as good as an overglorified pentium 3, but without all of the SSE extentions and what not
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
The CPU portion is 32nm, the GPU + NB portion is on 45nm.
Surprise, surprise, it's bigger.


Ok , Like tweaky here your confused . i3 is on 32nm cpu with added 45nm igp . Sandy is 1 chip . Its all 32nm . Theres a pic in the wild. Sandy gpu or what ever uses the L3 catch on sandy, the full l3 cache is shared . It well be interesting . As for performance thats a wait and see . But I am sure all sorts of claims will be made
 
Last edited:

SHAQ

Senior member
Aug 5, 2002
738
0
76
Guys...guys...I think you are underestimating the OP's source in silicon valley. He knew about the 5890 before it even came out!
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
By 2020 expect the majority of computing to be cloud based so what they use will not matter.
 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
well i guess the majority of simple computing could be done remotely. high performance computing will remain on site. as far as rendering goes, we would need hundred-gigabit internet service to almost every high performance video device by 2020.

3D rendering and display technology in the 90s and early 21st century has grown much faster than the consumer's access to internet infrastructure in that same timeframe. since the first cable broadband service became available, most subscribers got to 8 or 10 megabits and it's pretty much stayed that way unless you lived in a weird town and had to suffer with DSL. some people can get 20 mbps with high-speed cable, FIOS, etc, but they are a minority and either way the network performance still pales in comparison to the growth of display technology and the performance of the interconnects required to send a high resolution video signal. the cable in my city tops out at 1.2 MB/s during ideal low-usage conditions.

some time between now and 2020, this would have to change so that year over year our internet service is getting faster than our video standard. Will HDMI and displayport be in use in 2020 or will we have something else because we have to run our new 16-megapixel AMOLED television? then you would be talking 50 Gb/s of video bandwidth alone. how fast will lossless compression improve in 10 years? i imaybe in ten years they will be streaming a lossless 1080p output, and that would be cool for video and game services, but there will be a high performance standard that must be done locally.
 
Last edited:

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
So-called "cloud" computing really isn't. It's just another way of saying "client-server".

TRUE "cloud computing", would involve a singular distributed OS, that runs on BOTH the client and the server, allowing apps, data, resources, etc., to be seamlessly migrated and/or shared.
 

jihe

Senior member
Nov 6, 2009
747
97
91
Looks like intel hasn't learned how to do GPU.


I have actually researched about Intel one day have a chip that does GPU and CPU. Then we won't need a GPU card. By the way this might happen in 2020 also,, But I know for fact through a source I have in silicon valley that Intel has many plans, and one day their chips would do 3D as well.
Ive read many different dates.. like

2016 what the thread title is. 2020 and even as early as 2014, but I doubt that one.

I would say nVidia and ATI might be in deep jeopardy when Intel learns how to do GPU as well. By this time we would not even use a PCIe slot for the video card, as CPU has the GPU and its faster then anything nvidia or ati have created.. anyhow that is when Im upgrading next,, CPU and GPU same chip. :D Until then CPU's will run 3dmark 3D tests @ 1fps lol
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
Never.

If a chip with one trillion transistors has integrated CPU and GPU, than a seperate CPU and GPU with two trillion transistors on two seperate chips will still be faster.
 

Sylvanas

Diamond Member
Jan 20, 2004
3,752
0
0
I saw this thread title and thought 'that's ridiculous' then I saw it was a Tweakboy thread so I clicked it just for the lols.

It was worth it.
 

manimal

Lifer
Mar 30, 2007
13,559
8
0
in 2016 were gonna have backpacks with jets on them. Of course they will have matrox parhelia cards running them.
 

zsdersw

Lifer
Oct 29, 2003
10,505
2
0
I have to wonder about the future of the desktop PC as we know it today, at least in the everyday consumer space. Smartphones and other portable computing devices (and their successors) seem to be the future for consumers, maybe even the corporate world. As smartphones/MIDs/portable computing devices advance and become capable of doing more, the necessity of having a desktop computer (or even a traditional laptop) becomes less and less.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
I saw this thread title and thought 'that's ridiculous' then I saw it was a Tweakboy thread so I clicked it just for the lols.

It was worth it.

lol, ditto, which makes it a double lol

I just had a feeling I would find epic fail inside, and I was not disappointed.
 

LoneNinja

Senior member
Jan 5, 2009
825
0
0
I have to wonder about the future of the desktop PC as we know it today, at least in the everyday consumer space. Smartphones and other portable computing devices (and their successors) seem to be the future for consumers, maybe even the corporate world. As smartphones/MIDs/portable computing devices advance and become capable of doing more, the necessity of having a desktop computer (or even a traditional laptop) becomes less and less.

Until a smart phone gets some sort of 15"+ expandable screen and a full keyboard I can actually type on, I won't use one. I've used a blackberry, ex gf has one, I hated that thing. I've also messed with the Iphone and some other recent touch screen phone that tmobile just released that I don't remember the name of, don't like either of those either. I still stick to desktop over laptop, simply because I enjoy the extra horse power for less and my 32" screen. I'm sure the desktop marketshare will continue to decline, as the laptop market increased, but toys like smart phones and the Ipad are far from replacing the PC. And most anyone who needs to do real work on a computer, will still have a desktop/workstation, it's not like valve will be making games on a phone. lol
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
He has admitted he has some mental health issues. I would just shrug it off tbh. He's trying to bring up a valid discussion but cause his perception of the world is different he processes the info differently, making it harder for us to understand. I got the idea of what he was saying and so did some others. These threads are always good discussions though :)

That means that he has a TLB issue that will cause data corruption when overloaded and thus, will lower his performance further than a typical architecture due to his superscalar approach. We need better compilers so we can understand better his language and maximize our execution engine utilization. :awe:
 

zsdersw

Lifer
Oct 29, 2003
10,505
2
0
Until a smart phone gets some sort of 15"+ expandable screen and a full keyboard I can actually type on, I won't use one. I've used a blackberry, ex gf has one, I hated that thing. I've also messed with the Iphone and some other recent touch screen phone that tmobile just released that I don't remember the name of, don't like either of those either. I still stick to desktop over laptop, simply because I enjoy the extra horse power for less and my 32" screen. I'm sure the desktop marketshare will continue to decline, as the laptop market increased, but toys like smart phones and the Ipad are far from replacing the PC. And most anyone who needs to do real work on a computer, will still have a desktop/workstation, it's not like valve will be making games on a phone. lol

Think a bit further ahead than the existing limitations and inadequacies, and your own personal preferences. I'm talking about long term.