Speculation: AMD's 50-year anniversary

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

What remarkable things are AMD likely to do in 2019?


  • Total voters
    42
  • Poll closed .

scannall

Golden Member
Jan 1, 2012
1,333
147
136
#26
I think the next innovation will be to incorporate the memory into the chip, which will free up the mobo real estate for other uses. And overclocking speeds up everything. Five GHz 64GB anyone?
There may be a few steps in between. ;-) On high end chips anyway, 4 gig of HBM 2 or 3 as a L4 cache would speed things up nicely.
 

Thunder 57

Senior member
Aug 19, 2007
617
59
136
#27
Yes, for a reason. When I first started my second IT career, the company was very large (one of 10 data centers had over one square MILE of floor space). This was in 2002, when the Opteron used less power, was cheaper, and performed better then its Intel counterpart. But when I asked why we did not buy them, upper management at the data center said "we only use REAL CPU's, Intel". That kind of stupidity is still around. The EPYC processor is less expensive than its Intel counterpart, performance across different situations is about even, and I think they use less power. And thats before 7nm which is sampling now. But will they gain market share ?
To be fair, this was when AMD had no record in the server world. Also, the chipsets of the day were probably questioned as well. I'd like to think it would be different these days with pretty much everything being on chip, plus AMD's proven CPU and chipset design that kept them going until the Bulldozer days.

That said, I'm sure there were a lot of idiots saying stupid things like that and had no knowledge on the subject to back it up.
 
Apr 8, 2002
40,923
65
126
#28
Man this made me feel old. 20 years since Athlon?
 

ericlp

Diamond Member
Dec 24, 2000
5,999
48
106
#31
Pretty expensive to do....
yes it is, but, like anything, I think the gains are worth it.

Once they start making them for the masses the cost will come down. When everyone sees the speed and efficiency gains there will be no going back. They only draw backs will be if you wanted to upgrade the ram, or a faster gpu, would be impossible to do without replacing the entire silicon chip.
 

whm1974

Diamond Member
Jul 24, 2016
7,405
473
96
#32
yes it is, but, like anything, I think the gains are worth it.

Once they start making them for the masses the cost will come down. When everyone sees the speed and efficiency gains there will be no going back. They only draw backs will be if you wanted to upgrade the ram, or a faster gpu, would be impossible to do without replacing the entire silicon chip.
This may end up just being suitable only for certain niches as there will be very limited space for memory and GPU, if it is viable at all.
 

maddie

Platinum Member
Jul 18, 2010
2,442
371
136
#33
This may end up just being suitable only for certain niches as there will be very limited space for memory and GPU, if it is viable at all.
No rule says that all the memory has to be integrated.
 

ericlp

Diamond Member
Dec 24, 2000
5,999
48
106
#35
What if... Theoretically, say you bought your Chips in Modules. Base chip would be say a 2400G, 4 Core, 8 Thread, with an GPU, and 16 GB RAM in the package chip...

If you found the system was too slow for your needs and wanted 32 Gigs of ram, you could just take another 2400G Base chip and slap it on, now you got your 32 Gigs 8 Core 16 Thread CPU. for a 300 dollar upgrade. I think this would make much more sense then just adding a PCB with empty ram slots that would just SLOW down everything.

I think "niches" and limited space is the last thing you'd want to look for for integrating RAM into the CPU package. It would speed up ram 10 fold, Latency would be basically ZERO. Doing this may be the key to have real performance for AI chips to become reality. AMD isn't the only one working on this idea.

Look here.

https://www.theverge.com/circuitbre...smallest-computer-grain-of-salt-solar-powered

According to IBM, integration of RAM and CPU, the power reduction could be cut by over 80X.
 

Similar threads



ASK THE COMMUNITY

TRENDING THREADS