• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Forbes: AMD is the gadfly that still bothers intel

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
That's what we said back when dual cores rolled around. Now it seems like a no brainer to avoid single cores in general applications. I do think we need better inventors out there to create a new market now that all this processing power is out there. Maybe make something like Project Natal mainstream and used in the regular household. I'm sure that could require a buttload of processing power.

Intel has a compiler project known as "mitosis" that is supposed to "help" break single threaded calls until SMP-enabled ones that will execute on multiple cores. Thereby increasing both performance and efficiency at the same time.

This was a couple of years ago, I don't know what ever came of it.
 
That, indeed, would be awesome and Sell Vidcards and Gain PC Market Share all at the same time.

Yeah, that's what I'm thinking, provided it wasn't a buggy mess at launch. It's all about execution.

Intel has a compiler project known as "mitosis" that is supposed to "help" break single threaded calls until SMP-enabled ones that will execute on multiple cores. Thereby increasing both performance and efficiency at the same time.

This was a couple of years ago, I don't know what ever came of it.

Ah, good old mitosis. Thank you for remembering the project name for their speculative threading venture. I haven't even been able to find any academic papers on the subject written later than 2006.

Intel had a presentation of some kind related to mitosis at the 2005 IDF, where they said it would take them 5-10 years to make anything useful out of it: http://www.dvhardware.net/article6594.html

So, 2015 for mitosis?
 
Last edited:
Intel has a compiler project known as "mitosis" that is supposed to "help" break single threaded calls until SMP-enabled ones that will execute on multiple cores. Thereby increasing both performance and efficiency at the same time.

This was a couple of years ago, I don't know what ever came of it.

Anaphase
 
Yeah, that's what I'm thinking, provided it wasn't a buggy mess at launch. It's all about execution.

It probably would have some Bugs. Would be difficult to do, especially on Older Hardware. I suspect Vidcards would require special Instructions to pull it off well.
 
The issue isn't do we need better silicone. We will always need better silicone. You simply cannot have enough computational power.

I completely agree. We will always need more computing power. That said, until something is developed, and hits mainstream- something that requires the mass number of users to need more computing power- than continued development of better, faster processors will become too expensive. There's a relatively small number of developers, power gamers, enthusiasts and scientists that can never have too much number crunching power. Because it's such a small group that truly needs, enough to pay for more power, unless the masses are brought on board, that power will cost more and more. This will cause the group that can afford it to shrink, till finally it's just to expensive to develop.


We need better computing power. But it doesn't matter if there is only one outfit developing it, or if there are 1000 competitors. We're simply not going to continue to get it unless mainstream consumers have a need for more power. With the current products, we will soon arrive at a point where the normal computing experience will be perceived as instantaneous. No consumers will pay money for anything faster than that.


I don't know where it will come from, but have no doubt this need will be created. It will probably be something that you have never imagined before.
 
There's a relatively small number of developers, power gamers, enthusiasts and scientists that can never have too much number crunching power. Because it's such a small group that truly needs, enough to pay for more power, unless the masses are brought on board, that power will cost more and more. This will cause the group that can afford it to shrink, till finally it's just to expensive to develop.

Gee, kinda sounds EXACTLY like how every other computing model predecessor evolved and died. From mega-mainframes with terminal access in the 60's to vector-based supercomputers of the 80's to big-iron servers of the 90's.

Each of these models of iterating computing performance eventually stalled and toppled from lack of width in the end-user base.

I personally think we will see similar analogy with the extreme edition as well as the mid-tier SKU's exactly as you have laid out. Even at a mere $200 just how many people actually need a hex-core chip?

The TAM is rather limited when you get right down to the "good enough" computing levels delivered by $100 cpu's.

The same thing has happened to the disk-drive makers. 3TB drives? Yeah we'll buy them if they are $100, and maybe 5% of us will actually use the capacity, but everyone else will not likely touch large parts of the platter over the drives entire service lifetime.
 
There is a leapfrog of hardware and software. Right now hardware is ahead but you shouldn't always count on that.

It used to be that software developers always had clock speed improvements to count on for more performance. They could make code bigger because every year it was running faster.

But eventually that model tops out. Now that we have multicore as the basis for every computer being sold these days, the programming models are shifting to more threads and counting on single fast cores less and less. But these things take time. Look how well the newest versions of photoshop handle multiple threads vs. 5 years ago when everything was single core.

Software will get there.
 
Look how well the newest versions of photoshop handle multiple threads vs. 5 years ago when everything was single core.

Can someone answer the following questions:

Does Photoshop processing thru extra CPU cores vs GPU give better quality? Ease of use? Efficiency?

How about Encoding?
 
Last edited:
Can someone answer the following questions:

Does Photoshop processing thru extra CPU cores vs GPU give better quality? Ease of use? Efficiency?

How about Encoding?

For photoshop it is pretty transparent with the exception of dramatic speed differences.

For encoding, there are quality differences because ATi and Nvidia run proprietary compilers, and at the last stage i knew, there IS quality loss while encoding with your video card including a few gamebreaking bugs while transcoding (encoding from one video codec to another).
 
Can someone answer the following questions:

Does Photoshop processing thru extra CPU cores vs GPU give better quality? Ease of use? Efficiency?

Photoshop primarily uses the CPU. There are some new plugins that are starting to use the gpu a little more, but nothing earth shattering. A fast CPU trumps a fast video card.

How about Encoding?

The best encoders are entirely CPU based. Unfortunately, It is almost impossible to make a good GPU encoder (Believe me, many projects have looked at this).
 
sigh...

intel will not be focusing on 22nm this year and next year.

If they do focus on 22nm next year, it will be LATE LATE in Q4, possible the year after.

My friend did tell me intel has SSD FEVER.
They want to gobble the SSD market, which is still developing.

There gonna shift over to 25nm controllers, and SSD's.
While people hash out there opinions on SB.

They dont care if SB loses to Bulldozer, because AMD has no SSD market.
So why would they dig themselves in a deeper grave and outpace amd when they can shift resources somewhere else, and wait for the competition to catch up.

Dayam americans... GREED is not a NEED. (speaking as an american)
We are all riding on GREED unless you belong in the enterprise sector.

And in the enterprise sector SSD's is a NEED.

You guys are talking about engines.. Intel says there engines are good enough, you need a new transmission.
 
Last edited:
sigh...

intel will not be focusing on 22nm this year and next year.

If they do focus on 22nm next year, it will be LATE LATE in Q4, possible the year after.

My friend did tell me intel has SSD FEVER.
They want to gobble the SSD market, which is still developing.

There gonna shift over to 25nm controllers, and SSD's.
While people hash out there opinions on SB.

They dont care if SB loses to Bulldozer, because AMD has no SSD market.
So why would they dig themselves in a deeper grave and outpace amd when they can shift resources somewhere else, and wait for the competition to catch up.

Dayam americans... GREED is not a NEED. (speaking as an american)
We are all riding on GREED unless you belong in the enterprise sector.

Didn't AMD use to be in the market of making flash chips? I could have sworn at one time that was their biggest money makers.

Ah, here it is http://www.amd.com/us/press-releases/Pages/Press_Release_656.aspx

That, of course, isn't to say that one flash memory tech = another. Just that AMD might be able to enter this market as well.
 
LULZ.... i dont think AMD and cicso are partnered anymore.

SUNNYVALE, CA -- 12/19/2000

A lot has happened in 10 yrs.
 
Back
Top