OILFIELDTRASH
Lifer
- May 13, 2009
- 12,333
- 612
- 126
Well, that pretty much kills the argument if there ever was one
And mother of god, Intel makes both AMD and Nvidia look like ants
A .08% increase in share prices kills the argument?
Well, that pretty much kills the argument if there ever was one
And mother of god, Intel makes both AMD and Nvidia look like ants
But I thought smaller die sizes translated into lower costs for consumers?
I reme mber tbis being the cry against nvidias big die strategy. Now that AMD IS charging bigger bux for its product, suddely R&D comes into play.
Funny stuff.
Hmm, why notI'm in I guess.
Doesn't help any knowing that the amd CEO sees his customers as prey.
Good man! I think you have the better odds, since you can win if only one of the two conditions fail, whereas I need both conditions to be true to win. Nevertheless, I think the end results will be pretty close to both of our conditions regardless of who wins. Lets try to remember that this is all for "fun" although one of us gets to reward the other with a prize when the end of the rainbow is reached.
But I thought smaller die sizes translated into lower costs for consumers?
I reme mber tbis being the cry against nvidias big die strategy. Now that AMD IS charging bigger bux for its product, suddely R&D comes into play.
Funny stuff.
What do you think Jen Hsun Huang sees you as? His best friend?![]()
A small die is cheaper once the process has matured enough to provide good yields. A smaller die means more chips per wafer. Which ultimately means less cost once production scales up. However, 28nm is still very new, so the process has not yet matured to a level where yields are high. If you can only harvest half the chips on a given wafer, that ultimately effects TSMC's ability to make money. So they charge much more per wafer until they can get to a point where losses are low.
I will admit I payed full msrp on a gtx 580 once. Believe me there were cuss words uttered about nvidia when I bought it. I really didn't think about price/performance or none of that. Just wanted the top performer. Turned out I had buyers remorse about 2-3 months in and sold it. So while yes I have been on the paying premium side I am no longer. Just like anything else you learn from your mistakes. So yes with my new outlook on gpu buying I'm giving the 7970 a failing grade.
Lol
Everyone in class turned to look at me
Fun (and fairly useless) facts:
The 5870 was approximately 40% faster than the 4890 at release.
The 7970 was approximately 40% faster than the 6970 at release.
The 5870 was 110% more expensive than the 4890 at release (380 usd versus 180 usd)
The 7970 was 60% more expensive than the 6970 at release (350 usd versus 550 usd)
In absolute terms both the 5870 and 7970 increased the price by 200 usd (which is less in the 7970's case due to inflation)
So all in all the 7970 came with a smaller price hike than the 5870 in both absolute and relative terms, but people still complain, even though no one really complained all that much with the 5870.
And I guess all this just goes to show that talking about the size or extremity of a price increase is pointless, all that really matters is the given price, at a given performance level.
I will admit I payed full msrp on a gtx 580 once. Believe me there were cuss words uttered about nvidia when I bought it. I really didn't think about price/performance or none of that. Just wanted the top performer. Turned out I had buyers remorse about 2-3 months in and sold it. So while yes I have been on the paying premium side I am no longer. Just like anything else you learn from your mistakes. So yes with my new outlook on gpu buying I'm giving the 7970 a failing grade.
So was there no R&D to recoup from prior launches? In other words, does the point you're trying to make only apply to 7970?
A .08% increase in share prices kills the argument?![]()
So with your own words you state this product is of no interest to you and have exited this price bracket completely.
Yet, you belittle the people who buy it, and only complain about the negative price? Want some cheese to go with that whine? Do you also sit there and point out how Store Brand Mac and Cheese is cheaper than Kraft Mac n Cheese?
Outside of crapping on people's parade, have you added anything of value to your points outside of waaaaaah? Seriously?
.08% increase? AMD shares increased by 30~% from Jan 9 to Feb 9... Are you looking at the same chart, or am I the one seeing it wrong?
A .08% increase in share prices kills the argument?![]()
What do you think Jen Hsun Huang sees you as? His best friend?![]()
Q: With AMD's acquisition of ATI and Intel becoming more involved in graphics, what will NVIDIA do to remain competitive in the years to come?
Jen-Hsun Huang, CEO and founder of NVIDIA: The central question is whether computer graphics is maturing or entering a period of rapid innovation. If you believe computer graphics is maturing, then slowing investment and “integration” is the right strategy. But if you believe graphics can still experience revolutionary advancement, then innovation and specialization is the best strategy.
We believe we are in the midst of a giant leap in computer graphics, and that the GPU will revolutionize computing by making parallel computing mainstream. This is the time to innovate, not integrate.
The last discontinuity in our field occurred eight years ago with the introduction of programmable shading and led to the transformation of the GPU from a fixed-pipeline ASIC to a programmable processor. This required GPU design methodology to include the best of general-purpose processors and special-purpose accelerators. Graphics drivers added the complexity of shader compilers for Cg, HLSL, and GLSL shading languages.
We are now in the midst of a major discontinuity that started three years ago with the introduction of CUDA. We call this the era of GPU computing. We will advance graphics beyond “programmable shading” to add even more artistic flexibility and ever more power to simulate photo-realistic worlds. Combining highly specialize graphics pipelines, programmable shading, and GPU computing, “computational graphics” will make possible stunning new looks with ray tracing, global illumination, and other computational techniques that look incredible. “Computational graphics" requires the GPU to have two personalities – one that is highly specialized for graphics, and the other a completely general purpose parallel processor with massive computational power.
While the parallel processing architecture can simulate light rays and photons, it is also great at physics simulation. Our vision is to enable games that can simulate the interaction between game characters and the physical world, and then render the images with film-like realism. This is surely in the future since films like Harry Potter and Transformers already use GPUs to simulate many of the special effects. Games will once again be surprising and magical, in a way that is simply not possible with pre-canned art.
To enable game developers to create the next generation of amazing games, we’ve created compilers for CUDA, OpenCL, and DirectCompute so that developers can choose any GPU computing approach. We’ve created a tool platform called Nexus, which integrates into Visual Studio and is the world’s first unified programming environment for a heterogeneous computing architecture with the CPU and GPU in a “co-processing” configuration. And we’ve encapsulated our algorithm expertise into engines, such as the Optix ray-tracing engine and the PhysX physics engine, so that developers can easily integrate these capabilities into their applications. And finally, we have a team of 300 world class graphics and parallel computing experts in our Content Technology whose passion is to inspire and collaborate with developers to make their games and applications better.
Some have argued that diversifying from visual computing is a growth strategy. I happen to believe that focusing on the right thing is the best growth strategy.
NVIDIA’s growth strategy is simple and singular: be the absolute best in the world in visual computing – to expand the reach of GPUs to transform our computing experience. We believe that the GPU will be incorporated into all kinds of computing platforms beyond PCs. By focusing our significant R&D budget to advance visual computing, we are creating breakthrough solutions to address some of the most important challenges in computing today. We build Geforce for gamers and enthusiasts; Quadro for digital designers and artists; Tesla for researchers and engineers needing supercomputing performance; and Tegra for mobile user who want a great computing experience anywhere. A simple view of our business is that we build Geforce for PCs, Quadro for workstations, Tesla for servers and cloud computing, and Tegra for mobile devices. Each of these target different users, and thus each require a very different solution, but all are visual computing focused.
For all of the gamers, there should be no doubt: You can count on the thousands of visual computing engineers at NVIDIA to create the absolute graphics technology for you. Because of their passion, focus, and craftsmanship, the NVIDIA GPU will be state-of-the-art and exquisitely engineered. And you should be delighted to know that the GPU, a technology that was created for you, is also able to help discover new sources of clean energy and help detect cancer early, or to just make your computer interaction lively. It surely gives me great joy to know what started out as “the essential gear of gamers for universal domination” is now off to really save the world.
Keep in touch.
Jensen
Why do the two have to be mutually exclusive? BTW what is the source of that quote?This is the time to innovate, not integrate.
So you're saying amd couldn't sell the 7970 at $399 and still make a ton of profit? I think they could. So why are they $550-$600? Hmm..
Wake up man. Getting fleeced for your hard earned dollars is not cool. I can't for the life of me figure how getting hard earned dollars taken out of your pocket is a good thing?Then to come into a thread and defend it is really the puzzling thing.
Wonder if these same guys drive by the gas station and get all giddy when gas goes up by a dollar. "oh look honey gas is $1.00 higher than yesterday! Yippee!!
Go Exxon!! A big nameless faceless corporation needs this money so much more than me. Here you go sir. I don't need it. Just so happy to pay premiums!"
Why do the two have to be mutually exclusive? BTW what is the source of that quote?
Jason Paul, Product Manager, GeForce: Fermi has dedicated hardware for tessellation (sorry Rys). We’ll share more details when we introduce Fermi’s graphics architecture shortly!