• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

NVIDIA has canned NV50

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I don't trust the Inquirer, but who knows?
Will there be nVidia/ATI teams here just like in the "nVidia has canned NV48" thread 😛 😀?
By the way, there is NO Direct X 10, it's now called WGF (Windows Graphics Foundation), read it on Maximum PC a while ago.
 
YEY now they will make us ALL buy SLI!

w00t! Great marketing.

Thing is what happens if something changes, maybe something comes out unexpectedly? Maybe the R500 core has more tech than just a simple adding in of SM3, maybe they have something new?

U can buy an R500 with these new features and then ur stuck with buying an old 6800. Could look like it would be a reverse of the X800 vs the 6800?

lol oh well...
 
Originally posted by: Drayvn
YEY now they will make us ALL buy SLI!

w00t! Great marketing.

Thing is what happens if something changes, maybe something comes out unexpectedly? Maybe the R500 core has more tech than just a simple adding in of SM3, maybe they have something new?

U can buy an R500 with these new features and then ur stuck with buying an old 6800. Could look like it would be a reverse of the X800 vs the 6800?

lol oh well...

Whoah whoah now... This is where everyone gets jumpy. R500 is not here yet and won't be here for a good 6 months. Nvidia will have something to compete with ATI by that time; it won't just be the 6800 series.

When have ATI or Nvidia not responded to one another since they became the #1 and 2 video companies?
 
Originally posted by: jiffylube1024
Originally posted by: Drayvn
YEY now they will make us ALL buy SLI!

w00t! Great marketing.

Thing is what happens if something changes, maybe something comes out unexpectedly? Maybe the R500 core has more tech than just a simple adding in of SM3, maybe they have something new?

U can buy an R500 with these new features and then ur stuck with buying an old 6800. Could look like it would be a reverse of the X800 vs the 6800?

lol oh well...

Whoah whoah now... This is where everyone gets jumpy. R500 is not here yet and won't be here for a good 6 months. Nvidia will have something to compete with ATI by that time; it won't just be the 6800 series.

When have ATI or Nvidia not responded to one another since they became the #1 and 2 video companies?

When TSMC let nVidia down on the fabrication of the 5800Ultra would be the last time. It came out 5 months late due to TSMCs failure to deliver. 🙁

Anyway, I doubt the Inquirer has all knowledge of nVidia's future plans. I'm hoping to buy an SLI rig, I'm sure it can last me till late next year/early the year after to switch to whatever is best of ATI/nVidia's line at that point.

 
Sorry guys that view i stated was an extreme kinda scenario, i doubt it would be anything like that.

I was just saying if it WAS like that, its a slightly sickening thing for nV to do to us, instead of bringing out new hardware in the refresh cycles, they ask us to get another card which is a year old.

Personally i dont mind doing that because i can see myself doing that anyway 😛 but it is a very evil kind of marketing strategy.

Its just instead of buying 1 card every 6-9 months u buy 1 card now.... later you buy another card because there are no refreshes, and because of that prices depreciate very slowly so it still would be expensive to buy, then when the new card comes out, everyone scraps their 2 old cards, maybe being able to pay off getting one of the new gen cards. Then it gets into the expensive part, another 6-9 months you buy another card for your SLi set up. The cost is increasing, and that scenario if it happens i would probably do as well.

Its like spend $400 now, lets say $300 on the next card, then you sell both of them for about $300 - 400 and buy the new card with that, so your about $300 - 400 out of pocket already and a 6-9 months later you spend more money to get more performance as the top games that come out that year will need you to do that.

Again this is an extreme scenario that probably wont happen, but if this does happen it could be very bad for us financially, and very profitable for nV and ATi who are essentially robbing us 😛

But again i doubt they would do that to us.
 
Originally posted by: Lonyo
ATi scrapped the R400 and made the R420 instead.
Maybe nVidia will reuse their current tech in some way to make something involving SLI or multiple chips.
Or they will introduce the NV60 sooner.
Or the NV50 isn't really canned.

R400 was never scrapped. Its codename was changed to R520. The R420 is a completely different product based on the R360 architecture.
 
Originally posted by: Dean
The only reason I could see Nvidia canning(postponing) the NV50 would be because current or near future wafer production technology may not be advanced enough. I bet they will place it on hold until fabrication tech is ready for it and release something else to fill the gap.

They could be having the same issues the CPU manufacturers are having. They may need to move to dual-cored GPUs.

The only issues with the fab process at this point is that the die size will be enormous on the R520. Heat output will probably also be astronomical. nVidia has SLI, so they probably don't feel as pressured to go for the single-slot performance crown at this point. They are probably developing a beast of a card for when 2 6800U's are no longer the king don. If the R520 can take them out it will be very impressive and I'll probably buy one. Maybe Rollo will too, only to sit it on a shelf somewhere for posterity. 😀 :beer:
 
Going to call BS.

They started work on the NV50 around the time of the NV30. The NV50 is supposed to be Nvidias big chip that brings alot of stuff to the table.

I doubt they canned it this late in the dev cycle.

 
I think the problem is that like CPUs, graphics chips have hit a ceiling. They cannot cram more than 16 pipes and/or clock it that much higher even with a die shrink. 32bit FP for each color component is more than we will ever need for the display technology that we have. The current instruction limits are insanely big. All they can add are special tweaks like ultrashadow or better compression.

ATI is still one generation behind in their hardware, so they still have room to improve upon for R500.

Maybe we will finally see the Gigapixel tile based rendering core.
 
Originally posted by: Stoneburner
You guys exaggerate the inquirer's tendency to be wrong. The last thing they were wrong about was maybe the nv30 and that was because they were reciting what nvidia people told them. And I dont know why people get so defensive about NVIDIA. Are you people stockholders?

Yes I am and I also have stock in Imagination Technologies (ie PowerVR etc).

 
IMO, think about "scrapped" in terms of the NV40. NV40 is a very nice chip--not quite as big a leap as the R300 was at the time, but it's definitely a chip they can ride for a couple refresh cycles with some clock-boosting respins and the odd tweak (a la R300). It's possible R520 isn't as big a leap as they expected (really fast branching and FP32), so they felt they could push the risk of a new GPU back a year or so. Or they may be waiting for a different process, a la R400 (now R600, which ATi said required 90nm).

NV50's release plans may have been scrapped, but surely nV will reuse the tech elsewhere.

If NV47 is in fact a 24-pipe NV40, and can be clocked at least as high, nV should be pretty comfortable for a while--assuming ATi's supar SM3 chip isn't coming out anytime soon, or isn't as supar as they predicted with branching.
 
My speculation would be that nvidia sees the pointlesses of trying to bring a new chip to the market. the nv45 has barely hit the market. I also think they see the serious weakness of creative labs (seriously when is the last time these guys release a revolutionary product, the last revolutionary sound product was nvidia soundstorm 2.) I think nvidia realize just how bad the sound market is right now. which a sound chip that could encode dts or lossless they could make a killing. (I won't even go into that fact alot of people would kill just to rid themselves of creative drivers, much less second rate hardware at first rate prices.
 
nvidia should just die shrink and add another 8-12 pipes to the card. then put the work into a real next gen part since it doesn't look like the R520 will be anything to right home about
 
Originally posted by: Lonyo
Originally posted by: RussianSensation
I am sure just like ATI, Nvidia has multiple teams working on next generation GPUs. Besides, it's probably better to cancel an inferior gpu design and start from scratch rather than have other 5800 or 5900 failures.

I am sure Nvidia will just take one of their other team's chips and call it NV50. Just because the "original" NV50 has been cancelled, doesn't mean there wont be NV50. Besides, with the rate ATI is going, Nvidia can take 12 months to redesign the graphics card and X850XT PE still wont be in retail.

Perhaps the "old" NV50 was good enough to compete with R500, but what if they just aren't satisfied with competing, but want to become the leader again?

If car companies can take a concept car and turn it into a production model in 10-12 months, a videocard company can certainly design a full gpu from scratch in less than 6 months.

I call BS on that last statement.
Can't think where to find info, but I don't think it takes 6mo to produce a GPU.
Hell, ATi were working on the R5xx before the R4xx was released, so that's 12mo of a cycle already, if not more.

I never said companies make GPUs in 6 months. I said they are capable of making GPUs in 6 months. A car is 1000 times more complex to make than a GPU. I mean think about it, the amount of electronics alone in a car makes the complexity of a GPU look like a joke. The fuel-injection system, a small component of an engine, is controlled by a computer and so on. So if it takes car companies 10-12 months to make a CAR, you are telling me a dedicated graphics card company can't make a gpu from scratch in 6 months? The only reason they take so long is because they want to make money on current tech and it would make little sense to pump out new GPUs 2x as fast every 6 months or sooner since no one would spend $500 knowing that in 6 months it's worth practially nothing cuz it's 2x slower. Also they take time to perfect the yields and so on, whether as car companies have 5 years to workout the bugs and do recalls. But the actual design and architecture of a gpu shouldn't take more than 2-3 months to design (engineering wise).

I just think the engineers in say auto industry are much more stressed since you have 100s of competing models produced by 30 competing companies not just Nvidia and ATI. Even if car designs borrow from past designs and ideas, you have thousands of parts that have to work flawlessly and come together. Quality control revolves around so many parts from different suppliers making it harder to make a quality product. For graphics card you have Nvidia making PCBs and Chips and Samsung or Micron supplying memory and maybe Silicon Image providing DAC DVI converters or something like that. I am saying Intel had P4 Northwood at 4.0ghz when the highest model on market was only 1.8ghz. The technology is there, it's just that producing a large sample for distribution could be difficult due to yields. I agree with you guys that the "overall" time from design to distribution to customers might take 10-12 months. But I just think the GPU design alone shouldn't take 12 months (I mean if you know in theory what a pipelines is, how hard it is to add extra 4 pipelines to make 8? - you just need the technology that will allow you to do so but the theoretical understanding of what a pipeline is and what it does is still the same). I mean the electrical engineers know their stuff, and it's not like every generation the gpu is totally different as it also builds from past architectures.
 
Originally posted by: Falloutboy
nvidia should just die shrink and add another 8-12 pipes to the card. then put the work into a real next gen part since it doesn't look like the R520 will be anything to right home about


How do you know this?

 
Originally posted by: Drayvn
Originally posted by: Falloutboy
nvidia should just die shrink and add another 8-12 pipes to the card. then put the work into a real next gen part since it doesn't look like the R520 will be anything to right home about


How do you know this?

I would also like some basis for that post. R520 is the most anticipated GPU since R300, and is being designed by the same team. I can't see it being slower than what's currently available. Rumour has it being insanely powerful.
 
First of all, the Xbit article is just a link to the Inquirer with some additional filler. No additional source, so it certainly isn't corroborating evidence.

However, if NV was working on something to combat the X850 series, I would certainly understand if they scrapped that project, since the X850 is only marginally better than the X800. I wouldn't be surprised if they figured that some tweaking of the MFG process would allow them to put out higher clocked versions of the 6800 chips to compete and therefore decided to dedicate resources to something more important.

Frankly, I doubt we'll see significantly faster cards anytime soon from either company, since the current cards are so expensive and so hard to get, even 6 months after their release. I wouldn't expect to see another huge leap in performance until the 6800U's and X800XT's are closer to $250 or $300. I'm thinking that will be late 2005 for both companies, not just NV.

But, I guess we'll find out eventually.

-D'oh!
 
Originally posted by: SickBeast
Originally posted by: Drayvn
Originally posted by: Falloutboy
nvidia should just die shrink and add another 8-12 pipes to the card. then put the work into a real next gen part since it doesn't look like the R520 will be anything to right home about


How do you know this?

I would also like some basis for that post. R520 is the most anticipated GPU since R300, and is being designed by the same team. I can't see it being slower than what's currently available. Rumour has it being insanely powerful.

I want no part of it if it's "insanely powerful" - my rec room wouldn't be safe! 😉
 
THis is crap, Nvidia wont cancel a next gen product. They know that unless they have something else up there sleeves they will lose, they will get massacred if R5xx is all it is cracked up to be.

I think Inquirer is full of crap. Remember how they "confirmed" that we were getting SOundstorm 2? DO we have it.... Nope!

-Kevin
 
I don't think this will be such an issue unless ATI will have their R500/R520 ready to roll by next summer. I tend to believe the report, since x-bit seemed to go with it. I'm wondering if nVidia is having trouble with the architecture since they didn't put out out perfect 6800 silicon (VPU) this last time around. Could the NV40 architecture be a product of weak engineering? Maybe they've been forced to rewrite the book.

They'll come around, and be right up there with ATI, but there next gen, because of this delay, may just be a refresh of their current line. If this next gen doesn't have a fully functional VPU, then we'll know that there are some inherent flaws in the NV40 design.

Does anyone know if the 6600 and 6800 series have the same core design? If so, then I'm probably completely wrong about these inherent and unfixable defects.
 
6600 is diff....its 143 million transistors...no way to make it a 16pipe card, only wired for 128bit mem bus, its a lower fab process and they fixed the video processor on it
 
Back
Top