• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Nvidia delays GTX Titan Z - 790 incoming?

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
LOL, right, you must be really interested in Titan Z because you started a thread about Titan Z being delayed (even though launch date had never been announced), and then you linked to a source who fabricated impressions of the card. And stop it with the constant flow of Sweclockers rumors already, my goodness. NVIDIA's CEO has just recently confirmed that Titan Z is not being canned and is not being delayed.

The only thing remotely interesting about Titan Z is power balancing feature between each GPU, perf. per watt relative to SLI, and perhaps quad-SLI scaling. There is nothing much else to talk about here.
 
Last edited:
Someone has to be either really swimming in money to be interested in something so overpriced as a Titan card let alone Titan Z or an idiot. I'm the latter.
 
If the results DgLee show to us is true, then Nvidia made a goob job of locking this performance into the 375W TDP headroom.

Nvidia can make a Rev2.0 of the card with a better PCB, to the buyers put Titan Zs on better cooling...
 
Someone has to be either really swimming in money to be interested in something so overpriced as a Titan card let alone Titan Z or an idiot. I'm the latter.

It's a steal for doing workstation level CUDA development for PhD students, ISV, etc. looking to integrate a compute card into their hardware and software stack. I'm interested in how this works out for NV as pretty high performance, yet introductory AIB.

Aside from that, there are some people who are swimming in money - so what.
 
It's a steal for doing workstation level CUDA development for PhD students, ISV, etc. looking to integrate a compute card into their hardware and software stack. I'm interested in how this works out for NV as pretty high performance, yet introductory AIB.

Aside from that, there are some people who are swimming in money - so what.

I don't know many phd students that can afford a 3k GPU. 50% stipend is around $1900 a month. When I was in grad school we had access to plenty of computing power over the network.
 
It's a steal for doing workstation level CUDA development for PhD students, ISV, etc. looking to integrate a compute card into their hardware and software stack. I'm interested in how this works out for NV as pretty high performance, yet introductory AIB.

Aside from that, there are some people who are swimming in money - so what.

You mean a PhD student would have to steal in order to buy one. Its a rip off, period.
 
I don't know many phd students that can afford a 3k GPU. 50% stipend is around $1900 a month. When I was in grad school we had access to plenty of computing power over the network.

You mean a PhD student would have to steal in order to buy one. Its a rip off, period.

They get grants, etc. Nvidia itself currently loans out Teslas for qualified PhD thesises. Obviously, this is a target markets for them, not for profits, but for proliferation of CUDA development skills.
 
It's a steal for doing workstation level CUDA development for PhD students, ISV, etc. looking to integrate a compute card into their hardware and software stack. I'm interested in how this works out for NV as pretty high performance, yet introductory AIB.

Aside from that, there are some people who are swimming in money - so what.

I only had gaming in mind, I should have been more specific.
 
I don't understand the statements about "why are you interested, it's not in your price range".

This is merely the last generations dual card successor (590) at +350% the price. Simply because they decided to go crazy with the price doesn't mean we can't talk about it. If it were still priced in line with last gen it would be on the table and some of us would own it, now people (gamers, not the miniscule "cuda" market) with common sense will avoid it and see it for the failure that it is. A nice piece of hardware blinded by overwhelming greed. The irony is that the card is being delayed as it was obviously slower then it's 50% cheaper competitor.
 
They get grants, etc. Nvidia itself currently loans out Teslas for qualified PhD thesises. Obviously, this is a target markets for them, not for profits, but for proliferation of CUDA development skills.

This...I don't get why people thing this is for gaming. It's priced well above the cost of any gaming card (or 2x any NV gaming card SLI setup.)
 
Well NVidia did market it as a gaming card, and that is on them. But the real issue is that some people just like to b%$ch and moan. About anything.... for reasons that they don't even understand.... probably because they just don't like themselves.... who knows.
 
It's a steal for doing workstation level CUDA development for PhD students, ISV, etc. looking to integrate a compute card into their hardware and software stack. I'm interested in how this works out for NV as pretty high performance, yet introductory AIB.

Aside from that, there are some people who are swimming in money - so what.

some one already pointed out that computing don't get any benefit from dual GPU set up
 
I think that their marketing department had a moment of temporary stupidity and tried to classify a $3,000 workstation graphics card as an "Extreme Gaming!!!" card.

No. It is what it is. It's the most powerful gaming card nVidia has ever built. Jen-Hsun even said so. You can't blame the marketing department for going off half cocked when the CEO spreads the same message.

Saying it's not a $3000 gaming card, but a great value compute card is damage control. Especially when if for some reason you wanted to use Geforce cards and gaming optimized drivers for scientific research you could buy 3 titan Blacks for the same price and have ~50% better performance.
 
http://www.cnet.com/news/nvidia-ceo-sees-future-in-cars-and-gaming-q-a/

In other gaming topics, there are reports you killed or delayed Titan Z, your new high-end GPU.

Huang: No, no, that's silliness.

So it's still on time?

Huang: Yeah.

$3,000 is a lot of money for a GPU. What do you do to make sure that for someone who buys it, it's not irrelevant two or three years down the road?

Huang: In fact, most of the customers that buy Titan Zs buy it every year.

Do you anticipate that happening even with the $3,000 pricing?

Huang: Yeah. And the reason for that is the people who buy Titans and Titan Zs have an insatiable need for computing capability, graphics computing capability. So either they got tired of using just a 1,080p monitor and they just bought a 4K. My Titan all of a sudden's not enough. For a 4K monitor, a $3,000 to $5,000 monitor, I need something bigger to drive it. So that's Titan Z.
 
Last edited:
Huang: Yeah. And the reason for that is the people who buy Titans and Titan Zs have an insatiable need for computing capability, graphics computing capability. So either they got tired of using just a 1,080p monitor and they just bought a 4K. My Titan all of a sudden's not enough. For a 4K monitor, a $3,000 to $5,000 monitor, I need something bigger to drive it. So that's Titan Z.

He has to be living in his own little world. 4k isnt even something nvidia is dominating, after the 4k buggy drivers. he is just repeating his pr dep. told him to say.
 
He has to be living in his own little world. 4k isnt even something nvidia is dominating, after the 4k buggy drivers. he is just repeating his pr dep. told him to say.

The real answer is here

"$3,000 is a lot of money for a GPU. What do you do to make sure that for someone who buys it, it's not irrelevant two or three years down the road?

Huang: In fact, most of the customers that buy Titan Zs buy it every year."

Nvidia is making something that people will buy. Its the reason we have Bentleys AND Hondas. Some people have money to buy things that we could never justify because of price/perf. Even if we had that money, we may not justify it, but some can and do. So who cares if a niche group buys something that others wont? Its not like offering something that was not previously offered hurts anyone.

If and when AMD releases something to compete, you will see either something better or something cheaper. AMD released the 295x2 and suddenly, the Z gets delayed.

Just sit back, and buy the things that you can justify. You may not be able to afford the top card anymore, but thats not the point.

I would much rather a 2014 Honda Civic over a Rolls Royce and not just because of price.
 
The real answer is here

"$3,000 is a lot of money for a GPU. What do you do to make sure that for someone who buys it, it's not irrelevant two or three years down the road?

Huang: In fact, most of the customers that buy Titan Zs buy it every year."

Nvidia is making something that people will buy. Its the reason we have Bentleys AND Hondas. Some people have money to buy things that we could never justify because of price/perf. Even if we had that money, we may not justify it, but some can and do. So who cares if a niche group buys something that others wont? Its not like offering something that was not previously offered hurts anyone.

If and when AMD releases something to compete, you will see either something better or something cheaper. AMD released the 295x2 and suddenly, the Z gets delayed.

Just sit back, and buy the things that you can justify. You may not be able to afford the top card anymore, but thats not the point.

I would much rather a 2014 Honda Civic over a Rolls Royce and not just because of price.

Is this a long-winded way of saying, "Nvidia fanboys will buy it to brag, even though it's weaker than 295X2 and it's possible to get 3 Titan Blacks for the same price"?
 
some one already pointed out that computing don't get any benefit from dual GPU set up

That's often they case in gaming w/PhysX. In the compute field, many GPUs are often used (from a small 4-16x development systems to 10s of thousands for SuperComputers).
 
Is this a long-winded way of saying, "Nvidia fanboys will buy it to brag, even though it's weaker than 295X2 and it's possible to get 3 Titan Blacks for the same price"?

No necessarily fanbois, but just gamers with money that buy boutique computers already. They may not really be hardware geeks, but just might be thrilled to buy the most expensive single GFX card available - along with a water cooled 8 core Haswell-E and 64GB of DDR4 RAM (even though they'll probably never use more that 12GB).

These are buyers in a different club than most of us - and their friends have that same mentality as well. Heck, I don't care. If NV makes a profit on it, good for them. We can still make more sensible choices. Some people shop for a suit at Sears, others go to boutique shops where the cheapest suits start at $1200. So what if the number of guys buying suits @ >$1200, if that little shop is profitable - good for them.

I don't know why some people are going crazy over the price of this GFX card - buy something else if it's out of your price range (and as a tech geek, you know you can buy more perf for your dollar elsewhere).
 
Back
Top