How come nvidia had nothing to show at Computex??

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

NTMBK

Lifer
Nov 14, 2011
10,464
5,849
136
Ah, good to see G-Sync displays on demo. Adaptive framerate is a sweet tech.
 

Mand

Senior member
Jan 13, 2014
664
0
0
The pictures of the Acer display do leave me struck by something, though. Just how...old...Watch Dogs looks.
 

TreVader

Platinum Member
Oct 28, 2013
2,057
2
0
Sweet Gsync, not like i've ever seen that before.



Nvidia is taking a "none is more" approach to GPU design this year.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Nvidia is taking a "none is more" approach to GPU design this year.

Or, more likely, a "none is more" approach to GPU press releases. I find it highly doubtful they're standing still on design.

Still expecting Maxwell this fall, and a competing line from AMD. Maybe I'm being overly optimistic, but I don't see any reason to doubt it yet.
 

Atreidin

Senior member
Mar 31, 2011
464
27
86
As far as I know the Shuttle is still on 80386's which is radiation hardened as its actual mission CPU, which is fast enough for the basic control of the aircraft for take off and landing.

The thing to understand is that aircraft and shuttles have an avionics bay. This can contain quite a lot of computers in general, all serving different functions. Some might be managing the display output and others the engine control or in the case of the shuttle we have a landing and take off auto pilot mission computer that is also calculating controlled burns for orbital interceptions. The black box is another example of one of these avionics boxes, the main difference obviously being rather than computing resources its mostly the hard fire proof coating and storage and accurate timing mechanisms internally.

So the fact that Nvidia has Tegra running the control screen is actually quite significant. That is safety critical aspect of the shuttle and it would be classed as risk class 4 or SIL 1 component of the shuttle (the highest risk level needing the most stringent of testing). Input and output control is not something that doesn't matter, its absolutely critical and suggests Tegra with appropriate hardening is a very reliable processor. It takes quite a lot of computing power to draw screens and even more to do it in a safety critical way.

Not very interesting for most people here but I find the inclusion of such a processor in the capsule interesting.

If I had to guess, I would think the acceptability for use in space comes in large part from the processing used (different substrate, other materials, shielding), and maybe some changes in the design such as added redundancy and error correction.

If Nvidia did anything new or interesting to get their design in space I would think it would have been bragged about. As it is, I doubt they have done anything special that a competitor couldn't do if they studied up on radiation hardening and contracted companies for the necessary materials and packaging. Other than the novelty of "Nvidia in space" I don't see this as very intellectually stimulating.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
The launch looks to be a couple years away still....It is interesting I guess.

Two years is a long time in the hardware world so it's hard to say if it'll actually be used at that time.

It doesn't work that way in Aerospace. 2 years is nothing. The shuttle ran on 8086's for decades, most planes still run on the processors they were initially designed for which would have been many years before you even heard of them. The most advanced fighter jets in the world all use ancient kit.

This is the sort of deal that will persist for decades, they will either make a lot of them for the purpose of replacements now or do a deal to provide them for potentially 20 years or more. It costs enormous amounts of money to qualify software on particular hardware, the boards they go in are completely custom so they do not change the processor/vendor lightly, because the cost of doing so its millions upon millions and years of development time and qualification.
 

SoulWager

Member
Jan 23, 2013
155
0
71
As far as I know the Shuttle is still on 80386's which is radiation hardened as its actual mission CPU, which is fast enough for the basic control of the aircraft for take off and landing.

The thing to understand is that aircraft and shuttles have an avionics bay. This can contain quite a lot of computers in general, all serving different functions. Some might be managing the display output and others the engine control or in the case of the shuttle we have a landing and take off auto pilot mission computer that is also calculating controlled burns for orbital interceptions. The black box is another example of one of these avionics boxes, the main difference obviously being rather than computing resources its mostly the hard fire proof coating and storage and accurate timing mechanisms internally.

So the fact that Nvidia has Tegra running the control screen is actually quite significant. That is safety critical aspect of the shuttle and it would be classed as risk class 4 or SIL 1 component of the shuttle (the highest risk level needing the most stringent of testing). Input and output control is not something that doesn't matter, its absolutely critical and suggests Tegra with appropriate hardening is a very reliable processor. It takes quite a lot of computing power to draw screens and even more to do it in a safety critical way.

Not very interesting for most people here but I find the inclusion of such a processor in the capsule interesting.
I'm guessing they get safety through fault tolerance. Presumably any of the 4 touchscreens can display any information, and presumably you don't lose multiple screens if you lose 1 SoC. In the event all the screens fail, the autopilot is capable of flying the entire mission.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Keep in mind, we sent people to the Moon and back using computers that would get utterly crushed by your average smartphone.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
LL


2043264


The monitor partners are the ones presenting them, it seems. This is the Acer 4k w/ G-Sync.

http://www.techspot.com/news/56990-eyes-on-with-acers-g-sync-enabled-4k-monitor-at-computex.html

Cool! How much?

When the GPU was rendering a game at 40 frames per second, the low-ish frame rate was hardly noticeable with no stutter, lag, tearing or strobing; it looked just as good as if you were gaming at 60 frames per second.

Was it static @ 40FPS like the journalist said? (rhetorical)
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,223
540
126
Yeah, I am still waiting on the new "next gen" cards from Nvidia and ATI. But they are both waiting on the next gen manufacturing process...
 

Mand

Senior member
Jan 13, 2014
664
0
0
Cool! How much?



Was it static @ 40FPS like the journalist said? (rhetorical)

No price yet on either of the 4k G-Sync models, neither Acer nor Asus.

And I get that you said rhetorical, but it's not a rhetorical question.