The problem is that NVDIA is getting squeezed out of their traditional markets.
With both Intel and AMD integrating GPUs into their CPUs, and both producing and promoting their own chipsets -- this essentially destroyed NVIDIA's old cash cow (2 years ago) producing chipsets and low end integrated chips.
With the chipset business, both Intel and AMD can lower the prices of their CPUs, while making the margins on the chipsets. Especially for Intel, where their chipsets are produced using last-generation processes (basically free production cost for Intel).
This has left NVIDIA with its only traditional market left-- the discrete GPU market, medium and high end range for gamers. However, the problem with this market is still relatively small (small % of revenue compared to overall chipset/gpu market), and still face fierce competition from ATI.
Naturally, NVIDIA knows this and needs to do something to survive-- so they spawned several new businesses, the ION for smartphone and embedded devices, and sever CUDA. The value of CUDA is not selling it to home users for Photoshop, but rather the value is selling it to Wall Street Hedge Funds, Oil & Gas Companies, and Bio-Medical Corporations at 10x the margin.
On the smartphone side, they are pushing towards a high volume low margin business-- their major advantage is graphics, but there are also plenty of solutions/companies out there providing graphic pipelines for handheld devices (i.e. PowerVR, Qualcomm).
On the server side, they have to exploit this market while they are still in the lead. ATI/AMD has more or less abandoned this market for now due to the cash crunch 2 years ago. They decided to not design/implement the necessary logic/area required for GPGPU computing-- thus leaving the entire market for NVIDIA for now... However, Intel is very interested in pursuing this market.. Larabee was never meant to be a consumer product, but rather a massive floating point computational array aimed at the enterprise server market. Luckily for them, Intel has stumbled for now -- but Intel has a huge advantage over NVIDIA at this front. NVIDIA needs to pay TSMC for their wafers, while Intel owns their own fabs.
I highly doubt NVIDIA will ever produce a x86 chip. The x86 decoder is highly inefficient and power hungry. The problem w/ NVIDIA developing a x86 chip from scratch is the cost, and time required. They are still financially quite strong, but not strong enough to develop a x86 chip and have it potentially to fail while still pumping research dollars to the rest of its business. Also the time required, it'll take at least 4-6 years to produce a 1st generation chip, especially when the market is rapidly changing-- this is a very high risk to take. The other major problem w/ NVIDIA developing a x86 chip from scratch is lack of engineers to hire-- there aren't that many great CPU architects out there-- the good ones are all hired by IBM, Intel, and AMD. You don't just want to hire any architect, you want to hire the best-- and you also have to pay top dollar for that.
If NVIDIA does intend to enter the CPU market, I see them doing it w/ ARM, and with their GPU integrated onto it-- i.e. system on chip, so basically once again ION. The market has proved that the netbook market doesn't need Windows, they can survive with Linux. And if you can survive with Ubuntu, why do you need x86? When Linux/Ubuntu can run just fine with ARM.
Having said that, will NVIDIA purchase VIA? If they were in a better financial situation with better cash flow, then it is possible-- but currently right now? I would have to say no. Even tho the new GF104 Fermi chips are doing quite well, they need to replenish their war chest and still fund their CUDA business. Plus the VIA architecture isn't that great or something to be proud of (yes, its better than Atom, but only because Atom sucks ass and not because Via is amazing).
I would really like to see NVIDIA move more towards the ARM architecture-- since once you have a ARM license, you basically have the entire chip (i.e. no designing from scratch), and them from there you have much more endless possibilities based on an architecture that is less encompassed with legacy functionality.