CPU Core Count Mania. What do you need more cores for?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

What is your most important use case[s] for more CPU cores.

  • Gaming

    Votes: 32 25.2%
  • Video Encoding

    Votes: 38 29.9%
  • 3D rendering

    Votes: 10 7.9%
  • Virtualization (VMware and similar)

    Votes: 30 23.6%
  • HPC and Scientific computing

    Votes: 18 14.2%
  • Other (detail below)

    Votes: 18 14.2%
  • Software Compilation

    Votes: 15 11.8%
  • e-peen

    Votes: 13 10.2%
  • I don't need more cores

    Votes: 17 13.4%

  • Total voters
    127

Headfoot

Diamond Member
Feb 28, 2008
4,380
27
126
While I hear this, the trend is towards a Hybrid Solution whereby you only put what you absolutely need to in somebody else's data center : https://www.infoworld.com/article/3...ng-why-some-are-exiting-the-public-cloud.html

This is far more sensible and will eventually hit a tipping point back to private instances. My sentiments like those who left the cloud is security and a whole host of other problems with leasing someone else's equipment. There are a number of cases where doing so is far more expensive over the lifetime of equipment than actually purchasing yourself. I also note a huge chasm in understanding forming due to people choosing this EZ-bake oven solution to performance. A lot of runtime performance of compute solutions has actually gone to the gutter because no one know how to actually deal with the underlying hardware because they're just slapping things into containers or provisioning software.

Hardware is becoming more complex not less. If I asked if a server instance in the cloud was using NVME drives and if so what version, what do you think the answer would be? Do people know how to configure a sever to optimally handle their workload? GPU or CPU? Memory bound or CPU Bound? Yes, a lot of people love the idea of not having to know about any of these details and they pay for this luxury in many ways. A lot wouldn't know the first thing about distributed computing or load balancing beyond clicking a configuration in software. This leads to a lot of crappy software and security issues. You can't have your cake and eat it too. I imagine you could cute compute requirements in half if people really understood the principals behind building such software themselves and speccing the hardware to their specific requirements.

Instead, we exist in an age where people would rather run 800 cores worth of machines at full tilt brute forcing solutions using meme learning. There's a significant cost associated with this practice which is why its cyclical. Hardware is becoming complicated again and increasing substantially in capability. Mainframe/Cloud eras go through a flux during such periods... Namely because innovators begin writing new approaches to software now that they have an amazon AWS rack worth of computing in their room. New and widely varied hardware starts raining on the scale parade. Enterprise level features begin trickling their way down to the consumer.
You can provision instances with SSD vs without, with GPU compute vs without, they do give you quite a bit of granularity. At least with Amazon, you can ask people those questions and get the specific answers. You pay for complexity and scale whether you own it or Amazon owns it though, it's just a matter of whether the $$ goes to Amazon or payroll/tooling (VMware aint cheap). Increasingly it doesn't make sense for general purpose sorts of software shops to run their own server farms. Some specific uses it would matter, I agree, like video related services where the exact codecs and hardware that run them will matter a lot. I first used AWS in 2009 and the amount of new stuff since even then blew me away when I looked into it again in 2017
 

ub4ty

Senior member
Jun 21, 2017
749
315
96
You can provision instances with SSD vs without, with GPU compute vs without, they do give you quite a bit of granularity. At least with Amazon, you can ask people those questions and get the specific answers. You pay for complexity and scale whether you own it or Amazon owns it though, it's just a matter of whether the $$ goes to Amazon or payroll/tooling (VMware aint cheap). Increasingly it doesn't make sense for general purpose sorts of software shops to run their own server farms. Some specific uses it would matter, I agree, like video related services where the exact codecs and hardware that run them will matter a lot. I first used AWS in 2009 and the amount of new stuff since even then blew me away when I looked into it again in 2017
The trend is going towards hybrid models (private compute for the serious stuff w/ tightly restricted and specialized uplink to the cloud where consumers need broader access). This is to improve security and cut the insane costs associated w/ specialized cloud instances. I've worked in this sector for a while and everything has moved towards generic white box hardware paired w/ a similar pool of open-source/free tooling. There really is nothing special about the cloud and tooling as its the same thing you can download and run on your own desktop computing environment. 8 Core Xeons are no longer special. 16 Core uber Xeons are no longer special. A person can buy a 16 core processor for their home now and have even more PCIE lanes than some of the Amazon enterprise racks. Now AMD has grown this to 24/32 cores. Meanwhile you can download the same open source packages that run their cloud and run your own. Computing goes through cycles. Centralized->Distributed->Centralized->Distributed...
When computing that the big boys have in the enterprise becomes available to the consumer, things change drastically. People became obsessed with moving everything to the cloud and ignored all of downsides to doing so. There was a huge hardware disparity that allowed this to persist between enterprise and desktop. Intel ensured this persisted for some time. AMD blew it out of the water. I even hear that PLX PCIE switches will be going mainstream soon : A big differentiator between an enterprise board and consumer board. One that has been artificially sustained. "Fabric" now is mainstream. So, the question becomes, if I can buy the same junk they use to run your data center off newegg and download the same free open source packages they use why should I pay you perpetually to lease it? What you're seeing more of is people using the cloud for maybe that last mile of service to an end user. The crown jewels are moving back closer to home.
 

PeterScott

Platinum Member
Jul 7, 2017
2,538
125
96
Correct, and I think the survey isn't turning out as expected or desired.
Actually the big three I wanted to see, were Gaming, Video Compression and 3D Rendering. Because I thought Gaming and Video Compression would dominate, and 3D Rendering would be far behind despite it being so popular for benchmarking.

That relationship held and was even stronger than I thought, with 3D rendering being by far the least popular, well below even "I don't need more cores".

The biggest surprise for me the amount of people that chose Virtualization. Again something I primarily associate with work where I have used it heavily. I have dabbled in it at home for trying out Linux Distros, but it is usually short lived, back-burnered, then deleted.
 

moinmoin

Senior member
Jun 1, 2017
631
155
96
I'm privately using Virtualization for pretty much everything Windows as being used to software management under Unix I hate reinstalling Windows and its software whenever changing or switching hardware. VM containers and snapshots help optimizing and ignoring the more annoying parts. Everything approaching legacy gets virtualized to reduce the future hassle.
 

TheELF

Platinum Member
Dec 22, 2012
2,639
53
106
I'm privately using Virtualization for pretty much everything Windows as being used to software management under Unix I hate reinstalling Windows and its software whenever changing or switching hardware. VM containers and snapshots help optimizing and ignoring the more annoying parts. Everything approaching legacy gets virtualized to reduce the future hassle.
So this is more of a serial thing not a parallel thing that would need lots of cores...
Restoring a image into a VM and running this one image doesn't need tons of cores.
As far as legacy goes that would be one additional VM running XP (whenever needed) ,is there anything that needs win 7 or 8 and doesn't run on win10 at all?
 

moinmoin

Senior member
Jun 1, 2017
631
155
96
So this is more of a serial thing not a parallel thing that would need lots of cores...
Restoring a image into a VM and running this one image doesn't need tons of cores.
As far as legacy goes that would be one additional VM running XP (whenever needed) ,is there anything that needs win 7 or 8 and doesn't run on win10 at all?
You read my motivation the wrong way: I explained why I use virtualization. Essentially I have plenty old software frozen in their own hassle free outdated isolated environments (again motivation: why change what works fine at that time, why going through the hassle of updates and re-installations for minor things that aren't always needed).

More cores allow me to concurrently run more such VMs without having to worry about running short on CPU resources.
 

PeterScott

Platinum Member
Jul 7, 2017
2,538
125
96
I see the innovation in people taking the hardware and applying it in new ways. I don't see it in doing the same ol' crusty things they've been doing since the start of the cloud computing era... Cloud computing btw was a big scheme to achieve the holy grail of re-occurring revenue btw (aka leasing the same stuff to you over and over beyond the price it would have cost to purchase it). I hear people say : But I can run my software on 500 cores in the blink of an eye. Yeah, you also could write better software so you only need 10 cores. The innovation is going to occur with less not more IMO. We have some really crappy software out there due to how much compute power we have. We have some really crappy algorithms dominating computing due to insane compute power (meme learning).

Time for a change. The cloud era is over
LOL!

In our constantly connected, living online world, everything is migrating to the cloud, people are abandoning the ownership model for the service model (Netflix, Spotify) as well as backing up (or even only having) their personal data in the cloud.

That applies to computing resources as well. Cloud computing excels at many big number crunching jobs. If your CPU load is constant and consistent, then you may be able to justify/amortize the large in house data-center investment. But like many things, usage may be intermittent/inconsistent, and then it make more sense to rent cloud computing resources.

OpenAI trains it's AI in on cloud computing platforms. It recently scored another victory, this time in Team DOTA play. This time It trained on 128 000 CPU cores, and 256 NVidia P100 GPUs via Google Cloud Platform. Last time for individual play, they trained on 60 000 CPU cores on Azure.

NVidia PR about the HW OpenAI used via Google Cloud Platform to train "OpenAI Five":
https://news.developer.nvidia.com/ai-learns-to-play-dota-2-with-human-precision/

Note they can get have massive CPU/GPU resources at their disposal, easily changing the load-out as needed, without needing to invest in a massive data center. Usage of large scale cloud computing is only going to increase for this reason.

On a smaller, more personal scale All the digital assistants(Google, Siri, Alexa, etc...), are cloud computing resources, practically none of the voice recognition or AI processing is done locally, it is just passed onto the cloud. Rumor is that next generation Xbox has two version, One where your game runs locally, and cheaper streaming model that just runs games on Microsofts cloud computing resources.

The cloud is permeating anything and everything, while decentralization is becoming an anachronism.
 

Headfoot

Diamond Member
Feb 28, 2008
4,380
27
126
I don't understand why we're acting like the admin for running your own hardware stack is somehow free. Sure you lease from Amazon, but it is very expensive to maintain your own datacenter unless you have significant scale. That's why people go to the cloud. Even a 200 person company running a reasonable number of users on a typical run of the mill ecommerce style site will still require somewhere between 5 and 10 IT people with real tech chops and experience, full time. Good IT people are worth their weight in gold and get paid as such, even if you say its 5 full time IT people, that's $500k in salary, and a third over that in benefits and overhead. So all in $750k. If you're a SaaS startup where the goal is to get to $1m ARR that's nearly the entire budget. And that's assuming you're on a 100% open source stack with no licensing, and ignoring the cost of hardware. Even just 1 single IT guy with 10 years experience at 120k salary + hardware + licensing will eat a huge percentage of a $1m ARR SaaS shop

That is why increasingly people go cloud. It's more an economic decision than technical. If you want a tight, well managed operations stack, you will pay for that one way or the other
 

thecoolnessrune

Diamond Member
Jun 8, 2005
9,283
25
126
I don't understand why we're acting like the admin for running your own hardware stack is somehow free. Sure you lease from Amazon, but it is very expensive to maintain your own datacenter unless you have significant scale. That's why people go to the cloud. Even a 200 person company running a reasonable number of users on a typical run of the mill ecommerce style site will still require somewhere between 5 and 10 IT people with real tech chops and experience, full time. Good IT people are worth their weight in gold and get paid as such, even if you say its 5 full time IT people, that's $500k in salary, and a third over that in benefits and overhead. So all in $750k. If you're a SaaS startup where the goal is to get to $1m ARR that's nearly the entire budget. And that's assuming you're on a 100% open source stack with no licensing, and ignoring the cost of hardware. Even just 1 single IT guy with 10 years experience at 120k salary + hardware + licensing will eat a huge percentage of a $1m ARR SaaS shop

That is why increasingly people go cloud. It's more an economic decision than technical. If you want a tight, well managed operations stack, you will pay for that one way or the other
I don't disagree that Cloud is a major force, and only increasing, but your Cloud or In-house IT thing completely ignores MSPs, which is also an industry seeing growth every year, and is definitely Cloud's largest competitor (and ally, depending on the market).
 

Headfoot

Diamond Member
Feb 28, 2008
4,380
27
126
Very true - they overlap a lot too. I've worked with the federal government on some IT projects as a contractor and they typically need both, an MSP and somewhere to provide the infrastructure. So its common to run into situations where you're both in the cloud and paying someone to manage that
 

ub4ty

Senior member
Jun 21, 2017
749
315
96
I don't understand why we're acting like the admin for running your own hardware stack is somehow free. Sure you lease from Amazon, but it is very expensive to maintain your own datacenter unless you have significant scale. That's why people go to the cloud. Even a 200 person company running a reasonable number of users on a typical run of the mill ecommerce style site will still require somewhere between 5 and 10 IT people with real tech chops and experience, full time. Good IT people are worth their weight in gold and get paid as such, even if you say its 5 full time IT people, that's $500k in salary, and a third over that in benefits and overhead. So all in $750k. If you're a SaaS startup where the goal is to get to $1m ARR that's nearly the entire budget. And that's assuming you're on a 100% open source stack with no licensing, and ignoring the cost of hardware. Even just 1 single IT guy with 10 years experience at 120k salary + hardware + licensing will eat a huge percentage of a $1m ARR SaaS shop

That is why increasingly people go cloud. It's more an economic decision than technical. If you want a tight, well managed operations stack, you will pay for that one way or the other
There are an alphabet soup of solution models that don't revolve around the marketing term known as : the cloud.
Managed hosting/Co-location/CDN/ Hybrid Cloud.

None of it's free. You nee experts to manage it no matter what. The level expertise has fallen across the board due to open source tooling.

The IT savings over the years came mostly from the software tooling which is largely Open source and ubiquitous.
I know why people go to the cloud.... I'm constantly reminded of the idiocy related to the whole process on a daily basis :
https://www.zdnet.com/article/aws-error-exposed-godaddy-server-secrets/
"From operations as large as GoDaddy and Amazon, to small and medium organizations, anyone who uses cloud technology is subject to the risk of unintentional exposure, if the operational awareness and processes aren't there to catch and fix misconfigurations when they occur."

All's amazing in IT savings, until your bonehead reduced cost cloud computing group misconfigure's the company's crown jewels and customers on the public web.

Nothing is free in life. If you keep trying to cut every little bit of expenditure out of something, it's eventually you whose going to ultimately lose in one way or another. Business models don't last forever. Computing trends don't either. Cloud computing is a trend. It's not a new trend. There was a mainframe era in which people rented resources. Then the PC was created that brought the same power to the consumer. Are you too young to remember this? The same thing will happen and is happening to "cloud computing". A teenager has a 16 core running VMs/Containers/doing renders/machine learning/video editing...

Gotta get with the times man. It's like the boomers vs. millennial thinking.
You start believing some trend set about in your time is going to last forever. Its not.
Trends last for about 10 years peak in tech... then something no one saw coming absolutely destroys it.

As for this constant talk about $$$ and putting people out of jobs/automating. It's simple in my mind. If you don't need an expert or multiple experts that command $100k+ to setup your compute stack, then what exactly are you doing that's worth so much money? If you're able to 'outsource' all of the challenging work, then what value are you providing? America broadly is learning this hard lesson after having outsourced everything that isnt nailed down. One day you wake up and you find out, the person you outsourced your stuff to decided, from the expertise they gained, to get directly into your business. There's sound reasons from a security and preservation stand point, to keep the gold in your own vault. Hybrid cloud is evolving for this reason. Edge services are where you wire into your distributed clients. There's sound reasons why everyone shouldn't idiotically dump highly valuable compute tasks into a handful of service providers. While the data centers might be distributed, the hardware/software stack is all the same for good reasons. God help everyone when a hardware vulnerability exposes such a gigantic homogeneous operation... Oh wait.

Good IT people are worth their weight in gold and get paid as such, even if you say its 5 full time IT people, that's $500k in salary, and a third over that in benefits and overhead. So all in $750k. If you're a SaaS startup where the goal is to get to $1m ARR that's nearly the entire budget. And that's assuming you're on a 100% open source stack with no licensing, and ignoring the cost of hardware. Even just 1 single IT guy with 10 years experience at 120k salary + hardware + licensing will eat a huge percentage of a $1m ARR SaaS shop
$1m ARR SaaS startup... I'd hope the founders/staff have full-stack capability and an understanding of how to setup SaaS if that's your annual revenue.
In a company that small, your founders and engineers better be IT experts as well. Setting up SaaS in the cloud still commands a serious amount of technical know-how and compensation to go along w/ it. That being said, I know a number of teenagers who do it daily...
Again, the tooling has cut the majority of IT costs and expertise. Eventually a generation of kids come to understand/integrate the things a previous generation of adults made six figures for.
The 'cloud' is built upon software tooling that evolved in an open source environment..
This is where the savings came from.

As for the hardware... This is how the cloud provider makes money.. you're paying for the hardware over and over again at leased pricing.

You're talking as if there's no substantial profits in cloud computing.. The profits come from you.

Were not in the 90s.
Get with the times. People are computing atop open source VM/Containers/Load balancing stacks in their homes. There's nothing substantially different happening in the cloud. Ubiquitous Fiber to the home is coming and so is 5G.

The cloud for this reason has become a meme. 5G hits and there's super high speed access points everywhere. Then what? You think the cloud model will remain the same?
If you want to start a successful tech startup, you better have some technical competency yourself. The MBA slave driving engineers age ended some time ago. Now-a-days a successful founding team is technical and can wear multiple hats. Business related OPs are conducted by open source software and SaaS. Kids are growing up with VM/containerization/load balancing and high level languages that let them configure hardware resources with ease. So, many of them can do the IT work themselves. It's like second nature. When they come of age, they'll laugh at the idea of leasing something from someone they can do themselves and have reliably wired up to the internet.The complexity of it has decreased substantial. Everyone's using the same open source packages. So, in the end, you're just paying for hardware over and over which is how cloud providers get rich.

I work in the industry. I know the business model very well.
I'm a chef who doesn't eat his own cooking because I know what's in it.
I'm not an outsider with no clue of trends in enterprise computing.

It's the reversal of heavily invested trends that scares the @#*! out of common thinking.
Reversals make me salivate. The less people who see them coming the better.
I'm speaking about something beyond current trends.
Don't confuse this with a lack of understanding about what is currently trendy.

I don't understand why we're acting like the admin for running your own hardware stack is somehow free.
Oh... and it is free because I have the expertise and know-how on how to do it and it has become greatly simplified via open source software tooling which is the very reason Amazon is able to do cloud computing at scale. Software's doing all the complex work. When teenagers grow up maintaining these skills, you better start looking for a new job.
 
Last edited:

dark zero

Platinum Member
Jun 2, 2015
2,493
2
61
Compiling. More cores, the better.
Virtual Machines. More cores are needed for it
 

TheELF

Platinum Member
Dec 22, 2012
2,639
53
106
There are an alphabet soup of solution models that don't revolve around the marketing term known as : the cloud.
Managed hosting/Co-location/CDN/ Hybrid Cloud.

None of it's free. You nee experts to manage it no matter what. The level expertise has fallen across the board due to open source tooling.

The IT savings over the years came mostly from the software tooling which is largely Open source and ubiquitous.
I know why people go to the cloud.... I'm constantly reminded of the idiocy related to the whole process on a daily basis :
https://www.zdnet.com/article/aws-error-exposed-godaddy-server-secrets/
"From operations as large as GoDaddy and Amazon, to small and medium organizations, anyone who uses cloud technology is subject to the risk of unintentional exposure, if the operational awareness and processes aren't there to catch and fix misconfigurations when they occur."

All's amazing in IT savings, until your bonehead reduced cost cloud computing group misconfigure's the company's crown jewels and customers on the public web.

Nothing is free in life. If you keep trying to cut every little bit of expenditure out of something, it's eventually you whose going to ultimately lose in one way or another. Business models don't last forever. Computing trends don't either. Cloud computing is a trend. It's not a new trend. There was a mainframe era in which people rented resources. Then the PC was created that brought the same power to the consumer. Are you too young to remember this? The same thing will happen and is happening to "cloud computing". A teenager has a 16 core running VMs/Containers/doing renders/machine learning/video editing...

Gotta get with the times man. It's like the boomers vs. millennial thinking.
You start believing some trend set about in your time is going to last forever. Its not.
Trends last for about 10 years peak in tech... then something no one saw coming absolutely destroys it.

As for this constant talk about $$$ and putting people out of jobs/automating. It's simple in my mind. If you don't need an expert or multiple experts that command $100k+ to setup your compute stack, then what exactly are you doing that's worth so much money? If you're able to 'outsource' all of the challenging work, then what value are you providing? America broadly is learning this hard lesson after having outsourced everything that isnt nailed down. One day you wake up and you find out, the person you outsourced your stuff to decided, from the expertise they gained, to get directly into your business. There's sound reasons from a security and preservation stand point, to keep the gold in your own vault. Hybrid cloud is evolving for this reason. Edge services are where you wire into your distributed clients. There's sound reasons why everyone shouldn't idiotically dump highly valuable compute tasks into a handful of service providers. While the data centers might be distributed, the hardware/software stack is all the same for good reasons. God help everyone when a hardware vulnerability exposes such a gigantic homogeneous operation... Oh wait.


$1m ARR SaaS startup... I'd hope the founders/staff have full-stack capability and an understanding of how to setup SaaS if that's your annual revenue.
In a company that small, your founders and engineers better be IT experts as well. Setting up SaaS in the cloud still commands a serious amount of technical know-how and compensation to go along w/ it. That being said, I know a number of teenagers who do it daily...
Again, the tooling has cut the majority of IT costs and expertise. Eventually a generation of kids come to understand/integrate the things a previous generation of adults made six figures for.
The 'cloud' is built upon software tooling that evolved in an open source environment..
This is where the savings came from.

As for the hardware... This is how the cloud provider makes money.. you're paying for the hardware over and over again at leased pricing.

You're talking as if there's no substantial profits in cloud computing.. The profits come from you.

Were not in the 90s.
Get with the times. People are computing atop open source VM/Containers/Load balancing stacks in their homes. There's nothing substantially different happening in the cloud. Ubiquitous Fiber to the home is coming and so is 5G.

The cloud for this reason has become a meme. 5G hits and there's super high speed access points everywhere. Then what? You think the cloud model will remain the same?
If you want to start a successful tech startup, you better have some technical competency yourself. The MBA slave driving engineers age ended some time ago. Now-a-days a successful founding team is technical and can wear multiple hats. Business related OPs are conducted by open source software and SaaS. Kids are growing up with VM/containerization/load balancing and high level languages that let them configure hardware resources with ease. So, many of them can do the IT work themselves. It's like second nature. When they come of age, they'll laugh at the idea of leasing something from someone they can do themselves and have reliably wired up to the internet.The complexity of it has decreased substantial. Everyone's using the same open source packages. So, in the end, you're just paying for hardware over and over which is how cloud providers get rich.

I work in the industry. I know the business model very well.
I'm a chef who doesn't eat his own cooking because I know what's in it.
I'm not an outsider with no clue of trends in enterprise computing.

It's the reversal of heavily invested trends that scares the @#*! out of common thinking.
Reversals make me salivate. The less people who see them coming the better.
I'm speaking about something beyond current trends.
Don't confuse this with a lack of understanding about what is currently trendy.


Oh... and it is free because I have the expertise and know-how on how to do it and it has become greatly simplified via open source software tooling which is the very reason Amazon is able to do cloud computing at scale. Software's doing all the complex work. When teenagers grow up maintaining these skills, you better start looking for a new job.
So what's the difference?
If you need a server "room" (flat, building) you are paying for the system over and over again because you pay more rent,you are paying for the system over and over again because you need new ones every few years,you pay for them over and over because you have to pay for the power each month,you pay for them over and over because you need to pay for the cooling over and over again.Then at some point you want to expand and you have to pay for a whole new building because you can't fit enough compute into your existing rooms.
Then if you have some downtime you pay all of this for no reason(profit) at all.
It's all a game of costs,like everything else,if the cloud is cheaper anybody will use the cloud.


I'm a chef who doesn't eat his own cooking because I know what's in it.
Yeah maybe that's the problem here,you being that kind of cook being unable to see anybody else's view.
 

ub4ty

Senior member
Jun 21, 2017
749
315
96
So what's the difference?
If you need a server "room" (flat, building) you are paying for the system over and over again because you pay more rent,you are paying for the system over and over again because you need new ones every few years,you pay for them over and over because you have to pay for the power each month,you pay for them over and over because you need to pay for the cooling over and over again.Then at some point you want to expand and you have to pay for a whole new building because you can't fit enough compute into your existing rooms.
Then if you have some downtime you pay all of this for no reason(profit) at all.
It's all a game of costs,like everything else,if the cloud is cheaper anybody will use the cloud.
What's the difference? The difference is a large amount of $$ for things that variably people do or don't need. The difference is calculated, published, and known among consultants who help guide people on making these decisions. Thus why, no one with serious understanding of things, refers to the whole ecosystem as the cloud. "The cloud" is a meme. Highly paid consultants and specialists guide them through the many different possibilities...
Will you have a ton of egress traffic?
Do you really need high uptime?
Can you deal variable completion time?


There was a dearth of small business for some time. Now you have an explosion via 'the gig economy' and enablement platforms. More and more people are doing video production, hosting their own media content channels, etc etc. A lot of people do this from home using hardware that used to only be available in the cloud. A particular company's closed ecosystem is a data center. Producers connect to such a platform to reach people through a company's proprietary platform. Leasing isolated compute capacity for your personal use is a completely different use case and warrants being distinguished.

So, for all intents and purpose, the cloud is a meme. Most people aren't running super computer workloads. Most people are connecting to a proprietary platform. A modern 8 core desktop processor costing a couple hundred dollars outperforms a Xeon from some years ago that costs thousands. When it is possible to go to walmart and buy a computer for $500 to solve my use case, why should someone lease the resource from you? If I have Fiber with an LTE backup (headed towards ubiquitous 5G) to the home? Why do I need your silly enterprise connectivity solution?

When the average person has an enterprise level of compute in their desktop? Why do they need your meme cloud? Backup? My 70 year old granny clicks a button if she wants a backup done otherwise it is automated. Uptime? UPS/redundancy .. Available to consumers. Worst case scenario? I fire up a remote backup instance on someone else's data center in an emergency.

Yeah maybe that's the problem here,you being that kind of cook being unable to see anybody else's view.
Or you're maybe looking at things from the outside or a narrow picture. On the ground, everyone's talking about the opposite of what the trend seems to be to an outsider... They have been for years.

I've detailed and explained myself. If you don't grasp what I'm saying or laying out, that's too bad.
When the change comes, there will be people caught with their pants down and that's good for people (cook) who saw the future before they did.

No trend lasts forever in computing most certainly not one dependent on hardware disparities. If you argue otherwise.. it's time to question who is who.
 
Last edited:

TheELF

Platinum Member
Dec 22, 2012
2,639
53
106
Lol you were ranting about how paying the cloud is stoopid because you pay for the same thing over and over again,I responded with you do the same if you pay for real systems in your space and now you rant about how the cloud is a scam.
Yeah I know that people create more and more media at home because I am one of them.
I hit 44FPS converting to 1080/60fps/x265 and about 400FPS converting to 264 with a cheap ass 1050ti, try matching that with your $500 worth of 8 cores from walmart.
A CPU who's only highlight is good video rendering speeds is doomed in today's market everybody is going hardware accelerated for years now, you are trailing years behind the current trends.
 

ub4ty

Senior member
Jun 21, 2017
749
315
96
Lol you were ranting about how paying the cloud is stoopid because you pay for the same thing over and over again,I responded with you do the same if you pay for real systems in your space and now you rant about how the cloud is a scam.
Yeah I know that people create more and more media at home because I am one of them.
I hit 44FPS converting to 1080/60fps/x265 and about 400FPS converting to 264 with a cheap ass 1050ti, try matching that with your $500 worth of 8 cores from walmart.
A CPU who's only highlight is good video rendering speeds is doomed in today's market everybody is going hardware accelerated for years now, you are trailing years behind the current trends.
I am speaking about industry knowledge and perspective derived from actually working in this domain.
Why you keep claiming I'm ranting tells me more about you than me.

As for my home, I have a range of enterprise grade hardware and consumer hardware up to the HEDT level linked with consumer/enterprise grade solutions.. which relates to some of the work I do.

I would ask you from what perspective you're speaking from but I already have derived a bit of it.

...going hardware accelerated for years now - The only sound thing stated.

Yes, computing requirements have been shrinking due to hardware acceleration. When there are big bumps in computing at the desktop level, enterprise suffers not flourishes. Cloud computing evolved in a cyclically stagnant period on the desktop and domestic telecom. The future is a quite active desktop computing market, hardware acceleration, and boons in domestic telecom.

Where your perspective lies on this and whether or not you are personally involved and how involved in this domain will determine if you flourish or go bankrupt. To an outsider, such conversation seems like a cook level rant... It's simply beyond your purview and I'm not going to argue endlessly with someone who has no clue what I'm talking about.
 
Last edited:

ub4ty

Senior member
Jun 21, 2017
749
315
96
Jensen is currently on stage finally putting to rest the CPU meme rendering paradigm and beyond.
The hardware era is alive. Competition will be coming from all sides. What a time to be alive



Cut my life into pieces... this is the hardware era
 
Last edited:

BigDH01

Golden Member
Jul 8, 2005
1,633
0
91
When computing that the big boys have in the enterprise becomes available to the consumer, things change drastically. People became obsessed with moving everything to the cloud and ignored all of downsides to doing so.
I don't think anyone *ignored* the downsides but the business case makes sense. And I say that as someone who moved a pretty security-sensitive industry *to the cloud*.

So, the question becomes, if I can buy the same junk they use to run your data center off newegg and download the same free open source packages they use why should I pay you perpetually to lease it? What you're seeing more of is people using the cloud for maybe that last mile of service to an end user. The crown jewels are moving back closer to home.
Because what I can't buy on Newegg is the knowledge and expertise to properly manage a nosql distributed database that seamlessly grows with my usage. I can't cheaply buy the kind of expertise required to replicate my sql database across 3 different isolated zones with low latency read replicas that transparently scales up to 64 TBs. How about limitless object storage with integration that easily integrates with spark and presto to run big data analytics? Or how about a nearly limitless at-least-once message delivery system? Don't forget that this is all backed with KMS encryption. I've had to set up local analogs for nearly *all* of these services (Cassandra -> Dynamodb, MySQL -> Aurora, HDFS -> S3, Presto -> Athena, Spark -> Glue, NATS -> SQS, Kafka -> Kinesis) and I'm sure as shit glad we can outsource the required expertise so the most expensive resource (development) can concentrate on developing. And I'd still rather use lambda or elastic beanstalk for most compute scaling as opposed to trying to set up kubernetes. Hell, you could damn near get a startup up and running (at least through proof of concept) using AWS free tiers (1 million requests per month on lambda, 200 million requests per month on dynamo, 1 million requests per month on sqs, 5 GB of S3 storage). It's crazy.
 

ub4ty

Senior member
Jun 21, 2017
749
315
96
I don't think anyone *ignored* the downsides but the business case makes sense. And I say that as someone who moved a pretty security-sensitive industry *to the cloud*.
I am well aware of the big contract Amazon secured regarding *a pretty security sensitive* industry. I am also aware of enough of *security classifications* and subsequent requirements to know that its nothing like their public data centers. Do you agree or disagree? As for downsides and cost, that's evaluated on a per use case and customer basis. Nvidia just cut down a 144kw server garden down to a single 13Kw Rack install. This was just announced moments ago. In a hardware boom, things change and they change quickly. Yesterday's mainframe

becomes tomorrow's Iwatch. Hardware booms are cyclical. We were in a lull. The cloud computing model developed in it. Now were in a hardware boom. Yet, I see a lot of 'late entrants' to the cloud. You used to need a storage SAN for 8TB, now you can run out and buy one at bestbuy for $130.

Because what I can't buy on Newegg is the knowledge and expertise to properly manage a nosql distributed database that seamlessly grows with my usage. I can't cheaply buy the kind of expertise required to replicate my sql database across 3 different isolated zones with low latency read replicas that transparently scales up to 64 TBs. How about limitless object storage with integration that easily integrates with spark and presto to run big data analytics? Or how about a nearly limitless at-least-once message delivery system? Don't forget that this is all backed with KMS encryption.
And what do the trends of computing have to say about this? What happened to IT staff of the past? What do modern data center foot prints look like? A bunch of empty floor tiles with a 1/4th the racks the data center previously had with the dusty footprint remnants to remind you of the fact. Software stacks are an absolute mess tbqh. The many ridiculous high level languages, database technologies, middleware renaming, and distributed computing software packages is indicative of an software explosion cycle. Eventually it gets consolidated and standardized. Web 3.0 hasn't hit yet. What you have currently is a frankenstein mess of software.

They get sorted and consolidated in cycles too. The previous SME you needed to do all of this gets packaged into a software helper app. You fire the IT guys of previous era w/ automation and consolidate an industry into the cloud.. Then some enterprising person comes along and disrupts your cloud business model. Nothing stays still in computing. There's not a single thing you can list that can't be automated or simplified. Mainframes use to rule and leasing them was the route people had to take. Then the PC era hit... then the web era.. then the mainframe era..

People believe this but the reality is that there's a big arrow drawn from wave 3 back to Wave 1 and its cyclical. Were in Wave 3 and have been for a while... The next wave is ________.


I've had to set up local analogs for nearly *all* of these services (Cassandra -> Dynamodb, MySQL -> Aurora, HDFS -> S3, Presto -> Athena, Spark -> Glue, NATS -> SQS, Kafka -> Kinesis) and I'm sure as shit glad we can outsource the required expertise so the most expensive resource (development) can concentrate on developing. And I'd still rather use lambda or elastic beanstalk for most compute scaling as opposed to trying to set up kubernetes. Hell, you could damn near get a startup up and running (at least through proof of concept) using AWS free tiers (1 million requests per month on lambda, 200 million requests per month on dynamo, 1 million requests per month on sqs, 5 GB of S3 storage). It's crazy.
Here's the thing and I'll be quite honest and frank with you. A good ton of the garbage (not referring to you but the overall industry) that runs atop these elaborate installs isn't worth the electrons it rides atop. What you just defined was the reason why there are so many trash startups : Because there is little to no barrier to scaling operations. This is not a new phenomenon. It has been going on for some time and just about everyone in Silicon Valley is wondering when its going to crash. When does it all come crashing down in tech? When there's finally sound technology and use cases for compute and when compute at the desktop level begins matching enterprise. It ends up being a wakeup moment where people realize what's of value and what's not.

Big Data / Cloud computing have enjoyed almost a decade of success. It's long in the tooth and ripe for disruption. I'm not making my commentary against what is today. I'm highlighting what is to come... And if it hasn't been clear yet : A mass of fools always rush in at the end of a cycle which makes the crashes so significant. Data is being replaced by intelligence. Consumer hardware is becoming on par w/ enterprise. The unintelligent Big Data/Cloud era is over
 
Last edited:

BigDH01

Golden Member
Jul 8, 2005
1,633
0
91
I am well aware of the big contract Amazon secured regarding *a pretty security sensitive* industry. I am also aware of enough of *security classifications* and subsequent requirements to know that its nothing like their public data centers. Do you agree or disagree?
Not sure what requirements that contract entailed, but we just got FedRAMP running on AWS (which is just our latest of many). Not sure specifically what your qualms are.

becomes tomorrow's Iwatch. Hardware booms are cyclical. We were in a lull. The cloud computing model developed in it. Now were in a hardware boom. Yet, I see a lot of 'late entrants' to the cloud. You used to need a storage SAN for 8TB, now you can run out and buy one at bestbuy for $130.
Does that include the hardware RAID 60 card I need with enterprise level drives? I've been out of that space for awhile and really only ever did it for shits and giggles, but even relatively recently it was still a PITA to replace a dead drive and there was a definite pucker factor while the array was rebuilding.

And what do the trends of computing have to say about this? What happened to IT staff of the past? What do modern data center foot prints look like? A bunch of empty floor tiles with a 1/4th the racks the data center previously had with the dusty footprint remnants to remind you of the fact. Software stacks are an absolute mess tbqh. The many ridiculous high level languages, database technologies, middleware renaming, and distributed computing software packages is indicative of an software explosion cycle. Eventually it gets consolidated and standardized. Web 3.0 hasn't hit yet. What you have currently is a frankenstein mess of software.
That's kind of my point. Google, Azure, AWS are standardizing a lot of this tech by making managed versions of them available in the cloud. This means that random IT admin 1 doesn't have to tune JVM garbage collection to get maximum performance out of Cassandra. This is one of the reasons going to the cloud is an advantage.

Here's the thing and I'll be quite honest and frank with you. A good ton of the garbage (not referring to you but the overall industry) that runs atop these elaborate installs isn't worth the electrons it rides atop. What you just defined was the reason why there are so many trash startups : Because there is little to no barrier to scaling operations.
And with that, I stop reading. I'm not sure why you think eliminating barriers to scaling is a *bad* thing but there's little point in discussing further.
 

ub4ty

Senior member
Jun 21, 2017
749
315
96
Not sure what requirements that contract entailed, but we just got FedRAMP running on AWS (which is just our latest of many). Not sure specifically what your qualms are.
The qualms are typically the things people ignore and claim they don't know anything about until there's an o-shit moment...Like a high level data breach that exposes your whole 'cloud' or worse.


Does that include the hardware RAID 60 card I need with enterprise level drives? I've been out of that space for awhile and really only ever did it for shits and giggles, but even relatively recently it was still a PITA to replace a dead drive and there was a definite pucker factor while the array was rebuilding.
People play w/ such configs on /r/datahoarder.. Again, the kiddies tinkering and playing with things in the background who come of age and disrupt business models..

That's kind of my point. Google, Azure, AWS are standardizing a lot of this tech by making managed versions of them available in the cloud. This means that random IT admin 1 doesn't have to tune JVM garbage collection to get maximum performance out of Cassandra. This is one of the reasons going to the cloud is an advantage.
The open source community did and groups like the aforementioned participated, steered, and helped in various ways. The government also has several initiatives that will likely become standards like the creation of the internet itself... Like with self driving cars/etc. You no longer need a highly specialized IT admin and you eventually wont need a cloud instance.

And with that, I stop reading.
And that's where my work begins.
I'm not sure why you think eliminating barriers to scaling is a *bad* thing but there's little point in discussing further.
Eliminating barriers isn't a bad thing. I never said this so don't put words in my mouth. It's the natural order of technology which I detailed which you seem to disagree as you feel the future is the cloud. When there is no barrier to entry between the cloud hardware and the home, what becomes of the cloud? Eliminated.

What is observable all throughout the universe when there are no barriers/filters is that large volumes of junk that arrive. Eventually there is a consolidation and cleansing. In the history of tech, there was : https://en.wikipedia.org/wiki/Dot-com_bubble. Silicon Valley was a ghost town when it hit. Companies that were claimed to be the future of tech were shuttered w/ Aeron chairs stacked in bins outside.

Just about everyone including those who work at all of the companies you just mentioned (including cloud service providers) in Silicon Valley have all been awaiting the next big crash and it centers on Cloud/Big Data and loads of 'low barrier' startups built atop it. Why? Because lots of junk arrived w/ there being little to no barrier to entry and that only has a certain lifespan. People revel in what happened to the IT industry but don't seem to realize this cyclical visits all sectors of tech.

Catch everyone on the flip side of this. I've seen what I had to on the hardware side. What I've been waiting for has arrived.

Oh and in the history of my posts, you'll find a gem about what will eventually befall memeCoin even as everyone touted it as the future. There's so much more to come
 
Mar 11, 2004
17,598
152
126
Wait, are people seriously arguing that average consumers are now going to be installing and running servers as cloud setups just because they can get a lot more cores? I would actually love it (especially as part of an attempt to subvert commercial network setup as it is now, so in the effort to create an open wireless mesh network they do this), but, sorry I just don't see that at all.

Plus, its not like the cloud companies aren't getting the exact same hardware shift, plus they have the money to be able to transition ASAP. Average people are not going to drop $5000 on a home server setup, especially one that they have to manage themselves. Even with tools that make it muuuch easier than ever before, its just not gonna happen. Cloud companies aren't going to just buy new stuff and reduce their footprint, they're going to fill that footprint, and that will drive costs down further.

If anything, I think we're transitioning even quicker to the commercial cloud. People have shifted to streaming music and video. They're not far from shifting to streaming gaming (they're already halfway there). Content is shifting that direction more and more, and it won't be long before people won't have the option of owning a physical piece of media, not that they'll care as access to services will be cheap enough that they'll be happy to give it up, especially since it'll be like getting access to a much larger library.

I don't know if to laugh at this impending doom/crash prediction either (if they've been waiting for and/or predicting it, they'd be speaking with their money...). The Dotcom bubble didn't exactly make things a ghost town. In fact it pretty much did exactly what you're arguing against, it led to the cloud being setup, as the giants rose from the early situation. That you think there's some natural evolution back to the simpler setup, when the giants are leaner and more efficient baffles me to no end.

The whole argument is just absurd. None of the hardware companies have been doing that as an attempt at wooing consumers back. They've been blatantly targeting the major players and their cloud setups. The hardware being cheaper enables some fun things, but average consumers don't care about that at all. Most couldn't even setup network management of an iOS device. That you think they're going to be setting up and installing their own home servers and managing personal clouds is hilariously out of touch with average society. You've been spending way too much time on niche boards.

Another aspect that you're ignoring is that average people want access to the wider data that stuff like social media has. You cannot move that to your personal cloud setup.

And while this nice open collaboration is a nice idea, its up against much more than the cost of hardware. The FCC will likely clamp down hard on people trying to setup an open wireless network. Even data scientists are going "WTF?" at the insistence that blockchain is going to solve any of the things it claims it will, and that the decentralized nature has already shown serious problems (and if you think it enhances security then...just wow).
 
Last edited:

CHADBOGA

Golden Member
Mar 31, 2009
1,763
24
126
If Pepe is against the Cloud, then that's good enough for me.
 

TheGiant

Senior member
Jun 12, 2017
261
16
76
another discussion when we think average user needs to understand how computer works before using it....

reminds me of the time why unix lost the desktop ....
 


ASK THE COMMUNITY

TRENDING THREADS