Another example of "Cloud computing" bringing us back to the 70s

Hugo Drax

Diamond Member
Nov 20, 2011
5,647
47
91
1970s style computing that is. Back then before the new fangled personal computing made its impact, People had to work on dumb terminals attached to "the computer".

In the 1970s these dumb terminals, ie Hazeltines or ASR-33s were useless without a working connection to the computer.


Then the new fangled PC came about and the world was free from the umbilical cord, they can actually compute at home, save files at home etc..


Now Marketing firms and big corporations figured out that Millenials have no clue regarding the old history of the dark ages of computing and the requirements of centralized storage and cpu when it came to working on your data.

These firms now call it "Cloud computing" and the masses are biting to get into this new cloud as quickly as possible.


Well, the first victims suffered a bad outage, the "Adobe creative cloud" people.

http://arstechnica.com/information-...loud-more-than-a-day-old-locks-out-app-users/

Unable to work on their data, or operate the software they purchased for 24 hours because the umbilical cord broke for 24 hours.

This brings flashbacks of the 70s, people staring at blank Hazeltines and silent ASR-33's as everyone was waiting for the computer priesthood to bring the beast back to life.

Welcome Millennials to the 1970s. Start wearing your bellbottoms,sideburns and crack open a PBR while you sit on your hands, watching the deadline pass on that critical work that was due yesterday :)
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
We're going BYOD, VDI with everything we can at work. They want to save money on hardware purchases and support. So they're spending crazy money on servers that always choke and need to add drive IO or even more racks (you can buy a lot of laptops for the cost of that hardware). Also, we have outages from time to time, and it causes problems sometimes for several hours where no one can do anything while down. Recently I traveled for a sales meeting, some ~400 sales people all at one hotel joining their network and connecting. The hotel's bandwidth choked quite often. Half our sales people sell in hospitals and have to be connected to show our products now. As you can imagine cell coverage can be quite spotty in a large concrete building. Most of these people are not really tech savy and need to connect to the hospitals network, and then hope they can get through to their needed system.

I'm not sold on it.
 

brianmanahan

Lifer
Sep 2, 2006
24,572
5,979
136
i predict we will switch back and forth from centralized computing to distributed computing every 20 years or so. throw buzzwords in there so the business people will buy it, and we will keep the rework coming long enough to make all of us rich! :awe:
 

Hugo Drax

Diamond Member
Nov 20, 2011
5,647
47
91
i predict we will switch back and forth from centralized computing to distributed computing every 20 years or so. throw buzzwords in there so the business people will buy it, and we will keep the rework coming long enough to make all of us rich! :awe:

Sounds about right. in 20 years someone will discover the amazing power of distributed computing.

They will call it "Galactic computing".

"Get beyond the cloud with the power of galactic computing"
 

_Rick_

Diamond Member
Apr 20, 2012
3,951
70
91
There's simple economic analysis behind it.
In the 70s, computers were expensive, therefore personal computers didn't make any economic sense.
In the 90's, personal computers were cheap, with server/mainframe hardware remaining expensive, especially compared to the performance a desktop user needed.
in the 10's powerful and efficient multicore CPUs and ubiquitous high-speed/low-latency interconnects make server-based calculations more interesting, especially since low-end hardware is available for cheap and in bulk, to create low cost terminals.

Virtualised desktops are now becoming cost effective for most large organizations, since the available computer power of a single CPU is enough to serve a dozen users who only do terminal work. This makes centralization cost effective again, while previously getting that kind of power centralized was too expensive.

Internet based cloud-computing on the other hand makes much less sense in most cases, because the interconnect sucks.
 

jpiniero

Lifer
Oct 1, 2010
16,392
6,866
136
You're forgetting one big thing though, with the Cloud the companies workers can be anywhere. Including India.
 

Hugo Drax

Diamond Member
Nov 20, 2011
5,647
47
91
My biggest issue with cloud computing is that software that I was able to run on my own personal computer will require an internet connection to some server in order for it to function.

Somehow people are sold on having to run photoshop in the cloud in order to work on images etc.. When before that software ran on the desktop.

Now your at the mercy of the internet and the servers hosted elsewhere when it comes to doing work.

Stuff today's computers are more than capable of doing.


The new simcity is another example of cloud computing fail.
 

mrjminer

Platinum Member
Dec 2, 2005
2,739
16
76
Yea, it's silly.

It makes sense for virtualizing on-site or as a supplement to existing independent systems (ie: where your company can control pretty much everything), but relying on it 100% or for core business introduces a lot of factors outside of your control.

So many people fall for these marketing ploys, though, it is unbelievable.
 

her209

No Lifer
Oct 11, 2000
56,336
11
0
i predict we will switch back and forth from centralized computing to distributed computing every 20 years or so. throw buzzwords in there so the business people will buy it, and we will keep the rework coming long enough to make all of us rich! :awe:

It'll be a hybrid in about 10 years.

artworks-000040294056-1x1364-crop.jpg
 

BUTCH1

Lifer
Jul 15, 2000
20,433
1,769
126
There's simple economic analysis behind it.
In the 70s, computers were expensive, therefore personal computers didn't make any economic sense.
In the 90's, personal computers were cheap, with server/mainframe hardware remaining expensive, especially compared to the performance a desktop user needed.
in the 10's powerful and efficient multicore CPUs and ubiquitous high-speed/low-latency interconnects make server-based calculations more interesting, especially since low-end hardware is available for cheap and in bulk, to create low cost terminals.

Virtualised desktops are now becoming cost effective for most large organizations, since the available computer power of a single CPU is enough to serve a dozen users who only do terminal work. This makes centralization cost effective again, while previously getting that kind of power centralized was too expensive.

Internet based cloud-computing on the other hand makes much less sense in most cases, because the interconnect sucks.

Yea, waaay back in my HS days (early '70's) there was not ANY type of computer on the school premises, everything was done through a terminal and there was no display, you could write simple BASIC programs and such but you had to be a "super-geek" to get into computing in this form.
 

JManInPhoenix

Golden Member
Sep 25, 2013
1,500
1
81
The only thing I use the cloud for is extra backup for my music & photos (amazon). I already have both backed up in at least three other places in my physical proximity. I will never put anything in the cloud that is sensitive (ie tax info).
 

Jeff7

Lifer
Jan 4, 2001
41,596
19
81
And you'd better hope that your cloud service company doesn't go out of business, or decide that they want to change their business model and pursue a different market, or take advantage of their fully-captive audience and start jacking up their costs.

Cloud services give your business a kill switch, but that kill switch is owned and operated by someone outside of your business.





i predict we will switch back and forth from centralized computing to distributed computing every 20 years or so. throw buzzwords in there so the business people will buy it, and we will keep the rework coming long enough to make all of us rich! :awe:
Kind of like serial vs parallel.

- This serial connection is too slow.
- Let's put several of them together!
- I can't push this any faster than this, because it's tough to keep them synchronized.
- Someone just found a way of making a really fast serial connection!
- Switch back to serial.
- This serial connection is too slow.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
webmail
steam
synchronized web browsers

most of us are already fundamentally tied to "the cloud" in some form

web mail not so objectionable because its been around long enough to pretty much have a 100% uptime (which is why this Adobe situation is such fail; there's just no way they should have let that happen)

steam not so bad when we can still, for the most part, use the software we've downloaded even if steam service is down (which is far too often)

web browsers will still work even if sync is broken (at least for now)

the cloud does appear to be inevitable, and a lot of the pitfalls with it will be mitigated with technologies such as gigabit internet, which can provide bandwidth that can rival the speeds of all but the most modern HDDs. And if there's one thing thats truly holding back cloud computing in NA, its the tyranny of our shitty ISPs. So it would be pretty ironic if we can upset the 'balance' and finally get ISPs to play ball and innovate, only to then have cloud computing forced down our throats (even more so than already)
 

lxskllr

No Lifer
Nov 30, 2004
59,256
9,759
126
Eben Moglen said:
I haven’t mentioned the word “cloud” because the word “cloud” doesn’t really mean anything very much. In other words, the disaster we are having is not the catastrophe of the cloud. The disaster we are having is the catastrophe of the way we misunderstood the Net under the assistance of the un-free software that helped us to understand it. What “cloud” means is that servers have ceased to be made of iron. “Cloud” means virtualization of servers has occurred.

So, out here in the dusty edges of the galaxy where we live in dis-empowered clienthood, nothing very much has changed. As you walk inward towards the center of the galaxy, it gets more fuzzy than it used to. We resolve now halo where we used to see actual stars. Servers with switches and buttons you can push and such. Instead, what has happened is that iron no longer represents a single server. Iron is merely a place where servers could be. So “cloud” means servers have gained freedom, freedom to move, freedom to dance, freedom to combine and separate and re-aggregate and do all kinds of tricks. Servers have gained freedom. Clients have gained nothing. Welcome to the cloud.

https://www.youtube.com/watch?v=QOEMv0S8AcA
 

Red Squirrel

No Lifer
May 24, 2003
70,003
13,488
126
www.anyf.ca
I'll stick to having my own "cloud" and servers that I control. Centralization can be a good thing, but only if you are the one in control.
 

mmntech

Lifer
Sep 20, 2007
17,501
12
0
I think Adobe Cloud was a conspiracy by Apple to get everyone using Aperture and Final Cut X again. :D
 

Imaginer

Diamond Member
Oct 15, 1999
8,076
1
0
I think Adobe Cloud was a conspiracy by Apple to get everyone using Aperture and Final Cut X again. :D

I wouldn't be surprised if killing Flash is apart of that process as well.

But really, I much rather have computing power as local as possible, programs utilized in full off-line and in full featured capability, and having the internet as a means of communication, referencing, and information exchange. Personal data, should be at the very least be a redundant backup or allow to get your backup from your own dedicated, backup machines through the internet by a virtual private connection.
 

shortylickens

No Lifer
Jul 15, 2003
80,287
17,080
136
The difference is intercomputer connections (network, internet) are about a billion times faster.

Terminals are a lot more realistic nowadays.
 

Jeff7

Lifer
Jan 4, 2001
41,596
19
81
The difference is intercomputer connections (network, internet) are about a billion times faster.

Terminals are a lot more realistic nowadays.
And yet, just as an example, skipping around a streamed video is still quite sluggish (understatement) compared to locally-stored content. Or using a remote desktop service to control another PC: Either doing that through Teamviewer, or back in college, using the remote lab PC login, it's just maddeningly slow due to the latency of the Internet. My expectation is that a PC had better be damned snappy in what it does.


Or at work, for a few years we had two locations in the same city, and access to the main server in the other facility was done over the Internet, a T1 line I believe.
I kept most of my content local. It was very slow to send data to or from the remote server, at least when compared to something stored on our own server or on my PC.
Now we're all combined in the same building, and I do use the server most of the time - it's in the same building, and it's a gigabit network.
(I would still love to have a program that can index the filenames of a network drive. Searching for files there can take a long time.)
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
My boss and I talk about this about once a month. How we are evolving back to something that resembles the 1970s.
 

MaxDepth

Diamond Member
Jun 12, 2001
8,757
43
91
It'll be your personal cloud on a storage device you own! (Us old folks will still call it a 'hard drive.')

i predict we will switch back and forth from centralized computing to distributed computing every 20 years or so. throw buzzwords in there so the business people will buy it, and we will keep the rework coming long enough to make all of us rich! :awe: