All you need for context is ‘unable to locate’ something.
Tim has a piece up at Forbes talking about the relative costs of local storage and the cloud. I think it and the comments that follow at the time of writing miss an important point about the cloud.
We’ve had remote servers in data centres for decades. We’ve had at least some integration between different types of client platforms for decades, though Microsoft has done its best to inhibit this interoperability. Neither of these things are cloud computing.
If it means anything, a computer ‘cloud’ is a network with more than one physical computer and more than one storage device, an integrated control system and a high degree of virtualisation and redundancy. You can string together pieces of hardware so they look and behave like a single logical system, you can operate multiple virtual machines on one hardware system (Amazon’s cloud uses Xen, for example). If you’re really feeling good you can combine these two approaches. And you can often do these things using a nice control interface. The physical reality of the hardware and the logical structure of the system have been separated. Adding new hardware adds to the pool from which the virtual, logical units are constructed.
This means that cloud computing can’t be directly compared with a (more primitive) local computer and its hard drives. Cloud computing is intrinsically more robust and more expensive. It’s also far more flexible because you can add new nodes (computing units, storage units) or remove them as demand fluctuates. Many cloud services charge by the hour to reflect this flexibility.
These techniques of high redundancy and virtualisation have been around for years. I was hosting on a network of FreeBSD servers using jails for virtualisation for years before any marketing executive dreamed up the label ‘cloud computing’. Like ‘data mining’ before it, this is a more a marketing than a technical term; virtualisation and redundancy have long been found in well-designed systems. As marketing terms tend to, ‘cloud’ has now stretched to the point where some smaller IT businesses offer their own ‘cloud’ services that are actually based on single servers and not cloud computing at all. They are simply services housed in a data centre rather than onsite.
Tim’s point is that local storage has been getting cheaper at a faster rate than bandwidth. But then, these storage savings are also available to cloud providers. ‘Local’ storage can be accessible from any connected devices – all you need is a static IP address or a dynamic DNS service and you can host them from your bedroom. But they’re not offsite, which matters when you’re burgled or your bedroom catches fire. Expanding the system to meet a temporary upsurge in demand means buying new hardware and being stuck with it when demands falls back again.
Google can offer very cheap access to highly redundant cloud-based services like GMail because they monetise in ways other than direct charging (though GMail is also available as a chargeable service). But if you want to run your own cloud-based system, hiring the components from a cloud provider, it will be more expensive than operating a local workstation. This says nothing about the future direction of computing. A proper cloud system simply isn’t comparable with a single computer.
For what it’s worth, my view is that the separation between physical hardware and logical systems will continue to increase.
For Android only, so far. But this is a good initiative:
To mitigate the risks of misappropriation of the user’s data by today’s Android applications, the researchers of the study have developed a system, called AppFence, that implements two privacy controls that (1) convertly substitue shadow data in place of data that the user wants to keep private and (2) block network transmissions that contain data the user made available to the application for on-device use only.
Your next computer but one will look a bit like this (I need to widen this template…):
That’s a smartphone in a dock.
Combine that idea with this research:
Apple share falls less quickly as Google operating system [Android] takes over – but Windows Phone has barely sold half of the 2m handsets shipped, say new figures
And it becomes less insane than you’d think to suggest that Microsoft is in the process of experiencing the fastest and deepest collapse of market share in history, wars and catastrophes aside.
See also esr’s analysis.
Worth noting that Google was founded using not one, but two successive code bases that were very poor, first in Java(!) then in Python (which together with a Django-like application development framework remains the language for the Google Apps platform).
Moral: implementing an idea with quick and dirty code is fine – if the idea is good, you can polish or re-write later, if it isn’t you’ve saved time.
Excellent essay by Bruce Schneier:
In the next 10 years, the traditional definition of IT security—that it protects you from hackers, criminals, and other bad guys—will undergo a radical shift. Instead of protecting you from the bad guys, it will increasingly protect businesses and their business models from you.
In the field of patent law:
Intellectual Ventures, which is based in a Seattle suburb and claims 30,000 patents and patent applications, is believed to have the largest portfolio among firms that don’t make or sell products. It claims to have earned nearly $2 billion from licensing its patents.
The threat posed by Intellectual Ventures helped prompt the rise of firms like RPX Corp. It is paid by companies to buy up potentially threatening patents; the companies receive licenses to those patents, and RPX pledges never to sue over them.
The Telegraph today:
Google has announced a new operating system that will pit the search giant against Microsoft and Linux.
Google’s operating system is Linux. Twerps.
My shiny new Android phone is due to arrive today. Android – also a version of Linux promoted by Google – has apparently been outselling even the iPhone in the USA.
Which raises the interesting possibility that Microsoft may already have fallen below 50% of all computer operating systems – because of course my new “phone” is a computer – and that their crash in market share is almost without parallel in commercial history.
I’ve had to handle a number of computer security breaches. Causes vary. It’s rarely because of any hardware. Often software is at fault, containing security holes that have gone unpatched – it’s amazing how often people forget that you’re only as secure as your last update. But a lot of the time, it’s the wetware at fault: human beings. Passwords simple enough to fall to a dictionary attack, which every public server gets subjected to daily, or stuck on a post-it note somewhere; shortcuts that deliberately circumvent security provisions to speed up certain processes; security by obscurity being less obscure than hoped.
The release of a zip file from a server at the University of East Anglia was a security breach. What was the cause? There have been attempts to examine this, analyse email headers and so forth. I’ve never known how that was supposed to help – the mails had all been rolled up into a zip file before they were downloaded. So did someone gain unauthorised access to grab this file? I doubt it.
No explanation of how anyone gained access of some kind (telnet, ssh, ftp etc) has been released. If there had been a hole of this nature, I’d have expected it to have become known by now. But there’s a good reason to think there was no such access and that, from the point of view of the CRU, the problem lay with the wetware.
The proposition that there was a hack boils down to this: someone managed to gain access to a CRU server and lo! There was a fat zip file containing all these files. It’s wholly implausible.
The files were leaked.