Tag Archives: cloud

Clouds no longer pass by Windows.

Amazon today announced that later this year, Windows Server woud be available on EC2. No details on cost and licensing etc. but this is major.  Up until now, that portion of the business world who are pure MS shops (a very large percentage especially amongst SMEs) were excluded from taking advantage of Amazon’s amazing (and getting more amazing everyday) EC2 platform

From my point of view, as with Oracle’s announcement last week, this releases yet more of my “legacy” skillset for deployment in the clouds. Although I’ve been involved with  *nix servers for 20 years or so, as corporate servers became more locked-down (and removed to the control of 3rd party data centres) I lost day-to-day experience of using them; in latter years my main ‘hands-on’ platform was Windows, either my own PC or local departmental NT servers. Windows on EC2 will allow me to use a whole new set of Windows only software (e.g. RSSBus or XLsgen) and of course SQLServer.

The lack of SQLServer on EC2 has been a major problem for me as a datasmith; there’s an awful lot of data out there sitting in SQLServer databases, but currently if I need to “cloud burst” such datasets I would have to first extract the data to, say, csv files and then load the data on to a Linux compatible database. But with a SQLServer instance running in the cloud, I could simply use SQLServer’s native backup/replication tools.  No more need to download data to my “ground-based” PCs resulting in quicker turnaround and fewer data security risks.

On the licensing front,  I’m presuming that the OS licence will be on a pay-as-you-go basis, but what about SQLServer and other server products?  Will MS do an Oracle on it, i.e. require a traditional upfront use-it-or-lose-it payment or will they the go the radical (but I thing inevitable) path of a licence-by-the-hour. 

First RedHat, then Sun, then Oracle and now Microsoft; the mighty beasts of our industry have acknowledged there’s a new mighty beast on the prowl, dressed as a humble bookseller no less!

Advertisements

Oracle embraces the cloud.

 

In a previous post I had wished for Oracle to clarify its position as regards the use of their databases on a cloud platform, well it looks like they have!

They have officially certified Amazon EC2 as a supported platform on which to run their software, not only that, they appear to be embracing the cloud big time, providing pre-configured AMIs and management tools.

For someone like me who has Oracle in the blood (since Version 5 in the 1980’s) this is very good news. As I’ve said before….

As for using Oracle on EC2, yes please. Most of my datasmithing career has been spent behind the wheel of an Oracle database, the front-ends might have been Excel or some BI package, the end results might have been SAP master data take-ons or an Essbase cube, but the blood and guts were always Oracle. And this was before Oracle Apex – think what wonders could have been achieved if I had access to such a product in the past.

Although the licensing is not a pay-as-you-go model, it’s a start, who knows some enterprising firm of DBAs might purchase enterprise licences and repackage access for those wishing to use it for “cloud bursting” (adding utility resources to scale-out / scale-up).  Also, there’s Oracle’s free XE edition for low-volume datasets and for developers who need access to the enterprise editions, the usual “free to develop on” OTN licenses apply, except now there’s no need to first source a suitable spare machine or download a  multi-gigabyte install package and of course no more installation headaches, just fire up an Amazon EC2 AMI, easy peasy.

Oracle is also providing a Oracle Secure Backup Cloud tool which brings the power of Oracle backup and restore technology to S3.  This, combined with Amazon’s Elastic Block Store, makes the EC2 platform an ideal home for many Oracle database applications.

The major attractions to me of Oracle as a datasmithing tool (besides my 20+ years experience of using same) are…

  • Oracle Appliaction Express (aka APEX, previously known as HTML DB).  For fast, robust data-centric web apps for deployment within the firewall (or via VPN), it’s hard to beat (but also see WaveMaker). In a micro ETL environment, it provides a quick and easy means of distributing data cleansing tasks such as adding additional attributes or assigning hierarchies to dimensional data.
  • Oracle SQL engine/optimizer technology is fast, powerful and can handle anything you throw at it (as long as it’s valid SQL).
  • PL/SQL, the best DSL for data handling and data cleansing.
  • Oracle’s market position as a “safe and respectable” home for corporate data.

While I still have reservations about Oracle’s commitment to further develop (and patch) XE, at least its appearance at the heart of their cloud initiative reassures me that they are unlikely to abandon it totally.

The WAN is the new LAN

While discussing SimpleDB ,Nick Carr points to the polar opposite views that the two computing behemoths, Google and Microsoft, hold as to the future direction of cloud computing. Google’s Schmidt sees an eventual 90/10 split with the cloud being the home to most data and processes while as expected, Microsoft’s Raikes points to the current reality and insists that the trend will continue to favour a PC centric view.

I’m not sure who’s right, but my instinct (or is that my prejudice) would be towards the Google view. But one thing I am sure of, is ,that as the the cloud (aka the Internet) and “personal computing devices” (aka desktops, laptops,PDAs, mobile phones) fight it out for dominance, the future of the business LAN as the prime computing backbone is looking increasingly untenable. For SMEs and consumers at least, the WAN (in the form of the Internet) is the new LAN.

Not that LANs will disappear totally, the necessity to provide local wireless access and the address limitations of IPV4, plus the need to share printers etc. will see to that (a least in the short-term, but mobile 3G networks, IPV6 and services such as PrinterAnywhere may eventually address these issues). Also, the ability to act a local cache for backups and data access will ensure the LAN’s continued existence at least until Korean levels of broadband speed/availability becomes the norm in the rest of the developed world.

But what about shared private data, email/calendar, backups, security and last but not least, business applications; the big five “business” reasons that lie behind the justification for must organisations’ (and some families’) LAN setups?

Shared Private Data

Fast ubiquitous broadband and online data stores such as S3, SimpleDB, Microsoft Live Workspace and eventually GDrive, will mean that for many small and medium companies the cost of maintaining in-house data servers will no longer make economic sense. Even large organisations, who have in many cases already out-sourced their data centres to the likes of IBM and are already operating VPNs over private and public WANs, may also move parts of their data infrastructure to the internet cloud. Added value online storage services such as provided by Google’s Docs and Spreadsheets will also drive individuals and organisations in this direction.

Email / Shared Calendars

One word Google Apps. Okay, that’s 2 words and a bit simplistic but GMail and Google Calendar and particularly the premium Google Apps versions represent the future shape of business communication systems. Add in Wiki-like collaborative tools such as Google Docs and Spreadsheets (and the long awaited Googlified JotSpot) suddenly the idea of any SME running its own Exchange servers becomes harder to justify.

Data Backups

Even in current setups, an effective backup policy requires that data be moved of-site, so online backup services are a natural progression. In essence the LAN is working as a local cache to quickly assemble the backup and prepare it for transportation to another location (the boss’s home study most likely!). Online backup will probably be the first cloud service that businesses adopt. But as transactional data increasingly gets recorded off-site most of an organisation’s data will already be “backed up”; so, future backup services will be of the intra-cloud, belt’n’braces type e.g. a service that makes encrypted copies of your data stored on one service and either stores them in another online location or maybe burns the data to DVD and deposits it in a physical secure store.

Security

LANs are seen as the modern data equivalent of a medieval town with its firewall playing the role of the town fortifications. But just as increased mobility. collaboration and newer technology put an end to the justification and utility of walled towns, a similar fate awaits the firewalled LAN.

The explosion in the number of workers (especially knowledge workers, free agents and senior executives) operating outside the local network means that companies must already address data security in the context of public networks. VPNs can of course bring the LAN environment to the mobile worker (even a home/tiny business can use something like Hamachi VPN). But VPNs will not extend the LAN but replace it; increasingly to be used as “private pipes” between trusted peers and cloud servers.

For example, I use Hamachi to communicate with my EC2 instances and to transfer data between my laptop and my main desktop PC; something I can do securely and effortlessly from my laptop using any private or public network. As such, the firewall that really keeps my data secure is the one on my laptop not the one built into my LAN router.

You might look at the recent spate of data loses as evidence that companies should batten down the hatches and throw away the key but I’d argue that it’s a failure to face up to and manage the risks (and opportunities) of mobile data that has caused most if not all of these breaches. The first step is to focus on the “Wifi-enabled, easily-stolen laptop connected to a dodgy airport public network” as the “standard” against which your firm’s (and family’s) data security will be judged and eventually tested.

Applications

For many small businesses the business applications they use tend to be either single user packaged apps or even more likely, Excel. Having a shareable cloud-based data store is all they require to abandon their LAN. But for those businesses that rely on sophisticated multi-user systems replacing in-house servers will be more difficult. There are three options as I see it:

  • Keep servers in-house but purchase or lease them as pre-configured “black boxes”. When a new version or bug fix is required, the vendor remotely updates the software; no on-site technical expertise required. Likewise, the vendor remotely monitors the hardware and slots in a new pre-configured box as required. You may argue that the LAN remains and yes it does, but this sort of setup would only be required where high-speed and reliable broadband is not yet available or where any interruption in server connection is not an option.
  • Use remote pay-as-you-go, invoke-as-you-need virtual servers such as Amazon’s EC2 or Scotland’s Flexiscale. Again, using pre-configured virtual machines that can be either purchased or leased from software vendors removing the need to have in-house server or application expertise.
  • And finally, the ideal for most companies, SaaS, Software as a Service, pioneered by Salesforce.com and now starting to gain traction across not just CRM, but accounting, and even full scale ERP. Even the mighty Sage is starting to feel the winds of change! Very small businesses are also well catered for, e.g. FreeAgentCentral for UK based freelancers.

Times they are a-changin’, migration of some or all data to the internet cloud is inevitable, large organisations will most likely build their own cloud, smaller businesses will need to adapt to the cloud-as-a-service model. Organisations need to start thinking about it now as all future IT investments need to factor this phenomenon in, even if the reaction is to reject it!