Welcome back to the series Cloud Coverage. Even now, after years of success, cloud computing is awash in misconceptions and erroneous assumptions about its impact on businesses, software development and software management. This blog entry and one next week will discuss some of the most common cloud misconceptions. There is a lot to share so I am breaking the content into two separate posts. This week’s focus is on deployment and costs.
1. Myth: The Cloud is just hosting in the sky
This can be true but should not be true! Many organizations, especially ones just migrating software to a cloud provider, treat the cloud as another hosting company. This is a valid strategy during a transition phase. If software is developed prior to migration, based on traditional hosting principles of finite and pre-allocated resources, it can pay to first transition to the cloud by using it as another hosting company. The problem becomes if companies get ‘stuck’ in this phase and do not continue the migration to integrate the multiple cloud native applications available. This early phase, the cloud as a simple hosting company, is like using a Porsche to go grocery shopping. You can, but why would you?
2. Myth: The Cloud is more expensive/cheaper than hosting companies
Cloud computing can be both more expensive and cheaper then hosting companies. Cost is directly dependent on the usage model. To understand the cost structure of cloud services vs standard hosting companies we need to look at the charging architecture that underlies both business models. Standard hosting requires companies to pre-provision hardware and bandwidth requirements. Based on equipment provisioned flat monthly charges also apply. Since companies have a difficult time estimating usage patterns months or years into the future IT departments tend to provision resources based on the maximum foreseeable usage pattern predicted. If these estimates prove to be wrong, either too high or too low, we generate the classic over/under provisioning problem. Either companies pay way too much for underutilized resources or scramble to add resources to meet unanticipated demand.
The cloud works on a radically different provisioning and charging model. Basically, users are charged by usage, often by the hour for computing resources or gigabyte used for storage and network hardware. Cloud companies also provide numerous resources to control costs, such as auto-scaling based on need, or pre-allocation of computers. Cloud companies present what appears to be an infinite quantity of resources that can literally be allocated in minutes. Since new adopters often get carried away with this newfound power, they allocate resources without thinking of efficiency or cost. However, overtime they become more sophisticated consumers of the services offered. Cloud computing also offers a very large number of advantages over standard hosting, that cost alone does not tell the whole story. Since it can respond instantly to need, either by freeing resources or allocating more resources, a direct cost comparison is difficult. Bottom line, the cost of cloud computing will depend on the nature of the use case and the cloud sophistication of the users. Its this adoptability that makes cloud computing so powerful and its this adoptability that enables users to control costs based on need. Smart cloud users can leverage various tools to limit cost and ensure they only pay for what computing resources they require to meet their business goals.
Check back next week with Part 2 of this post. I’ll be covering 3 more common myths about cloud technology.