Benjamin Franklin coined the enduring phrase ‘Time is money.’ That was more than 2 centuries ago and it is particularly apt when describing cloud computing today.
A defining characteristic of cloud computing—whether public, private, or hybrid—is its self-service nature. Computing resources are delivered on demand and end users pay based on their consumption levels, which is where the ‘time is money’ premise comes in.
The industry standard
When it comes to pricing, the prevailing standard in the cloud is $/hour of usage. This means that as a cloud consumer, you are charged for an entire hour of consumption, even if you’ve only been running for 1 minute. That’s quite a price to pay for the privilege of running a virtual machine in the cloud. This pricing model can be of immense consequence, especially if you are in the business of developing, testing, and delivering software. And while ‘1 minute = 1 hour’ may sound like a worst case scenario, let me illustrate just how easily—and frequently—it can happen.
For example, let’s assume that you have the basic setup for continuous integration (CI) for your team using a cloud platform. A check-in from anyone on your team will trigger a build and deployment of unit tests. Clean CI servers are automatically spun up in the cloud on demand, builds are done, tests are run, and results are reported. At the end of this process, the servers are discarded so that those resources can be utilized somewhere else when needed. Depending on the size and complexity of your code base, you may have any number of check-ins per hour. Each check-in may take a few minutes and your CI servers will come online for a few minutes.
In this scenario, you would be charged for 1 hour of usage each time one of your CI server runs even for 60 seconds. As you can imagine, this pricing model can quickly rack up costs and blow a development and test budget out of the water. It’s not a model that supports the dynamism and flexibility needed in the development and testing of an application.
The alternative
Skytap Cloud was designed to meter, report, and bill compute usage by the minute—never by the hour. Users don’t pay for more than their exact usage. Period.
Only very recently have we seen a shift in the industry toward this pricing model in the cloud. Google announced that Google Compute will charge users by the minute, and Microsoft announced per minute pricing for Windows Azure targeted for developers. 451 Research also produced an analysis on the benefits of per minute pricing in a report titled: Azure and Google buck market trend with per minute billing – but is it better value?
Cloud computing has the potential to super charge development and test teams, but it doesn’t have to threaten your budget.
You may be inclined to focus more on the hardcore technical value of the cloud platform you are evaluating; however, the non-technical aspects like per minute vs. per hour pricing will be a big enabler—or hindrance—in day-to-day usage. When every minute counts, the pricing model can be as important as technical value when assessing the best cloud provider for your team.