What do you think about when you hear the word “scale”?
The scales on a fish? A musical scale? Scaling Mount Everest?
There are a lot of definitions for the word. But as more and more companies migrate to the cloud, they need to be thinking about a very specific one.
In the technology world, scalability defines a solution’s ability to process very large volumes of data very quickly—without slowing down or breaking the system. An inability to handle these volumes can cause system performance and throughput to collapse—leading to end user frustrations and problems completing tasks in a timely manner.
When it comes to the cloud, scale matters. A lot.
Scale is sometimes forgotten—or maybe just taken for granted—when evaluating cloud solutions. It is especially important when an enterprise must process huge volumes of transactions—for example, millions of credit card transactions every few minutes.
Your company may or may not be handling millions of transactions, but nonetheless cloud scalability is an important consideration for any enterprise. Here are three reasons why.
It seems obvious. The more your company grows, the more transactions you will process. Yet, when selecting a new cloud solution, selection teams often ignore this principle. All SMBs owners want their companies to grow spectacularly—yet they rarely think about the technology they will need 3 to 5 years down the road.
Picking a cloud application needs to include an analysis of its proven processing capabilities—if for no other reason than to ensure your company will never have to worry outgrowing your software.
While the term “transaction” is singular, most transactions are not a singular event. Take a typical example: the payroll check.
Consider all the payroll deductions associated with each check. There are deductions for taxes, savings plans, investment options, healthcare and insurance. Additional deductions may tie to vacation plans and withholding for other scenarios.
Dig deeper, and each of these transactions can spawn additional activity. For example, a deduction from the employee paycheck for a retirement plan contribution might launch a transaction for a company match—which in turn generates multiple transactions to a third-party financial services company that manages the retirement plan.
Calculating the number of possible permutations—from one transaction into many subsequent downstream transactions—is a mind-numbing activity, even for a small payroll run. Issuing payroll for a 500-person company can easily generate 15,000 or more transactions every payday.
Volume analysis calculations often ignore periodic events that rapidly increase data throughput needs, and these regular events are very important to an enterprise’s operations.
Building on the payroll example, consider how this core activity only occurs 2 or 3 times a month. It is certainly a large increase in data volume and it needs to be handled in a timely manner to meet your employee expectations.
Now add in other periodic events like month-end close or billing cycles for customer invoices or statements. Each event in itself delvers a significant increase in data volume, but what happens when multiple periodic events fall on the same day? What if payroll processed on the last day of the month coincides with month-end close and tax reporting to local authorities?
Such a spike will test and challenge systems that can’t handle these data volumes as they surge through your cloud applications.
A few months ago, Oracle ran a series of financial transaction volume tests to measure the current capabilities of our cloud architecture and its supporting infrastructure. Using a model based on a large global corporation with a diverse mix of thousands of accounts, vendors and customers, the key parameter was astonishing.
Financial transaction throughput: 6 million per minute.
We recently published a narrative report and infographic about this extensive test. The paper presents several scenarios that show how a company can find itself reaching millions of transactions every second (one example: processing credit cards at gasoline station pumps) and clearly outlines why this capability is important for any enterprise using cloud applications.
Companies tend to think of the cloud as “out there” and therefore infinitely scalable. But not all cloud providers are created equal. Some cloud applications work with flat files (instead of relational databases) which have inherent performance issues. And since most Software-as-a-Service (SaaS) companies rent their hardware from third parties (like Amazon or Rackspace), they have limited scalability. If they have too many customers running the application at once, they must purchase extra processing power.
Woe betide the company that outgrows its own cloud solution. Because when it goes down, customers will flee.
When evaluating any cloud solution, high-volume data transaction rates must be a key consideration. Ignoring this conversation—and then subsequently picking an inferior or limited solution—will create downstream issues across your company. You might even have to abandon your choice and restart your entire cloud effort.
Look in detail at scalability. After all, there are many things to worry about in your business. Using a tested, high volume, high velocity cloud means one less worry for you and your colleagues.