In May 1999 Enron announced to the world that it was creating a new market for trading Bandwidth. A wired article from 2001 noted that it seemed to many that the then energy giant had found a new pot of virtual gold. Enron and a broader group of their experienced traders believed it was only a matter of time before bandwidth (as well as other virtual resources) would be bought and sold in much the same way that commodities markets trade everything from petroleum to pork bellies. Now, more then 10 years later this transition has yet to occur. In this post I will examine why the idea of trading bandwidth never took off and see if today might be the ideal time to try again.Looking back at the previous attempts to create virtual commodities exchanges including Enron’s failed attempt, it now appears that it was indeed a great opportunity, but just about a decade too early. In the case of Enron, they had a right vision, but suffered from the now obvious fact that it was born from an overwhelming greed. In other words, the right idea but the wrong people at the wrong time. Today with the emergence of cloud computing looking at these past failures such as the failed bandwidth markets and as well as the successes of the energy markets of 1990’s may represent a case study in how to we might go about creating a successful commodity compute marketplace. One of the first problems in getting bandwidth trading off the ground was timing. The bursting of the dotcom bubble meant that there was a significant disconnect between an over supply of bandwidth versus the demand for it. Basically there weren’t enough companies who wanted to buy and too many selling. Making the market go in one direction, down. This discouraged both buys and sellers from getting involved. The key to an active market and ecosystem is growth. Secondly, as the wired article points out, the telecom firms that owned the fiber optic networks didn’t like the idea of selling their services as a commodity. Some made the case that “not all networks perform equally well.” Basically there were no measurement standards and therefore no easy way to determine the good from the bad. In addition, most telecom firms preferred to negotiate prices with customers, rather than be stuck with a one-size-fits-all pricing scheme. In a sense they would rather lose on the excess capacity and make up the difference by charging more for the capacity that was actually used. In contrast, the benefit to a commodity style approach, you may charge less overall but make more money because you have a higher utilization of your resources (volume). Another major problem was at that time the adoption of broadband was at its infancy. Most Internet users in 1999 we’re still using dial up connections. Compounding things was other than a few notable exceptions (Napster) the majority of web applications were static and light weight. Mobile apps, streaming media, social web applications, realtime web and cloud computing (Internet centric computing) had yet to be widely accepted. Fast forward to today and these applications have become the key drivers to a recent explosion of rich user generated content and the ever increasing need for realtime compute capacity to process it all. Thanks in part to the increasing popularity of cloud computing, the idea of just-in-time compute capacity has helped lower some of barriers that limited the previous bandwidth markets from flourishing. For many the concept of distributed batch processing and compute elasticity have become critical parts of modern business IT strategies. These kind of flexible and elastic compute usage models are ideally suited to that of a spot market for commodity compute capacity (provided via a method that is quoted for immediate (spot) settlement for both payment and delivery. Also the announcement last month that Amazon Web Services would start offering excess EC2 capacity using a spot market approach has also helped legitimized the concept. The notion of selling your excess compute capacity now has a poster child (AWS), this may lead to increased acceptance of selling excess compute resources using a commodities approach. This is in much the same way that Amazon EC2 has encouraged companies to use cloud like strategies within their internal systems (private clouds). In a very real way, AWS is blazing a path for the broader industry. I see tremendous opportunities for the trading of excess Cloud Computing resources or compute capacity and believe the most viable market example may be that of the energy marketplace. The energy market is similar to bandwidth and compute capacity in that the commodities are variable, transient and don’t store well. The concept of selling excess capacity in cloud centric data centers may also make sense in that cloud providers must have significant additional capacity on hand just in case of demand spikes. As I’ve said before, unused compute capacity = lost revenue. It’s better to sell your excess then to have it disappear. For a lot of larger players such as Telecoms and large content providers this means un-utilized compute capacity is making you nothing. The notion of a public spot market may help address this problem. The great example for a compute centric market may be based on that of the electricity wholesale markets. Like compute capacity, electricity is difficult to store because of it’s transient nature. It needs to be available on demand, and unpredictable demand spikes may occur. Using the energy trading market as a model provides an existing proven context that may translate well into compute centric environments, not mention there are wide variety of trading platforms already built that may be easily modified to address the needs of a compute exchange market. One of the more common energy trading models uses a automated central scheduler to balance supply and demand and calculate the market price. Another model is that of conducting auctions in various time scales, i.e. auctions for yearly and daily provision of power, with additional spot market that resolves the need for accommodating short-term demand spikes. Before a widely accepted commodity compute trading market may form and begin trading, governments may also need to provide a common regulatory framework as well as standards and liability controls. Otherwise the market will be doomed to serve as a novelty or worst yet, limited to academic use only. So what’s next? First a trading organization must form, preferably in a transparent not for profit context, so to help avoid future Enron type scenarios. I’d also say the capital to develop such a trading platform, the will of the industry to help make this happen and some standard processes for the measurement of the cloud capacity itself. So will this happen? Certainly, but question of when is still up for debate.
[#SmartGrid #スマートグリッド] 久しぶりに読み応えの在る記事：クラウドをユーテリティ化、さらに公開取引市場を作る構想：コメント募集します！
Amazon Web Serviceが発表したスポット価格精度は、正にこのコンセプトの実現性を裏付ける、という事も言える。 売れなければ捨ててしまうコンピューティングパワーを客の言い値でオークション形式で売り切ってしまう。 実に効率的で、尚且つ実にグリーンなソリューションである。
意外と日本で実現する可能性が高いのでは、と想像する。 日本の｢閉じた｣市場がかえってメリットになるのでは、と考えるところ。 インフラが完全にUS製になるのはちょっと悔しいが。