Archive for March, 2009

Cloud Ambition: Standardization Later!

If you have any interest in IT at all you may have come across the history of the evolution of computers from humongous building complexes to today’s sample of gizmos and gadgets. It is often the case that the forward match of computers and computing in general has had a moment where the innovation in question is an island; isolated from sharing data, peripherals and perhaps even experiences of the technicians. Within the Personal Computer (PC) industry standards bodies have become a norm as they define both hardware specifications and protocols that enable exchange of data between computer hardware and software that are made by different developers. Some have argued that too much concern about standards and standardization does lead to stagnation in innovation as each vendor only does what is necessary to meet the standards. Breaking away from the cover of standards would perhaps lead to user base isolation as they may not be able to work effectively in an almost always heterogeneous information systems environment.

As an example to illustrate the detrimental effects of standards and standardization on software development: consider the rate of development and innovation that went on at the height of the first browser wars between Netscape and Microsoft. These two players pushed the envelop by not waiting for the slow and arduous standards making process but instead made extensions to existing standards and release products into the market. The standards bodies and their standardization processes eventually catch up to what is happening in the real world but more often than not the drive to push the envelope was never reduced to strategic maneuvers and machinations. Those who are familiar with this subject will quickly espouse the virtues of standards – the harmony they provide, the freedom from vendor lock in and the rest of the good stuff. Perhaps in support of the overall benefits of standards and standardization, Microsoft (the victor in the first browser wars) is now saddled with the responsibility of making its browser standards compliant without breaking web sites and applications that have come to rely on their extensions.

The story of standards and standardization does not, however, end there as it is indeed very rare to find new, paradigm shifting products and services created by standards bodies. The main goal of standards bodies is to provide the venue and mechanism to build and maintain consensus amongst parties that should ideally have similar or at least comparable end results. Such an environment rarely lends itself to revolutionary ideas and/or thoughts; the responsibility of pushing forward the state of the arts falls on visionary and insight full individuals and corporations.

In this day and age, Cloud has become the new buzz word that you hear hump across the internet and tech news media. The cloud does present a unique architectural challenge but at the same time, it is perhaps the shape of things to come in the future. Earlier in the day, I was reading an article that posits the demise of the relational database management systems (RDBMS) because of the rise of the internet and its demands to scale up and down at will. We are talking about a technology, RDBMS, which countless enterprises and perhaps applications rely on in some way but perhaps of greater concern is the amount of data that is locked up in these databases. Don’t get me wrong, I am not raising an alarm over the data stored in RDBMS because when it comes down to it, a method of bridging RDBMS with whatever eventually succeeds it can be provided. Key aspects of the relational model has been standardized which means that degree of interoperability is much higher as vendors strive to at least implement the basic specifications.

With the incessant push towards the cloud, a data storage paradigm that suits the demands of cloud computing could emerge. Amazon does provide cloud services through its EC2 and S3 services with additional services from other vendors at various stages of development and deployment. Of particular interest in these cloud storage services is their use of APIs to interact with data. If you have looked at the evolution of data storage, you will remember that once upon a time data and the application processing the data are one and the same. The obvious advantage is that such an application would perform quite well but the value of the data locked up in its code becomes much less.

If anything, it has always been claimed that the data an application produces will outlast the application that created it. Thanks to the relational model, data became independent of applications. The push towards cloud computing, with its numerous APIs at this point, would perhaps seem like a step back from the uniformity and sanity that the relational model brought to application development.

Microsoft’s push towards the cloud is currently going by the name of Azure Services Platform. Architecturally, it is an interesting piece of infrastructure to put in place. It does represent the core of what Microsoft eventually intends to hoist upon the cloud in the hopes of giving its dominance on the desktop a new lease on life. Of greater significance is Microsoft’s approach to the cloud: Software + Services. Windows and Office represent a lucrative install base for Microsoft and from that perspective it would make sense for them to attempt to migrate some of the technologies that have proven invaluable on the desktop onto the cloud. First of all the revenue opportunity of enabling some of sort backward compatibility with existing Windows and Office install base is too big to ignore. Keep in mind that data is immortal (well, almost immortal). The more interesting perspective is that transferring desktop standards on to the web would also present the legions of Windows and Office developers the opportunity to work with tools that they are already familiar with. The aforementioned reasons are compelling enough to have Microsoft push for a relational implementation of some kind of cloud storage. That is already happening with their SQL Server Data Services push. If this works out in the long run is a matter of wait and see. From a strategic perspective, I really do hope that Microsoft sees the cloud as different from the desktop market as the rules are different. It would be interesting to find out how well the relational model would adapt to future applications and demands – being able to implement a cloud-aware relational model may not be enough as it needs to be capable of addressing future demands and hence stop serving as a transition bridge from one era of computer to the coming wave.

Google and Amazon are not tied down by existing user bases and a need to protect and leverage existing markets shares in the desktop space. These two companies are versed in ways of running internet businesses. Google has the leading search engine which Microsoft is yet to challenge successfully in the market. Amazon is well known for its online ecommerce stores but it is also branching out in other directions. Beyond search and advertisement, Google has assemble a veritable collection of online services and infrastructure such as the Google App Engine which sits on to of a proprietary BigTable implementation. Access to such storage services are through Google defined APIs and at this point of the game, interoperability between these clouds services is the last question on any of the player’s collective and individuals minds.

It is not only the commercial worlds that are having cloud dreams; the next release of the popular Ubuntu Linux will also include technologies to allow it to participate in the cloud. The next release of Ubuntu called Karmic Koala will include hooks to Amazon cloud services.



1 Comment

java.lang.NoSuchMethodError: net.sf.ehcache.Cache.<init>

This entry is about the aforementioned exception which was thrown in an application I am working.


The application uses the following libraries

  • Spring framework (version 2.5)
  • Hibernate JPA (version 3.2.x)
  • Acegi Security


As indicated in the exception, the error is thrown by the ehcache library and in my case version 1.2.3. In simple terms the init method can not be found in the aforementioned version of ecache. The stack trace of the exception indicates that the absence of the method affected the creation of a cache for use by Acegi. The cache bean has been configured in Spring to allow it to be injected into Acegi authentication & authorization beans.

How did version 1.2.3 of ecache end up in the application’s class path? Well, that boils down to my recent decision to switch from using TopLink as a JPA library to using Hibernate JPA. The hibernate JPA in my IDE uses version 1.2.3 of ecache instead of version 1.3.0 which I also have in my class path.


The solution that worked for me (suggested in the reference below) was to remove the old version of ecache and using echache version 1.3.0.


Leave a comment