My blog has moved and can now be found at

No action is needed on your part if you are already subscribed to this blog via e-mail or its syndication feed.

Sunday, October 29, 2006
« Geopolitics and Military History | Main | soapUI WS-I Test Tool »

Jon Udell of InfoWorld has a podcast with John Schneider, the CTO of AgileDelta, who is "...evangelizing Efficient XML, an alternate binary syntax for XML.". In the podcast they ".. discuss the motivations for this proposed W3C standard, its theoretical foundations, and its uses." It is an enjoyable podcast and John is an articulate proponent of the need for an alternate binary representation of XML.

The intent with this approach is to use a binary encoding instead of the XML Infoset encoding as the transfer mechanism (which is meant to reduce the size of the message on the wire among other benefits). The Efficient XML Interchange Working Group at the W3C has been tasked with gathering use cases and approaches to binary encodings. As of March 15, 2006 the following proposals have been submitted to this working group:

As a technologist, I have to admit that this is an approach that offers a technically elegant solution that addresses the content bloat that can be associated with XML and the need to transfer data over a limited bandwidth.

But I do have concerns…

To me interoperability is achieved through a combination of open standards AND the adoption and implementation of those standards in vendor tooling such that by following accepted and adopted practices you have a means for exchanging information across platforms, languages and toolsets in a seamless manner. I do not believe that Binary XML meets this criterion, in the current timeframe, as:

  1. It is not an accepted standard for content encoding in the web services world (noted by its lack of adoption in vendor toolkits from IBM, Microsoft, BEA and a host of others).
  2. The binary encoding support is not in the technology roadmap of any vendor (other than Sun perhaps for Fast Infoset) and
  3. In the current world, in order for a binary encoding based exchange of happen, it requires a custom encoder/decoder on both ends of the conversation i.e. there is no out of the box support for it in the current web services stacks.

The way these approaches deal with Interoperability is to use the binary encoding, instead of the regular XML Infoset encoding, when it detects that the other endpoint supports it, which of course requires remote endpoints to support that particular binary encoder/decoder to get the benefits. Otherwise you are just down-selecting, from a performance or size perspective, and using regular XML web services.  Some binary XML folks define this ability to dynamically switch between binary encoding and XML encoding as being interoperable. I do not, as in my view you do NOT get both interoperability AND performance when you choose this approach. You get one or the other.

The only way you would get BOTH interoperability AND performance is if one or more of these binary encoding schemes actually became a publicly accepted web service standard and was adopted by the platform vendors like IBM, Microsoft, BEA, and Sun into their web service implementations. We are not even close on that one, as of yet.

There is an even more critical issue that comes into play when choosing to use binary encodings. A SOA implemented using web services is not just about point to point web services. It is about web services that are passing messages via intermediaries that may perform various actions on those web service messages (e.g. Content Mediation that allows a message format to be transformed from one format to another, Security Mediation which allows enforcement of security policies etc.) The caveat in this case is that these Intermediaries need to understand the data format in order to act and process these messages. They do so in the current state of technology because these Intermediaries (ESBs, XML Gateways, and Orchestration Engines etc.) are built to publicly supported standards. Binary XML is currently NOT standardized, which means that these types of technologies have no visibility into and cannot act on and enforce policies on the messages that are using binary encodings!

Does this mean that this will always be the case? No! But if and until there is a standard around this, and that standard is widely supported by the vendors who build the tools that allow us to build services, this will at best be a proprietary point-to-point solution to a very specific problem that requires you to step away from standards that promote interoperability.

In all fairness, driving towards a standard is precisely what folks like John Schneider are working towards at the W3C and he notes in the podcast that we are, optimistically, more than 2 years away from any type of standard.  Then, of course, we have to wait for the standard to be implemented by the various vendors.

That said, I do look forward to the time when there is indeed a binary XML standard that is baked into the all the web service stacks because the limited bandwidth use case that the solution is targeting is a very real one.

UPDATE (April 2007): Per the W3C EXI Status Page, as of mid November 2006, the group has selected Efficient XML to be the basis for the proposed encoding specification. Present work centers around integrating some features from the other measured format technologies into Efficient XML, particularly variations for both more efficient structural and value encodings. Additionally, the first public working draft of the specification of this format is expected in early May 2007.

10/29/2006 1:45 AM Eastern Standard Time  |  Comments [0]  |  Disclaimer  |  Permalink   
Comments are closed.