Communication standards: Götterdämmerung?

Leonardo Chiariglione - CSELT, Italy

 

Abstract

This paper addresses the role of standards as the enablers of communication and how it can be preserved in the digital age. It is divided in three parts. In the first part the main communication systems that human ingenuity has invented, the standards that made them possible, the development of standards bodies and their working practice are revisited. In the second part an analysis of the effects of the digital revolution, in particular the new role of intellectual property, and a dispassionate assessment of the current situation of standards bodies is made. Lastly a radical framework proposal to preserve the role of standardisation is made. Those in real hurry could simply go to the conclusions.

 


Table of content

  1. Introduction
  2. Communication systems
  3. Communication standards
  4. The standards bodies
  5. The standards bodies at work
  6. The digital communication age
  7. Intellectual Property
  8. Not everything that shines is gold
  9. A way out
  10. Conclusions
  11. Acknowledgements

 


1. Introduction

Communication standards are at the basis of civilised life. Human beings can achieve collective goals through sharing a common understanding that certain utterances are associated with certain objects, concepts, and all the way up to certain intellectual values. Civilisation is preserved and enhanced from generation to generation because there is an agreed mapping between certain utterances and certain signs on paper that enable a human being to leave messages to posterity and posterity to revisit the experience of people who have long ago departed.

Over the centuries, the simplest communication means that have existed since the remotest antiquity have been supplemented by an endless series of new ones: printing, photography, telegraphy, telephony, television, and the new communication means like electronic mail and the World Wide Web.

New inventions made possible new communication means, but before these could actually be deployed some agreements about the meaning of the "symbols" used by the communication means was necessary. Telegraphy is a working communication means only because there is an agreement on the correspondence between certain combinations of dots and dashes and characters, and so is television because there is an agreed procedure for converting certain waveforms into visible and audible information. The ratification and sometimes the development of these agreements - called standards - are what standards bodies are about. Standards bodies exist today at the international and national levels, industry-specific or across industries, tightly overseen by governments or largely independent.

Many communication industries, among these the telecommunication and broadcasting industries, operate and prosper thanks to the existence of widely accepted standards. They have traditionally valued the role of standards bodies and have often provided their best personnel to help them achieve their goal of setting uniform standards on behalf of their industries. In doing so, they were driven by their role of "public service" providers, a role legally sanctioned in most countries until very recently. Other industries, particularly the consumer electronics and computer industry, have taken a different attitude. They have "defined" communication standards either as individual companies or as groups of companies and then tried to impose their solution on the marketplace. In the case of a successful outcome, they (particularly the consumer electronics industry) eventually went to a standards body for ratification.

The two approaches have been in operation for enough time to allow some comparisons to be drawn. The former has given stability and constant growth to its industries and universal service to the general citizenship, at the price of a reduced ability to innovate: the telephone service is ubiquitous but has hardly changed in the past 100 years; television is enjoyed by billions of people around the world but is almost unchanged since its first deployment 60 years ago. The latter, instead, has provided a vibrant innovative industry. Two examples are provided by the Personal Computer (PC) and the compact disc (CD). Both barely existed 15 years ago, and now the former is changing the world and the latter has brought spotless sound to hundreds of million homes. The other side of the coin is the fact that the costs of innovation have been borne by the end users, who have constantly struggled with incompatibilities between different pieces of equipment or software ("I cannot open your file") or have been forced to switch from one generation of equipment to the next simply because some dominant industry decreed that such a switch was necessary.

Privatisation of telecommunication and media companies in many countries with renewed attention to the cost/benefit bottom line, the failure of some important standardisation projects, the missing sense of direction in standards, and the lure that every company can become "the new Microsoft" in a business are changing the standardisation landscape. Even old supporters of formal standardisation are now questioning, if not the very existence of those bodies, at least the degree of commitment that was traditionally made to standards development.

The author of this paper is a strong critic of the old ways of formal standardisation that have led to the current diminished perception of its role. Having struggled for years with incompatibilities in computers and consumer electronics equipment, he is equally adverse to the development of communication standards in the marketplace. He thinks the time has come to blend the good sides of both approaches. He would like to bring his track record as evidence that a Darwinian process of selection of the fittest can and should be applied to standards making and that having standards is good to expand existing business as well as to create new ones. All this without favouring any particular industry, but working for all industries having a stake in the business.

This paper revisits the foundations of communication standards, analyses the reasons for the decadence of standards bodies and proposes a framework within which a reconstruction of standardisation on new foundations should be made.

 

2. Communication systems

Since the remotest antiquity, language has been a powerful communication system capable of conveying from one mind to another simple and straightforward as well as complex and abstract concepts. Language has not been the only communication means to have accompanied human evolution: body gesture, dance, sculpture, drawing, painting, etc. have all been invented to make communication a richer experience.

Writing evolved from the last two communication means. Originally used for point-to-point communication, it was transformed into a point-to-multipoint communication means by amanuenses. Libraries, starting with the Great Library of Alexandria in Egypt, were used to store books and enable access to written works.

The use of printing in ancient China and, in the West, Gutenberg's invention brought the advantage of making the reproduction of written works cheaper. The original simple system of book distribution eventually evolved to a two-tier distribution system: a network of shops where end users could buy books. The same distribution system was applied for newspapers and other periodicals.

Photography enabled the automatic reproduction of a natural scene, instead of hiring a painter. From the early times when photographers built everything from cameras to light-sensitive emulsions, this communication means has evolved to a system where films can be purchased at shops that also collect the exposed films, process them, and provide the printed photographs.

Postal systems existed for centuries but their use was often restricted to kings or the higher classes. In the first half of the 19th century different systems developed in Europe that were for general correspondence use. The clumsy operational rules of these systems were harmonised in the second half of that century so that prepaid letters could be sent to all countries of the Universal Postal Union (UPU).

The exploitation of the telegraph (started in 1844) allowed the instant transmission of a message composed of Latin characters to a distant point. This communication system required the deployment of an infrastructure - again two-tier - consisting of a network of wires and of telegraph offices where people could send and receive messages. Of about the same time (1850) is the invention of facsimile, a device enabling the transmission of the information on a piece of paper to a distant point, even though its practical exploitation had to wait for another 100 years before effective scanning andreproduction techniques could be employed. The infrastructure needed by this communication system was the same as the telephony's.

Thomas A. Edison's phonograph (1877) was another communication means that enabled the recording of sound for later playback. Creation of the master and printing of discs required fairly sophisticated equipment, but the reproduction equipment was relatively inexpensive. Therefore the distribution channel developed in a very similar way as for books and magazines.

If the phonograph had allowed sound to cross the barriers of time and space, telephony enabled sound to overcome the barriers of space in virtually no time. The simple point-to-point model of the early years gave rise to an extremely complex hierarchical system. Today any point in the network can be connected with any other point.

Cinematography (1895) made it possible for the first time to capture not just a snapshot of the real world but a series of snapshots that, when displayed in a rapid succession, appeared to reproduce something very similar to the real movement to the eye. The original motion pictures were later supplemented by sound to give a complete reproduction to satisfy both the aural and visual senses.

The exploitation of the discovery that electromagnetic waves could propagate in the air over long distances produced wireless telegraphy (1896) and sound broadcasting (1920). The frequencies used at the beginning of sound broadcasting were such that a single transmitter could, in principle, reach every point on the globe by suitably exploiting propagation in the higher layers of atmosphere. Later, with the use of higher frequencies, only more geographically restricted areas, such as a continent, could be reached. Eventually with the use of Very High Frequencies (VHF), sound broadcasting became a more local business where again a two-tier distribution systems had usually to be put in place.

The discovery of the capability of some material to generate current if exposed to light, coupled with the older cathode ray tube (CRT), capable of generating light via electrons generated by some voltage, gave rise to the first communication system that enabled the real-time capture of a visual scene, simultaneous transmission to a distant point and regeneration of a moving picture on a CRT screen. This technology, even though demonstrated in early times for person-to-person communication, found wide use in television broadcasting. From the late 1930s in the United Kingdom television provided a powerful communication means with which both the aural and visual information generated at some central point could reach distant places in no time. Because of the high frequencies involved, the VHF band implied that television was a national communication system based on a two-tier infrastructure. The erratic propagation characteristics of VHF in some areas prompted the development of alternative distribution systems: at first by cable, referred to as CATV (community antenna television), and later by satellite. The latter opened the television system from a national business to at least a continental scale.

The transformation of the aural and visual information into electric signals made possible by the microphone and the television pick-up tube prompted the development of systems to record audio and video information in real time. Eventually magnetic tapes contained in cassettes provided consumer-grade systems, first for audio and later for video.

Automatic Latin character transmission, either generated in real time or read from a perforated paper band, started at the beginning of this century with the teletypewriter. This evolved to become the telex machine, until 10 years ago a ubiquitous character-based communication tool for businesses.

The teletypewriter was also one of the first machines used by humans to communicate with a computer, originally via a perforated paper band and, later, via perforated cards. Communication was originally carried out using a sequence of coded instructions (machine language instructions) specific to the computer make that the machine would execute to carry out operations on some input data. Later, human-friendlier programming (i.e., communication) languages were introduced. Machine native code could be generated from the high-level language program by using a machine-specific converter called a compiler.

With the growing amount of information processed by computers, it became necessary to develop systems to store digital information. The preferred storage technology was magnetic, on tapes and disks. Whereas with audio and video recorders the information was already analogue and a suitable transducer would convert a current or voltage into a magnetic field, information in digital form required systems called modulation schemes to store the data in an effective way. A basic requirement was that the information had to be "formatted".

The need to transmit digital data over telephone lines had to deal with a similar problem, with the added difficulty of the very variable characteristics of telephone lines. Information stored on a disk or tape was formatted, so the information sent across a telephone line was organised in packets.

In the 1960s the processing of information in digital form proper of the computer was introduced in the telephone and other networks. At the beginning this was for the purpose of processing signaling and operating switches to cope with the growing complexity of the telephone network and to provide interesting new services, possible because of the flexibility of the electronic computing machines.

Far reaching was the exploitation of a discovery of the 1930s (so-called Nyquist sampling theorem) that a bandwidth-limited signal could be reproduced faithfully if sampled with a frequency greater than twice the bandwidth. At the transmitting side the signal was sampled, quantised and the output represented by a set of bits. At the receiving side the opposite operation was performed. At the beginning this was applied only to telephone signals, but the progress in microelectronics, with its ability to perform sophisticated digital signal processing using silicon chips of increased complexity, later allowed the handling in digital form of such wideband signals as television.

As the number of bits needed to represent sampled and quantised signals was unnecessarily large, algorithms were devised to reduce the number of bits by removing redundancy without affecting too much, or not at all as in the case of facsimile, the quality of the signal. The conversion of heretofore analogue signals into binary digits and the existence of a multiplicity of analogue delivery media prompted the development of sophisticated modulation schemes. A design parameter for these schemes was the ability to pack as many bits per second as possible in a given frequency band without affecting the reliability of the transmitted information.

The conversion of different media in digital form triggered the development of receivers - called decoders - capable of understanding the sequences of bits and converting them into audible and/or visible information. A similar process also took place with "pages" of formatted character information. The receivers in this case were called "browsers" because they could also move across the network using addressing information embedded in the coded page.

The growing complexity of computer programs started breaking up what used to be monolithic software packages. It became necessary to define interfaces between layers of software so that software packages from different sources could interoperate. This need gave rise to the standardisation of APIs (application programming interfaces) and the advent of "object-oriented" software technology.

 

3. Communication standards

For any of the manifold ways of communication described in the preceding section, it is clear that there must be an agreement about the way information is represented at the point where information is exchanged between communicating systems. This is true for language, which is a communication means because there exists an agreement by members of a group that certain sounds correspond to certain objects or concepts. For languages such as Chinese, writing can be defined as the agreement by members of a group that some graphic symbols, isolated or in groups, correspond to particular objects or concepts. For languages such as English, writing can be defined as the agreement by members of a group that some graphic symbols, in certain combinations and subject to certain dependences, correspond to certain basic sounds that can be assembled into compound sounds and traced back to particular objects or concepts. In all cases mentioned an agreement - a standard - about the meaning is needed if communication is to take place.

Printing offers another meaning of the word "standard". Originally, all pieces needed in a print shop were made by the people in the print shop itself or in some related shop. As the technology grew in complexity, however, it became convenient to agree - i.e., to set standards - on a set of character sizes so that one shop could produce the press while another could produce the characters. This was obviously beneficial because the print shop could concentrate on what it was supposed to do best, print books. This is the manufacturing-oriented definition of standardisation that is found in the Encyclopaedia Britannica: "Standardisation, in industry: imposition of standards that permit large production runs of component parts that are readily fitted to other parts without adjustment". Of course, communication between the author of a book and the reader is usually not hampered if a print shop decides to use characters of a non-standard size or a different font.However, the shop may have a hard time finding them or may even have to make them itself.

The same applies to photography. Cameras were originally produced by a single individual or shop and so were the films, but later it became convenient to standardise the film size so that different companies could specialise in either cameras or films. Again, communication between the person taking the picture and the person to whom the picture is sent is not hampered if pictures are taken with a camera using a non-standard film size. However, it may be harder to find the film and get it processed.

Telegraphy was the first example of a new communication system, based on a new technology, that required agreement between the parties if the sequence of dots and dashes was to be understood by the recipient. Interestingly, this was also a communication standard imposed on users by its inventor. Samuel Morse developed himself what is now called the Morse alphabet and the use of the alphabet bearing his name continues to this day.

The phonograph also required standards, namely the amplitude corresponding to a given intensity and the speed of the disc, so that the sound could be reproduced without intensity and frequency distortions. As with telegraphy the standard elements were basically imposed by the inventor. The analogue nature of this standard makes the standard apparently less constraining, because small departures from the standard are not critical. The rotation speed of the turntable may increase but meaningful sound can still be obtained, even though the frequency spectrum of the reproduced signal is distorted.

Originally, telephony required only standardisation of the amplitude and frequency characteristic of the carbon microphone. However, with the growing complexity of the telephone system, other elements of the system, such as the line impedance and the duration of the pulse generated by the rotary dial, required standardisation. As with the phonograph, small departures from the standard values did not prevent the system from providing the ability to carry speech to distant places, with increasing distortions for increasing departures from the standard values.

Cinematography, basically a sequence of photographs each displayed for a brief moment - originally 16 and later 24 times a second - also required standards: the film size and the display rate. Today, visual rendition is improved by flashing 72 pictures per second on the screen by shuttering each still three times. This is one example of how it is possible to have different communication qualities while using the same communication standard. The addition of sound to the motion pictures, for a long time in the form of a trace on a side of the film, also required standards.

Sound broadcasting required standards: in addition to the baseband characteristics of the sound there was also a need to standardise the modulation scheme (amplitude and later frequency modulation), the frequency bands allocated to the different transmissions, etc.

Television broadcasting required a complex standard related to the way a television camera scans a given scene. The standard specifies how many times per second a picture is taken, how many scan lines per picture are taken, how the signal is normalised, how the beginning of a picture and of a scan line is signaled, how the sound information is multiplexed, etc. The modulation scheme utilised at radio frequency (vestigial sideband) was also standardised.

Magnetic recording of audio and video also requires standards, simpler for audio (magnetisation intensity, compensation characteristics of the non-linear frequency response of the inductive playback head, and tape speed), more complex for video because of the structure of the signal and its bandwidth.

Character coding standards were also needed for the teletypewriter. Starting from the Baudot code a long series of character coding standards were produced that continues today with the 2- and 4-byte character coding of ISO/IEC 10646 (Unicode).

Character coding provides a link to a domain that was not originally considered to be strictly part of "communication": the electronic computer. This was originally a stand-alone machine that received some input data, processed them, and produced some output data. The first data input to a computer were digital numbers, but soon characters were used. Different manufacturers developed different ways to encode numbers and characters and the way operations on the data were carried out. This was done to suit the internal architecture of their computers. Therefore each type of computing machine required its own "communication standard". Later on high-level programming languages such as COBOL, FORTRAN, C and C++ were standardised in a machine-independent fashion. Perforations of paper cards and tapes as well as systems for storing binary data on tapes and disks also required standards.

With the introduction of digital technologies in the telecommunication sector in the 1960s, standards were required for different aspects such as the sampling frequency of telephone speech (8 kHz), the number of bits per sample (seven or eight for speech), the quantisation characteristics (A-law, m-law) etc. Other areas that required standardisation were signaling between switches (several CCITT "alphabets"), the way different sequences of bits each representing a telephone speech could be assembled (multiplexed), etc. Another important area of standardisation was the way to modulate transmission lines so that they could carry sequences of bits (bit/s) instead of analogue signals (Hertz).

The transmission of digital data across a network required the standardisation of addressing information, the packet length, the flow control, etc. Numerous standards were produced: X.25, I.311, and the most successful of all, the Internet Protocol (IP).

The compact disc, a system that stored sampled music in digital form, with a laser beam to detect the value of a bit, was a notable example of standardisation: the sampling frequency (44.1 kHz), the number of bits/sample (16), the quantisation characteristics (linear), the distance between holes on the disc surface, the rotation speed, the packing of bits in frames, etc.

Systems to reduce the number of bits necessary to represent speech, facsimile, music, and video information utilised exceedingly complex algorithms, all requiring standardisation. Some of them, e.g. the MPEG-1 and MPEG-2 coding algorithms of the Moving Picture Experts Group, have achieved wide fame even with the general public. The latter is used in digital television receivers (set top boxes). Hypertext markup language (HTML), a standard to represent formatted pages, has given rise to the ubiquitous Web browser, actually a "decoder" of HTML pages.

The software world has produced a large number of software standards. In newspaper headlines today is Win32, a set of APIs providing high-level functionalities abstracted from the specifics of the hardware processing unit that programmers wishing to develop applications on top of the Windows operating system have to follow. This is the most extreme, albeit not unique, case of a standard, as its is fully owned by a single company. The Win32 API’s are constantly enriched with more and more functionalities. One such functionality, again in newspaper headlines these days, is the HTML decoder, alias Web browser. Another is the MPEG-1 software decoder.

 

4. The standards bodies

It is likely that human languages developed in a spontaneous way, but in most societies the development of writing was probably driven by the priesthood. In modern times special bodies were established, often at the instigation of public authorities (PA), with the goal of taking care of the precise definition and maintenance of language and writing. In Italy the Accademia della Crusca (established 1583) took on the goal of preserving the Florentine language of Dante. In France the Académie Française (established 1635) is to this day the official body in charge of the definition of the French Language. Recently the German Bundestag approved a law that amends the way the German language should be written.

The role of PAs in the area of language and writing, admittedly a rather extreme case, is well represented by the following sentence: "La langue est donc un élément clé de la politique culturelle d'un pays car elle n'est pas seulement un instrument de communication ... mais aussi un outil d'identification, un signe d'appartenance ŕ une communauté linguistique, un élément du patrimoine national que l'État entend défendre contre les atteintes qui y sont portées" (language is therefore a key element of the cultural policy of a country because it is not just a communication tool ... but also an identification means, a sign that indicates membership to a language community, an element of the national assets that the State intends to defend against the attacks that are waged against it).

Other forms of communication, however, are or have become fairly soon after their invention of a more international concern. They have invariably seen the governments as the major actors. This is the case of telegraphy, post, telephone, radio and television.

The mail service developed quickly after the introduction of prepaid letters in the United Kingdom in 1840. A uniform rate in the domestic service for all letters of a certain weight, regardless of the distance involved, was introduced. At the international level, however, the mail service was bound by a conflicting web of postal services and regulations with up to 1200 rates. The General Postal Union (established 1874 and renamed Universal Postal Union in 1878), defined a single postal territory where the reciprocal exchange of letter-post items was possible with a single rate for all and with the principle of freedom of transit for letter-post items.

A similar process took place for telegraphy. In less than 10 years since the first transmission, telegraphy had become available to the general public in developed countries. At the beginning telegraph lines did not cross national frontiers because each country used a different system and each had its own telegraph code to safeguard the secrecy of its military and political telegraph messages. Messages had to be transcribed, translated, and handed over at frontiers before being retransmitted over the telegraph network of the neighbouring country. The first International Telegraph Convention was signed in 1865 and harmonised the different systems used. This was an important step in telecommunication, as it was clearly attractive for the general public to be able to send telegraph messages to every place where there was a telegraph network.

Following the invention of the telephone and the subsequent expansion of telephony, the Telegraph Union began, in 1885, to draw up international rules for telephony. In 1906 the first International Radiotelegraph Convention was signed. The International Telephone Consultative Committee (CCIF) set up in 1924, the International Telegraph Consultative Committee (CCIT) set up in 1925, and the International Radio Consultative Committee (CCIR) set up in 1927 were made responsible for drawing up international standards. In 1927, the Union allocated frequency bands to the various radio services existing at the time (fixed, maritime and aeronautical mobile, broadcasting, amateur, and experimental). In 1934 the International Telegraph Convention of 1865 and the International Radiotelegraph Convention of 1906 were merged to become the International Telecommunication Union (ITU). In 1956, the CCIT and the CCIF were amalgamated to give rise to the International Telephone and Telegraph Consultative Committee (CCITT). Today the CCITT is called ITU-T and the CCIR is called ITU-R.

Other communication means developed without the explicit intervention of governments but were often the result of a clever invention of an individual or a company that successfully made its way into the market and became an industrial standard. This was the case for photography, cinematography, and recording. Industries in the same business found it convenient to establish industry associations, actually a continuation of a process that had started centuries before with medieval guilds. Some government then decided to create umbrella organisations – called national standards bodies – of which all separate associations were members, with the obvious exception of matters related to post, telecommunication and broadcasting that were already firmly in the hands of governments. The first country to do so was, apparently, the United Kingdom with the establishment in 1901 of an Engineering Standards Committee that became the British Standards Institute in 1931. In addition to developing standards, whose use is often made compulsory in public procurements, these national standards bodies often take care of assessing the conformity of implementations to a standard. This aspect, obviously associated in people’s minds with "quality", explains why quality is often in the titles of these bodies, as is the case for the Portuguese Standards Body IPQ (Instituto Portuguęs da Qualidade).

The need to establish international standards developed with the growth of trade. The International Electrotechnical Commission (IEC) was founded in 1906 to prepare and publish international standards for all electrical, electronic, and related technologies. The IEC is currently responsible for standards of such communication means as receivers, audio and video recording systems, and audio-visual equipment, currently all grouped in TC 100 (Audio, Video and Multimedia Systems and Equipment). International standardisation in other fields and particularly in mechanical engineering was the concern of the International Federation of the National Standardizing Associations (ISA), set up in 1926. ISA's activities ceased in 1942 but a new international organisation called ISO (International Organisation for Standardisation) began to operate again in 1947 with the objective "to facilitate the international coordination and unification of industrial standards". All computer-related activities are currently in the Joint ISO/IEC Technical Committee 1 (JTC 1) on Information Technology. This Technical Committee (TC) has achieved a very large size. About 1/3 of all ISO and IEC standards work is done in JTC 1.

Whereas ITU and UPU are Treaty Organisations (i.e., they have been established by treaties signed by government representatives) and the former has bee an agency of the United Nations since 1947, ISO and IEC have the status of private not-for-profit companies established according to the Swiss Civil Code.

 

5. The standards bodies at work

Because "communication", as defined in this paper, is such a wide concept and so many different constituencies with such different backgrounds have a stake in it, there is no such thing as a single way to develop standards. There are, however, some common patterns that are followed by industries of the same kind.

The first industry considered here is the telecommunication industry, meant here to include telegraphy, telephony, and their derivatives. As discussed earlier, this industry had a global approach to communication from the very beginning . Early technical differences justified by the absence of a need to send or receive telegraph messages between different countries were soon ironed out, and the same happened to telephony, which could make use of the international body set up in 1865 for telegraphy to promote international telecommunication. In the 130 plus years of its history, what is now ITU-T has gone through various technological phases. Today a huge body of "Study Groups" take care of standardisation needs: SG 3 (Tariffs), SG 7 (Data Networks), SG 11 (Signaling), SG 13 (Network aspects), SG 16 (Multimedia), etc.

The vast majority of the technical standards at the basis of the telecommunication system have their correspondence in an ITU-T standard. At the regional level, basically in Europe and North America and, to some extent, in Japan, there has always been a strong focus on developing technical standards for matters of regional interest and preparing technical work to be fed into ITU-T. A big departure from the traditional approach of standards of world-wide applicability began in the 1960s with the digital representation of speech: 7 bit/sample advocated by the United States and Japan, 8 bit/sample advocated by Europe. This led to several different transmission hierarchies because they were based on a different building block, digitised speech. This rift was eventually mended by standards for bitrate-reduced speech, but the hundreds of billions of dollars invested by telecommunication operators in incompatible digital transmission hierarchies could not be recovered. The ATM (asynchronous transfer mode) project gave the ITU-T an opportunity to overcome the differences in digital transmission hierarchies and provide international standards for digital transmission of data. Another departure from the old philosophy was made with mobile telephony: in the United States there is not even a national mobile telephony standard, as individual operators are free to choose a standard of their own liking. This contrasts with the approach adopted in Europe where the Global System for Mobile (GSM) standard is so successful that it is expanding all over the world, United States included. With the Universal Mobile Telecommunication System (UMTS), so-called 3rd generation mobile, the ITU-T is retaking its original role of developer of global mobile telecommunication standards.

The ITU-R comes from a similar background but had a completely different evolution. The development of standards for sound broadcasting had to take into account the fact that with the frequencies used at that time the radio signal could potentially reach any point on the earth. Global sound broadcasting standards became imperative. This approach was continued when the use of VHF for frequency-modulated (FM) sound programs was started: FM radio is a broadcasting standard used throughout the world.  The case of television was different. A first monochrome television system was deployed in the United Kingdom in the late 1930s, a different one in the United States in the 1940s and yet a different one in Europe in the 1950s. In the 1960s the compatible addition of colour information in the television system led to a proliferation of regional and national variants of television that continues until today. The ITU-R was also unable to define a single system for teletext (characters carried in unused television lines to be displayed on the television screen). Another failure has followed the attempt to define a single standard for High Definition Television.

The field of consumer electronics, represented by the IEC, is characterised by an individualistic approach to standards. Companies develop new communication means based on their own ideas and try to impose their products on the market. Applied to audio-based communication means, this has led so far to a single standard generally being adopted by industry soon after the launch of a new product, possibly after a short battle between competing solutions. This was the case with the audio tape recorder, the compact cassette and the compact disc. Other cases have been less successful: for a few years there was a competition between two different ways of using compressed digital audio applications, one using a compact cassette and the other using a recordable minidisc. The result has been the demise of one and very slow progress of the other. More battles of this type loom ahead. Video-based products have been less lucky. For more than 10 years a standards battle continued between Betamax and VHS, two different types of videocassette recorder. Contrary to the often-made statement that having a competition in the market place brings better products to consumers, it is stated by some that the type of video cassette that eventually prevailed in the marketplace is technically inferior to the type that lost the war.

The fields of photography and cinematography (whose standardisation is currently housed, at the international level, in ISO) have adopted a truly international approach. Photographic cameras are produced to make use of one out of a restricted number of film sizes. Cinematography has settled with a small number of formats each characterised by a certain level of performance.

The computer world has adopted the most individualistic approach of all industries. Computing machines developed by different manufacturer had different Central Processing Unit (CPU) architectures, programming languages, and peripherals. Standardisation took a long time to penetrate this world. The first examples were communication ports (EIA RS 232), character coding (American Standard Code for Information Interchange - ASCII - later to become ISO/IEC 646) and programming languages (e.g., FORTRAN, later to become ISO/IEC 1539). The hype of computer and telecommunication convergence of the early 1980s prompted the launching of an ambitious project to define a set of standards that would enable communication between a computer of any make with another computer of any make across any network. For obvious reasons, the project, called OSI (Open Systems Interconnection), was jointly executed with ITU-T. In retrospect, it is clear that the idea to have a standard allowing a computer of any make (and at that time there were tens and tens of computers of different makes) to connect to any kind of network, talk to a computer of any make, execute applications on the other computer, etc., no matter how fascinating it was, had very little prospect of success. And so it turned out to be, but after 15 years of efforts and thousand of person-years spent, when the project was all but discontinued.

For the rest ISO/IEC JTC 1, as mentioned before, has become a huge standards body. This should be no surprise, as JTC 1 defines information technology to "include the specification, design and development of systems and tools dealing with the capture, representation, processing, security, transfer, interchange, presentation, management, organization, storage and retrieval of information". Just that!

While ISO and ITU were tinkering with their OSI dream, setting out first to design how the world should be and then trying to build it, in a typical top-down fashion, a group of academics (admittedly well funded by their government) were practically building the same world bottom up. Their idea was that once you had defined a protocol for transporting packets of data and, possibly, a flow-control protocol, you could develop all sorts of protocols, such as SMTP (Simple Mail Transport Protocol), FTP (File Transfer Protocol), HTTP (HyperText Transport Protocol), etc. This would immediately enable the provision of very appealing applications. In other words, Goliath (ISO and ITU) has been beaten by David (Internet). Formal standards bodies no longer set the pace of telecommunication standards development.

The need of other communication standards – for computers - was simply overlooked by JTC1. The result has been the establishment of a de facto standard, owned by a single company, in one of the most crucial areas of communication: the Win32 APIs. Another case - Java, again owned by a single company - may be next in line.

 

6. The digital communication age

During its history humankind has developed manifold means of communication. The most diverse technologies were assembled at different times and places to provide more effective ways to communicate between humans, between humans and computers, and between computers, overcoming the barriers of time and space. The range of technologies include

The history of communication standards can be roughly divided into three periods. The first covers a time when all enabling technologies were diverse: mechanical, chemical, electrical, and magnetic. Because of the diversity of the underlying technologies, it was more than natural that different industries would take care of their standardisation needs without much interaction among them.

In the second period, the common electromagnetic nature of the technologies provided a common theoretical unifying framework. However, even though a microphone could be used by the telephone and radio broadcasting communities or a television camera by the television broadcasting, CATV, consumer electronic (recording) or telecommunication (videoconference) communities, it happened that either the communities had different quality targets or there was an industry that had been the first developer of the technology and therefore had a recognised leading role in a particular field. In this technology phase, too, industries could accommodate their standardisation needs without much interaction among them.

Digital technologies create a different challenge, because the only part that differentiates the technologies of the industries is the delivery layer. Information can be represented and processed using the same digital technologies, while applications sitting on top tend to be even less dependent on the specific environment.

In the 1980s a superficial reading of the implications of this technological convergence made IBM and AT&T think they were competitors. So AT&T tried to create a computer company inside the group and when it failed it invested billions of dollars to acquire the highly successful NCR just to transform it in no time into a money loser. The end of the story a few years later was that AT&T decided to spin off its newly acquired computer company and its old manufacturing arm. In the process, it also divested itself of its entire manufacturing arm. In parallel IBM developed a global network to connect its dispersed business units and started selling communication services to other companies. Now IBM has decided to shed the business because it is "non-core". To whom? AT&T!

The lesson, if there is a need to be reminded of it, is that technology is just one component, not necessarily the most important, of the business. That lesson notwithstanding, in the 1990s we are hearing another Mermaid's song, the convergence of computers, entertainment and telecommunications. Other bloodbaths are looming.

Convergence hype apart, the fact that a single technology is shared by almost all industries in the communication business is relevant to the problem this paper addresses, namely why the perceived importance of standardisation is rapidly decreasing, whether there is still a need for the standardisation function, and, if so, how it must be managed.

This because digital technologies bring together industries with completely different background in terms of their attitudes vis-ŕ-vis public authorities and end users, standardisation, business practices, technology progress, and handling of Intellectual Property Rights (IPR). Let us consider the last item.

 

7. Intellectual Property

The recognition of the ingenuity of an individual who invented a technology enabling a new form of communication is a potent incentive to produce innovation. Patents have existed since the 15th century but it is the U. S. Constitution of 1787 that explicitly links private incentive to overall progress by giving the Congress the power "to promote the progress of … the useful arts, by securing for limited times to … inventors the exclusive rights to their … discoveries". If the early years of printing are somewhat shrouded in a cloud of uncertainty about who was the true inventor of printing and how much he contributed to it, subsequent inventions like telegraphy, photography, telephony etc. were duly registered at the patent office and sometimes their inventors, business associates and heirs enjoyed considerable economic benefits.

Standardisation, a process of defining a single effective way to do things out of a number of alternatives, is clearly strictly connected to the process that motivates individuals to provide better communication means today than existed yesterday or to provide communication means that did not exist before.

Gutenberg's invention, if filed today, would probably deserve several patents or at least multiple claims because of the diverse technologies that he is credited with having invented. Today's systems are several orders of magnitude more complex than printing. As an example, the number of patents needed to build a compact disc audio player is counted in the hundreds. This is why what is known as "intellectual roperty" has come to play an increasingly important role in communication.

Standards bodies such as IEC, ISO, and ITU have developed a consistent and uniform policy vis-ŕ-vis intellectual property. In simple words, the policy tolerates the existence of necessary patents in international standards provided the owner of the corresponding rights is ready to give licenses on fair and reasonable terms and on a non-discriminatory basis.

This simple principle is finding several challenges.

 

A. Patents, a tool for business

Over the years patents have become a tool for conducting business. Companies are forced to file patents not so much because they have something valuable and they want to protect it but because patents become the merchandise to be traded at a negotiating table when new products are discussed or conflicts are resolved. On these occasions it is not so much the value of the patents that counts but the number and thickness of the piles of patent files. This is all the more strange when one considers that very often a patented innovation has a lifetime of a couple of years so that in many cases the patent is already obsolete at the time it is granted. In the words of one industry representative, the patenting folly now mainly costs money and does not do any good for the end products.

 

B. A patent may emerge too late

Another challenge is caused by the widely different procedures that countries have in place to deal with the processing of patent filings. One patent may stay under examination for many years (10 or even more) and stay lawfully undisclosed. At the end of this long period, when the patent is published, the rights holder can lawfully start enforcing the rights. However, because the patent may be used in a non-IEC/ISO/ITU standard or ,even in that case, if the rights holder has decided not to conform to that policy, the rights holder is not bound by the fair and reasonable terms and conditions and may conceivably request any amount of money. At that time, however, the technology may have been deployed by millions and the companies involved may have unknowingly built enormous liabilities. Far from promoting progress, as stated in the U. S. Constitution of 1787, this practice is actually hampering it, because companies are alarmed by the liabilities they take on board when launching products where gray zones exist concerning patents.

 

C. Too many patents may be needed

The third challenge is provided by the complexity of modern communication systems, where a large number of patents may be needed. If the necessary patents are owned by a restricted number of companies, they may decide to team up and develop a product by cross-licensing the necessary patents. If the product, as in the case of the MPEG-2 standard, requires patents whose rights are owned by a large number of companies (reportedly about 40 patents are needed to implement MPEG-2) and each company applies the fair and reasonable terms clause of the IEC/ISO/ITU patent policy, the sum of 40 fair and reasonable terms may no longer be fair and reasonable. The MPEG-2 case has been resolved by establishing a "patent pool", which reportedly provides a one-stop license office for most MPEG-2 patents. The general applicability of the patent pool solution, however, is far from certain. The current patent arrangement, a reasonable one years ago when it was first adopted, is no longer able to cope with the changed conditions.

 

D. Different models to license patents

The fourth challenge is provided by the new nature of standards offered by information technology. Whereas traditional communication standards had a clear physical embodiment, with digital technologies a standard is likely to be a processing algorithm that runs on a programmable device. Actually, the standard may cease to be a patent and becomes a piece of computer code whose protection is achieved by protecting the copyright of the computer code. Alternatively both the patent and the copyright are secured. But because digital networks have become pervasive, it is possible for a programmable device to run a multiplicity of algorithms downloaded from the network while not being, if not at certain times, one of the physical embodiments the standards were traditionally associated with. The problem is now that traditional patent licensing has been applied assuming that there is a single piece of hardware with which a patent is associated. Following the old pattern, a patent holder may grant fair and reasonable (in his opinion and according to his business model) terms to a licensee, but the patent holder is actually discriminating against the licencee because the former has a business model that assumes the existence of the hardware thing, whereas the latter has a completely different model that assumes only the existence of a programmable device.

 

E. All IPR together

The fifth challenge is provided by yet another convergence caused by digital technologies. In the analogue domain there is a clear separation between the device that makes communication possible and the message. When a rented video cassette is played back on a player, what is paid is a remuneration to the holders of the copyright for the movie and a remuneration to the holders of patent rights for the video recording system made at the time the player was purchased. In the digital domain an application may be composed of some digitally encoded pieces of audio and video, some text and drawings, some computer code that manages user interaction, access to the different components of the application, etc. If the device used to run the application is of the programmable type, the intellectual property can only be associated with the bits – content and executable code – downloaded from the network.

 

F. Mounting role of content

The last challenge in this list is provided by the increasingly important role of content in the digital era. Restricted access to content is not unknown in the analogue world, and it is used to offer selected content to closed groups of subscribers. Direct use of digital technologies with their high quality and ease of duplication, however, may mean the immediate loss of content value, unless suitable mechanisms are in place to restrict access to those who have acquired the appropriate level of rights. Having overlooked this aspect has meant a protracted delay in the introduction of Digital Versatile Disc (DVD), the new generation of compact disc capable of providing high-quality MPEG-2-encoded movies.

The conclusion is the increased role of IPR in communication, its interaction with technological choices, and the advancing merge of the two components – patents and copyright – caused by digital technologies. This sees the involvement of the World Intellectual Property Organisation (WIPO), another treaty organisation that is delegated to deal with IPR matters.

 

8. Not everything that shines is gold

The challenges that have been exposed in the preceding sections do not find standardisation in the best shape, as will be shown in the following.

 

A. Too slow

The structure of standards bodies was defined at a time when the pace of technological evolution was slow. Standards committees had plenty of time to consider new technologies and for members to report back to their companies or governments, layers of bureaucracies had time to consider the implications of new technologies, and committees could then reconsider the issues over and over until "consensus" (the magic word of ISO and IEC) or unanimity (the equivalent in ITU) was achieved. In other words, standardisation could afford to operate in a well organised manner, slowly and bureaucratically. In other words standardisation could afford to be "nice" to everybody.

An example of the success of the old way of developing standards is the integrated services digital network (ISDN). This was an ITU project started at the beginning of the 1970s. The project deliberately set the threshold high by targeting the transmission of two 64 kbit/s streams when one would have amply sufficed. Although the specifications were completed in the mid 1980s, it took several years before interoperable equipment could be deployed in the network. Only now is ISDN taking off thanks to a technology completely unforeseen at the time the project was started – the Internet. An example of failure has been the joint JTC1 and ITU-T OSI project. Too many years passed between the drawing board and the actual specification effort. By the time OSI solutions had become ready to be deployed, the market had already been invaded by the simpler Internet solution. A mixed success has been ATM standardisation. The original assumption was that ATM would be used on optical fibres operating at 155 Mbit/s, but today the optical fibre to the end user is still a promise for the future. It is only thanks to the ATM Forum specifications that ATM can be found today on twisted pair at 25 Mbit/s. For years the CCITT discussed the 32- versus 64-byte cell length issue. Eventually, a decision of 48 bytes was made; however, in the meantime precious years had been lost and now ATM, instead of being the pervasive infrastructure of the digital network of the future, is relegated to being a basic transmission technology.

In the past a single industry, e.g., the government-protected monopoly of telecommunication, could set the pace of development of new technology. In the digital era the number of players, none of them pampered by public authorities, is large and increasing. As a consequence, standardisation can no longer afford to move at a slow pace. The old approach of providing well-thought-over, comprehensive, nice-to-everybody solutions has to contend with nimbler but faster solutions coming from industry consortia or even individual companies.

 

B. Too many options

In abstract terms, everybody agrees that a standard should specify a single way of doing things. The practice is that people attending a standards committee work for a company that has a definite interest in getting one of their technologies in the standard. It is not unusual that the very people attending are absolutely determined to have their pet ideas in the standard. The rest of the committee is just unable or unwilling to oppose because of the need to be "fair" to everybody. The usual outcome of a dialectic battle lasting anywhere from 1 hour to 10 years is the compromise of the intellectually accepted principle of a single standard without changing the name. This is how "options" come in.

In the past, this did not matter too much because transformation of a standard into products or services was in many cases a process driven by infrastructure investments, in which the manufacturers had to wait for big orders from telecom operators. These eventually bore the cost of the options that, in many cases, their own people had stuffed into the standards and that they were now asking manufacturers to implement. Because of too many signaling options, it took too many years for European ISDN to achieve a decent level of interoperability between different telecommunications operators and, within the same operator, between equipment form different manufacturers. But this was the time when telecommunication operators were still the drivers of the development.

The case of the ATM is enlightening. In spite of several ITU-T recommendations having been produced in the early 1990s industry was still not producing any equipment conforming to these recommendations. Members of the ATM Forum like to boast that their first specification was developed in just 4 months without any technical work, if not the removal of some options from existing ITU-T recommendations. Once the heavy ITU-T documents that industry, without backing of fat orders from telecom operators, had not dared to implement became slim ATM Forum specifications, ATM products became commercially available at the initiative of manufacturers, at interesting prices and in a matter of months.

 

C. No change

When the technologies used by the different industries were specific to the individual industries, it made a lot of sense to have different standards bodies taking care of individual standardisation needs. The few overlaps that happened from time to time were dealt with in an ad hoc fashion. This was the case of the establishment of the Comité Mixte pour le Transmissions Télévisuelles et Sonores (CMTT), a joint CCITT-CCIR committee for the long-distance transmission of audio and television signals, the meeting point of broadcasting and telecommunication, or the OSI activity, the meeting point of telecommunication and computers. With the sweeping advances of digital technologies, many of the issues that are separately considered in different committees of the different standards bodies are becoming common issues.

A typical case is that of compression of audio and video, a common technology for ITU-T, ITU-R, JTC1 and now also the World Wide Web Consortium (W3C). Instead of agreeing to develop standards once and in a single place, these standards bodies are actually running independent standards projects. This attitude not only is wasting resources but also delays acceptance of standards because it makes it more difficult to reach a critical mass that justifies investments. Further, it creates confusion because of multiple standard solutions for similar problems.

Within the same ITU it has been impossible to rationalise the activities of its "R" and "T" branches. A high-level committee appointed a few years ago to restructure the ITU came to the momentous recommendations for

  1. renaming CCITT and CCIR as ITU-T and ITU-R,
  2. replacing the Roman numerals of CCITT study groups with the Arabic numerals of ITU-T study groups (those of CCIR were already Arabic)
  3. moving the responsibility of administration of the CMTT from CCIR to ITU-T while renaming it Study Group 9.

"Minor" technical details such as "who does what" went untouched, so video services are still the responsibility of ITU-R SG 11 if delivered by radio, ITU-T SG 9 if delivered by cable, and ITU-T SG 16 if delivered by wires or optical fibres. For a long time mobile communication used to be in limbo because the delivery medium is radio, hence competence of ITU-R, but the service is (largely) conversational, hence the competence of ITU-T.

 

D. Lagging, not leading

During its history the CCITT has gone through a series of reorganisations to cope with the evolution of technology. With a series of enlightened decisions the CCITT adapted itself to the gradual introduction of digital technologies first in the infrastructure and later in the end systems. For years the telecommunication industry waited for CCITT to produce their recommendations before starting any production runs.

In 1987 an enlightened decision was made by ISO and IEC to combine all computer-related activities of both bodies in a single technical committee, called ISO/IEC JTC1. Unlike the approach in the ITU, in the IEC, and also in many areas of JTC1, the usual approach has been one of endorsing de-facto standards that had been successful in the marketplace.

In the last 10 years, however, standards bodies have lost most of the momentum that kept them abreast of technology innovations. Traditionally, telephony modem standards had been the purview of ITU-T, but in spite of evidence since more than 10 years that the local loop could be digitised to carry several Mbit/s downstream, no ITU-T standards exist today for Asymmetric Digital Subscriber Line (ADSL), when they are deployed by the hundreds of thousands, without any backing of ITU-T recommendations. The same is true for digitisation of broadcast-related delivery media such as satellite or terrestrial media: too many ITU-R standards exist for broadcast modem. A standard exists for digitising cable for CATV services but in the typical fashion of recommending 3 standards: one for Europe, one for the United States and one for Japan. In JTC1, supposedly the home for everything software, no object-oriented technology standardisation was ever attempted. In spite of its maturity, no standardisation of intelligent agents was even considered. In all bodies, no effective security standards, the fundamental technology for business in the digital world, were ever produced.

 

E. The flourishing of consortia

The response of the industry to this eroding role of standards bodies has been to establish consortia dealing with specific areas of interest. In addition to the already mentioned Internet Society, whose Internet Engineering Task Force (IETF) is a large open international community of network designers, operators, vendors, and researchers concerned with the evolution of the Internet architecture and the smooth operation of the Internet, and the ATM Forum, established with the purpose of accelerating the use of ATM products and services, there are the Object Management Group (OMG), whose mission is to promote the theory and practice of object technology for the development of distributed computing systems; the Digital Audio-Visual Council (DAVIC) established with the purpose of promoting end-to-end interoperable digital audio-visual products, services and applications; the W3C, established to lead the World Wide Web to its full potential by developing common protocols that promote its evolution and ensure its interoperability; the Foundation for Intelligent Physical Agents (FIPA), established to promote the development of specifications of generic agent technologies that maximise interoperability within and across agent-based applications; and Digital Video Broadcasting (DVB), committed to designing a global family of standards for the delivery of digital television and many others.

Each of these groups, in most cases with a precise industry connotation, is busy developing its own specifications. The formal standards bodies just sit there while they see their membership and their relevance eroded by the day. Instead of asking themselves why this is happening and taking the appropriate measures, they have put in place a new mechanism whereby Publicly Available Specifications (PAS) can easily be converted into International Standards, following a simple procedure. Just a declaration of surrender!

 

9. A way out

The process called standardisation, the enabler of communication, is in a situation of stalemate. Unless vigorous actions are taken, the whole process is bound to collapse in the near future. In the following some actions are proposed to restore the process to function for the purpose for which it was established.

 

A. Break the standardisation-regulation ties

Since the second half of the 19th century public authorities have seen as one of their roles the general provision of communication means to all citizens. From the 1840s public authorities, directly or through direct supervision, started providing the postal service, from the 1850s the telegraph service, and from the 1870s compulsory elementary education, through which children acquired oral and paper-based communication capabilities. In the same period the newly invented telephony, with its ability to put citizens in touch with one another, attracted the attention of public authorities as did wireless telegraphy at the turn of the century and broadcasting in the 1920s and television in the 1930s. All bodies in charge of standardisation of these communication means, at both national and international level, see public authorities as prime actors.

Whichever were the past justifications for public authorities to play this leading role in setting communication standards and running the corresponding businesses on behalf of the general public, they no longer apply today. The postal service is being privatised in most countries, and the telegraph service has all but disappeared because telephony is ubiquitous and no longer fixed, as more and more types of mobile telephony are within everybody's reach. The number of radio and television channels in every country is counted by the tens and will soon be by the hundreds. The Internet is providing cheap access to information to a growing share of the general public of every country. Only compulsory education stubbornly stays within the purview of the state.

So why should the ITU still be a treaty organisation? What is the purpose of governments still being involved in setting telecommunication and broadcasting standards? Why, if all countries are privatising their post, telecommunication, and media companies, should government still have a say in standards at the basis of those businesses?

The ITU should be converted to the same status as IEC and ISO, i.e., a private not-for-profit company established according to Swiss Civil Code. The sooner technical standards are removed from the purview of public authorities, the sooner the essence of regulation will be clarified.

 

B. Standards bodies as companies

A state-owned company does not automatically become a swift market player simply because it has been privatised. What is important is that an entrepreneurial spirit drives its activity. For a standards body this starts with the identification of its mission, i.e., the proactive development of standards serving the needs of a defined multiplicity of industries, which I call "shareholders". This requires the existence of a function that I call "strategic planning" with the task of identifying the needs for standards; of a function that I call "product development", the actual development of standards; and of a function that I call "customer care", the follow-up of the use of standards with the customers, i.e., the companies that are the target users of the standards.

A radical change of mentality is needed. Standards committees have to change their attitude of being around for the purpose of covering a certain technical area. Standards are the goods that standards committees sell their customers, and their development is to be managed pretty much with the same management tools that are used for product development. As with a company, the goods have to be of high quality, have to be according to the specification agreed upon with the customers but, foremost, they have to be delivered by the agreed date. This leads to the first precept for standards development: Stick to the deadline.

The need to manage standard development as a product development also implies that there must be in place the right amount and quality of human material. Too often companies send to standards committees their newly recruited personnel, with the idea that giving them some opportunity for international exposure is good for their education, instead of sending their best people. Too often selection of leadership is based on "balance of power" criteria and not on management capabilities.

 

A new standard-making process

The following is a list of reminders that should be strictly followed concerning the features that standards must have.

A priori standardisation. If a standards body is to serve the needs of a community of industries, it must start the development of standards well ahead of the time the need for the standard appears. This requires a fully functioning and dedicated strategic planning function fully aware of the evolution of the technology and the state of research.

Not systems but tools. The industry-specific nature of many standards bodies is one of the causes of the current decadence of standardisation. Standards bodies should collect different industries, each needing standards based on the same technology but possibly with different products in mind. Therefore only the components of a standard, the "tools", can be the object of standardisation. The following process has been found effective:

  1. select a number of target applications for which the generic technology is intended to be specified;
  2. list the functionalities needed by each application;
  3. break down the functionalities into components of sufficiently reduced complexity that they can be identified in the different applications;
  4. identify the functionality components that are common across the systems of interest;
  5. specify the tools that support the identified functionality components, particularly those common to different applications;
  6. verify that the tools specified can actually be used to assemble the target systems and provide the desired functionalities.

Specify the minimum. When standards bodies are made up of a single industry, it is very convenient to add to a standard those nice little things that bring the standard nearer to a product specification as in the case of industry standards or standards used to enforce the concept of "guaranteed quality" so dear to broadcasters and telecommunication operators because of their "public service" nature. This practice must be abandoned; only the minimum that is necessary for interoperability can be specified. The extra that is desirable for an industry may be unneeded by or alienate another.

One functionality - one tool. More than a rule, this is good common sense. Too many failures in standards are known to have been caused by too many options.

Relocation of tools. When a standard is defined by a single industry, there is generally agreement about where a given functionality resides in the system. In a multi-industry environment this is usually not the case because the location of a function in the communication chain is often associated with the value added by a certain industry. The technology must be defined not only in a generic way but also in such a way that the technology can be located at different points in the system.

Verification of the standard. It is not enough to produce a standard. Evidence must be given that the work done indeed satisfies the requirements ("product specification") originally agreed upon. This is obviously also an important promotional tool for the acceptance of the standard in the marketplace.

 

D. Dealing with accelerating technology cycles

What is proposed in the preceding paragraphs would, in some cases, have solved the problems of standardisation thatstarted to become acute several years ago. Unfortunately, by themselves they are not sufficient to cope with the current trend of accelerating technology cycles.

On the one hand, this forces the standardisation function to become even more anticipative along the lines of the "a priori standardisation" principle. Standards bodies must be able to make good guesses about the next wave of technologies and appropriately invest in standardising the relevant aspects. On the other, there is a growing inability to predict the exact evolution of a technology, so that standardisation makes sense, at least in the initial phases, only if it is restricted to the "framework" or the "platform" and if it contains enough room to accommodate evolution.

The challenge then is to change the standards culture: to stress time to market, to reduce prescriptive scope, to provide frameworks that create a solution space, to populate the framework with concrete (default) instances. Last, and possibly most important, there is a need to refine the standard in response to success or failure in the market. The concept contains contradiction: the standard, which people might expect to be prescriptive, is instead an understated framework, and the standard, which people might expect to be static, anticipates evolution.

 

E. Not just process, real restructuring is needed

Innovating the standards-making process is important but pointless if the organisation is left untouched. As stated before the only thing that digital technologies leave as specific to the individual industries is the delivery layer. The higher one goes, the less industry-specific standardisation becomes.

The organisation of standards bodies is currently vertical, and this should be changed to a horizontal one. There should be one body addressing the delivery layer issues, possibly structured along different delivery media, one body for the application layer, and one body for middleware.

This is no revolution. It is the shape the computer business naturally acquired when the many incompatible vertical computer systems started converging. It is also the organisation the Internet world has given to itself. There is no body corresponding to the delivery layer, given that the Internet sits on top of it, but IETF takes care of middleware and W3C of the application layer.

 

10. Conclusions

Standards make communication possible, but standards making has not kept pace with technology evolution, and much less is it equipped to deal with the challenges lying ahead that this paper has summarily highlighted. Radical measures are needed to preserve the standardisation function, lest progress and innovation be replaced by stagnation and chaos. This paper advocates the preservation of the major international standards bodies after a thorough restructuring from a vertical industry-oriented organisation to a horizontal function-oriented organisation.

 

11. Acknowledgements

This paper is the result of the experience of the author over the past 10 years of activity in standardisation. In that time frame, he has benefited from the advice and collaboration of a large number of individuals in the different bodies he has operated in: MPEG, DAVIC, FIPA, and OPIMA. Their contributions are gratefully acknowledged.

Special thanks go to the following individuals, who have reviewed the paper and provided the author with their advice: James Brailean, Pentti Haikonen (Nokia), Barry Haskell (AT&T Research), Keith Hill (MCPS Ltd), Rob Koenen (KPN Research), Murat Kunt (EPFL), Geoffrey Morrison (BTLabs), Fernando Pereira (Instituto Superior Técnico), Peter Schirling (IBM), Ali Tabatabai (Tektronix), James VanLoo (Sun Microsystems), Liam Ward (Teltec Ireland), and David Wood (EBU).

The opinions expressed in this paper are those of the author only and they are not necessarily shared by those who have reviewed the paper.

 


Götterdämmerung: Twilight of the Gods. See, e.g. http://walhall.com/

 

You are visitor no.  since 26-Jun-00