First Metre - only connect... the people with the purpose
contact | site map | legal
|
The PC @ 25 (12 August 2006)
A quarter of a century ago, on 12 August, 1981, IBM launched its ground-breaking personal computer, the IBM 5150, the first in what it called its PC range. Today it is sometimes difficult to conclude that the promise offered by that event has been fulfilled as the latest bundle of software security patches and upgrades are downloaded to protect its successors. Significantly, our longing for trouble-free computing, both at the individual as well as the corporate level, has over the past ten years stimulated Microsoft to develop totally new foundations that are about to transform the way in which computers are used. 


Microsoft’s future survival is dependent upon these developments which underpin its new Vista operating system and .NET framework. Combined they will free business computing from many of the restraints imposed by traditional IT attitudes – in house or outsourced. The guiding light behind this innovative work has been Anders Hejlsberg, developer of Borland Delphi, who ten years ago was headhunted by Bill Gates to shape Microsoft’s new Windows .NET development platform which offers customised business computing with new levels of responsiveness and agility. 


In the same way that the notaries and scribes of the Middle Ages held sway over the circulation of information centuries ago, so today IT managers seem to feel obliged to lock down their systems and selectively control how, and what, data and information flows around ‘their’ networks. Ironically much software – intended to liberate businesses - actually tends to ossify innovation and attenuate, rather than amplify, competitive advantage. 


Just as the scribes steadily lost their power to rising literacy and the disruptive innovations of Gutenberg and Caxton so we are now about to witness a steady loss of influence by large scale software vendors as workflow management ceases to be a function controlled from the level of the enterprise and becomes a locally integrated function at the level of process ownership. Instead of major corporations all subscribing to a reducing set of less than agile company-wide management systems, small and medium size enterprises – long the dynamo of economic activity and true innovation – will be able to enter the workflow market and tailor make their own process workflows. Success will flow towards businesses that can fully mobilize the intellectual capital of their employees more economically than ever before. 


THE IBM QUALITY INSTITUTE
 

To understand how the PC has fallen short of its potential we need to take note of a seemingly unrelated event that took place in the same year that the PC was launched. Worried about improving the quality of its products IBM top management established the IBM Quality Institute and charged it with the responsibility ‘to train managers and professionals in the techniques to sustain company leadership in quality’. Such was the concern about quality within IBM at the time that it started the PC project that it adopted an aerospace industry ‘skunk works’ approach to its development. This involved creating a multidisciplinary development team that bypassed usual management channels and reported directly into head office. 


Working from a secret location in Baton Roca, Florida the team used off-the-shelf parts at will and completed the project within a year.With the announcement of both its new PC and the Quality Institute the bellwether of American business technology signalled clearly how much it recognised the importance to management of data processing on the one hand and process improvement on the other. Unfortunately no one in IBM – let alone elsewhere in mainstream American business – realised that the failure to integrate the basic philosophies of these two disciplines would be the primary cause of America’s loss of competitiveness over the coming decades. firstly to the wiser minds in Japan’s electronics and car industries, then to the other Asian ‘tigers’ of Singapore, Malaysia and South Korea and finally to the low wage economies of India and China. 


The reason why these two golden threads of business practice were not neatly entwined from the outset, despite the theoretical foundations set down by visionaries such as John von Neumann, Claude Shannon and Alan Turing, may be understood against the backdrop of the Anglo-Saxon world’s love of specialisation and classification, which ultimately abstracts itself to the level of deceptive simplicity and practical convenience ordained by the rule of binary outcomes. 


Since management simply failed to realise that success in quality was primarily dependent upon the concept of feedback and that the computer offered the best means of handling feedback, the biggest opportunity available for productivity improvement in the second half of the 20th century was ignored as the business disciplines of data and process management followed their separate tracks. 


WORK GETS DONE IN THE GEMBA
 

It has been observed that “if work were such a good idea the rich would have monopolised it long ago”. Since the direct daily contribution to value-adding work in most businesses is inversely proportional to the status of the employee, it is in the workplace – what the Japanese refer to as the gemba (i.e. where the truth resides) – that feedback signals are most needed in real time. And yet the most useful machine in this regard is on the desks of all managers as they pursue the Taylorist traditions of planning, instructing and inspecting the work of others less elevated in the organisation’s socio-technical hierarchy.
 

The fundamentals of any business are the competent mastery of how the work gets done and the how time is used – more simply, how the workflow progresses. Fault-free workflow systems mean lowest costs and more contented customers. But while perfection is not a sustainable human condition, the use of the computer – correctly programmed – can and does significantly increase reliability and productivity. The trick, and often the trap, is in defining the correct programme.
 

Programming is a human act and as such is prone to error. The bigger the software programme the greater the risk of bugs and errors of logical detail. The greatest source of error in such projects arises from the old-style command and control attitude being applied to software design practices which routinely underestimates the primary value of the rich seams of know-how freely available from the skilled process workers in the gemba. 


PEOPLE MAKE COMMITMENTS, COMPUTERS DON’T
 

People go to extreme ends to prevent letting their colleagues down. Computers have no such compunction. The concept of commitment, which is central to the way people do business with each other, cannot be encoded for a machine. The commercial traditions of commitment being sealed by word, sign or handshake first found favour in the early coffeehouses of mid-17th century London. Today, despite the role of technology, no deal is done without an implicit commitment between individuals. With the help of carefully designed computer programmes fewer people can conduct more business than ever before and the transactional bond is often only broken when the technology fails. But the consequences of such failures are usually far more extensive than in the case of those rarer instances of human forgetfulness, as countless IT projects have attested over the past decades. 


The development of low-cost, high-reliability computer hardware has raced ahead of advances in software, not least because of the difficulty of encoding the social aspects of business transactions by comparison to pro-forma activities. Technologists tend to lack a natural enthusiasm for the persistent social interaction needed to elicit detailed process know-how from people in the gemba, remote from the more familiar surroundings of their coding cubicles. Making explicit the full details of processes metaphorically represents the ‘first metre’ of the process journey to making customers happy. The requisite disciplines of the ‘first metre’ are however too often overlooked in the race for early results. 


Equally importantly in the matter of workflow design is the need to recognise that user requirements themselves will change and provision needs to be made to handle this reality. Solutions need to be both well informed from the outset and easy to change thereafter. Software design cannot be seen as an engineering project following a rigid design-build-use model (the traditional ‘waterfall’ approach of old-style software designers). Instead it needs to be based on an agile new-style of concurrent modelling where iterative design-build-test cycles ensure fluid feedback that can guide desirable changes as the overall project progresses.
 

DATA IS THE SHADOW OF THE PROCESS
 

Most computer applications rely on cleverly designed databases where all the facts and figures of countless activities can be stored pending retrieval for invoicing, settlement, marketing, soliciting, sorting or other commercial activities. The database may thus be seen as the pivotal element of all computing. And herein lies a clue to the problem of inadequate IT implementations. Databases are idiosyncratic creations. Each is the unique interpretation of its designer. It is very unlikely that, for a given business process, two database designers would produce identical designs – or even feel comfortable with each other’s work. Changing a database to better suit changed circumstances is not easy, even if the original designer is still working just down the corridor. 


The most used type of database – the relational model – was first described by Edgar Codd in 1970 and became widely use from the early 1980s. An intermediary graphical device that relates the data structure to the entities within the process, the entity-relationship model, was devised by Peter Chen in 1976, and in both cases little has changed since in the way most databases are designed and built today. Databases tend to be cast in stone and are rarely responsive to change. And when change does arise handling it is not generally characterised by either agility or economy. 


Lethargy in the development of innovative new database models over the past 30 years has a lot to answer for in the way that computer software has failed to economically respond to the needs of its users. With rapid advances in computing power (doubling every two years) and ever lower memory costs per megabyte (from $350 in 1981 to 1¢ by 2000) computer performance characteristics have risen exponentially. While this should have promoted fresh thinking about agile methodologies in fact the waterfall model has remained centre stage as management indulges itself, some say excessively, in processing data seemingly for its own sake and generally based on the same financial principles as prevailed before the advent of the computer itself. 


Data is meaningless unless understood as being the shadow of the process which generates it. When data processing effectively becomes an end in itself it is timely to look upstream to the processes themselves and ask some fundamental questions about just how much data is needed to inform operators and management about process performance, reliability and capability in real-time.
 

THE SILENT ADVENT OF CYBERNETICS
 

When Edgar Codd set down his ideas about database design in 1970 at IBM's San Jose Research Laboratory in California a British management consultant named Stafford Beer was visiting Chile at the invitation of President Allende’s democratically elected socialist government to create a unique national real-time system – with modest computer support - to help improve the management of the entire Chilean economy. Beer had been selected as a result of his practitioner-based approach to management using cybernetic principles (the ‘science of organisation’ as he described it) in the mid-1950s. At the same time, and independently, Norbert Wiener was writing up his pioneering theoretical work on general cybernetics in the US following the end of the Second World War and his work on anti-aircraft predictor systems.
 

Within three years of Beer’s project starting General Augusto Pinochet, with CIA help, had seized power in a violent military coup and the Chilean project had to be abandoned just as it was on the brink of operational first-use. With hindsight it is not possible to overestimate the negative effect that this coup had – not only locally on the people of Chile – but also on the wider world of business management methods and global prosperity.
 

CODD & BEER – A RECIPE IGNORED
 

While Codd believed that computer users should be able to work at a natural-language level and not be concerned about the details of where or how the data was stored so Beer believed that business managers should be able to work with real-time process information in recursive patterns to deal with the changes that their companies experienced daily and thus optimise their viability both physically and financially.
 

Both Codd and Beer had shown the way for more cost effective business management two decades before the availability of low cost, high power computing. However Beer’s viable system model had essentially faded from view after the Chilean experience as it could not be commoditised like a database model and sold into an IT-hungry marketplace. Management cybernetics, by contrast, required a totally different worldview, one which was rooted in intellectual curiosity and was not amenable to a simple sales proposition that promised more of the same but quicker, cheaper and easier. 


However as the world changes ever more rapidly and managers recognise the importance of continual improvement and innovation Beer’s message of the viable organisation is set to become more attractive to questioning managers alert to the repeated failings of past hi-tech solutions. Attitudes towards information processing will change with the advent of new software technologies. The reality that managers are progressively less dependant upon costly technical intermediaries, adding no process-related benefits to their bottom line, will add to the attractiveness of the new technology.
 

The view that matters when it comes to reducing costs and increasing added value is the one from the gemba. As the new technologies from Microsoft allow business process models to be converted directly into code (using a drag and drop interface) customisable, web-enabled economic workflow becomes a reality. Even better, viable system models will be coded directly in association with viable databases that in turn are automatically modified as and when the process is improved.
 

Kaizen, or continual improvement, which has been so successful in guiding Toyota and Honda in their aim of removing the American car giants, Ford and GM, from their dominant position will slowly become the norm for successful organisations. The lure of the computer will have been replaced by the elegant simplicity of seamlessly agilie processes that adjust themselves in response to changes in the marketplace. And thus personal computing can now become truly personal – not so much to the user but more importantly to the customers who want to be as blissfully unaware of what is happening behind the counter as they are of what goes on in the engine compartment of their car. 



Feedback to the author by e-mail.