The first edition of this textbook on software engineering was published more than
twenty years ago. That edition was wJitten using a dumb terminal attached to an early
minicomputer (a PDP-II) that probably cost about $50,000. I wrote this edition on
a wireless laptop that cost less than $2,000 and is many times more powerful than
that PDP-Ii. Software then was mostly mainframe software, but personal computers
were just becoming available. None of us then realised how pervasive these would
become and how much they would change the world.
Changes in hardware over the past twenty or so years have been absolutely remarkable,
and it may appear that changes in software have been equally significant.
Certainly, our ability to build large and complex systems has improved dramatically.
Our national utilities and infrastructure-energy, communications and transportrely
on very complex and, largely, very reliable computer systems. For building
business systems, there is an alphabet soup of technologies-J2EE, .NET, EJB, SAP,
BPEL4WS, SOAP, CBSE-that allow large web-based applications to be deployed
much more quickly than was possible in the past.
However, although much appears to have changed in the last two decades, when
we look beyond the specific technologies to the fundamental processes of software
engineering, much has stayed the same. We recognised twenty years ago that
the waterfall model of the software process had serious problems, yet a survey
published in December 2003 in IEEE Software showed that more than 40% of
compames are still using this approach. Testing is still the dominant program
validation technique, although other techniques such as mspections have been used
more effectively since the mid-1970s. CASE tools, although now based around the
UML, are still essentially diagram editors with some checking and code-generation
functionality.