Brian Randell talked of the NATO software engineering meetings of 1968 which may be found here: 1968 Garmisch report which was written at the first NATO software engineering conference. He also reported on how the proposed NATO Software Engineering Institute was white anted skillfully by Tom Simpson. Brian claims there there are three topics that were in the air in 1968 and are still around today:
- Software components
- Development tools and environments
- Multiprocesor System Design
Brian pointed out that in 1968 people (Doug McILroy ran a workshop on this in 1968 – photo) thought that within a few years people would have access to large libraries of well developed and parameterised components. He claims that this has not come into being – maybe he hasn’t heard of Java/COM/.Net?
On tools and environments he points out that the HOPL website lists 8512 programming languages. He claims rightly that there are too many languages/methodolgies and that there are also too many conferences and communities that do not talk to each other.
On multiprocessor design he talks ot modern multi-core machines and sites Amdhal’s law which states that if P is the proportion of a program that can be made parallel (i.e. benefit from parallelization), and (1 ? P) is the proportion that cannot be parallelized (remains serial), then the maximum speedup that can be achieved by using N processors is
(from Wikipedia: http://en.wikipedia.org/wiki/Amdahl’s_law)
Brian claims that grants, proposals, white papers, etc. for multi-core do not contain new ideas to address this fundamental problem and that multi-core systems only really help in cases in which we have multiple processes which do not interact.
The general thesis of this talk was that the progress in these three areas in the last 40 years has been quite disappointing.
Jean-Claude Laprie was more upbeat safety-critical systems, traditional software systems have pushed mean time to failure from a few hour to several years. He also mentioned formal methods and talked of the much cited Paris subway system. He said there is much promise and excitement in the future where we are moving away from static software engineering and moving towards Dynamic Software Engineering where systems are constructed by discovering and assembling independent services. This has many names including Service Oriented Architecture. This introduces more problems – evolutionary mismatches and introduces emergent behaviours. He cautions that in such systems there are problems with fault containment where a system becomes dependent on its component services.
He has many expectations in advances in dependability. He quotes Barry Boehm who says that improving agility and dependability will be one of the challenges for 21st century software engineers.
Michael Fagan was also more upbeat asking "what have we done to advance ourselves?". Languages are much more high level, terminology has changed, Methodologies have changed – or have they? – agile methodology for example. He described extreme methodologies and claimed it had been used by Harlan Mills many years ago (but not called Agile). He also advocated that we all keep things simple especially in the use of vocabulary – instead of using big words, use simple vocabulary – Occam’s razor again. He cites his experience of managing an OS project with several hundred software engineers (Michael Fagan introduced Inspections at IBM in the 1970’s). He also talked of attitudes and discipline have changed. However the big remaining problem with software engineering is getting requirements right. Who do we need to involve to insure that the user gets what the user wants? Intestingly, he asked if we could learn from the makers of games and talked about World of Warcraft and how rapidly they turn around versions of the game.