The West Coast issue seems to be related to spreadsheet use - of which past posts give my view on that so I won't bang on about those any more here. However, this week, a couple of other queries have also raised alarm bells, and these are related to main stream software projects on critical infrastructures!
Both questions were effectively related to how can clients assure themselves that the right level of progress is being made with complex software development. One issue was around ensuring the right level of reliability of a particular code and the other was around assurances to meet delivery timescales.
How do you measure software development 'progress' in a consistent manner? Particularly with the variety of development methods that can be called up, for example, a list taken from Wikipedia states;
"..., the software development methodology is an approach used by organizations and project teams to apply the software development methodology framework (noun). Specific software development methodologies (verb) include:
- 1970s
- Structured programming since 1969
- Cap Gemini SDM, originally from PANDATA, the first English translation was published in 1974. SDM stands for System Development Methodology
- 1980s
- Structured systems analysis and design method (SSADM) from 1980 onwards
- Information Requirement Analysis/Soft systems methodology
- 1990s
- Object-oriented programming (OOP) developed in the early 1960s, and became a dominant programming approach during the mid-1990s
- Rapid application development (RAD), since 1991
- Dynamic systems development method (DSDM), since 1994
- Scrum, since 1995
- Team software process, since 1998
- Extreme programming, since 1999
Or some homemade variant thereof!"
So - just how do you visualise progress on a software development activity? Using verification and validation monitoring, ensuring timelined activities, timesheets of developers or a variety of other metrics? So many processes with so many variations of 'progress'.
to be continued.....
to be continued.....