Saturday, 24 November 2012

Management of requirements management!

Quote this week from Bambofy - "you deal with the boring end of software development".

I think I agree.

This week has taken a bizarre twist in that its been a week of 'requirements management' (RQM) issues. Two area emerged, the first around how to specify them appropriately and the second on reuse of requirements. You have to admit that sounds pretty boring doesn't it!

But when you try to get your head round these things, the situation rapidly gets complicated. A problem emerges around the sheer number of 'requirements' that can be generated if you don't have a strategy for RQM. Let me try and illustrate.

Even for a simple system there is an exponential increase in the number of requirements the more you need to partition things. Lets not do software example as they tend to be a bit obtuse, but use a house build as an example. Hopefully we can all relate to that a bit better. I'm assuming in all this that everyone is signed up to undertaking some form of RQM as part of the design of course! The first decision is how are you going to represent the 'systems' involved as you will need to be able to allocate the requirements throughout the house in some manner. If you don't get this bit correct you have already increased the gradient of the requirements growth curve. In our house example you could take each room as a 'system' or you could take each major element of infrastructure as a 'system' or one of many other variations. Lets take the infrastructure view as this is more akin to what you would do for more complex assets, railways, oil platforms, power plans etc.

So off we go doing our requirements capture exercise - don't worry I'm not going to do the whole thing - even I'm not that sad!

There are at least say 10 major areas to consider, e.g. 1 water, 2 electrical, 3 heating, 4 lighting, 5 civil structure, 6 plumbing, 7 waste treatment, 8 accessibility, 9 safety, 10 useability ....... etc.

Each of these areas breaks down into at least 10 further sub-areas, e.g. for  1 water these could be 1.1 sinks, 1.2 baths, 1.3 toilets, 1.4 hot water, ..... etc.

Even for this relatively simple example we already have 10x10 or 100 sub-areas to allocate requirements to. We could then easily envisage coming up with say 10 main requirements for each of these sub-areas and at least a further 10 sub-requirements for each main requirement. You can see where this is going - we now have 100 (sub-areas)x10(main)x10(sub-main) or 10,000 requirements to allocate and track. On top of this it is likely that we would need to allocate a set of 'attributes' for each requirements so that we could also track certain types of requirements rather than just which area they are allocated to, for example attributes like, environment, performance, safety, quality .....etc. which could again easily add up to 10 minimum. So - you still awake - in total, without even trying, we have got ourselves into a situation where we are reporting and tracking 100,000 items - just for a house!

Serious problem eh - if you are not careful this is also serious job creation!

This number assumes also that you can clearly specify your requirements in the first place - if not you could easily start with (I have seen this) 100 top-level requirements leading to 1,000,000 items to manage - good luck with that one.

That is why it is imperative that you have a rationale for management of your requirements management. And, no, you don't just have to purchase a requirements management software package.

You then have to ask yourself, if you tick all the requirement boxes, is your built system the one you wanted - would you want a builder to manage the build of your house in this way - or would you rather have the build project overseen by a civil engineer?

In the overall scheme of things its still pretty boring - but critical to get right!

Now some of these requirements can surely be reused on the next house - but which ones ;)

Saturday, 17 November 2012

Analytical taxonomies - appropriate analysis

Having had a pop at approaches to 'Big Data Analytics' based around spreadsheets in the last post the question has to be "so what does appropriate analysis look like?"

In my various internet wanderings this week I came across a couple of articles that for me give a glimpse into what the future should look like.

The first is by Jim Sinur in an entry on applying analytics to processes and not just data, follow the link for more detail;

http://blogs.gartner.com/jim_sinur/2012/11/13/is-unmanned-process-management-a-pipe-dream/

In fact, thinking through exactly what you are expecting your 'processes' to deliver rather than simply feeding the process, is key, as is 'unmanned' optimising and management of interactions between them!  

The figure below illustrates some of the analytical taxonomy that could be used.


As well as the process analytics elements outlined above the sheer volume of data to work through will also require new computing techniques. The second article I came across by Rick Merritt in EETimes illustrates the type of computing power that will be available;

http://www.eetimes.com/electronics-news/4401143/Startup-demos-new-road-to-computing-at-SC-12

which is by the sounds of it is 40,000 processors working in a parallel configuration using neural net and fuzzy logic techniques to crank out 5.2 tera-operations per second!


So the Big Data Analytics future, for me, contains complexity in both analysis techniques and computational systems. A bit more that a few unconnected spreadsheets.

Looks exciting eh!!

Sunday, 11 November 2012

Big Data Analytics - innaproriate analysis

I thought I wasn't going to rant about this again for a while but three encounters this week have fanned the flames again.

I don't know how many Twitter and LinkedIn posts I have made on Big Data + Analytics over recent months but its definitely an area on an increasing trend in the business world. However, the reality is most of the business world struggles to extract any meaningful  'knowledge' from all of the 'data' that is 'collated' from business activities.

Why is that - because the main analysis tools used are spreadsheets - an in particular - Excel. Now don't get me wrong Excel is a phenomenal software package - but in my view in some instances it is being used to produce models that are way outside of its domain of appropriateness.

What do I mean by that? Well - three events this week have highlighted the tip of the iceberg for me. All of these are being addressed, I hasten to add, but I don't think I am alone in my experiences.

1 The first was when I was sat in a meeting looking at the projected image of some analysis using Excel, upon which we were making decisions that affected the direction of the business. One of a myriad of cells was being concentrated on - and the value in the cell was 'zero'. Everyone in the room knew that wasn't right so we all sat there for 5 minutes discussing why this was so. Now this could have been a simple mistake somewhere on one of the supporting sheets but the effect it had was to throw the whole of the analysis into question. How could we then believe any of the other numbers. Therein lies the first 'rant fact' - it is difficult to manage traceability in these sorts of tools.

2 The second was when I was asked to comment and add to a sheet for some supporting data input into a model. Someone was collating data to help build up a spreadsheet model and was emailing around for changes to the data sheet. Of course no one person holds all of this in their head so people were passing on the sheet for refinement. The version that came to me for input was 'blah filename - Copy - Copy - Copy'. Therein lies the second 'rant fact' - if not part of some wider process, configuration and version control can get out of hand pretty quickly.

3 The third and for me the most serious came from checking through to try and understand a part of a model that didn't appear to be functioning as expected (see 'rant fact' 1). When I looked into the sheets in question - without even going into any of the equation set being used - I found one sheet with 100 columns and 600 rows of manually input driven data entries - that's 60,000 places for making an input error on that sheet alone and there were more sheets!  Therein lies the third 'rant fact' - data quality within this environment is difficult to control.

The issue is that Excel in particular is so easy to fire up and start bashing away at, that we forget that we are in some cases building complex calculation engines. In some instances these engines are not using any 'design' process at all. There is no overarching systems design process and even at a simplistic level there is no recognition of fundamental modelling techniques that would improve modelling and therefore output quality, namely, consideration of the following;

1 Functional model development - what is the sheet set up to do - even a simple flowchart would help never mind some functional breakdown of the calculation set.

2 Data model development - what data, where from, what format, type views to force thinking about quality control of data, a database maybe!

3 Physical model of the hardware - how does the overall 'system', including data input, connect, store and process the information.  Maybe using email and collating feedback on a laptop is not the best system configuration.

All these activities add time and cost to model development and because their results are intangible and difficult to measure can get left out in the rush to get the answer out. However, the question is, would you put your own money at risk on the basis of this 'answer'?

What is the solution? Well certainly don't let learner drivers loose in the F1 racing car for a start - but there must also be some way of providing an easily accessed development environment that can   be used to translate formula into readable and understandable code - formula translation - now that could catch on (sorry couldn't resist!).

Saturday, 3 November 2012

To blog or to curate - that is the question?

More a thought for the day this one.

You definitely need a strategic approach to get the most out of all of this social media capability. There is so much to go at you can quite easily become social app weary. Not to mention spending your whole life trawling through the various information feeds!

Check out Guy Kawasaki's view on the following link for a more 'rounded' assessment

http://www.socialmediaexaminer.com/blogs-books-and-social-how-the-world-has-changed/

Which is great, but what are you going to do with all this 'networking' data and information, just leave it all hanging out there?

That is why I believe you need some sort of strategic goal - something that all of the collating and curating works towards. Currently, myself and one of my trusted advisor's (JB you know who you are)  are having a go at feeding and filtering information related to setting up a consultancy business. Which is something I have lived through and so can do the curating and is something that JB is interested in and can do the editing. The ultimate goal is to produce a book which effectively has been filtered through the process we are following.

The process at the moment goes like this;

  1. Curate the material on Scoop.it
  2. Select relevant entries for inclusion in the book
  3. Do an initial organisation of the information based upon business content
  4. Enter these into Storify story repository
  5. Arrange the content in Storify
  6. Produce the narrative around this content
  7. Flesh out the narrative to produce the book
  8. Publish as an eBook

Simples'

Who knows what it will produce - then the question is - can you automate this and produce a book automatically - what!!

Scoop.it process link - http://www.scoop.it/t/how-to-set-up-a-consulting-services-business

Storify process link - work in progress don't forget - http://storify.com/grandwizz/for-those-who-want-to-quickly-set-up-in-business

Or is life too short ..... at least its fun trying?

Saturday, 27 October 2012

Quality of the software quality standard!

Just for completeness below are the current set of 'metrics' recommended from the software quality standard.

You can try and demonstrate 'progress' using these and some smoke and mirrors (sorry - project review document) but most folk would start to glaze over.

 ISO 9126-3:2003 (factors/criteria) properties/sub properties referenced from 

http://www.arisa.se/compendium/node6.html
You are probably thinking I haven't put any of these into place for my coding - join the club - we have probably been distracted by actually trying to solve the problem ;)

End of software progress rant.....

Friday, 19 October 2012

One year on.

Can't believe it but have been doing this for a year!

What I thought would be a pretty old school re-introduction to Fortran computing has turned into a fantastic kaleidoscope of programming, networking and cloud computing activities. Its not exactly been the structured journey originally planned but rather a meander around various avenues as and when I came across them.

One frightening realisation was that it didn't take long to get back into a Fortran 'flow' - very enjoyable - mathematical routines the lot. Which was a great re-starter! However, the most exciting part of the past year has been the introduction to new media for sharing and networking of 'knowledge'. The brilliant thing is that this part of the revival has cost absolutely nothing in terms of application costs. The biggest cost was in the time invested in development of the content and network. This however, is part of the attraction of course!

Quick summary of online things, including an estimate of the 'value' of each;


  1. LinkedIn - essential, nuff said, 10/10
  2. Storify - excellent repository and for thread building - 7/10
  3. Scoop.it - very easy for collation of things - 7/10
  4. Google+ - I like this as it connects me to all other G-things - 9/10
  5. Google Sites - brilliant free web-site builder - free to anyone - 10/10
  6. Google Reader - great collator of news feeds - 6/10
  7. Google analytics - great for monitoring web-site activity - 6/10
  8. Twitter - original thought that this would be rubbish but turned out to be fantastic for current news - 10/10
  9. Tweetdeck - what a dashboard - 9/10
  10. Trello - came out of the blue - project/activity manageent tool - now essential 9/10
  11. Corkboard - the one that fell by the wayside - remined post it note site - 4/10
  12. Blogger - withut which you wouldn't be reading this years worth of diary - 10/10
  13. GitHub - code repository - essentail  - one with great potential for the future 8/10
  14. Dropbox - absolutely brilliant - 10/10
  15. Photobucket - saved me hours copying photo's between machines - 6/10


Wow - that a pretty exhaustive list but still only scratching the surface - you definitely need a 'strategy' for managing all this - otherwise burnout will ensue. In my view the trick is to have an 'approach' for each of these and on how they fit together - more of that in future posts.

With thanks to all - named in past posts - for pointing me in the direction of these new worlds!!

Saturday, 13 October 2012

A million dollar idea!

Cont' ..... from last blog post.

Having mulled the problem of measuring software development over for a week and having performed extensive literature searches (that's Google and Wikipedia) I think I've got one - a million dollar idea that is!

Maybe I shouldn't tell .... hey ho - won't be the first time.

So the best the extensive research could come up with was essentially you need to define some 'Metrics' that you then monitor, for example;

  1. metric - number of lines of code written - issue, doesn't this just measure the efficiency of the coding team?
  2. metric - number of bug's corrected - issue, same as 1 above?
  3. metric - structured processes in place - issue, is anyone following them?
  4. metric - verification and validation testing - issue, not bad but does it measure progress?
  5. metric - timelines project planning - issue, will you (or maybe you will) plateaux at 90% complete?
  6. metric [for 'agile' development processes such as RAD, DSDM, and other HP (Hackers Paradise;) processes] - small programme elements, and the novel concept of speaking the projects team! - issue - again not bad but am I going to tell you I have a stuck bolt and its going to take me at least a week to  fix it?
  7. etc.
Note to self: must remember the 'agile' terminology for future reference and use on Spreadsheet development projects!


All this is still pretty intangible for the 'manager' who just wants to know if its going OK and will things be finished on time. Not an unreasonable question. So - there must be something better and more user friendly that all this metric stuff (which is useful/essential don't get me wrong). Being software geeks surely there is some software available that does this for you? I've not come across anything more than app's that mechanise the above 'metrics' which is not the right answer in my view.

So I started thinking about how a civil engineering project goes about doing this - not that I am a civil engineer - but that's the point! Say the project was construction of a house, even I could walk along to the site and have a look. Is it still a hole in the ground, have the foundations been put in place is the roof on? These are all tangible milestones that can be viewed by anyone. You could look at the project plan and progress charts, measure time on the job, ensure the contractor has processes in place for delivery - all metrics like the software ones. However, there is no substitute for going and having a look at the site! So what is the software equivalent of 'viewing the site' then?

Taking the civil engineering analogy a step further, and probably stretching it a bit. Could we not present things a little differently, say;

  • hole in the ground - SW equivalent,  project plan and development team in place,
  • foundations in place - SW equivalent, detailed design completed,
  • topping out - party time - SW equivalent, major functional components completed and VnV'd,
  • final fitout - all input output routines completed.
  • etc

what would then be needed is a slick piece of software that puts this into some 'house build' type view of the project. The metrics will then be linked to something more tangible, something that anyone can understand. If you don't have the roof on, for example, then you could have major delays if the main functional requirements have not been coded. There is no point in fiddling about with the fitout if the roof still needs completing (critical path analogy).

Et voila - you have a visualisation that anyone can view and understand.

Well something like that anyway. Now, where do I collect my money .....