Saturday, 1 December 2012

An emergent property!

Over the past few weeks - through my various meanderings on Twitter and LinkedIn - something that I have been seeking has emerged! 

A topic that I came across on a Tweet by one of my GURU's (that's Grandwizz Useful Research Unit) was taken and re-entered into one of the company 'thought leadership' groups on LinkedIn. Nothing Earth shattering in that process, however, it has been fascinating seeing how this topic has sparked interest among a 'self-organising' group of people. No need to send emails around the various operating units around the globe to canvas for support - usually resulting in getting someone nominated who is not fully engaged - the Diamond Dogs have formed. 

Comprising, me plus;

  • Ben
  • Cam
  • Eric
  • Ian
  • Marcelo
  • Paul
  • Tom - who set the challenge!
Many thanks to all - you know who you are - for the input so far by-the-way.

The topic we are thinking through is around the use of 'gamification' to help support and grow staff development -  which sounds pretty boring when you say it like that. Essentially the use of game and token incentives - like collecting game point to show your 'power' to others. Or possibly hotel points in my case given the number of scheme's I seem to be enrolled in! Given that this was completely new to me a couple of weeks ago I am now seeing these sorts of things everywhere, LinkedIn 'profile % complete', 'endorsements', Twitter followers' etc..

The challenge has resulted from Tom's use of FourSquare (I'd not used that either - just to let you know how far behind the drag curve I am on these things) where 'badges' can be gained for visiting certain places. Badge collection resulting in gamification of travel. I'm still struggling with FS if truth be told but I can understand the concept of incentives for visiting places - still feels a bit boy scout-ish to me though. The concept of using badges for training and development purposes is well established - few sites listed below for those interested - and widely used in the education field. Our challenge was could we not apply these concepts to our internal activities? Seems like a very reasonable task - how to use non-monetary public recognition awards within the business to help raise staff engagement. 

Some of the key requirements - given the topics of previous posts I had to put a few of these down ;)
  1. must be easy for staff to 'sign-up' to the scheme
  2. must be accessible to all staff - no inner-circles
  3. must be recognised across the business
  4. must be easy to implement
  5. must be publicly viewable
  6. must be linked to tangible benefits (e.g. enhanced peer recognition)
  7. must be cheap (if not free)!
See how we get on in future post's ;)  .....



Badge collection site links

A few links if you want to explore further:

http://gamifyforthewin.com/2012/10/big-news-the-book-is-out/

http://duolingo.com/

http://www.freetech4teachers.com/


http://www.openbadges.org/en-US/ which provides some basic open-source tools to accomplish intrinsic badge reward set-ups.

Saturday, 24 November 2012

Management of requirements management!

Quote this week from Bambofy - "you deal with the boring end of software development".

I think I agree.

This week has taken a bizarre twist in that its been a week of 'requirements management' (RQM) issues. Two area emerged, the first around how to specify them appropriately and the second on reuse of requirements. You have to admit that sounds pretty boring doesn't it!

But when you try to get your head round these things, the situation rapidly gets complicated. A problem emerges around the sheer number of 'requirements' that can be generated if you don't have a strategy for RQM. Let me try and illustrate.

Even for a simple system there is an exponential increase in the number of requirements the more you need to partition things. Lets not do software example as they tend to be a bit obtuse, but use a house build as an example. Hopefully we can all relate to that a bit better. I'm assuming in all this that everyone is signed up to undertaking some form of RQM as part of the design of course! The first decision is how are you going to represent the 'systems' involved as you will need to be able to allocate the requirements throughout the house in some manner. If you don't get this bit correct you have already increased the gradient of the requirements growth curve. In our house example you could take each room as a 'system' or you could take each major element of infrastructure as a 'system' or one of many other variations. Lets take the infrastructure view as this is more akin to what you would do for more complex assets, railways, oil platforms, power plans etc.

So off we go doing our requirements capture exercise - don't worry I'm not going to do the whole thing - even I'm not that sad!

There are at least say 10 major areas to consider, e.g. 1 water, 2 electrical, 3 heating, 4 lighting, 5 civil structure, 6 plumbing, 7 waste treatment, 8 accessibility, 9 safety, 10 useability ....... etc.

Each of these areas breaks down into at least 10 further sub-areas, e.g. for  1 water these could be 1.1 sinks, 1.2 baths, 1.3 toilets, 1.4 hot water, ..... etc.

Even for this relatively simple example we already have 10x10 or 100 sub-areas to allocate requirements to. We could then easily envisage coming up with say 10 main requirements for each of these sub-areas and at least a further 10 sub-requirements for each main requirement. You can see where this is going - we now have 100 (sub-areas)x10(main)x10(sub-main) or 10,000 requirements to allocate and track. On top of this it is likely that we would need to allocate a set of 'attributes' for each requirements so that we could also track certain types of requirements rather than just which area they are allocated to, for example attributes like, environment, performance, safety, quality .....etc. which could again easily add up to 10 minimum. So - you still awake - in total, without even trying, we have got ourselves into a situation where we are reporting and tracking 100,000 items - just for a house!

Serious problem eh - if you are not careful this is also serious job creation!

This number assumes also that you can clearly specify your requirements in the first place - if not you could easily start with (I have seen this) 100 top-level requirements leading to 1,000,000 items to manage - good luck with that one.

That is why it is imperative that you have a rationale for management of your requirements management. And, no, you don't just have to purchase a requirements management software package.

You then have to ask yourself, if you tick all the requirement boxes, is your built system the one you wanted - would you want a builder to manage the build of your house in this way - or would you rather have the build project overseen by a civil engineer?

In the overall scheme of things its still pretty boring - but critical to get right!

Now some of these requirements can surely be reused on the next house - but which ones ;)

Saturday, 17 November 2012

Analytical taxonomies - appropriate analysis

Having had a pop at approaches to 'Big Data Analytics' based around spreadsheets in the last post the question has to be "so what does appropriate analysis look like?"

In my various internet wanderings this week I came across a couple of articles that for me give a glimpse into what the future should look like.

The first is by Jim Sinur in an entry on applying analytics to processes and not just data, follow the link for more detail;

http://blogs.gartner.com/jim_sinur/2012/11/13/is-unmanned-process-management-a-pipe-dream/

In fact, thinking through exactly what you are expecting your 'processes' to deliver rather than simply feeding the process, is key, as is 'unmanned' optimising and management of interactions between them!  

The figure below illustrates some of the analytical taxonomy that could be used.


As well as the process analytics elements outlined above the sheer volume of data to work through will also require new computing techniques. The second article I came across by Rick Merritt in EETimes illustrates the type of computing power that will be available;

http://www.eetimes.com/electronics-news/4401143/Startup-demos-new-road-to-computing-at-SC-12

which is by the sounds of it is 40,000 processors working in a parallel configuration using neural net and fuzzy logic techniques to crank out 5.2 tera-operations per second!


So the Big Data Analytics future, for me, contains complexity in both analysis techniques and computational systems. A bit more that a few unconnected spreadsheets.

Looks exciting eh!!

Sunday, 11 November 2012

Big Data Analytics - innaproriate analysis

I thought I wasn't going to rant about this again for a while but three encounters this week have fanned the flames again.

I don't know how many Twitter and LinkedIn posts I have made on Big Data + Analytics over recent months but its definitely an area on an increasing trend in the business world. However, the reality is most of the business world struggles to extract any meaningful  'knowledge' from all of the 'data' that is 'collated' from business activities.

Why is that - because the main analysis tools used are spreadsheets - an in particular - Excel. Now don't get me wrong Excel is a phenomenal software package - but in my view in some instances it is being used to produce models that are way outside of its domain of appropriateness.

What do I mean by that? Well - three events this week have highlighted the tip of the iceberg for me. All of these are being addressed, I hasten to add, but I don't think I am alone in my experiences.

1 The first was when I was sat in a meeting looking at the projected image of some analysis using Excel, upon which we were making decisions that affected the direction of the business. One of a myriad of cells was being concentrated on - and the value in the cell was 'zero'. Everyone in the room knew that wasn't right so we all sat there for 5 minutes discussing why this was so. Now this could have been a simple mistake somewhere on one of the supporting sheets but the effect it had was to throw the whole of the analysis into question. How could we then believe any of the other numbers. Therein lies the first 'rant fact' - it is difficult to manage traceability in these sorts of tools.

2 The second was when I was asked to comment and add to a sheet for some supporting data input into a model. Someone was collating data to help build up a spreadsheet model and was emailing around for changes to the data sheet. Of course no one person holds all of this in their head so people were passing on the sheet for refinement. The version that came to me for input was 'blah filename - Copy - Copy - Copy'. Therein lies the second 'rant fact' - if not part of some wider process, configuration and version control can get out of hand pretty quickly.

3 The third and for me the most serious came from checking through to try and understand a part of a model that didn't appear to be functioning as expected (see 'rant fact' 1). When I looked into the sheets in question - without even going into any of the equation set being used - I found one sheet with 100 columns and 600 rows of manually input driven data entries - that's 60,000 places for making an input error on that sheet alone and there were more sheets!  Therein lies the third 'rant fact' - data quality within this environment is difficult to control.

The issue is that Excel in particular is so easy to fire up and start bashing away at, that we forget that we are in some cases building complex calculation engines. In some instances these engines are not using any 'design' process at all. There is no overarching systems design process and even at a simplistic level there is no recognition of fundamental modelling techniques that would improve modelling and therefore output quality, namely, consideration of the following;

1 Functional model development - what is the sheet set up to do - even a simple flowchart would help never mind some functional breakdown of the calculation set.

2 Data model development - what data, where from, what format, type views to force thinking about quality control of data, a database maybe!

3 Physical model of the hardware - how does the overall 'system', including data input, connect, store and process the information.  Maybe using email and collating feedback on a laptop is not the best system configuration.

All these activities add time and cost to model development and because their results are intangible and difficult to measure can get left out in the rush to get the answer out. However, the question is, would you put your own money at risk on the basis of this 'answer'?

What is the solution? Well certainly don't let learner drivers loose in the F1 racing car for a start - but there must also be some way of providing an easily accessed development environment that can   be used to translate formula into readable and understandable code - formula translation - now that could catch on (sorry couldn't resist!).

Saturday, 3 November 2012

To blog or to curate - that is the question?

More a thought for the day this one.

You definitely need a strategic approach to get the most out of all of this social media capability. There is so much to go at you can quite easily become social app weary. Not to mention spending your whole life trawling through the various information feeds!

Check out Guy Kawasaki's view on the following link for a more 'rounded' assessment

http://www.socialmediaexaminer.com/blogs-books-and-social-how-the-world-has-changed/

Which is great, but what are you going to do with all this 'networking' data and information, just leave it all hanging out there?

That is why I believe you need some sort of strategic goal - something that all of the collating and curating works towards. Currently, myself and one of my trusted advisor's (JB you know who you are)  are having a go at feeding and filtering information related to setting up a consultancy business. Which is something I have lived through and so can do the curating and is something that JB is interested in and can do the editing. The ultimate goal is to produce a book which effectively has been filtered through the process we are following.

The process at the moment goes like this;

  1. Curate the material on Scoop.it
  2. Select relevant entries for inclusion in the book
  3. Do an initial organisation of the information based upon business content
  4. Enter these into Storify story repository
  5. Arrange the content in Storify
  6. Produce the narrative around this content
  7. Flesh out the narrative to produce the book
  8. Publish as an eBook

Simples'

Who knows what it will produce - then the question is - can you automate this and produce a book automatically - what!!

Scoop.it process link - http://www.scoop.it/t/how-to-set-up-a-consulting-services-business

Storify process link - work in progress don't forget - http://storify.com/grandwizz/for-those-who-want-to-quickly-set-up-in-business

Or is life too short ..... at least its fun trying?

Saturday, 27 October 2012

Quality of the software quality standard!

Just for completeness below are the current set of 'metrics' recommended from the software quality standard.

You can try and demonstrate 'progress' using these and some smoke and mirrors (sorry - project review document) but most folk would start to glaze over.

 ISO 9126-3:2003 (factors/criteria) properties/sub properties referenced from 

http://www.arisa.se/compendium/node6.html
You are probably thinking I haven't put any of these into place for my coding - join the club - we have probably been distracted by actually trying to solve the problem ;)

End of software progress rant.....

Friday, 19 October 2012

One year on.

Can't believe it but have been doing this for a year!

What I thought would be a pretty old school re-introduction to Fortran computing has turned into a fantastic kaleidoscope of programming, networking and cloud computing activities. Its not exactly been the structured journey originally planned but rather a meander around various avenues as and when I came across them.

One frightening realisation was that it didn't take long to get back into a Fortran 'flow' - very enjoyable - mathematical routines the lot. Which was a great re-starter! However, the most exciting part of the past year has been the introduction to new media for sharing and networking of 'knowledge'. The brilliant thing is that this part of the revival has cost absolutely nothing in terms of application costs. The biggest cost was in the time invested in development of the content and network. This however, is part of the attraction of course!

Quick summary of online things, including an estimate of the 'value' of each;


  1. LinkedIn - essential, nuff said, 10/10
  2. Storify - excellent repository and for thread building - 7/10
  3. Scoop.it - very easy for collation of things - 7/10
  4. Google+ - I like this as it connects me to all other G-things - 9/10
  5. Google Sites - brilliant free web-site builder - free to anyone - 10/10
  6. Google Reader - great collator of news feeds - 6/10
  7. Google analytics - great for monitoring web-site activity - 6/10
  8. Twitter - original thought that this would be rubbish but turned out to be fantastic for current news - 10/10
  9. Tweetdeck - what a dashboard - 9/10
  10. Trello - came out of the blue - project/activity manageent tool - now essential 9/10
  11. Corkboard - the one that fell by the wayside - remined post it note site - 4/10
  12. Blogger - withut which you wouldn't be reading this years worth of diary - 10/10
  13. GitHub - code repository - essentail  - one with great potential for the future 8/10
  14. Dropbox - absolutely brilliant - 10/10
  15. Photobucket - saved me hours copying photo's between machines - 6/10


Wow - that a pretty exhaustive list but still only scratching the surface - you definitely need a 'strategy' for managing all this - otherwise burnout will ensue. In my view the trick is to have an 'approach' for each of these and on how they fit together - more of that in future posts.

With thanks to all - named in past posts - for pointing me in the direction of these new worlds!!