Saturday 27 April 2013

Herding dogs!

I was going to say this week has seen the herding of cats but its been more like dogs - bit easier to herd than cats but if you get it wrong you get eaten!

Its all been about a senior management workshop the focus being to pull together the structure and details to go into specification of contract wording. I wasn't running the thing - lucked out there - just a participant. Everyone else (if you read this - you know who you are ;) had run for the hills as the unleashed pack can be pretty ferocious! Anyway I though this could be fun, in a voyeuristic sort of way, as running these event is not easy at the best of times. However, shock, horror it all went pretty well, I even managed to contribute!

What made the difference? Well it was the use of a technique I came across 20 years ago, part of the now old looking SSADM toolset, but now reinvented as a six-sigma activity called SIPOC. What are all these acronyms doing in here. SSADM, in case you don't know as it was a long time ago, is Structured Systems Analysis Design Methodology - probably why it never really caught the imagination come to think of it. SIPOC stands for Supplier, Input, Process, Output, Customer - e.g. how to define an activity in the SSADM process modelling world.

The following link has a good description of what you do to fill out a SIPOC;

http://www.isixsigma.com/tools-templates/sipoc-copis/sipoc-diagram/

or check Wikipedia there must be something in there on it.

In fact, at the workshop, this process was followed very closely but with the addition of discussing the 'principles' of the contract area before launching into the detail of the 'process' part. Which provided a 10,000 ft introduction to what we were talking about and was a good feature to add.

So 10 out of 10 for SIPOC dog herding, and rock-on SSADM there is life there still ;)


Saturday 20 April 2013

Whole greater than the sum of the parts!

Realised I've been living a new word over the past few weeks and didn't even know it!

The word is 'equifinality' - what - well here is the Wikipedia definition:

"Equifinality is the principle that in open systems a given end state can be reached by many potential means. The term is due to Ludwig von Bertalanffy, the founder of General Systems Theory. He prefers this term, in contrast to "goal", in describing complex systems' similar or convergent behavior. It emphasizes that the same end state may be achieved via many different paths or trajectories. In closed systems, a direct cause-and-effect relationship exists between the initial condition and the final state of the system: When a computer's 'on' switch is pushed, the system powers up. Open systems (such as biological and social systems), however, operate quite differently. The idea of equifinality suggests that similar results may be achieved with different initial conditions and in many different ways. This phenomenon has also been referred to as isotelesis (Greek: ἴσος /isos/ "equal", τέλεσις /telesis/ "the intelligent direction of effort toward the achievement of an end.") when in games involving superrationality.

The previous post raised the issue of how to you start to define the requirements and projects for a complex system - essentially how can you make sure you have captured them all?

What you need of course is a very large dose of equifinality - you need to travel as many different paths through whatever it is the system is being designed to do as possible. These paths, of course, need to be both top-down and bottom-up and cover as many viewpoints of the system as possible.

So what sort of viewpoints are we talking about?

What about;

  • Physical architecture
  • Data - Information architecture
  • Business Process architecture
  • Security architecture
  • Enterprise architecture
  • Functional architecture 
  • User architecture

However, you may therefore need a degree in architecture to fully complete ;)

Final thought from the complexity course I'm doing is that even if you take all these paths you will still have some emergent property that will take you by surprise!






Saturday 13 April 2013

Bottom-up and Top-down

No ..... not that sort .....

Validation and verification related of course!

The Question:
You have a brave new view of the future for you operations - big data related to big assets and all that new stuff is banging on your door. Why aren't you using this to improve efficiencies in the business? Everyone else is - just read about what you can do.

Issue:
You have an operation that currently runs not that badly, is very complex, and has lots of fragmented data. How can you start to introduce a new big-data type system into what you do?

The Solution:
You need to start by gathering requirements for the new system, have a look through these and then see which can be implemented and on what timescales - of course - simples!

Well that's all very good from the 10,000 ft management helicopter view of the problem. The next step in this world is a bit of Start Trek Management (STM),

'Make It So'  number one.

and off we go.

Meanwhile in a universe near you, requirements gathering has started, as the make it so at this level doesn't involve much thinking, just a bit of organising of meetings. This usually goes well, everyone wants to get their 'issues' out on the table, "and a want a yellow button in the top corner of the screen" type stuff along with "we would like to manage risk at an enterprise level". Result - one big bucket full of requirements! Yes, yes I can hear you requirements management types - structured approach, attributes blah, blah... Unfortunately, here in the real world the Captain wants progress, and NOW! So things happen, and the feedback is good, everyone is venting, carry on number one! More workshops - they work. Bucket gets fuller and fuller - big data gone mad. We need a management tool for all this, role out some requirements software to manage it all. Phew thank goodness that existed now we can relax can't we? But no - its just a fancy bucket - we shall have to engage (STM) brain to figure out what to do with all this data (sorry - poor STM jokes).

Number one, "where are we" - "we have a bucket full sir"

So, the problem with the bottom-up set of activities is that you end up in a position where you can't see the wood from the tree's. While the STM top-down view of the world ends up launching a raft of projects but you are never sure if they will connect with the real world. The conclusion so far is that, unless you do both BU and TD then you will never figure out if your big data related initiatives will be viable and add value to the business.

Not thought further than this yet ..... sits down and puts fist on chin .....



Saturday 6 April 2013

3D's of computing (Data, Devil, Detail)

This week has seen a flurry of activity under the banner of Big Data!

The finale was Friday evening watching a recording of this weeks Horizon programme on 'Big Data' - which I watched with 3 of my advisor's - sounded like geek heaven to us. Anyway the programme unfolded, blah, blah, big data, lot of 1's and 0's flashing over the screen to show you where the big data was coming from and going to. As it went on though, I personally, was having trouble keeping my face straight - to the annoyance of one of my advisor's who kept telling me to shut up. Having slept on it and having been immersed in a real live project for the past month or so directly dealing with Very Big Data (maybe that will catch on - VBD ;) the things that were bothering me boil down to the following;

  1. there's a 'smoke and mirrors' feel about a lot of this big data talk. Certainly there is vast potential for mining data, but, from what I've seen 'ordinary' companies are miles away from being in a position to exploit it fully. Enter the big data repository suppliers who will solve all your big data consolidation and mining problems for you. Off you go....
  2. enter the mythical 'algorithm' - is having this central repository going to work. As in the Horizon programme, when you need to access the data all you do is create the algorithm to do what you need - simples! You have your data, you can access it from anywhere at any time (oh yes you can) what are you going to do with it (in my world you should have though of that beforehand but that's another story) you have your bucket of data and want to fish out some 'benefit'. What do you do, you write an algorithm to do this - most of this algorithm is just searching and filtering and displaying - not much algorithm about that. However, there could be an analysis element in this algorithm too - sounds like you need to dust of the old Fortran compiler to me! What's the problem, the problem is spreadsheets, everyone wants to run their own personal 'algorithm' dealing with their own specific needs - and quite rightly too! They take an extract of the big data, do some work on it, write the report and off they go. Well, probably a bit more than that but you get the idea! All this leads to fragmentation (again) of the data set as it is difficult to re-upload you work back into the mother ship. 
  3. what's needed of course is a managed way of allowing access to the big data and development of local 'algorithms' - sounds like app development to me! These can use and refresh the big data appropriately. Sorry seem to have entered the smoke and mirrors zone again. Great aspiration but do 'ordinary' companies really have the quality of data to allow meaningful apps to be developed?
The thoughts continue, keep smiling......