Not by me - Prof Jefferies has appointed Bambofy to help sort out the analysis of the swimming data.
First up is the data extraction so that we can pick out the waveforms associated with each hand. Given the way the data is stored even doing that is turning out to be a bit of a trauma. This is being done using Python code - so that's the end of me - PJ was going to do this in Fortran (same code era as me you see) but we have been overruled.
The monkey is now on Bambofy's back to split the waveforms into left and right hand sets - then the fun starts.....
All things computing, software engineering and physics related! Now also including retirement stuff.
Friday, 3 August 2012
Sunday, 29 July 2012
Holiday computing!
All a bit sporadic for the next few weeks while holidaying takes precedence.
Think may have a new application for Robfit fittery - nothing to do with gamma rays this time but swimming related!
The essence of the fitting is based around analysis of swimmer force profiles which have been measured using 'glove' worn by swimmers. Analysing the waveforms from these gloves is being used to improve swimming technique. A better explanation of the physics behind all this is provided in an article by friends Stuart and Colleen in the Journal of International Society of Swimming Coaching;
Full reference; http://isosc.org/index.php/journal/international-journal-of-swimming-coaching/261-journal-volume-2-issue-2
The Effect of Real-Time Feedback on Swimming Technique (page 41)
"We examine a new approach for accelerating the learning of efficient stroke mechanics: using a flume equipped to deliver multi-perspective live video footage and force analysis data simultaneously to the swimmer and the coach. A preliminary study of the effectiveness of this approach with a small group of age group swimmers shows gains in ability to generate force of around 20% and to improve swim velocity with only two hours of application."
where you can see the profiles produced by the gloves.
May take few few beers to come up with the optimal way of analysing the waveforms mind...;)
Think may have a new application for Robfit fittery - nothing to do with gamma rays this time but swimming related!
The essence of the fitting is based around analysis of swimmer force profiles which have been measured using 'glove' worn by swimmers. Analysing the waveforms from these gloves is being used to improve swimming technique. A better explanation of the physics behind all this is provided in an article by friends Stuart and Colleen in the Journal of International Society of Swimming Coaching;
Full reference; http://isosc.org/index.php/journal/international-journal-of-swimming-coaching/261-journal-volume-2-issue-2
The Effect of Real-Time Feedback on Swimming Technique (page 41)
"We examine a new approach for accelerating the learning of efficient stroke mechanics: using a flume equipped to deliver multi-perspective live video footage and force analysis data simultaneously to the swimmer and the coach. A preliminary study of the effectiveness of this approach with a small group of age group swimmers shows gains in ability to generate force of around 20% and to improve swim velocity with only two hours of application."
where you can see the profiles produced by the gloves.
May take few few beers to come up with the optimal way of analysing the waveforms mind...;)
Saturday, 21 July 2012
ROBFIT operation.....how it works!
So how does it work - just realised I launched into GitHub loading without setting out how the code operates.
ROBFIT - that's the Fortran code used to find peaks within a complex spectrum, such as a gamma ray spectrum where the code originated. The idea behind the code is that it is designed to find the very smallest peaks (signals) in a spectrum and it does that by employing a ROBust FITing technique.
There are many spectral analysis packages on the market, however, these tend to require the spectrum to be broken into small sections, each of which is then fitted separately. This creates a couple of major problems. One is that if you have a large spectrum then there is substantial user intervention required resulting in fitting taking increased time to complete. Secondly, and more importantly, splitting the background into sections may misrepresent the background continuum. Small details in the spectrum could therefore be missed.
What's the solution?
Use ROBFIT - sorry - but yes do - the code gets round these problems by seperating spectra into two functions: background and foreground. The background contains slowly varying features and the foreground contains high-frequency content (peaks). Accurate separation of these functions allows the code to detect small peaks and decompose multiple-peak structures. ROBFIT iterates on background and foreground fitting to move smaller peaks from the background to the foreground.
A critical feature is that the code fits the background over the entire spectrum as a set of cubic-splines with adjustable knots - a knot being a place where two cubic splines meet. More on this in later post. Fitting over the whole spectrum range allows the background features to be continually fitted with fewer constants, resulting in a more accurate representation than is possible when fitting in small sections.
Two algorithms make operation this possible. The first is a data compression algorithm which uses a robust averaging technique to reduce the contributions to the background from peaks and spurious high points. The second is a minimisation algorithm (SMSQ routine) that minimises chi-square with respect to the constants of the background and foreground. With the background represented as a smoothly varying function, peaks can be identified as regions of the spectra that lie above this background curve - simples!
So now you know.....
ROBFIT - that's the Fortran code used to find peaks within a complex spectrum, such as a gamma ray spectrum where the code originated. The idea behind the code is that it is designed to find the very smallest peaks (signals) in a spectrum and it does that by employing a ROBust FITing technique.
There are many spectral analysis packages on the market, however, these tend to require the spectrum to be broken into small sections, each of which is then fitted separately. This creates a couple of major problems. One is that if you have a large spectrum then there is substantial user intervention required resulting in fitting taking increased time to complete. Secondly, and more importantly, splitting the background into sections may misrepresent the background continuum. Small details in the spectrum could therefore be missed.
What's the solution?
Use ROBFIT - sorry - but yes do - the code gets round these problems by seperating spectra into two functions: background and foreground. The background contains slowly varying features and the foreground contains high-frequency content (peaks). Accurate separation of these functions allows the code to detect small peaks and decompose multiple-peak structures. ROBFIT iterates on background and foreground fitting to move smaller peaks from the background to the foreground.
A critical feature is that the code fits the background over the entire spectrum as a set of cubic-splines with adjustable knots - a knot being a place where two cubic splines meet. More on this in later post. Fitting over the whole spectrum range allows the background features to be continually fitted with fewer constants, resulting in a more accurate representation than is possible when fitting in small sections.
Two algorithms make operation this possible. The first is a data compression algorithm which uses a robust averaging technique to reduce the contributions to the background from peaks and spurious high points. The second is a minimisation algorithm (SMSQ routine) that minimises chi-square with respect to the constants of the background and foreground. With the background represented as a smoothly varying function, peaks can be identified as regions of the spectra that lie above this background curve - simples!
So now you know.....
Saturday, 14 July 2012
Momentum ..... increasing.
Update of the Twitter social universe side of the revival.
Approaching 100 Twitter followers has been an interesting journey. The 100, which is 'pulsating' all the time, is homing in on an excellent set of communities. Having been directed to the use of Lists by @sheffters I now have a way through the noise of Twitter land. Though I thought I was listing these contacts for my own use I found out this week that the person who I have listed also gets informed - not a problem really, in fact had a few nice messages from people as a result!
These contacts have taken a bit of a tortured route though. Which makes me a bit suspicious that I am being spoon fed by the Twitter machine? I started off with lots of 'follows' from nice young ladies - at least that's what they looked like - which I didn't follow back I will have you know! Welcome to Twitter. However, these soon 'unfollow'. If (like me) you have a plan for the use of Twitter then things start to get organised pretty quickly - well its taken a few months to get to this stage. Soon the follows become more relevant, subject area wise that is. If you then have a rationale for who you do follow back you end up with a pretty focussed set of information feeds. I'm probably driving my contacts on LinkedIn mad by posting links that I come across there too, but that's all part of that sharing thing.
The question now is what happens next, what happens with the focussed group, how can I ask a question of this set of individuals and not be lost in the noise?
Onward and upward as they say.....
Approaching 100 Twitter followers has been an interesting journey. The 100, which is 'pulsating' all the time, is homing in on an excellent set of communities. Having been directed to the use of Lists by @sheffters I now have a way through the noise of Twitter land. Though I thought I was listing these contacts for my own use I found out this week that the person who I have listed also gets informed - not a problem really, in fact had a few nice messages from people as a result!
These contacts have taken a bit of a tortured route though. Which makes me a bit suspicious that I am being spoon fed by the Twitter machine? I started off with lots of 'follows' from nice young ladies - at least that's what they looked like - which I didn't follow back I will have you know! Welcome to Twitter. However, these soon 'unfollow'. If (like me) you have a plan for the use of Twitter then things start to get organised pretty quickly - well its taken a few months to get to this stage. Soon the follows become more relevant, subject area wise that is. If you then have a rationale for who you do follow back you end up with a pretty focussed set of information feeds. I'm probably driving my contacts on LinkedIn mad by posting links that I come across there too, but that's all part of that sharing thing.
The question now is what happens next, what happens with the focussed group, how can I ask a question of this set of individuals and not be lost in the noise?
Onward and upward as they say.....
Saturday, 7 July 2012
Next subroutine.....
I've almost forgotten what 'subroutines' do!
However, managed to load onto GITHUB the first routine called by the ROBFIT background fitting code BKGFIT detailed in previous posts.
The code BKLINK is now available for viewing. Though think I should have put more comments into the code!
Anyway - this is another of the routines used in the background fitting process - it is essentially an input routine that reads user defined values from a file called BKGFIT.MNU - which I still need to find!
Why bother fitting the background when the idea is to be looking for small peaks?
The code has been mainly developed around fitting of gamma-ray spectra but can be used on any data set which required the identification of peaks in among significant background 'noise'. Once you have identified what the background looks like and have represented it mathematically the search for small variations from this representation is made easier. Exactly like the identification of the signal for the Higgs Boson reported upon this week. Blimey I am topical - it wasn't planned!
Essentially the operation of the ROBFIT code follows a sequence;
Except it gets a bit more complicated......more on that later!
However, managed to load onto GITHUB the first routine called by the ROBFIT background fitting code BKGFIT detailed in previous posts.
The code BKLINK is now available for viewing. Though think I should have put more comments into the code!
Anyway - this is another of the routines used in the background fitting process - it is essentially an input routine that reads user defined values from a file called BKGFIT.MNU - which I still need to find!
Why bother fitting the background when the idea is to be looking for small peaks?
The code has been mainly developed around fitting of gamma-ray spectra but can be used on any data set which required the identification of peaks in among significant background 'noise'. Once you have identified what the background looks like and have represented it mathematically the search for small variations from this representation is made easier. Exactly like the identification of the signal for the Higgs Boson reported upon this week. Blimey I am topical - it wasn't planned!
Essentially the operation of the ROBFIT code follows a sequence;
- Read in data required to be analysed
- Fit the background (this can be a separate file or the code can be run 'all-up' with it fitting background and peaks)
- Search for 'channels' above a cutoff level
- Search for peak regions
- Identify peaks in these regions
- Refit all peaks in the regions
- Update the peak list
Except it gets a bit more complicated......more on that later!
Saturday, 30 June 2012
Tracking the good stuff..
Bit of a diversion from programming but worth it.
Thanks to a bizarre combination of watching a programme on Caledonian Road (N9 London) on the BBC (excellent viewings for me), Tweeting that I had lived there for a few years then Eleonora http://eleonoraschinella.wordpress.com replying with a Tweet about a way of tracking and storing them as a story - I discovered an excellent new online tool for corralling all this fleeting internet information.
Its called Storify http://storify.com/ absolutely worth checking out! So I'm using it to pull together the threads of 'knowledge' that I pick up from various sources such as Twitter, Google reader etc. So far I have gone mad and have 4 'stories running - 2 can be found on http://storify.com/grandwizz - one is an Alan Turing tribute and the other relates to collation of Innovation ideas! The other 2 stories are in draft and are just 'knowledge' dumps - probably post them anyway this week sometime.
So the online tools are shaping up like this for me;
Now just need a good Fortran compiler - the one I have (must be free you see) is a bit too retro even for me;)
Thanks to a bizarre combination of watching a programme on Caledonian Road (N9 London) on the BBC (excellent viewings for me), Tweeting that I had lived there for a few years then Eleonora http://eleonoraschinella.wordpress.com replying with a Tweet about a way of tracking and storing them as a story - I discovered an excellent new online tool for corralling all this fleeting internet information.
Its called Storify http://storify.com/ absolutely worth checking out! So I'm using it to pull together the threads of 'knowledge' that I pick up from various sources such as Twitter, Google reader etc. So far I have gone mad and have 4 'stories running - 2 can be found on http://storify.com/grandwizz - one is an Alan Turing tribute and the other relates to collation of Innovation ideas! The other 2 stories are in draft and are just 'knowledge' dumps - probably post them anyway this week sometime.
So the online tools are shaping up like this for me;
- Twitter - full of junk until you can filter what you want through a trusted network then excellent access to current thinking!
- LinkedIn - interfacing with Twitter to share knowledge and for deeper discussions - negative is its a bit glacial in response time - positive is that the feedback you get from a network is fantastic, a living encyclopaedia.
- Github - repo for all coding - best bar none for this type of thing!
- Blogger - this - effectively my diary.
- Storify - repository for pulling things together and building themes.
- My website (yes I am calling it that much to the disdain of my lads) https://sites.google.com/site/oldbam/ for keeping track of where all this is located!
Now just need a good Fortran compiler - the one I have (must be free you see) is a bit too retro even for me;)
Friday, 22 June 2012
First module loaded!
Result - the first ROBFIT module has been loaded onto Github!
It only took me 4 hours to do that mind!
Setting up the ROBFIT repo (got to do the jargon right - that's short for repository on github) was more of a lesson in retro. All command line control once you have downloaded github,
mkdir robfit
cd robfit
git init
type stuff - not sure whether to be pleased that I still follow this stuff or not - thought there would be some slick modern version to clickety click and bob's your uncle its all done for you. But no - maybe the advanced version comes with a green screen too ;)
Anyway;
git add BKGFIT
git remote add origin https://github/grandwizz/robfit.git
git push origin master
seemed to load the background fitting programme BKGFIT onto the site.
Github - looks very very useful by-the-way once you have got over up the learning curve!
Check it out.......more to come.
It only took me 4 hours to do that mind!
Setting up the ROBFIT repo (got to do the jargon right - that's short for repository on github) was more of a lesson in retro. All command line control once you have downloaded github,
mkdir robfit
cd robfit
git init
type stuff - not sure whether to be pleased that I still follow this stuff or not - thought there would be some slick modern version to clickety click and bob's your uncle its all done for you. But no - maybe the advanced version comes with a green screen too ;)
Anyway;
git add BKGFIT
git remote add origin https://github/grandwizz/robfit.git
git push origin master
seemed to load the background fitting programme BKGFIT onto the site.
Github - looks very very useful by-the-way once you have got over up the learning curve!
Check it out.......more to come.
Subscribe to:
Posts (Atom)