We’ve got a nice project which went live today - we’ve been providing the final performance analysis for the £17m government-funded Retrofit for the Future programme, which has been running since 2009. The programme retrofitted 120 homes in the UK with multiple energy-saving and energy-efficiency technologies, and collected data through over 1300 sensors and meters taking readings every 5 minutes. The aim of the programme was to inform the planning and development of retrofit and new-build homes to help the UK meet its energy goals. They launched the final results at Ecobuild today, and they’re also up on this website.
More recent and upcoming talks from Team Mastodon:
Prescribing Analytics got an awful lot of positive press coverage over the last few weeks.
We’ve finally published the prescribing analytics portal here
Don’t tell anyone yet, but we’ve got a big project going live soon which aims to help the NHS save an awful lot of money. It turns out that by altering GPs’ prescribing behaviour for a few drugs, swapping generic for proprietary forms where appropriate, it’s possible to save hundreds of millions of pounds a year - working with the doctors at Open Healthcare UK, we’ve clarified how prescriptions can and should safely be changed, done the detailed financial analysis, and created maps and rankings of exactly which GPs are spending what.
From the Bethnal Green Ventures Demo Day - my 5 minute version of what we do, told mostly through the medium of Muppets.
(if you now have an earworm, congratulations, we are of the same generation)
A friend asked me this week what the difference is between using Hadoop and its related ecosystem for data storage and analysis, and using a traditional Data Warehouse.
Hive is a SQL-like interface onto Map Reduce. It feels nice and familiar to analysts who are used to thinking in a SQL paradigm, but it has some nasty gotchas that can make jobs verrrrrry slow or make them fail altogether. Either way, you waste a lot of time, blood pressure, and machine hours.
We added Rackspace calculations to our footprint data this week, as we’re keen to start running our Hadoop clusters over more infrastructure (in the greenest possible way) and we’re sure you are too.
Subscribe via RSS