BBC Trust visit to R&D- 3D Demo
In order to see this content you need to have both Javascript enabled and Flash installed. Visit BBC Webwise for full instructions. If you're reading via RSS, you'll need to visit the blog to access this content.
Post categories: Production, events, media, section updates, video
Graham Thomas | 15:00 UK time, Monday, 28 December 2009
In order to see this content you need to have both Javascript enabled and Flash installed. Visit BBC Webwise for full instructions. If you're reading via RSS, you'll need to visit the blog to access this content.
Post categories: Christmas Lectures, Production, Quentin Cooper, interview, media, project updates, video
Graham Thomas | 09:00 UK time, Saturday, 26 December 2009
In order to see this content you need to have both Javascript enabled and Flash installed. Visit BBC Webwise for full instructions. If you're reading via RSS, you'll need to visit the blog to access this content.
Post categories: Audience Experience, Christmas Lectures, Quentin Cooper, interview, media, project updates, video
Alia Sheikh | 14:21 UK time, Thursday, 24 December 2009
In order to see this content you need to have both Javascript enabled and Flash installed. Visit BBC Webwise for full instructions. If you're reading via RSS, you'll need to visit the blog to access this content.
Post categories: Automated Production and Media Management, Ingex, Production, events, media, partnerships, video
Ant Miller | 10:00 UK time, Tuesday, 22 December 2009
In order to see this content you need to have both Javascript enabled and Flash installed. Visit BBC Webwise for full instructions. If you're reading via RSS, you'll need to visit the blog to access this content.
Post categories: Quentin Cooper, Roundup, facilities, media, video
Matthew Postgate | 12:00 UK time, Monday, 21 December 2009
In order to see this content you need to have both Javascript enabled and Flash installed. Visit BBC Webwise for full instructions. If you're reading via RSS, you'll need to visit the blog to access this content.
Post categories: Audience Experience, Automated Production and Media Management, project updates
Alia Sheikh | 12:00 UK time, Thursday, 17 December 2009
Post categories: Quentin Cooper, Roundup, interview, media, video
Matthew Postgate | 09:00 UK time, Wednesday, 16 December 2009
In order to see this content you need to have both Javascript enabled and Flash installed. Visit BBC Webwise for full instructions. If you're reading via RSS, you'll need to visit the blog to access this content.
Post categories: Archives, events
Richard Wright | 12:00 UK time, Friday, 11 December 2009
Post categories: Audience Experience, Prototyping, Research Case Study, project updates, section updates
Tristan Ferne | 12:47 UK time, Thursday, 10 December 2009
The R&D Prototyping team, in association with Radio Labs, recently launched a new public prototype called Mooso. Mooso is a game to play while listening to BBC 6 Music. Sign up at https://www.mooso.fm, then listen to 6 Music and tag the songs that are played. If you match other players’ tags then you score points. You can read more about it on the the BBC Radio Labs blog. In this rather long post we explain some of the development process for Mooso, some of the technology behind it and we look at the problems and choices we had to make.
The core idea for Mooso came out of an ideas session a couple of years ago. We wanted to build a web application for discovering new music that was inspired by games and fun.
Mooso is a Game With A Purpose, a GWAP. GWAPs are a concept first developed by Luis von Ahn of Carnegie Mellon University and the first mainstream example was the ESP Game, which was subsequently licensed by Google as the Google Image Labeler. The principle behind these games is to create enjoyable and fun experiences for people that also do useful work as a by-product of their design. You can play several examples of GWAPs at gwap.com including image-based, text-based and music-based games.
Typically these games match up a number of players over the internet and get them to independently describe some aspect of an image, text or music that is provided, if the players descriptions match then they get points. And those bits of independently entered and matched data can then be collated and used to aid search and navigation of the things that were described. This concept inspired us and we spent some time thinking about how we could use it for music discovery but came to a bit of a dead end. We wanted to build a music-based GWAP as a way of gathering tags and metadata but at the time we didn't have any on-demand music on bbc.co.uk, not even clips, so we couldn’t build a completely analogous game to the Image Labeler. What we do have however is live radio, and the eureka moment came when we realised we could develop a game of this type which ran synchronously with a radio station, using each song played on the radio as the basis for a round in the game.
So we took that idea and developed it, drafting some possible rules and then playing the game on paper to see if it would work - you just need a few people, a radio, a countdown timer and some paper and pens. When the radio starts playing a song everyone starts writing down tags, while keeping them hidden. After two minutes the round ends and you start the scoring. Go round the table in turn, each player reading out their tags, if anyone else wrote down your tag then you and they get a point and then cross it off, otherwise just cross it off. Then when everyone has either matched or not matched all their tags you can total up the scores for that round. Anyway, it was pretty fun to play and the idea looked good. We put together a small team and built a quick Rails prototype over a couple of weeks, this worked OK and we played the game internally a bit but it wasn’t in any way scalable or robust enough to put on the web in public. So the idea was shelved temporarily while other things got developed and time passed.
Then earlier this year we picked the project up again and turned it into a beta-quality prototype for public release.
Mooso is built in Ruby on Rails, it’s one of the technologies that our team use for prototypes, in this case providing enough robustness and performance combined with speed of development. The main challenge in building the game is the real-time nature of it, which is an interesting trend on the web in itself. We decided to use XMPP/Jabber, a standardised Instant Messaging (IM) protocol, to power the game as it would give us a scalable way to receive real-time submissions from players and send them instant updates as they play. We use the eJabberd Jabber server because it is very stable and we have some experience of using it in the BBC.
As Jabber is an IM protocol we need some way of integrating that into the game’s website. If you’re playing on the website then the pop-up Play page (shown above) communicates to our eJabberd server using a Flash bridge (based on the XIFF Flex library) which allows a direct connection to be made over the standard Jabber ports. If you’re behind a firewall (e.g. if you’re playing at work, which we wouldn’t condone obviously) we provide a Javascript fallback (based on the Stophe library) which connects to the eJabberd BOSH server using HTTP binding or long polling over the standard port 80. Around 8 out of 10 people connect using the Flash bridge. Using Jabber also meant that we can let people play over instant messenger (if they have a Jabber or Google Mail account) with very little extra code - see the Mooso IM help for more detail.
The game is run by a number of Ruby scripts which communicate with each other internally over Jabber as well, using the XMPP4R library. One script manages player communication and the start and end of each round. Another script manages timing information and sends out timer messages, like when there are 30 seconds left in the round. A third script listens for new tracks to be played from the 6 Music hard disk playout system. This is what triggers the start of a new round, although not all tracks that 6 Music plays are played out like this. So live sessions, for instance, do not trigger a new round. And another script handles scoring, which is the first point at which we have any communication between the game and the database - deliberately done in order to de-couple the real-time gameplay from the website serving. Once scoring is done the database and site are updated to reflect the players’ new scores and the round which was just played. This architecture means the game mechanics are modular and should Mooso become very popular we can move processor-intensive functions like scoring onto dedicated hardware.
Fairly obviously, the game requires several people to be playing simultaneously for it to be any fun, which could be a problem. We try to help remedy this situation by generating “ghost” sessions for when there are few players online. We can only do this when a round involves a song that has been played in a previous round. At the end of the round, before scoring, we select several previous sessions for that song when available (where a session is tags from a player during a round) and add them to the pool of real players’ sessions to be matched and scored. There’s a caveat that these ghost sessions can’t be from any of the real players in that round so you can’t end up playing yourself! A player also doesn’t get any post-hoc points if they happen to be one of the ghost sessions as that becomes too complex, so only the current “real” players in a round score any points.
The rules we currently have in place mean that you get a single point if you enter any tags at all in a round, 5 points for every tag you match with other players and 10 points for every artist/band you match. We determine if a tag is an artist by matching it against our MusicBrainz-powered database of artists. Only by people playing the game will we discover if these rules work and we imagine that they may need to evolve over time to fine-tune the game for maximum enjoyment and usefulness. The tag matching is done with a varying threshold. Below a certain number of players you only have to match one other person but as the number of simultaneous players increases that threshold increases so you might have to match 2 or 3 or more players if there are many people playing.
This brings up some interesting questions around what kind of tags we will get from the game. By requiring players to match tags we are, hopefully, validating that those tags are in some way relevant or descriptive to the song. And we also want to avoid people spamming or abusing the game, hence the varying threshold. But by setting this threshold are we penalising those players who have specialist knowledge and enter obscure but valuable tags? Are we just going to get the lowest common denominator tags? We could start banning tags which have been entered too often (i.e. rock and indie) and we could also consider awarding points based on the entropy of each tag - i.e. you get more points for tags that are rarer and potentially more interesting.
It’s also worth thinking about whether this kind of tagging is different to other music sites that use tagging. I guess the motivations for people tagging music on places like last.fm are partly for their own use (which is why you often see the tag “seenlive”) and partly for the community. Mooso is potentially different as it has a particular audience (the 6 Music listeners), the players aren’t entering tags directly for their benefit, and they don’t even necessarily like the song that is being played. But they are, hopefully, having fun. Will these motivations make a difference to the qualities of the tags? We don't know but we hope to investigate.
The tagging data that we do capture from the game is used to create links inside the Mooso site so matching tags and artists are used to create links between tag pages and artist pages on the site creating an interconnecting web of music, songs, tags and bands. Internally we are actually storing all the entered tags, whether they match or not, and we hope to analyse these in more depth at some point. We are also planning to release the dataset of matching tags and artists under the BBC Backstage licence which will allow other people and organisations to benefit from this data.
Post categories: Quentin Cooper, Roundup, interview, video
Matthew Postgate | 12:00 UK time, Wednesday, 9 December 2009
In order to see this content you need to have both Javascript enabled and Flash installed. Visit BBC Webwise for full instructions. If you're reading via RSS, you'll need to visit the blog to access this content.
Post categories: events, interview, media, podcast
Ant Miller | 17:00 UK time, Tuesday, 8 December 2009
There have been a few events and developments in the last week that warrant a bit of a note in this blog, but we haven't had the time to pull together their own post. So by way of a catch up- here's the highlights of this week and a bit:
We're Hiring!
R&D is currently in the proces of recruiting graduate or equivalently experienced engineers for our graduate training program. Details are available on our careers page, and applications will be accepted up to the 18th of January. It is a cracking opportunity to forge a career in the North West or London area right at the heart of media technology. Competition is strong, and you maybe surprised at the background of previous succesful candidates- we need skills in a wide range of disciplines, and a blend of rigorous analytical capabilities along with innovative creativity. Engineering is not the only fruit!
Archive Workshop
Monday the 30th of November Kingswood Warren hosted an invite only Archive Workshop, with presentations from academia and industry. To name but a few of the contributors; Paul McConkey of Cambridge Imaging (they worked on the ITN/Pathe preservation project), ANdrew Zisserman of the Visual Geometry Group at Oxford University, Simon Factor of Moving Media, and Daniel Teruggi of INA . We hope to get a more detailed post from the organisers shortly.
In order to see this content you need to have both Javascript enabled and Flash installed. Visit BBC Webwise for full instructions. If you're reading via RSS, you'll need to visit the blog to access this content.
Post categories: Prototyping, section updates
George Wright | 09:10 UK time, Friday, 4 December 2009
Post categories: distribution, project updates, section updates
Andrew Murphy | 18:00 UK time, Thursday, 3 December 2009
Hello. I'm Andrew Murphy, a Senior Research Engineer in the "Distribution Core Technologies" section here at BBC R&D. Our section carries out research into the technologies that underpin the distribution of the BBC's TV, radio and interactive services both now and in the future.
Over this and subsequent posts I'll aim to tell you more about the work of the various teams in the section in areas such as high definition radio cameras, video coding, white spaces and the future of radio. I'm going to concentrate on DVB-T2 now, however, as yesterday marked the technical launch of Freeview HD which uses DVB-T2 as its physical layer.
You may have seen from Ant's post that our section's work on DVB-T2 was recognised by an RTS Innovation Award. As part of the T2 team, I was lucky enough to be at the awards ceremony and thought I'd give some background into the work that was recognised by the award.
DVB-T2 is
a new transmission standard (so you will need a new T2-compatible receiver to
decode it) that typically gives 50% more capacity than the current digital
terrestrial television standard used in the
The timescale for the development of DVB-T2 has been incredibly tight. In just three years it's gone from an initial study mission and call for technologies through to a published system specification and the manufacture of silicon and imminent launch of consumer set top boxes. This process has been backed up by countless simulations, verification work, testing and detailed field trials.
That short summary of course doesn't do justice to the huge amount of effort put in by us and numerous other companies from all around the world who have worked to make DVB-T2 a reality.
Our section at R&D is split into two teams overseen by section lead, Nick Wells, who is also chair of the T2 group within the DVB Technical Module.
The hardware development team is led by Justin Mitchell. Justin posted last year about the development of the world's first end-to-end DVB-T2 modulator/demodulator chain. Justin and his team are focussing on adding features to our modulator and demodulator both of which have been licensed to manufacturers as a way of encouraging the availability of T2 equipment and getting value back from work we would have needed to carry out anyway.
The T2 specification team (of which I'm a member) is led by Chris Nokes. This covers our contributions to the DVB-T2 working group, inputs to the T2-related specifications, T2 field trials and work within the Digital Television Group (DTG). Our team is also working closely with manufacturers to provide feedback on the performance of their DVB-T2 receivers which we test in our labs.
Personally, I've been helping to develop realistic test streams for T2 Multiple Physical Layer Pipes (PLPs). Multiple PLPs are an advanced feature of DVB-T2 that enable service-specific robustness. So, for example, a single T2 transmission could contain a mixture of high definition services aimed at household TVs fed by roof-top aerials as well as some low-bit rate, more rugged, services aimed at portable receivers.
There are no plans to use multiple PLPs in the
Alongside this I've also been chairing a sub-group within DVB-T2 to verify the T2 Modulator Interface standard (T2-MI). Whereas DVB-T transmissions are completely defined by the MPEG transport stream that they carry, the flexibility of T2 and the inclusion of multiple PLPs in the standard means that a transport stream is not necessarily an unambiguous description of the on-air T2 signal. Why does this matter? Well, for synchronised Single Frequency Network (SFN) operation, every transmitter must output the same signal at a precisely defined time instant. The T2-MI allows this to happen by telling the modulator exactly what to transmit when. The interoperability of T2-MI was successfully demonstrated at a plug-fest at the end of October where the first T2 SFN on a wire was created using equipment from different manufacturers fed by a single source of T2-MI.
The RTS Innovation Award recognised the hard work of everyone in my section working on DVB-T2 but there are many more people in other sections at R&D and around the BBC and in industry working hard to make sure that the launch of Freeview HD goes as planned and that the new transmissions integrate correctly with the existing DVB-T transmissions.
Watch this space...
Post categories: section updates
Richard Wright | 18:00 UK time, Wednesday, 2 December 2009
Jump to more content from this blog
For the latest updates across BBC blogs,
visit the Blogs homepage.
You can stay up to date with Research and Development via these feeds.
Research and Development Feed(RSS)
Research and Development Feed(ATOM)
If you aren't sure what RSS is you'll find our beginner's guide to RSS useful.
For a detailed breakdown of our activities, teams, locations and how we collaborate visit our main website. We also host videos on the main website without UK only distribution restrictions.
These are some of the popular topics this blog covers.
BBC © 2014 The BBC is not responsible for the content of external sites. Read more.
This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.