BBC R&D at IBC 2017

IP production, object-based audio, atomised media, speech to text, Virtual and Augmented Reality all on our stand at this year's show.

Published: 14 August 2017
  • Scott Cawley

    Scott Cawley

    Technology Engagement Advisor

BBC Research and Development returns to IBC to show our IP production, object-based audio, atomised media, speech to text and Virtual Reality and Augmented Reality work on our own stand. Plus we'll be presenting papers in conference sessions on subjects including voice UI and High Dynamic Range, and our work will feature on several of our partner's stands too.

 

IP Production in the Cloud/Protocols for creating Object-Based Media at Scale BBC R&D is extending its IP Studio technology into the cloud, moving production tools into the web browser and using the public internet to transport content. The core of our system is powered by an implementation of AMWA’s Networked Media Open Specifications (NMOS).

This year at IBC we are demonstrating a flexible production system which scales out as required. It enables multiple users to produce their own live multi-camera shoot ‘on demand’ using nothing but commodity tools and the internet – perfect for rapidly breaking news stories! Each user receives their own set of browser-based configuration and production interfaces, whilst the cloud takes care of handling content storage and high-quality rendering.

Atomised News Piecing Together the News for Young People Newsbeat Explains is a mobile-first prototype aimed at younger audiences who broadcasters are typically finding harder to reach. The prototype is part of our research on Atomised News - breaking down key aspects of a story into smaller, reusable, self-contained pieces of information loosely linked together by rich metadata. Atomising content will enable non-linear and contextual content experiences.

 

The pilot was built using existing news production systems and stories were written by BBC journalists, proving to be such a success that the atomised format was used in the BBC's recent UK General Election coverage.

Our second prototype looks at ways we can create atomised-style videos in multiple lengths from existing content sources and articles not written with atomisation in mind. Our concept takes headlines, dates, standfirsts, summaries, article images and other sources of structured data and reassembles them to make adaptable video summaries so they fit a user’s preference, context or situation.

Reality Labs New Immersive Experiences for Audiences

 

BBC R&D is continuing our successful work that has taken place over the last two years: looking into Virtual Reality, Augmented Reality and emerging technologies in the immersive field.

As part of this we recently launched an interactive 360 audio and video application, BBC Taster VR. It allows production teams to use light interactivity in the form of hotspots, plus truly immersive audio to tell stories that are impossible with linear 360.

We’re also investigating WebVR as a creation and delivery technology – we’re interested in how it can increase the audience for our VR work and the democratisation of the VR landscape, as well as building on previous BBC R&D work in immersive audio through Web Audio standards.

We are currently working on an Augmented Reality project to bring heritage and cultural artefacts into our audience’s homes and we'll be demonstrating cutting-edge AR at IBC 2017 too.

Speech to Text Cheaper, better transcription built with our own content for the BBC's problems

Speech-to-text has recently moved from the lab to the newsroom as a production-ready technology for journalists inside the BBC and we will be showcasing some of the many uses that speech-to-text can be put to in the broadcasting environment. These include helping journalists rapidly find footage in large archives, helping them quickly subtitle ‘water cooler’ content so it can be shared more widely on social media and helping them quickly transcribe interviews.

We will also be able to explain how broadcasters with large subtitle archives can achieve similar results and the work the BBC has done to help steer academic research in this area towards the sort of real-world problems that broadcasters face.

Object-Based Audio ORPHEUS Collaborative Project ORPHEUS is a European collaborative project that aims to create an end-to-end object-based audio broadcast system. Using the BBC’s IP Studio platform, we are developing the tools, formats and protocols required to produce and deliver object-based content. This work will inform the design of future broadcast systems and our work on emerging standards.

On show at IBC will be The Mermaid’s Tears, our interactive audio drama where you can follow any one of three characters through the story and which we produced using our system. It was broadcast live, making it the first live interactive object-based broadcast. To produce this experience we developed a novel audio control interface to generate the ADM metadata that describes the mix for each character. Visit our stand to experience the drama and learn more about how we created it.

BBC Research & Development Contributions on Partner Stands:

IP Showcase (Conference Room E.106)We have been working with the Advanced Media Workflow Association (AMWA) to produce a new API specification for use in IP broadcast facilities. NMOS IS-05 Connection Management facilitates the creation of connections between pieces of IP broadcast equipment, and delivers the functionality which was provided by broadcast routers in traditional installations. The specification is compatible with AMWA’s NMOS IS-04 Discovery and Registration specification, which allows the automatic discovery of broadcast devices.

IS-05 is an open specification, to help ensure inter-operability between broadcast equipment vendors and simplify system integration. A range of different broadcast manufacturers have been participating in workshops around the specification over the past year, to ensure the specification meets the needs of the market. Over 20 companies have implemented the specification so far, and many of those will come together for a demonstration of the specification as part of the IP Showcase at IBC.

HbbTV2 Media Synchronisation and Companion Screen (Stand 14.A20 - Opera TV)

BBC R&D is showing what HbbTV 2 media synchronisation and companion screen features make possible. Local programmes or personalised trailers, delivered over IP, can be inserted into broadcast viewing. Apps on phones and tablets can become aware of what you are watching on your TV and provide personalised synchronised content, extra information about programmes, or help you find the next programme to watch.

HbbTV 2 technology is being adopted in the UK via the Freeview Play platform the BBC is committed to moving its TV services (such as iPlayer and connected red button) to HbbTV 2. BBC R&D has been a leading contributor to the standardisation and promotion of these technologies.

ORPHEUS - Object-Based AudioThe ORPHEUS project will have a major presence at IBC 2017; it will be strongly represented both in the conference and in the exhibition.  As well as on the BBC R&D stand (details above) ORPHEUS will be present on the following stands:

Within the conference, ORPHEUS will present a the paper The EU Project ORPHEUS: OBJECT –Based Broadcasting - For Next Generation Audio Experiences. In addition more papers will be presented by ORPHEUS partners.

High-Dynamic Range (Stand 10.F20 - EBU)

BBC R&D will be using clips from the BBC’s award winning Planet Earth II series, to demonstrate the set of video conversions required in a complete high dynamic range television ecosystem.  The conversions will allow video sources in HLG HDR, PQ HDR and SDR (standard dynamic range) formats to be used within the same production or television schedule.

We will also be displaying a real-time video brightness meter that can be used to maintain brightness consistency between HDR programmes.

COGNITUS (In the Ruby Lounge and on Stand 7.G23 - VITEC) COGNITUS is a research collaboration funded by the European Commission’s Horizon 2020 programme, aimed at enabling the convergence of broadcast and user generated content (UGC) for future interactive UHD services. COGNITUS specialises in enhancing UGC for broadcast productions as well as tools to facilitate content authoring and efficient adaptive delivery for future media services. Having passed the half-way mark of the 3-year project, at IBC 2017, we will present the key components of the COGNITUS workflow and the latest updates of the content processing components, including quality assessment, video enhancements, metadata enrichment and high-efficient adaptive delivery.

COGNITUS will be presented in two forms: a Multimedia Presentation in the Ruby Lounge, and a series of interactive presentation sessions twice daily at the VITEC stand (7.G23). The presentations will feature the details and results of the COGNITUS Edinburgh festival trial from August 2017.

BBC Research & Development Technical Papers and Presentations:

BBC R&D Colleagues will be presenting technical papers based on their work all week at IBC. This is a great opportunity to get details on the work our teams do day to day - you can find all these sessions in the Emerald Room.

Talking with Machines: Prototyping Voice Interfaces for Media In the Paper Session: Intelligent Media Interfaces - Listening, Watching and Feeling You!Sophisticated media interfaces have featured at IBC for many years but interest in them has been heightened by the practicality of deep-learning algorithms and the publicity associated with voice interfaces such as Alexa. In this enlightening and entertaining session we shall explore the latest research across a range of media interface technologies.

Our leading experts will examine such questions as: How are voice interfaces designed and how are they actually used in the home? How can new combinations of AI and eye-tracking provide exciting interaction with AI worlds? What does a new survey tell us about the way users consume video and what implications will this have for interface design? Finally, a new technology which can recognise you in two seconds from the way you handle your remote control! Come along and be amazed at what today's latest interfaces can learn about you! With Henry Cooke, Thursday 14th September, 11:00-12:30

An architecture for cloud-based IP video production for non-live content In the Paper Session: IP Connectivity in Media production IP heralds the most fundamental advance in production since media went digital; it will revolutionise both the art, the technology and the workflow of media creation. In this exciting and informative session we shall explore how IP interconnectivity and software IP implementations are contributing to the revolution. We shall discover: the challenges involved in creating a software version of SMPTE 2110, the standard which carries uncompressed media and metadata in IP; how to capitalise on IP's asynchronicity; variable bandwidth allocation in live multiservice links; and the benefits which IP can bring to object-based production. Come along and discover the power and the potential of IP. With Chris Northwood, Friday 15th September, 08.30-10.00

Experiments in Immersion: Lessons learnt from migrants to monstersIn the Paper Session: VR and AR - the Production, the Potential and the Pitfalls

The boom in 360° media continues to strengthen, as every day the media industry collectively expands its knowledge of: production grammar, psychology, capture and processing technologies, user requirements and potential markets. From a large offering, we have chosen a selection of new and innovative presentations which demonstrate the diversity of this exciting topic. We're beginning with a fascinating look at lessons that have been learnt so far by a major broadcaster with a wide experience of immersive production. Then a glimpse of the exciting potential of live-action VR captured with camera arrays. What about combining VR and social media - the concept of being in a virtual room with your friends? And not forgetting potential adverse effects - can machine learning help us to recognise material which will induce sickness in users? Join us for what will be a remarkable session. With Oliver Spall, Friday 15th September, 11:30-13:00

A scalable IP-based distribution system In the Paper Session: Solving OTT's Achilles’ heel - Scale Whilst broadcaster's OTT "catch-up" services are well established - linear, particularly live TV presents a significant and costly audience scaling challenge. In this session, we will hear from service providers with first-hand experience of tackling this issue as well as potential technical solutions to this and the future challenges OTT delivery will face from immersive media. With Richard Bradbury, Friday 15th September, 15:30-17:30

Moving object-based media from one-off examples to scalable production workflows In the Paper Session: Object and Data-Based Media Interest in object-based media, the idea that a service can be conveyed as a segmented collection of objects and reconstructed intelligently in a version which meets the specific requirements of the viewer, has been growing for several years. They offer unique flexibility but at the cost of increased production effort and decoder processing. In this session we shall look at the very latest ideas, technologies and productions, notably those arising from a European collaborative research initiative. Discover the challenges faced in designing a pilot object-based radio service which delivers immersive and personalised experiences. Learn how new tools permit the production of object-based drama, cookery and interactive content. We shall also look at the use of factual data in programme content. In these days of fake news, how can we ensure that numerical facts are rigorously checked? Join us to explore the latest thinking.With Chris Northwood, Saturday 16th September, 10.15-11.45

A Brightness Measure for High Dynamic Range Television In the Paper Session: More Life-like Images with High Dynamic Range and Wide Colour Gamut

HDR and WCG are very exciting developments which are having far-reaching implications for production practice, delivery chains and domestic viewing. Their imagery matches human vision more closely and can produce a startlingly real-world rendition. In this fascinating session, the world's leading researchers in these technologies will explore the latest developments, for example: What colour tolerances need to be achieved for HDR and WCG displays? What are the acceptable limits to brightness changes on an HDR display during a TV production? Do HDR and WCG enhance the visibility of processing artefacts? How might tone mapping influence the bit-rate of coded video? Join us to discover all that is new in this fast-moving field. With Katy Noland, Sunday 17th September, 12:00-13:45

Speech to text for broadcasters - from research to implementation - better output for view In the Paper Session: Advances in Audio Production Have you heard? Audio, is a cornerstone technology of our industry and it continues to impress with its innovation. This session explores the diversity of that innovation covering: sound source extraction and localisation for use in Next Generation Audio, the implementation and effectiveness of large scale Automated Speech Recognition and the monitoring challenge - speaker versus headphone. With Andrew McParland and Alex Norton, Monday 18th September, 08:30-10:00

BBC R&D Presenting Elsewhere at IBC:

NMOS in the Real World As broadcasters plan their next generation of facilities using IP, the "auto-provisioning" part of the JT-NM roadmap becomes more essential. Using examples from real projects such as BBC Cardiff, Peter describes how the AMWA NMOS family of specifications -- IS-04 discovery and registration and the forthcoming IS-05 connection management can make this happen in practice, and can pave the way for later "dematerialisation".In the IP Showcase (Conference Rooms.E106) with Peter Brightwell, Friday 15th September at 15:50-16:10 and Saturday 16th September 15:20-15:40

EBU Lightning Talk: Open Source Meet UpBBC R&D's Simon Rankine will be talking about our newly open sourced IS-05 reference implementation and development tools, which are an important part of our work to drive the adoption of the new NMOS IS-05 specification in industry.On the EBU stand (10.F20) with Simon Rankine, Saturday 16th September 16:00-17:00

Climbing the Beanstalk: IP Studio and NMOS in the Cloud An overview of how BBC R&D is using NMOS and its own IP Studio work to develop a cloudhosted production system. This talk will also explore what a 'dematerialised', cloudfit production system might look like and some tools associated with cloud deployment.In the IP Showcase (Conference Rooms.E106) with Alex Rawcliffe, Saturday 16th September 11:00-11:20 and Monday 18th September 10:20-10:40

Inventing The Future - Decoding The Unknown It is predicted that in the next 20 years the technology innovation will accelerate radically and the world will change faster than in the past 300 years. What are the latest developments in cutting-edge research on technologies and applications around the world that will have the biggest impact on humanity? We asked the experts to envision what mega shifts are they likely to happen in the near and distant future. The speakers will also zoom in on media and entertainment and share insights into transformative technologies that are reshaping the industry. What technical and creative solutions already offer new ways to turn bold ideas into practical applications? What is hype and what will really matter in the fast-changing landscape? What is next for tech? In the Conference Forum with George Wright, Sunday 17th September, 15:00-17:00

Hybrid Log-Gamma - Latest UpdateThe latest on our Hybrid Log-Gamma approach to High Dynamic Range content from our Principal Technologist Andrew Cotton.In the SES Balcony Suite with Andrew Cotton, Monday 18th September, 10:30-11:15

Rebuild Page

The page will automatically reload. You may need to reload again if the build takes longer than expected.

Useful links

Theme toggler

Select a theme and theme mode and click "Load theme" to load in your theme combination.

Theme:
Theme Mode: