Welcome to weeknotes from the IRFS team in Â鶹ԼÅÄ R&D, where this week we prepare for the Edinburgh festivals, welcome back journalists in our Atomised News project after their Brexit-tinged hiatus, and test 360 videos in the UCL "CAVE".
360 Video User Testing
Andrew and Maxine from the North Lab ran some more 360 video user testing.
Over 2 days, 16 participants were shown the same 360 film on 2 different viewing systems: a head mounted display and UCL’s CAVE (an immersive 4 wall projection system). The findings from this study and previous lab testing are now being written up into a white paper.
Talking With Machines
We started development of the Voice Interface Radio skill this sprint!
Ant has set us up on AWS, and Henry has started writing the skill and some associated tooling in Node.js. Andrew has been drawing up Voice User Interface (VUI) flows for the first few versions of the skill on our roadmap, and he and Thomas have been doing some desk research. We’re holding back on beginning the exploratory stream of the project proper until the other two projects are ready to begin their activity, in the interests of syncing up.
Editorial Algorithms
We started preparing our collaboration with the Edinburgh/Live digital team, using our technology to help them follow the coverage of the festivals from multiple online media sources. We've been creating some specific searches for them in our system, in preparation for the events. We will carry out research on-site to see how it's used, how we fit in the workflow, and what improvements we can make.
Meanwhile, we've made progress on our "smart topics" training system: its backend is more stable now, supports auto-updating of classifiers, and now also automatically builds and deploys to AWS when new code is pushed. We also plan to have 'explainer' pages for each classifier.
David and others in the team completed some design work on creating a more compact creative work metadata page - aiming to give a complete view of all the metadata our system extracts at a glance.
And finally, Michel has been using our platform to start the analysis of half a million programme transcripts, in order to build up a historic archive of content metadata.
Atomised Media
For Atomised News, the journalists are back on board after a few... busy weeks. We are working with them to write our first Atomised News story, to be released as ‘soft’ launch this week - promoted by Newsbeat only on Twitter.
We are also working with Taster on all the compliances and editorial stuff - once we have at least a couple of stories we’ll do the official Taster launch. We have been also revisiting quantitative and qualitative measurements with the Audience Research team from News Online.
TellyBox
Joanne's written up our ethnographic study of how people watch TV, and it's fascinating. This week, Libby, Joanne, Tim, Ant, Chris have started using Sacha's project canvas to thoroughly think through the project and start planning the next 6 - 12 months. The next time we can meet is in a week or so, so we'll continue that process then.
Reflective Profiling
Michel scoped out the rest of his trainee project and identified remaining tasks to create a basic working prototype. Work on the prototype is progressing apace, and together with David, he sketched a design for the prototype front-end/website.
W3C
Chris has been working with members of the TV Control Working Group on requirements for radio support in the API, as well as capturing issues with the . The Second Screen Working Group has also published the first draft of the , which enables web pages to play audio and video media on a remote device, and the is now a Candidate Recommendation.
The Rest
Tim’s been in Salford, learning about the Â鶹ԼÅÄ and , as well as chatting to our North Lab colleagues working on audio, object-based broadcasting and UX about interesting ways we can work together in future.