Widget Bash – what a difference two days make

“I got more more done here in a day than I would have in three or four days in the office”. Just one of the comments during the wrap up session at our widget bash (#cetiswb).

And judging from other comments from the other delegates, having two days to work on developing “stuff” is one of the best ways to get actually move past the “oh, that’s interesting, I might have a play with that one day” stage to actually getting something up and running.

The widget bash was the latest in our series of “bash” events, which began many years ago with code bashes (back in the early days of IMS CP) and have evolved to cover learning design with our design bashes. This event was an opportunity to share, explore and extend practice around the use of widgets/apps/gadgets and to allow delegates to work with the Apache Wookie (Incubating) widget server which deploys widgets built to the W3C widget specification.

We started with a number of short presentations starting with presentations from most of the projects in the current JISC funded DVLE programme. Starting with the rapid innovation projects, Jen Fuller and Alex Walker gave an overview of their Examview plugin, then Stephen Green from the WIDE project, University of Teeside explained the user centred design approach they took to developing widgets. (More information on all of the rapid innovation projects is available here). We then moved to the institutionally focused projects staring with Mark Stubbs from the W2C project who took us through their “mega-mash up” plans. The DOULS project was next with Jason Platts sharing their mainly google based approached. Stephen Vickers from the ceLTIc project then outlined the work they have been doing around tools integration using the IMS LTI specification. We also had a remote presentation around LTI implementation from the EILE project. Rounding up the DVLE presentations, Karsten Lundqvist from the Develop project shared the work they have been doing primarily around building an embed video BB building block. Mark Johnson (University of Bolton) then shared some very exciting developments coming from the iTEC project where smartboard vendors have implemented wookie and have widget functionality embedded in their toolset allow teachers to literally drag and drop collaborative activities onto their smartboards at any point during a lesson. Our final presentation came from Alexander Mikroyannidis on the ROLE project which is exploring the use of widgets and developing a widget store.

After lunch we moved from “presentation” to doing “mode”. Ross Gardler took everyone through a basic widget building tutorial, despite dodgy wifi connections and issues of downloading the correct version on Ant, most people seemed to be able to complete the basic “hello world” tutorial. We then split into two groups, with Ross continuing the tutorials and moving creating geo- location widgets and Scott Wilson working with some of the more experienced widget builders in what almost become a trouble shooting surgery. However his demo of repackaging a pac-mac game as W3C widget did prove very popular.

The sun shone again on day two and with delegates more familiar with wookie and how to build widgets, and potential applications for their own contexts, the serious bashing began.

One of the great things about working with open source projects such as Apache Wookie (Incubating), is the community sharing of code and problem solving We had a couple of really nice examples of this in action, starting with the MMU drop in pc-location widget. The team had managed to work out some IE issues that the wookie team were struggling with (see their blog post), and inspired by the geo-location templates Ross showed on day 1, managed to develop their widget to include geo-location data. Now if users access the service from a geo-location aware device it will return a list of free computers nearest to their real-time location. The team were able to successfully test this on ipad, galaxy tab, iphone and android phone. For non-location aware devices the service returns an alphabetical list. You can try it out here.

Sam Rowley and colleagues from Staffordshire university decided to work on some DOM and jQuery and issues. Whilst downloading the wookie software they noticed a couple of bugs, so they fixed them and submitted a patch to the Wookie community.

Other interesting developments emerged from discussions around ways of getting data out of VLEs. The team from Strathclyde realised that by using the properties settings in wookie they could pass a lot of information fairly easily from Moodle to a widget. On day two they converted a Moodle reading list block to a wookie widget with an enhanced interface allowing users to specify parameters (such as course code etc). The team have promised to tidy up the code and submit to both the wookie and moodle communitys. Inspired by this Stephen Vickers is going to have a look at developing a powerlink for webCT/BB with similar functionality.

On a more pedagogical focus some of the members of the Coeducate project worked on developing a widget version of the the 8LEM inspired Hybrid Learning Model from the University of Ulster. By the end of the second day they were well on the way to developing a drag and drop sequencer and were also exploring multiuser collaboration opportunities through the google wave api functionality which wookie has adopted.

Overall there seemed to be a really sense of accomplishment from delegates who managed to do a huge amount despite having to fight with very temperamental wifi connections. Having two experts on hand proved really useful to delegates as they were able to ask the “stupid” and more often than not, not so stupid questions. Having the event run over two days also seemed to be very popular as it allowed delegates to actually move from the thinking about doing something to actually doing it. It also highlighted the positive side of contributing to an open-source community and hopeful the Apache Wookie community will continue to see the benefit of increased users from the UK education sector. We also hope to run another similar event later in the year, so if you have any ideas or would like to contribute please let me know.

For another view of the event, I’ve also created a storify version of selected tweets from the event.

Using video to capture reflection and evidence

An emerging trend coming through from the JISC Curriculum Design programme is the use of video, particularly for capturing evidence and and reflection of processes and systems. Three of the projects (T-Sparc, SRC, OULDI) took part in an online session yesterday to share their experiences to-date.

T-Sparc at Birmingham City University have been using video extensively with both staff and students as part of their baselining activities around the curriculum design process. As part of their evaluation processes, the SRC project at MMU have been using video (flipcams) to get student feedback on their experiences of using e-portfolios to help develop competencies. And the OULDI project at the OU have been using video in a number of ways to get feedback from their user community around their experiences of course design and the tools that are being developed as part of the project.

There were a number of commonalities identified by each of the projects. On the plus side the immediacy and authenticity of video was seen as a strength, allowing in the case of SRC the team to integrate student feedback much earlier. The students themselves also liked the ease of use of video for providing feedback. Andrew Charlton-Perez (a lecturer who is participating in one of the OULDI pilots) has been keeping a reflective diary of his experiences. This is not only a really useful, shareable resource in its own right, but Andrew himself pointed out that he has found it really useful as self-reflective tool and in helping to him to re-engage with the project after periods of non-involvement. The T-Sparc team have been particularly creative in using the video clips as part of their reporting process both internally and with JISC. Hearing things straight from the horses mouth so to speak, is very powerful and engaging. Speaking as someone who has to read quite a few reports, this type of multi-media reporting makes for a refreshing change from text based reports.

Although hosting of video is becoming relatively straightforward and commonplace through services such as YouTube and Vimeo, the projects have faced some perhaps unforeseen challenges around consistency of file formats which can work both in external hosting sites, and internally. For example the version of Windows streaming used institutionally at BCU doesn’t support the native MP3 file formats from the flip-cams the team were using. The team are currently working on getting a codec update and they have also invested in additional storage capacity. At the OU the team are working with a number of pilot institutions who are supplying video and audio feedback in a range of formats from AVI to MP3 and almost everything in the middle, some which of need considerable time to encode into the systems the OU team are using for evaluation. So the teams have found that there have been some additional unforeseen resources implications (both human and hardware) when using video.

Another common issue to come through from the presentations and discussion was around data storage. The teams are generating considerable amounts of data, much of which they want to store permanently – particularly if it is being incorporated into project reports etc. How long should a project be expected to keep evaluative video evidence?

However despite these issues there seemed to be a general consensus that the strengths of using video did make up for some of the difficulties it brought with it. The teams area also developing experience and knowledge in using software such as Xtranormal and Overstream for creating anonymous content and subtitles. They are also creating a range of documentation around permissions of use for video too which will be shared with the wider community.

A recording of the session is available from The Design Studio.

Personal publishing – effective use of networks or just noise?

If you follow my twitter stream you may have noticed that everyday about 9am, you’ll see a tweet with a link to my daily paper and a number of @mentions of people featured in it. You may even have been one of those @ mentions.

I’ve actually had a paper.li account since last year, but it’s only recently that I’ve actually set the “automagic” tweet button live. Partly this was because I’ve found it quite interesting following links to other paper.li sites where I’ve been mentioned, and also partly as a bit of a social experiment to see (a) if anyone noticed and (b) what reactions, if any, it would solicit. In fact this post is a direct response to Tore Hoel’s tweet at the weekend asking if I was going to reflect on use.

Well, here goes. So being one of those people who likes to play (and follows every link Stephen Fry tweeets) I was intrigued when I came across paper.li at first and signed up. For those of you unfamiliar with the service it basically pulls in links from your twitter feed, categorizes them and produces an online paper. Something, and I’m not sure what it was prevented me from making the links public from the outset. On reflection I think it was that I wanted to see how the system works, and if it actually did provide something useful.

There’s no editorial control with the system. It selects and classifies articles, links and randomly generates your online paper, and (if you choose) sends a daily tweet message from your twitter account with a url and @mentions for selected contributions. Sometimes these are slightly odd – you might get mentioned because you tweeted a link to an article in “proper” paper, a blog entry or a link to a flickr stream. It’s not like getting a by-line in a proper paper by any stretch of the imagination. The website itself has an archive of your paper and there’s also the ability to embed a widget into other sites such as blogs. Other services I’ve used which utilise twitter (such as storify) generate more relevant @mention tweets i.e. only for those you actually quote in your story. You also have the option not to send an auto tweet. Something I missed the first time I used it and so tweeted myself about my story:-).

So, without editorial control is this service useful? Well like most things in life, it depends. Some people seem to find it irritating as it doesn’t always link to things they have actually written, rather links they have shared. So for the active self promoter it can detract from getting traffic to their own blog/website. Actually that’s one of the things I like- it collates links that often I haven’t seen and I can do a quick skim and scan and decide what I want to read more about. Sometimes they’re useful – sometimes not. But on the whole that’s the thing with twitter too – some days really useful, others a load of footballing nonsense. I don’t mind being quoted by other people using the service too. It doesn’t happen that often and I don’t follow too many people, and guess what -sometimes I don’t actually read everything in my twitter stream, and I don’t follow all the links people post – shocking confession I know! However when you post on to twitter it’s all publicly available so why collate it? If it’s a bit random, then so be it. But some others see it differently.

If you don’t like being included in these things then, like James Clay get yourself removed from the system.

There have been another couple of instances where I have found the service useful too. For the week after the CETIS10 conference last year, we published the CETIS10 daily via the JISC CETIS twitter account. As there was quite a lot of post conference activity on blogs etc it was another quite useful collation tool – but only for a short period of time where there was enough related activity to the conference hashtag for the content to be nearly always related to the conference. Due to the lack of editorial control, I don’t think a daily JISC CETIS paper.li would be appropriate. The randomness that I like in my personal paper isn’t really appropriate at an organisational communication level.

I recently took part in the LAAK11 course, and one of the other participants (Tony Searle, set up a paper.li using the course hashtag. I found this useful as it quickly linked me other students, articles etc which I might not have seen/connected with and vice versa. Again the key here was having enough relevant activity Tore asked if it would be useful for projects? I’m in two minds about that – on the one hand it might – in terms of marketing, getting followers. But again the lack of editorial control might lead to promotion of something that wasn’t as closely related to the project as you would like. If however you have an active project community then it might work.

For the moment the Sheila MacNeill daily will continue but I’d be interested to hear other thoughts and experiences.

The University of Southampton opening up its data

The University of Southampton have just launched their Open Data Home site, providing open access to some of the University’s administrative data.

The Open Data Home site provides a number of RDF data sets from teaching room features to on campus bus-stops, a range of apps showing how the University itself is using the data, and their own SPARQL endpoint to query the data. As well as links to presentations from linked data luminaries Tim Berners-Lee and Nigel Shadbolt, the site also contains a really useful FAQ section. This question in particular is one I’m sure lots of institutions will be asking, and what a great answer.

“Aren’t you worried about the legal and social risks of publishing your data?
No, we are not worried. We will consider carefully the implications of what we are publishing and manage our risk accordingly. We have no intention of breaking the UK Data Protection Act or other laws. Much of what we publish is going to be data which was already available to the public, but just not as machine-readable data. There are risks involved, but as a university — it’s our role to try exciting new things!”

Let’s hope we see many more Universities following this example in the very near future.