Creating augmented display cases for Museon

A recent project, as part of the meSch project, was to create a number of augmented, networked display cases for Museon in The Hague. Museon, which is a museum partner in the meSch project, “aims at transferring knowledge about man and his relation with nature and culture and provides easily accessible information about topical themes and developments in science and society”.

Initial Brainstorming

This project began with a group of us from Sheffield Hallam University, including myself, Nick Dulake, Daniela Petrelli, Rob Barker and Ed Turner, working with Hub Kockelkorn from Museon in order to brainstorm ideas for an exhibit to place in Museon in time for the Ecsite annual conference. Hub has written an excellent blog post about this process which you can read on the meSch website.

Brainstorming the augmented cases at SHU.Designing the content display.

Creating the Cases

From this brainstorming session we arrived at the idea of a augmented showcases that would allow us to crowdsource a museum exhibition. Sensors mounted within the case would detect visitors approaching to look at the object inside the case. By monitoring how close visitors came and how long they stayed we could determine their level of interest in the object. A number of cases could be set up together and then a type of popularity contest could be created: monitor levels of interest in each object over a set period of time and then replace the least popular object at the end of the time period.

To do this we created a new case design, based around the standard show case used at Museon. Inside the case we placed three infrared distance sensors, which faced out through a slot in the case. This sensors could monitor proximity levels of visitors to the cases. Data from these sensors was gathered by an Arduino Mini and then sent via a USB connection to a Raspberry Pi. This processed and logged this data to a centralised server.

Model of augmented showcase.

The Raspberry Pi also performed other useful functions for the cases. By placing a pico-projector within the case we could project content about the object onto the case itself. This allowed the museum to augment the object with content from their repository, including text, images and animations. The Raspberry Pi was used to read content for the current object from the central server and display this content through the projector onto the case. The content was stored in HTML format, making it easy for curators to add or update content in the system. The projector was embedded in a casing designed to stand in the showcase and ensure the correct alignment of content with the case.

To allow curators to easily inform the system when the object in a case was changed and to ensure the correct content was being displayed we interfaced an NFC tag reader with the Raspberry Pi. A small pocket was created on the side of the projector casing into which the curator could place a laminated card depicting the object now in the showcase. This card contained an NFC tag with an object ID which allowed the Raspberry Pi to detect the new object and display the relevant content.

Content Engine

The content engine for the showcases was implemented in PHP, jQuery and HTML. The Raspberry Pi requests content from the central server for the current object. This content is delivered as a dynamic HTML page. The page has three sections: content from the museum’s perspective, content from the object’s perspective and content from the visitors’ perspective.

The museum content is static HTML content that presents known information about the object itself. The object’s perspective presents content created by the curators, but presented as though the object is speaking about itself. This included text, images and animations. The visitors’ content was created by gathering Tweets about the object made by visitors: each object had a unique hashtag that was displayed on the projection. This allowed visitors to offer opinions on the object and provided an additional measure of visitor interest.

meschcasesscreenshot

Deployment

We built four showcases for use in Museon during Ecsite. These showcases were set up side-by-side in a section of Mueson’s permanent exhibition before the beginning of the Ecsite pre-conference workshops and were left in place for the duration of the conference.

meSch showcases in situ in at Museon

Every three hours the least popular object was replaced with a new object from Museon’s storage area. In total 15 objects were selected, had content created for them and were rotated through the cases. Occasionally the objects were rotated between showcases, in case position had an effect on their popularity. Data gathered during the conference showed that some objects were much more popular than others, irrespective of which case they were placed in. One object (actually a set of objects) maintained their place in the showcases for the entire duration of the event, consistently scoring as first or second most popular.

What next?

In general I think these cases were a success. Museon’s curators liked them, a number of museum professionals attending the conference expressed an interest in them and we have had a request to build some more for an exhibition to take place next year.

The deployment was not without issues though. These cases were developed on a very short timeframe and as such there are some improvements to be made. In particular the showcases need to be more fully integrated into Museon’s existing network that was possible in the available time. Some other issues, regarding data gathering, system startup and shutdown and network reliability were also raised and although it was possible to deal with many of these in situ, some more thought should be put into them for future iterations.

Overall though, I think we were pretty happy with how this went, as indicated by this photo of some of the systems creators, smiling alongside the cases in place at Museon:

14042478518_b650242623_b