Improving Visitor Evaluation at The British Postal Museum & Archive

Sponsor: The British Postal Musuem & Archive BPMI
Sponsor Liaison:  Hannah Clipson and Dominique Gardner.
Student Team: Angela-Marie Conklin, George Benda, Nysa Casha, Shuyang Sun.
Abstract: This project was conducted with the British Postal Museum and Archive (BPMA) in order to improve their methods of visitor experience evaluation. We evaluated the BPMA’s past data, conducted research, interviewed staff and visitors, edited surveys, developed creative writing/drawing activities, and used social media to promote exhibitions. Through these methods we were able to identify the present visitors’ demographics, evaluate visitor engagement, and collect visitor feedback on their experiences. Ultimately, we made recommendations to our sponsors on how to improve their exhibitions and their visitor engagement evaluation methods.
Link: BPMA_LondonE14_IQP

Executive Summary

 

Introduction

The British Postal Museum and Archive (BPMA), is an organization based in London, England founded in 2004 as a charitable trust. The organization strives to share British postal history and promote communication amongst its visitors by providing a meaningful experience through fun and educational exhibitions. Through paper surveys and online discussion groups pertaining to exhibition experience, the BPMA has made many efforts to discover visitors’ specific enjoyments and preferences. They developed temporary exhibitions and interactive sites that entice visitors to their many locations. While visitor motivation, engagement, and retained knowledge are high priorities for the organization, the BPMA lacks up-to-date, statistically robust, and in-depth data on these aspects of the visitors’ experience. The goal of this project was to improve the visitor experience evaluation process at the BPMA.

 

Literature Review

We synthesized relevant articles and assembled best practices for museum evaluation strategies. We investigated the BPMA’s mission in order to identify which survey parameters best fit their purpose. In addition, we prepared the structural components of the exhibition under study, the Last Post Exhibit. We researched site description, best practices in survey design and methods of analysis. Case Studies deemed successful, suggested that the focus of these surveys revolve around (1) demographic profiling; (2) visitor motivation; (3) visitor experience and ways of engagement; and (4) visitor response/takeaway.

In our research, we found that the museums were not just tourist sites, but places where local residents could relax and spend time with family and friends. The literature review debunked our predictions and enabled us to reconsider the relevance of certain questions. We aimed to keep visitor “exploration” categorization, visitor motivation, visitor experience, and ultimate visitor takeaway in mind. The most important element in survey creation entailed keeping the museum’s goals and visions at the forefront of our methodology.

 

Methodology

The goal of our project was to improve the process of evaluating visitor experience for exhibitions provided by the British Postal Museum and Archive (BPMA). To meet our goal, we followed these objectives:

  1. Evaluating the BPMA’s current survey and other baseline strategies used to measure visitor engagement.
  1. Understanding and identifying site-specific needs, constraints and parameters of museums and their exhibitions.
  1. Designing and testing tools that measure visitor experience.
  2. Determining an effective tool (device or software system) for data entry and analysis.

 

The key to our project was not only to find best practices for evaluation, but to also create innovative ways of evaluating visitors. We constructed a list of designs, which we determined all held potential to provide informative feedback. We developed a creative writing/drawing center at the end of each exhibition. We utilized Twitter and Facebook by posting quotes of people we interviewed, and submissions from our creative writing/drawing activities.

 

Results

The data we gathered determined which methods of visitor evaluation produced the most informative conclusions at each site. Although the sample size of the data was small making these conclusions not statistically significant, we were still able to gain some insight from them.

 

Objective 1: Evaluating the BPMA’s current survey and their baseline strategies.

In our review of the BPMA’s paper survey, there were some questions we thought needed to be reworked or changed. In looking at the past survey, we found confusing statements, an uncomfortable mix of free response and multiple-choice questions, and a lack of site-specific questions.

 

Objective 2: Determine site-specific needs, constraints and parameters.

We observed and visited well-known museums in London to see what was and was not working for them. By evaluating popular museums, this research enabled us to identify the onsite needs of BPMA exhibitions.  The more engaged the visitor, the more likely he or she was to share feedback. The Natural History museum had a plethora of interactive games, videos, auditory telephones, and three-dimensional displays. The Victoria and Albert Museum was primarily a visual experience. The visitors seemed to enjoy wandering and looking at a variety of historic displays. The Science Museum was a bit different from the aforementioned two museums. It had beautiful models and displays. The Science museum was interesting and exciting, but not as popular and captivating as the other two museums.

The Postal Maps Event, a BPMA pay upon entry presentation, was our first opportunity to take note of visitors’ reactions to present information, and to our evaluation methods. At this event, which focused on the evolution of London postal codes, fifteen people attended and enjoyed both the provided refreshments and the displayed maps of London postal districts.

The Last Post exhibition, featured in Mansfield, was our first opportunity to evaluate visitors in an exhibition setting. The exhibition, consisting of eight panels, was located at the entrance to a children’s museum. At this site we found that visitors did not want to take electronic surveys nor did they enjoy being quizzed on the material. These findings allowed us to play to our strengths at the Last Post Exhibition at Coalbrookdale where we primarily used observations and surveys to gather data.

 

 

Objective 3: Designing and testing tools that measure the visitor experience.

We developed a quiz and a creative writing/drawing activity to see if visitors were absorbing the information provided by the exhibition. Although we were not able to test these methods on many visitors due to our small sample size, we were able to gather some valuable information. At events that were not an appropriate setting for an activity, surveys were used to measure visitor experience.

At the one event and two exhibitions we attended, we found that the visitors, who participated, enjoyed the presented information. Interviews with staff were informative since they spend every day onsite and see firsthand how visitors react to the material. Based on the data gathered from our prototype survey, we found that paper surveys were preferred to electronic ones, and that children at the site did enjoy our creative writing/drawing activity. The tweets that were posted received six retweets and four people favored them. The sample size gathered was small, but the data collected was helpful in determining which tools could be popular and informative.

 

Objective 4: Determining an effective tool for data entry and analysis.

At the Museum and Heritage show we were introduced to a wide variety of products and methods that could be a great asset to the growth of any museum. We specifically were looking for interactive activities the BPMA could use to increase visitor interaction with their sites and devices to use for visitor evaluation to make collecting data more efficient. We created a spreadsheet, which depicted each device we thought would meet our organization’s needs. The sheet included company name, product purpose, cost, and duration of effective use. The products were grouped by type pertaining to computer devices, guided tours, and visitor aid.

 

Discussion

 

Visitor Engagement

Visitor engagement is the varying level of involvement one has with an exhibition. Visitors, who pass by a display without looking at it, will be less likely to take part in exhibition evaluation methods. By creating visually enticing displays and interactive activities, people are more likely to participate in our evaluation methods. In places like Coalbrookdale, where the exhibition had its own room 54 visitors had elected to take past surveys, and were very willing to take the ones we presented to them.

 

Visitor demographics

            The data showed us that the majority of visitors are adults. Those who filled out paper survey liked the format and specified that they would not prefer an electronic survey. In order to gain data it would be advisable to continue using evaluation methods that the visitors respond well to.

 

Visitor Feedback

The visitors we interviewed and surveyed allowed us to conclude a few things about what visitors generally thought about the exhibitions and events. In looking at the survey and interview results from the event and exhibitions, we inferred the following: visitors generally enjoyed the event and gave an overall high rating. We recognized that the sample was too small to reach any statistically significant conclusion. What we can say about the exhibitions is that placement and layout are very important. Visitor’s enjoyed reading small amounts of text and looking at pictures and displays. Most visitors also did not know that the exhibition was produced by the British Postal Museum and Archive.

 

Recommendation

 

Evaluation Methods

The recommendations made in this section are based on interviews, data collection from surveys, interactive activities, and observations completed during this project. In future surveys and interviews, we recommend asking straightforward questions. The survey should have a balance of open-ended and multiple-choice questions. We recommend that when interviews take place, they should be semi-standardized. We found that by having a conversation rather than asking formal questions, made people feel more at ease. We tested this anecdotally by asking formal questions to some staff members and then having conversations with others. We gained more useful feedback during conversations.

 

Software Options

There are many different software options that we determined could be useful and exciting for the visitors to use. We encourage the use of an iPad paired with Survey Monkey to see how much attention it receives as compared with the paper survey. Due to a lack of resources, we were unable to compare these two alternative methods. Unfortunately, there are very few devices that can simply transfer data from the paper survey to a database. If the BPMA were to use mobile apps on phones or digital surveys, data could be input immediately. The display that we think can engage the most people was the FAB (Family Activity Based). It is a family activity set up so that children and parents can take games, audios, or visuals with them as they explore an exhibition.

 

Additional Recommendations

The location and orientation of the exhibition is key when attracting attention. The exhibition in Mansfield, primarily a children’s museum, seemed too tall and complex for children to read. About 30 children came to the museum during the time we were there and only two attempted to look at the material presented.

The more attention an exhibition gains, the more opportunities staff will have for evaluation. Therefore we recommend changing the layout of the Mansfield site, which will result in more visitors taking part in the evaluation methods. We recommend that the BPMA takes into account the room/space and demographic of visitors when setting up an exhibition.  Larger text, additional pictures, or a spreading out the display would have attracted more attention. We determined this after observing visitors who had come specifically to see the exhibition walked right passed it many times before having a staff member direct them towards the display.

 

Conclusion

Unfortunately, we were unable to evaluate a substantial amount of visitors, but the research we gathered, and the evaluations we conducted can still be used and seen as informative.  In the first stage of this project we thought implementing technological ways to collect data would be the most popular, yet we learned that site-specific needs were based on the desires of our visitors. In determining the best practice for evaluating visitors, we also suggest the BPMA focus on what the visitors enjoy. Therefore, new innovated methods are always good to test; however, if they are not popular among visitors, we suggest that they are not used. Observations are the most informative way of evaluating. We found that by seeing our visitors engaged and reading body language, we could determine what they enjoyed and preferred than some of the vague comments left on a survey.

Overall, the project was a success in suggesting progressive changes for the BPMA. After analyzing the strengths and weaknesses of the current surveys, we established recommendations for the most feasible approaches for the BPMA. Unfortunately, we were unable to evaluate a substantial amount of visitors to make concrete conclusions, but the research we gathered and the evaluations we conducted can still be used and seen as informative.  In the first stage of this project we thought implementing technological ways to collect and analyze data would be the most efficient, yet we learned that site-specific needs were based on the desires of our visitors who favored paper surveys. In determining the best practice for evaluating visitors, we also suggest the BPMA focus on what the visitors enjoy. Therefore, new innovated methods are always good to test. Observations are the most informative way of evaluating. We found that by reading body language and viewing the visitors as engaged, we could determine what they enjoyed and preferred compared to the vague comments left on a survey. We also found that speaking to staff members who work every day at exhibitions can share insightful information that can be used to improve exhibitions further.

The information we gathered not only benefits the BPMA, but will in turn assist other museums that face similar challenges with visitor feedback. The data helps organizations develop a baseline of information, which they can build upon. It also provides evaluators with tested surveys and activities. The significant amount of information this project provides can benefit the BPMA and similar organizations towards determining the best practices for evaluating visitor engagement. Each innovation in evaluating and improving visitor evaluation helps the BPMA maintain its position as one of London’s great tourist attraction.