Automating Visitor Evaluation in the London Transport Museum

Sponsor: London Transport Museum
Sponsor Liaison: Elizabeth Poulter
Student Team: Lauren Francis, Sebastian Hamori, Andrew Robbertz, Kylie Sullivan
Abstract: The Learning Team at the London Transport Museum utilised paper and pen-based evaluations that were time-consuming and resource-intensive. Our project automated these evaluations by producing several digital surveys administrable with iPads. We created tools to analyze survey results and produced an analysis report showcasing the Learning Team’s progress towards its target outcomes. To ensure continued success, we recommend the Learning Team continues to update and improve its Microsoft Forms, protect and maintain its iPads, and conduct more frequent evaluations.

LO19-LTM_Final Report
LO19-LTM_Final Presentation

Executive Summary

The London Transport Museum is dedicated to preserving the history and importance of transport in London and to inspiring young people to pursue engineering careers. The Learning Team, a group of London Transport Museum staff, aims to empower and enable its participants to take the next steps towards a more fulfilling life and playing an active role in society. The team designs and administers collection-based learning programmes for different audiences, both in the museum, in communities, and in local schools. Wishing to better advocate the impact of its work to funders, the Learning Team created an evaluation framework coined Journey of Change. This was done using guidelines from Project Oracle, an organisation that validates educational programmes. This Journey of Change contains 21 target outcomes that outline the positive impacts of the Learning Team’s programmes on participants.

Previously, the Learning Team used only paper questionnaires and observation evaluation forms for measuring progress towards seven target outcomes. As a result of the paper-based processes, the Learning Team invested a lot of time and resources in the evaluation process. Our team was brought into the London Transport Museum’s Learning Team to digitise and streamline these processes. This will allow the Learning Team to better understand the impact of their learning programmes and how to report that impact to employees and funders. In order to examine the Learning Team’s current evaluation process, we conducted interviews with staff members and participated in programme evaluations. We identified the limitations of the current process based on information gathered and developed a list of specifications enabling us to design our three main deliverables:

  • a set of streamlined digital survey forms replacing all previous paper surveys,
  • a suite of automated tools and dashboards to analyse the collected data, and
  • a summary report showcasing progress towards the Learning Team’s target outcomes.

The first deliverable consisted of 9 Microsoft Forms surveys. This number was condensed down from 16 paper surveys by combining similar forms and standardising language throughout all of the evaluations. The forms also allow for conditional questions, enabling us to combine multiple paper forms into fewer digital ones. The Learning Team purchased 5 iPads in order to distribute the web-based forms to participants and teachers in the museum. Microsoft Forms also generates QR codes to share forms and enable participants to use their own mobile devices to submit responses. Microsoft Forms collects and organises responses in Microsoft Excel spreadsheets, which can be downloaded by staff for data analysis.

Our second deliverable consisted of 5 Microsoft Excel documents with built in dashboards for primary data analysis. Each document contains spreadsheet tabs separating different types of questionnaires and observation forms based on age group. These spreadsheets were developed with built in analysis equations that automatically populate existing charts and tables once data is imported from the downloaded spreadsheets. The analysis is organised based on the outcomes highlighted by the Learning Team’s Journey of Change, which allows for easy referencing in biannual reports.

Our final deliverable was a data report highlighting progress towards each of the Learning Team’s target outcomes. The report includes a performance overview table which gave RAG (Red, Amber, Green) ratings on the strength and quality of data supporting each outcome. This report is organised by outcomes and includes information on demographics and audience for each of the museum’s programmes. The sections include a variety of tables and charts that present the most important information from the data analysis we conducted. This report will be used as a template for future reports on a biannual basis. The target audience for this report is the museum’s staff with a variation being sent to funders.

Our team developed a number of recommendations for the Learning Team after interviewing staff and spending time working with them for a few months:

1. Protect and maintain the iPads:

The iPads purchased for the new Microsoft Forms are brand new and need to be maintained well to ensure longevity. Based on this, the first three recommendations are logistics focused but important, while the final recommendation is a longer-term goal for the Learning Team.

  • Obtain cases and screen protectors to ensure longevity of iPads
  • Find a convenient and safe storage location at the museum, as the iPads are more likely to be used there
  • Create a charging system for the iPads that can be used at the museum
  • Obtain more iPads to assist in gathering additional data

2. Update and Expand the use of Microsoft Forms:

Microsoft Forms were created for all of the programmes that currently had evaluation tools in place, but more will need to be added as programmes change or are produced. We also recommend looking into new ways that QR codes can be used around the museum to provide visitors with the ability to submit feedback at their convenience without the need for the iPads.

  • Look to post QR codes for the surveys in locations around the museum
  • Introduce QR code as the main surveying method for programmes witholder audiences
  • Pull data more often so the data analysis become statistically significant

3. Conduct frequent Evaluations to measure outcomes:

These final recommendations are to support the Learning Team’s future evaluating and reporting mechanism. The museum should continue to develop their evaluation methods by prioritising the below recommendations.

  • Evaluate more often (2 weeks a term) to increase statistical significance of data
  • Look into Project Oracle’s new website and determine if the accreditation process still works the same way it did previously
  • Continue to update sub outcomes anywhere they appear different than in the Project Oracle submission.