Experiencing a museum for the first time is exciting and a fun way to learn new things. However, sometimes attending a museum can cause confusion if there's not a clear path or information readily available. You might want to know where the nearest restroom is and not know where to look. You may have more questions about some exhibits but there's not enough explaination in the little card next to it, or there may be no employees nearby to ask questions.
The AR experience needed to be immediate, inclusive, and intuitive. Our solution involves using AR glasses to enhance your museum experience. Through gestures and signals, you would have access to information at every location inside the museum. Finding similar exhibits and learning about them would be simple and accessible for those who may not speak english as their first language.
Skip to Final ProductUX/UI, AR/VR Design, Spatial Design, Accessibility
9 Weeks
Jessica Walsh
Gabi Garcia Greco
User Research, Usability Testing, Gesture Research
Figma, Blender, Adobe After Effects
Our research told us that not everyone attending a museum may speak English as their first language. It also told us that more than half of people attending museums often are met with questions about an exhibit and the information behind it. Another thing that we kept in mind from our survey was that most people have been on some form of extended reality tour, mainly audio tours, so using something new like an AR headset may come more naturally to them.
We conducted a survey to learn more about what people enjoy about Museums, and what they feel could be improved by using AR. We had 62 respondents with people aged 18-60. Some key things we discovered from our survey was:
Alongside our survey, we interviewed someone who was working in the UX Design field specifically in AR/VR. They mentioned many helpful tips on designing an AR experience, and ultimately these key takeaways really stuck out to us:
Assigning the glasses to various museum attendees would be required and could cause issues when teaching people about Muse.
A tutorial may be required since a small percentage of people may not know how to naviagte the product.
The glasses would need to be accessible in every way. Including the visuals, which shouldn't be too overpowering. We wanted to focus on creating an experience that wouldn't cause over-stimulation.
We created three personas with different mindsets with each one having a different level of understanding with tech. We did a quick sprint to figure out whether our personas would benefit more from a tablet-led experience, or a smart glasses experience. We weighed the pros and cons for each device, keeping in mind accessibility, and ultimately decided our solution would be best met using AR Smart Glasses.
In order to understand our users needs with each step of our experience, we designed a storyboard that could showcase the different steps in the process. This storyboard helped us find holes in our experience, and allowed us to realize potential pain points within our process. This storyboard helped us uncover the best solution for making the experience easy to use and accessible to all. With visual cues and arrows to help guide you, you could grab the glasses and set them up in a safe environment that is out of everyone's way.
After we weighed the pros and cons we thought of ways the glasses could potentially supplement each experience throughout a museum tour. We ran through ideas for paintings to textiles developing possible features along the way. After generating a series of possible features we were ready to develop a rough storyboard of what the experience may look like to help us step through the experience. We did research on proximity ranges in AR, specifically for someone who may have down syndrome so that we could understand that a fully developed version of this app would need to have a greater margin of error for the gesture recognition.
In order to provide the best experience, we researched what to avoid when designing in an AR space. We had to ensure that everyone coming into this experience was comfortable and we wanted to eliminate any waiting or pauses that might happen. This led us to our gesture research. We wanted to know what types of gestures were common in our everyday lives that we could use to reduce potential arm fatigue. Some of our key findings during this development phase were:
With the general research of what to avoid when using gestures we could begin breaking down some that we could use. To begin this we wanted to know what the most basic gestures are for specific things. I ran a quick user test in a zoom meeting with my class and discovered what they did in different scenarios. I asked my class to do each of these things while I recorded it and compared it to our existing research on gestures in AR. These questions that I asked were:
As we did research and interviewed, we began to create a gesture library that we could use. Many of these gestures were common in everyday life, and most were used in Microsofts HoloLens as well. We continued to develop the library as we discovered more gestures we needed. We ended up animating them to create a clearer vision.
Here is a showcase of the menu system that will change depending on what you are looking at and doing.
Muse features a main menu that would offer you multiple features to choose from when looking at an exhibit. This would change depending on what was available for the experience and could range from seeing info about the exhibit, hearing dialogue explaining the exhibit, or a button to help you navigate to a similar exhibit. For the purpose of this demonstration the user is fully onboarded and is touring the museum. The menu that you see is contextual to the type of exhibit the user is looking at. At this time there is no option to listen to an audio recording, so the option will be left out but would appear in the same menu system if available.
After an action gesture, you would have different options become available depending on what you did. For example, the user has taken a snapshot using their fingers, and looking down they have access to either saving the snapshot, or deleting it. This is an example of the contextual menu that would pop up.
To see the experience, we needed a way to showcase our menu and the various gestures. Although we didn't include all the gestures in our video, we used the main four: tap, zoom, snapshot, and like.
Over the course of the 6 months I had to work on this project I learned how important accessiblity is when developing applications. This is especially true with any educational app or public application that will be used by a large pool of people with several diverse backgrounds. Creating inclusive designs helps to resolve issues before they appear. I learned how to utilize Figma, making usability testing even easier since I was able to prototype rapidly while using a Design System. One of the most important things I learned from creating backpack is that a Design System is extremely important. Creating a ready to sell product isn't possible without a system in place that could be easily picked up from another team and used.
If you would like to hear more about my process, feel free to contact me.