IMG_2506.jpg

Hi there!

This is where I document some of the questions I've answered as a designer and researcher.
You can also find my resume here.

How do you create a successful SXSW experience?

How do you create a successful SXSW experience?

Through adaptability, quick user feedback implementation, and having fun with your team. We created a cognitive beer tasting experience for SXSW 2016. Since then, it has been shown at multiple conferences around the world.

Know your constraints

Before we even thought about what kind of experience we wanted to create, we knew we had to use components from a previous project. This previous project was intended as a solution for the retail environment. This solution meant to bridge the physical and digital retail experiences. 

Customers at a store would pick up an item from the shelf, and detailed information about this particular item, as well as recommendations to additional items that would pair well with their selection, would be displayed on a screen. RFID sensors were placed on the products, and RFID readers where placed on the shelves, so the user interaction of choosing and picking up an item was not interrupted.

ManPickingUpBottleConcept

Then we started to explore how the user could interact with the screen to further explore recommendations. We tested using Microsoft's Kinect and also a Leap Motion, in order to see if users could navigate through the experience using these, but the level of detail at which hand and arm gestures were tracked, didn't suffice the user needs.  

Testing gesture recognition with Microsoft's Kinect

Testing gesture recognition with Microsoft's Kinect

The Apple TV 2 had recently come out, so we decided to give it a shot. Unfortunately, it was the Achilles' heel of the project. The Apple TV 2 remote was quite unintuitive and it entirely hindered the experience, as you can see in the following video. 

Shortly after, we were given the task to come up with one of the SXSW experiences for the IBM Design Studio Hive. We had a short time frame, since it needed to be finished in 4-6 weeks, so we got crafty. 

Identify the key ingredients for a SXSW experience

The team began to brainstorm ideas that would highlight different technologies we were developing, but most of them felt a bit lacking on the human interaction / engagement department. I decided to spend a few minutes doing some research to find patterns or key components to a successful and engaging SXSW experience. Therefore, I went to Pinterest and searched for SXSW (obviously). Why Pinterest? Pinterest, it's one of the fastest ways to find recently curated content, which makes it a great tool when you're trying to quickly identify recurring themes or patterns. As you may notice in the image below, most SXSW experiences highlight an aspect of Austin's culture. It makes sense, considering that most of the people attending are from out of town.  

Pinterest "sxsw" search results showing a bunch of things to do in Austin.

I shared this information with the team, and proposed that our experience should somehow showcase some of the things Austin has to offer. I also proposed we could curate what the items on the shelves should be, ideally something Austin related (like bbq sauce, a guitar, etc). That way when the user picked up an item, it would show information about where in Austin they could go and enjoy an experience related to that object. After further brainstorming, we narrowed it down to creating a cognitive beer tasting experience. This would showcase IBM's Watson machine learning capabilities and also some of Austin's best craft beers.

Break it down

We started by refining the initial concept and taking into account multiple constraints (tech availability and time) until we decided what the final experience should be:

1. The user would answer 3 questions about their food preferences.

2. Watson would recommend 3 beers based on those preferences by matching the user's flavor profile to the beer's. 

3. The user would blind taste them and rank them based on taste.

4. The big reveal: Show the user what type of beer they had actually chosen as their top choice and see how accurately Watson had matched them with their favorite type of beer.

During testing we discovered that people enjoyed the experience since they didn't think it would be possible to guess what type of beer they liked based on the food they preferred, and also based on what time of the year they preferred to drink. They also were surprised by the accuracy with which we predicted different beers that they would enjoy. Sometimes, they were also impressed when they would taste a new type of beer that they had never tried before, because they thought they wouldn't like it, and ended up really enjoying it. 

Making it happen

We first had to gather a lot of data to see if the relationship between food preference and beer preference existed. We first began to see if these relationships existed through card sorting exercises, then we scaled it up and sent a survey out to the world. The more user data we gathered, the more accurate our prediction would be.

Once we had enough data, we worked with the developers as they tried different algorithms that would help us improve the prediction accuracy. After a few days, we were able to narrow it down to a few questions:

1. What is your favorite berry?
2. What is your favorite cheese?
3. What is your favorite dessert?
4. What time of the year do you prefer to drink beer?
5. When do you prefer to drink beer the most?

By answering these questions, we could give an incredibly accurate prediction of what type of beer they would enjoy. Once we had the back bone figured out, we started to work with the designers to mold the experience.

Aide running a testing session for the bar surface.

Some of our team members built a custom bar and tv display (shown in the image above). We couldn't use the RFID sensors as we originally intended since we would be disposing of the cups that each person used, and placing RFID tags in hundreds of cups wasn't scalable, so we came up with a more affordable solution. We ended up using QR code stickers on the bottom of the cups to identify the different beers, and computer vision to detect the QR codes.

That way, whenever a user would pick up a cup during the blind tasting part of the experience, the camera below the clear surface top would detect that, and the information displayed on the screen would show details for that specific beer (for example, the beer abv and ibu and its flavor profile). All of this without revealing which type or brand of beer it was. This quickly became a wow factor since people thought it was "magical" how they were controlling what was on the screen just by picking up the cup with the beer.

Testing, testing, 1, 2, 1, 2.

We then began to test all the user interaction points of our experience. This included: the interaction with picking up the cups, visual queues on the bar top, ranking system, comprehension of the beer information on the screen, appropriate timing of animations, sound effects, lighting, etc. We wanted to ensure that we could create the best and most efficient experience due to the high volume of people that would attend the event.

We recruited users for multiple testing sessions, each time testing a new iteration of the experience. We had several testing sessions over a couple of weeks. We used different testing methods including: card sorting, paper prototypes, interviews, and moderated experience testing.

Some of the details we spent solving for the longest was the ranking system used to rank the 3 beer samples, from "favorite" to "least favorite". Half of the design team believed we should use a "1st, 2nd, 3rd" ranking system, and the other half thought we should use stars. In the end we chose stars since the users were more familiar with that rating system thanks to that same rating system found in online shopping. 

 

All good things must come to an end

After working closely with the designers and developers for four weeks, iterating multiple times as we validated the experience, and presenting our findings and progress to our stakeholders, we finished the experience.

 

The future is now

The experience was a success at SXSW! We got several requests for it to be sent to other events. The problem was that setting up and breaking down the beer tasting stands was a logistical nightmare, so we set out to create a portable version instead. The challenge was to keep the immersive experience intact while making the hardware and setup smaller.

After doing technical research with RFIDs again, cup holders, and weight sensors, we ended up with a weight-sensitive beer flight. We 3D modeled a prototype and 3D printed the casing for the components. We wrote and tested an instructional PDF and packed it all up in a pelican case. It has now traveled around the world, including Amsterdam, NYC, Vegas, France, and more!

 

Lessons learned 

These are some user behaviors we learned while working on this project:

1. People dislike interacting with technology in public that requires hand and arm gestures, especially in a semi-formal retail environment.
2. People dislike having their shopping preferences publicly displayed on a 27" screen or larger. Smaller screens are fine.
3. When using stars as part of a rating system, always provide context. Meaning, if 5 stars is the highest rating, make sure you visualize "1 star out of 5" instead of just showing a single star on its own. The latter can be confused with #1, as in 1 single star being the best option.
4. The Apple TV 2 remote interface is confusing to first time users. Especially the "back" button since it says "home" instead of "back".
5. When validating iconography, low fidelity card testing is the fastest method to gather feedback and iterate.

 

 

 

 

How can you interact with a VR world using your voice?

How can you interact with a VR world using your voice?