Friday, August 21, 2015

Video Prototype: Testing and Evaluation Results

In this weeks B Prac we ran our video prototype testing sessions. This blog post details my evaluation of this testing session.

Outcomes:
  1. What you asked users to do:Watch the video (advised users they can stop, rewatch, replay video as they like), answer a few questions about the video, ask/answer if they had additional questions, complete a survey containing questions about the concept and the testing session.
  2. What the users actually did:7 out of 8 users didn't replay the video, just watched it once through. Some resized the video to watch full screen. All answered my questions. Some had additional questions and feedback to add. All completed the survey correctly.
  3. The measures you used:No qualitative measures were used in the testing session. Only observations and notes taken.
  4. The feedback you sought:Feedback related to user's understanding of the game concept as communicated in the video prototype (gained from observation, Q & A session post-viewing). Feedback about video production quality, the game concept itself and testing session (from the survey).

Reflection:

Overall the video prototype was a great tool for communicating the concept for the game and it's rules without having to actually build anything yet. All users responded positively to the prototype, understanding the core concept of the game, and most understood the game's rules, interactions and mechanics. It was also good to get some feedback on how to improve the game at this early stage in design, which will make it a lot easier to implement when starting to build, rather than later down the track. 

Testing wise, I found my protocol adequate for what I wanted to achieve in the amount of time allocated. Towards the end of the session I wasn't as strict with asking the same scripted questions, I just asked the last 3 users if they had any questions or didn't understand anything. I felt this was good enough to get the same level of feedback in a shorter time. The lower scored responses to survey question 5 may have been due to the last minute testing towards the end of the session, where I didn't put as much effort in explaining the instructions, although all the participants I observed managed the watch the video, answer questions and completely the survey without any trouble.


Effectiveness:

I thought the video prototype was very effective in explaining the game concept. Overall users typically had a good grasp of the game's concept and rules after watching the video. 2 students who were not familiar with the Guess Who game had some trouble understanding those elements used in the game. The responses to questions 3 and 7 identify that more explanation about the guessing element of the game was necessary to help those understand how this translated into the mashup. Also explicitly showing or stating how the clues are to be delivered would make it clearer to viewers how this part of the game works.


Constraints:

Time constraints meant that I wasn't as thorough towards the end of the session in giving all the same instructions and asking the same questions as in the beginning. This may have resulted in collected less verbal feedback, but I think the result of this was pretty minimal, if any.


Implications:

The confusion of the guessing element and also crossing the road element at the same time has made me more aware that there may need to be more support in the game to train players how to play - maybe using a tutorial, or start playing at a really easy level, where the difficulty increases as time goes on. The next testing session will be quite different and more interactive, so I'll be more observant about their behaviour when interacting with the prototype than I was in this session.


RESULTS:

I had a total of 8 users watch my video and complete the survey. Overall the feedback was positive. After watching I asked a few informal questions directly:

1. What was your first impression of the video?
Comments included: "relaxing", "interesting", "good production", "clear", "good impression", "personal language used", "enjoyed the funny moments".

2/3. Did you understand the concept? Did you understand the rules?
6 out of 8 users responded positively, with no problem understanding. The remainder had some problems understanding the rules, especially concerning the Guess Who elements of the game. In these cases 2 of the users had very little background knowledge of this game, so therefore didn't understand how this translated into the game mashup.

4. What parts didn't you understand?
A user didn't quite understand how the clues were to be delivered in the game. Also here I had to explain the rules of Guess Who for the users who had never played this game, and describe how the Guess Who elements worked in Hop to Who.

After asking these questions, I then asked users to fill out an online survey which asked questions about the video production quality, the game concept itself and the testing process. The questions and results are as follows:

1. How interested are you in playing this game? (Scale 1-5, 1:not at all, 5: Let me try it now!)
4, 5, 4, 5, 5, 5, 4, 4 (Average 4.5)

2. How unique do you think the game's physical interactions are? (Scale 1-5, 1:not unique at all, 5: very unique)
5, 4, 5, 4, 5, 3, 5, 5 (Average 4.5)

3. Do you have any suggestions to improve the game?
"Might be difficult playing both games at the same time coz the frog game is all about timing, but I think it's gonna be pretty fun"
"personal quibble with games that involve love stories. is there another reason the frog would meet someone? other people probably wouldn't care about this."
"through the video, I thought one of results of this game is to find the heart of Mr. Frog. So, maybe for someone, they don't need finish the whole one round of game. I think it can set a condition that anytime picking up the heart from the puzzle, gamer can win in advance"

4. How would you rate the video and audio production quality? (Scale 1-5, 1: poor, 5: excellent)
5, 5, 5, 4, 4, 4, 5, 5 (Average 4.6)

5. In the testing stage, were my instructions clear? (Scale 1-5, 1: I didn't understand at all, 5: I understood the instructions clearly)
5, 5, 3, 5, 5, 5, 4, 3 (Average 4.4)

6. Did you have enough time to ask questions or add comments after watching? (Multiple choice)
No I didn't have enough time: 8
I was given some time, but would have liked more: 0
There was plenty of time to ask questions: 0

7. Do you have any other comments?
"Was a good video! Great idea and well explained"
"Might need to consider for those who have no common sense on either games you mashup including me. Tutorial may work well to me anyway."
"great video and interesting concept. was definitely emotionally invested to save the frog from a car tire - very visual but not alarming :)"

View a summary of the responses here: 
https://docs.google.com/forms/d/13DCUIHV2wHqeu_8hDvAC27Seg8d_jv9Kc-tp_w0o2-0/viewanalytics#start=publishanalytics




No comments:

Post a Comment