Week 10: User Study


During the last week, we had 2 of potential users of our application give us a first feedback for the prototype we build during the last weeks.  With help from the users, the design of our prototype was tested with the so called “Think Aloud” method. Having launched our prototype, we expected from the users to react to the application as they would do if they would have seen this application in the public on a public display and thereby they should speak aloud they thoughts: what did they thought is happening,  what they were trying to achieve and why the acted the way they did.

Afterwards we gave them some specific task to fulfill, and observed their actions to evaluate our design decisions in order to improve them.

As we launched our prototype application, the users saw the following setup:


It was pretty easy for both of the users to quickly recognize that see a dressing room. As they stood in front of the camera and saw them self, they intuitively began to wave their hands until they’ve seen a red circle appear above their hands. At first they were surprised and could not think what that means. After having some thoughts, they stared to place their hands above the visual buttons to see what happens. At first – nothing happened, because our application expects that the users holds his hands a certain time over to button to actually trigger it. But after some time of tryouts, the observed that if the hold still above a button, that the circle was painted red from inside and the buttons disappeared, showing the category of clothes they accidently selected. At this point, we need to think of a better solution of how we can tell a user more obviously that his hands can fulfill “magic”.After this first aha-effect we proceeded with giving the users actual instructions.


After the first setup, we gave the users the following instructions in a typed form and observed their behavior:

  1. Try on a Black dress and then a blue Tshirt
  2. Find out how much a blue t-shirt and black pants cost

Having learned from the first aha-effect, both users performed our desired actions with ease. This was due to the fact that both of the users where tech-savvy ones.  Except sometimes, there where some problems with our application to identify their hands. This caused them to struggle with their hand movements and get a little disappointed.


Week 9: Software Prototype


The Task of the last week of our project was to build a first functional prototype of our application.
For the implementation our group received a PrimeSense Sensor which is simmilar to the famous Kinect for the XBOX 360. (http://www.asus.com/Multimedia/Xtion_PRO).

For the Implementation of our prototype we chose the following development stack:

  • Processing 2.1
  • SimpleOpenNi 1.96
  • Java 1.7

Our first implementation shows how we plan to add different kinds of clothing to our users. This is done using gesture control for selecting the clothes and using body part detection and tracking of the user with help of our PrimeSense sensor. The clothes are 3D objects, which allows for scaling and also rotating of them, giving the user a nice view of the clothes from different angles.


During development of the prototype we’ve found that the Drag&Drop selection (introduced during the Paper Prototype phase) of clothes by the hand of the user is redundand. Therefore we decided to implement the clothes selection just by “clicking” over the desired clothing.

Currently there is only a small amount of 3D clothes that we’ve found online for free. Also the current UI is in it’s very early stages and it has to be implemented further.