MyUI: Mainstreaming Accessibility through Synergistic User Modelling and Adaptability FP7-ICT-2009-4-248606
MyUI Games and Exercises for User Profiling: prototype Public Document
Deliverable number
D4.5
Date of delivery
08/31/12
Status
Final
Type
Prototype
Workpackage
WP4 - Mainstreaming user-model-based adaptive accessibility into devices and services
Authors
SOTE (Barnabas Takacs), Philips (Bruikman Hester), UNOTT (Johann Riedel, Rob Edlin-White), UC3M (Jose Alberto Hernandez)
Keywords
User modelling, Adaptive UI, Cognitive Games, Physical Exercises
Abstract
This document describes a series of MyUI games used to assess initial user states as well as the evolution of user profiles over time. The focus is on cognitive and physical challenges that affect how users perceive and/or control the MyUI system. Measured profiles can subsequently be used to provide an adapted interface and furthermore to make inferences on user behaviour.
2010-2012 MyUI Consortium
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Table of Contents 1.
INTRODUCTION ...............................................................................................3
2.
MYUI GAME APPLICATIONS DESCRIPTION.................................................4
2.1 MAPPING OF GAMES TO USER PROFILE VARIABLES....................................................4 2.2 GAMES USER INTERFACING ..................................................................................6 2.3 COGNITIVE GAMES FOR USER PROFILING ...............................................................6 2.3.1 Trail-Making-Test .........................................................................................7 2.3.2 Cards-Pairs Matching ..................................................................................8 2.3.3 Sudoku ........................................................................................................8 2.3.4 Corsi BlockTapping ......................................................................................9 2.4 PHYSICAL EXERCISES FOR USER PROFILING ..........................................................10 2.4.1 The Games ................................................................................................10 2.4.2 Preliminary Game Evaluations ....................................................................16 2.4.3 System Architecture MyUI Physical Exercise Games ......................................51 2.4.4 Pointer Device Handling .............................................................................27 2.4.5 Data Logger for Central DataBase ..............................................................30 3.
METHODOLOGY FOR SCORING AND FUTURE GAME INTEGRATION ....32
4.
CONCLUSION .................................................................................................35
REFERENCES ..........................................................................................................36 APPENDIX A .............................................................................................................37
2
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
1. Introduction Playing games offer a unique opportunity to assess user abilities and capabilities that relate to adaptive interfaces. When participating in a game, the high motivation levels, personal interest and concentration required to carry out a particular exercise leads to pushing our mental and physical limits. This phenomena allows computer systems to observe the EndUser and subsequently draw inferences from how they play for adaptation purposes. People often experience a wide variety of changes in capabilities, motivations, preferences, opportunities and preferred learning styles over the course of a lifetime, and this can have an impact on their ability or willingness to use technology devices (which are normally designed by, and largely for, younger people, with limited concern for universal accessibility). Common capability changes include muscular-skeletal and neuro-motor changes affecting the ability to use certain types of controls, sensory changes which impair the ability to detect certain displays, and cognitive changes which can impair the ability to interpret display and decide on courses of action. Some of these capabilities can be supported or modified through the use of assistive devices or alternative task strategies. Impairment levels and accessibility needs are not static. Many of the underlying causes of capability changes have a gradual onset and are progressive; e.g. the increasing opacity of the cornea over a lifespan reduces the ability to read small fonts in poor lighting conditions. However there are other temporal factors which make impairment less predictable and stable; some fluctuating, some seemingly random. For instance many older people’s visual acuity will vary with the time of day, replenished after sleep. It may also vary due to task demands – e.g. after a long period on a task with high visual challenge the eye muscles may tire (eye strain). Visual acuity may also change due to environmental characteristics such as general lighting levels, reflections, glare etc. Finally a person’s effective visual acuity may change due to the availability of assistive devices; sometimes the correct spectacles for a particular task are not available and they attempt it without or with less suitable devices. There are similar fluctuations in other domains; e.g. many people with cognitive impairments have “good days and bad days”, affected by mood, diet, sleep etc. Many older people have complex regimes of medication whose cycles can also affect their capabilities. This document describes a series of MyUI games used as exercise to regularly assess the initial states as well as the evolution of user profiles over time. Our focus is on cognitive as well as physical challenges that may affect how the user perceives and/or controls the MyUI system. Thus, the key objective of the game like exercises is to provide data on user abilities to the Context Manager, to be included in creating and maintaning a user profile. This profile can then be used to provide an adapted interface for a specific user. The information is also used to make inferences on user behaviour that is measured during use of other MyUI applications. This document is partly based on, and complementing R4.5.
3
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
2. MyUI Game Applications Description The applications described in this document allow the MyUI system to create and maintain a user profile on sensory, cognitive and motor abilities in a playful manner. The games are in terms of architecture and user perspective like any other MyUI application, making them unobtrusive and fun, while at the same time allowing the MyUI system (specifically the Context Manager) to build a more accurate user profile, making adaptations more and more in line with users’ abilities. Games have to be easy to understand, enjoyable and suitable for the task of user profiling. The underlying purpose of such exercises is to develop a standardized test battery capable of measuring user profiles for system initialization or overwrite already existing user profile variable values. These measures upon which this information is built fall into two major categories, specifically they maybe validated or non-validated. The former group contains exercises with a large body of pre-existing user data on how thousands of users have performed on a particular task. Examples include cognitive games based on neuropsychological tests originally developed for assessing cognitive impairments, like the Trail Making exercise. The second group, like the Physical Game exercises presented below, measures motor skills that may pose different limitation to each user and also reflects the particular physical setup of devices, layout and time characteristics. These latter measures thus primarily perform on the relative scale providing an initial assessment and tracking changes as they may occur over time. 2.1
Mapping of games to user profile variables
As user performance on games influences a user’s profile on cognitive impairments and motor impairments, which profile variable they influence and to what extent needs to be specified. A methodology for defining appropriate game score and user profile mapping is described in Chapter 3. As mentioned above; some of these mappings are validated, others are not. This is dealt with by providing an initial mapping and by minimizing the consequences of invalid mapping. Compensation is done by allowing user control via manual user profile settings and by adding a user profile variable „MyUI Experience” that can influence the weight of game score on user profile values. The mapping of game scores to user profile variables, and subsequently user profile values, is provided in Table 1 below.
4
MyUI / FP7-ICT-2009-4-248606 Name Card-matching
User profile area Cognitive
Trail-Making-Test Part A
Trail-Making-Test Part B
Cognitive
Cognitive
D4.5 / Final User profile variables Working Memory, Attention
Processing Speed, Attention
Processing Speed, Attention
Measure No. Of clicks
Score >60
Time
50-60 40-50 30-40 <30 <=53
1 2 3 4 0
Time
54 - 74 75 - 94 95 - 112 > 112 <= 141
1 2 3 4 0
142 - 225 226 - 308 309 - 340 > 340 CORSI-Score (Length of the > 46 longest block span repeated 31 - 45 by user * correctly repeated 16 - 30 sequences) 2 - 15 <2
Corsi Block Tapping Test
Cognitive
Sudoku
Cognitive
Attention
Time
Balloon
motor
Processing Speed, Hand precision
Time
Magic Spoon
motor
Processing Speed, Hand precision
Time, Range of motion
Painting
motor
Working Memory
Processing Speed, Hand precision
Time, Range of motion
Gravity wells
motor
Hand precision
Range of motion
Curved Ball
motor
Hand precision
Range of motion
<10mins 10-20mins 20-30mins 30-40mins > 40mins < 15 sec 15-30 sec 30-60 sec 1 - 2 min > 2 min < 15 sec and [0.1,+0.1] 15-30 sec and [-0.1,+0.1] 30-60 sec and [-0.5,+0.5] 1 - 2 min and [-1.0,+1.0] > 2 min and [-1.0,+1.0] < 15 sec and [0.1,+0.1] 15-30 sec and [-0.25,+0.25] 30-60 sec and [-0.5,+0.5] 1 - 2 min and [-1.0,+1.0] > 2 min and [-1.0,+1.0] [-1,+1] [-0.5,+0.5] [-0.25,+0.25] [-0.1,+0.1] [-1,+0.0] or [0.0,+1.0] [-1,+1] [-0.5,+0.5] [-0.25,+0.25] [-0.1,+0.1] [-1,+0.0] or [0.0,+1.0]
Table 1: Summary of user profile variables mapping.
5
Value 0
1 2 3 4 0 1 2 3 4 0 1 2 3 4 0 1 2 3 4 0 1 2 3 4 0 1 2 3 4 0 1 2 3 4 0 1 2 3 4
MyUI / FP7-ICT-2009-4-248606
2.2
D4.5 / Final
Games User Interfacing
From a practical and implementation point of view in the MyUI framework, the role of games for user profiling builds essentially on a set of basic games, offered to the new user upon logging-in, for the purpose of providing a basic estimate on the user’s cognitive and motor skills, which subsequently serves to aid the Interface Adaptation Engine and lead to a first User Interface proposal. All games have been developed in Flash and Javascript, and they easily integrate with the CakePHP framework. To help visualize how this works Figure 1 shows the welcome menu for a given user without any impairments. As shown, the second row, and fourth column offers the application Games, where these games may be accessed by the user. This redirects the user to a number of games available to be played by the user. The games portal has been developed as a CakePHP plugin (See Deliverable D4.2 for a review of CakePHP fundamentals). The games menu offers the user a large number of games, each focused on capturing different features of the user, to feed the Context Manager. In the sections that follow we will describe in detail the structure and operation of the games currently implemented in the MyUI framework.
Figure 1: Main menu for a user without impairments offering the option of playing games for user profiling
2.3
Cognitive Games for User Profiling
Cognitive games are based on Neuropsychological tests originally used to assess mental impairments of psychiatric patients. As such, they offer validated measures relying on a statistical corpus of large user performance data. In the context of MyUI they provide EndUsers with simple, but challenging tasks and while measuring certain performance characteristics the results later alter the default user profile. 6
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
2.3.1 Trail-Making-Test The first type of validated cognitive game in the MyUI system is Trail-Making (TM). One of the key advantages of TM is that it is easy to administer and understand. It was originally part of the US Army Individual Test Battery (1944), developed by psychologists. It was further improved and used in a variety of Test Batteries (e.g. Halstead-Reitan Battery). TM can provide reliable measurement of visual attention, speed of processing, cognitive flexibility and task switching abilities (based on time needed and errors made). Since there are various Normative Data sets available as reference, a new user-centered adaptive interface becomes possible, by comparing current performance with historic ones. Furthermore, it has been shown that the TM is sensitive to a variety of impairments ([LeHL04]) as well as that overall performance is affected by age and education ([Tomb04]). Both of which are important in the MyUI framework. In the TM game the subject is required to connect dots of 25 consecutive circles on the screen as quickly and as accurately as possible. In part A of the test, circles are numbered from 1 to 25. In part B, the subject has to alter between numbers and letters (1 – A – 2 – B …). This is demonstrated in Figure 2.
Figure 2: Trail-Making-Test (Part A – Left / Part B – Right).
More specifically, the user must click on the circles following an increasing order, that is, the user must find the circle number 1 and click it, then find circle number 2 and click it, the circle number 3 and so on until he/she clicks the 25 circles. The game measures three distinct parameters. These are as follows:
the amount of time that the user takes in completing this game, the number of wrong clicks, both out of sequence clicks, and clicks outside the circles (see Figure 2 above). 7
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
As described above these three parameters capture several cognitive and motor skills of the user, and can be used to establish a first estimate of some user’s profile variables as numeric values in the range [0,4], for instance:
Attention ProcessingSpeed
Further details on how measured raw variables are translated into meaningful user profile variable values are discussed in Chapter 3. 2.3.2 Cards-Pairs Matching In this game, the player must select two squares at a time. The fruits behind each square are shown for a few seconds. Then the player must remember the position of each pair and select them in pairs. This game measures the time required to complete the game and the number of pairs required to complete it.
Figure 3: Cards Pairs Matching game in the MyUI system.
2.3.3 Sudoku This is a conventional Sudoku game, where the player must fill each square with a number from 1 to 9, such that numbers are not repeated in the same row, column or 3x3 square. This game measures the number of errors, and the time required to complete each game. 8
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Figure 4: Playing Sudoku in the MyUI system.
2.3.4 Corsi BlockTapping In this game, some squares are lightened with red light following a sequence of two or three squares. The user must click on the squares in the same sequence as it appeared. This game measures the time to complete the game and the number of errors made by the user.
Figure 5: MyUI Screenshot of Corsi Block Tapping.
9
MyUI / FP7-ICT-2009-4-248606
2.4
D4.5 / Final
Physical Exercises for User Profiling
To evaluate fine motor skills along with assessing how these variables evolve over the use of the system, the MyUI platform contains a set of physical motion exercises that were designed to provide an increasing level of difficulty in motion patterns and coordination abilities. These MyUI physical exercise games can be accessed via the integrated platform (see sections below) or installed as a separate application and accessed via the Internet with a simple web-browser as shown in the Figure 6. The overall theme of these games is called “Scratch Games”, an activity quite popular with most people, especially the Elderly. The scratching motion is carried out with the help of a mouse or pointing device, even a simple TV remote as discussed later, and employes a gesture recognition interface to analyze the timing and shape of predefined motion patterns.
2.4.1 The Games We have developed a total of five exercises to demonstrate how the increasing level of difficulties in motion patterns may be used to assess user abilities. These MyUI physical exercises aim mirror real life interests and activities with a little twist and, in a way, a sense of humor. These features were designed based on our past experience in working with Elderly, where we found that thematic exercises that exploit basic human interests and curiosities, like those founded on esoteric phenomena, motivate users to carry on with a given task more. Finally, an additional third-party-interface layer was also incorporated, allowing future developers and independent teams to integrate their own applications, while still providing user assessment as a service. This latter case is demonstrated using the Trail-Making-Test introduced earlier in this document.
10
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Figure 6: Physical exercises summary screen for user ability assessment.
EXERCISE #1 / Balloon. The balloon exercise consists of a combination of three short animated videos showing the motion of a flying balloon. Users are requested to carry out the simplest possible movement by controlling the cursor with the help of the mouse or a remote pointing device. When the exercise starts it first shows a boat approaching the take-off site (intro video). During this sequence dramatic context, instructions on what will happen and what to do are given to the user in the form of a subtitle line. Once the intro ended the system enters its exercise phase It was designed to progress its video time-line via the controllers available. Specifically, the user needs to perform UP AND DOWN only simple motions over the image. These gestures are recognized and if successful, the balloon will start flying and go higher and higher. By maintaining this motion pattern over a period of time, the balloon keeps flying higher and motivation is given in the form of wishing the reach the other side and land there. This allows the MyUI platform to assess the vertical motion range of the wrist when using a pointing device. To reach the highest point the user needs to carry out the required motion pattern many times. Progress is visible from the video and encouragement messages are shown in the subtitle line. When the top, above the clouds, is finally reached the system will play an outro video, a flying and landing sequence in this case as instant 11
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
feedback to the user that the exercise has ended and also to provide a reward. See Figure 7 for explanation. (NOTE: The actual video snippets were taken from a commercially available video game, called Journey to the Wild Divine, for demonstration purposes. These videos maybe replaced with any other content for future use.)
Figure 7: Physical exercises / Balloon Game three stages.
EXERCISE #2 / Magic Spoon. This exercise was designed to increase the complexity of wrist motion patterns and their variations. The motivational force behind this exercise is people’s interest in esoteric experiences. Specifically, in this case, Users may bend a spoon by performing a series of GESTURE ACTIONS as prompted by the MyUI physical exercise virtual TV display shown on the left of the image. Please note, that this display only involves a controllable size area over which the requested gestures need to be performed. Should the pointer device and consequently the mouse cursor fall outside the area of interest, gestures will be neglected. This feature was incorporated in order for the MyUI system to allow for measuring hand positioning and pointing capabilities. As illustrated in Figure 8 and Figure 9, respectively, users are prompted to carry out simple gestures, such as a circling motion, left/right sequences, etc. If successful, after each gesture is recognized the spoon is bent a little bit and a new gesture is displayed on the monitor. For the purpose of the exercise we used four simple gestures as shown here.
Figure 8: Physical exercises / Four basic coordinated gestures used to measure wrist motion range and hand pointing
12
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
precision in the Spoon Bending game.
Figure 9: Physical exercises / Main display of the Spoon Bending game. The Virtual TV is shown on the left, the bending spoon on the right.
EXERCISE #3 / Paintings. In this exercise an even more complex set of motions and hand coordination is required to succeed. Nine tiles, each showing different gestures to perform, cover a hidden painting. The game is based on a classical and familiar guessing game format, whereas each tile uncovers a different portion of the underlying image, and users are allowed to guess as to what that picture represents. In the MyUI framework, the goal is to uncover the entire image while performing the gestures in random order. This involves ACCURATE POSITIONING by moving the mouse over the respective tiles with the help of the pointing device and subsequently to carry out the COMPLEX GESTURE illustrated on the panel. There is an arbitrary length of gestures that may be defined. The starting state and an intermediate state of the game is demonstrated in Figure 10.
13
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Figure 10: Physical exercises / Two phases of the Paintings game revealing a hidden image beneath.
EXERCISE #4 / Gravity Wells. This exercise measures FREE EXPLORATORY HAND MOTION in the MyUI games framework. Exploration, in this context, means wondering in 2D screen space and accessing all corners and edges of the work area. The game involves controlling a colorful swirl of particles governed by the X and Y position as well as the speed of the mouse motion over the image. Users are encouraged to generate the various and ever changing patterns with periods of holding the pointing device in one place and later moving on to others. The underlying simulation involves a multiple gravitational attractors the dynamic and chaotic properties of which are bound to generate an infinite number of intricate and esthetic patterns, which is key motivational factor for the exercise (refer to Figure 11). The time of exploration and the screen space distribution as well as potential gaps indicate the accessibility of each area.
14
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Figure 11: Physical exercises / Gravity Wells are swirling particles governed via mouse motion over the 2D screen plane to measure free exploratory hand motion.
EXERCISE #5 / Curved Ball. The final MyUI physical exercise encourages and measures unstructured hand motion. It is in the form of a free-hand ball game, as demontstrated in Figure 12, where users play a squash-like game in a 3D environment. The purpose of such an exercise is to map the operational space and range of motion capabilities of the user onto the profile variables of the framework. By plotting the targetted areas of the game, i.e where the virtual ball passes the return line and indicating the errors whether the ball was hit or missed, creates a map of navigational capabilities in terms of physical skills, position accuracy and range of motion. This exsercise also demonstartes the Third Party Game Plugin capabilities, the actual Curved Ball game itself was not devbeloped by MyUI members, rather it was downloaded from an open website.
Figure 12: Physical exercises / Curved Ball game measuring physical skills, position accuracy and range via unstructured hand motion.
The Physical exercise games described above measure the hand precision user profile variable. The exercises are set up to log all interaction raw information into an SQL database. The evaluation algorithms in essence are looking to analyze the mouse motion patterns, characterizing the accuracy of pointing, gesture execution, and motion curve for stability, smoothness, speed of approach and its extremes, in order to find the range of motion and optional limitations of the user. These parameters, summarized in Table 2 below, are iteratively mapped onto hand precision values, but with no statistical significant background data yet available, the mapping function results from the initial assessment studies described in the following section. GameID 1. Balloon
Difficutly
Measures
Variables
Note
1
UP & DOWN wrist motion
Time,Y range
Simple HW
15
MyUI / FP7-ICT-2009-4-248606
GameID
D4.5 / Final
Difficutly
Measures
Variables
speed and range
Note calibration
2. Spoon
2
SIMPLE GESTURES and coarse positioning
Time, XY range, curve gesture accuracy
Involves cognitive processing
3. Paintings
3
COMPLEX GESTURES and high precision POINTING accuracy
Time, XY range, curve gesture accuracy
Involves cognitive processing
4. Gravity Wells
4
FREE EXPLORATORY HAND MOTION and range.
Range of motion, speed of 2D screen area access
Voluntary exploration, relaxed state
5. Curved Ball
5
UNSTRUCTURED HAND MOTION in task driven, high precision context
Range of motion, precision, response time, speed of 2D screen area access
Task driven, performance oriented
Table 2: Summary of measured user profile variables for physical exercises.
2.4.2 Preliminary Game Evaluations Aims and Objectives: The end user validation studies of the MyUI system, working with older participants in field settings, aimed to produce in-depth evaluation in its near-final form. Furthermore it attempted to focus as much as possible on MyUI’s underlying concepts (achieving accessibility through automatic adaptation based on sensor input, user profiling, design patterns and dynamic rendering), rather than the applications, peripherals or superficial aspects of the user interface. After completion of the project, the study design will be refined and provided to UC3M for a final evaluation of the system. This study therefore provides real results and also acts as a pilot for the final evaluation. In addition, some validation findings have already been produced as a by product of earlier studies. Some key design features are as follows: We worked in field settings, and our methods were influenced by social research, not just by the more empirical methods of traditional HCI. We did not attempt to create randomised controlled trials involving comparison of adaptive and non-adaptive systems. We worked in depth with a small number of carefully selected participants with heterogeneous abilities. Participants were given task based instructions and initially left to explore how to achieve this with the technology. Many of the findings were based on subjective responses to questions by participants – formal questionnaires, scripted questions and semi-structured interviews. 16
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
We supplemented this with observations by researchers, and expert assessments by usability experts and by ageing experts. Many of our observations and measures are subjective and qualitative. We continued our ethical and participant-centred approach.
Results: The University of Nottingham Human Factors Research Group planned case studies of 5 older participants selected to exemplify a variety of computer and MyUI expertise, and a variety of impairments. Each participant session lasted 2.5 to 3 hours, involving one older participant, and three researchers. One researcher was specifically dedicated to operating the MyUI technology; the other two conducted the study and made detailed field notes. The results reported below are based on detailed analysis of the observer notes, but no formal analysis of the audio and video material. Table 3 summarizes question responses in the MyUI pre-validation studies for a subset of the total range of questions as well as Good-and-Bad observations along with responses on Acceptance, Usability and Utility. Further details maybe found in Appendix A. SUS‐Plus
Recorded scores
Derived scores
Total
(1= strongly disagree; 5 = strongly agree)
(0=bad; 4 = good)
(score out of 20)
Statements
P1
P2
P3
P4
P5
P4
P5
Q1
I felt like the system was making changes which I had no control over
2
3
4
3
4
3
2
1
2
1
9
Q2
I think that I would like to use this system frequently
4
4
4
4
3
3
3
3
3
2
14
Q3
I thought there was too much inconsistency in this system
1
4
2
3
4
4
1
3
2
1
11
Q4
I found it useful to be able to change the display myself
5
5
4
4
4
4
4
3
3
3
17
2
5
5
3
4
3
0
0
2
1
6
4
5
5
5
2
3
4
4
4
1
16
Q5 Q6
I didn’t like the way the system changed by itself I thought the system was easy to use
P1 P2 P3
Q7
I think that I would need the support of a technical person to be able to use this system
1
3
1
2
4
4
2
4
3
1
14
Q8
I think the separate applications are well integrated into the whole system
5
4
4
4
3
4
3
3
3
2
15
1
4
2
3
4
4
1
3
2
1
11
4
2
5
2
4
3
1
4
1
3
12
1
3
2
3
4
4
2
3
2
1
12
5
3
3
4
2
4
2
2
3
1
12
1
5
2
3
4
4
0
3
2
1
10
4
4
4
3
2
3
3
3
2
1
12
Q9
I found the system unnecessarily complex
I would imagine that most people would learn to Q10 use this system very quickly Q11 Q12
I found the system very cumbersome to use I felt very confident using the system
I needed to learn a lot of things before I could Q13 start using this system Q14
I felt in control of the changes that occurred
Table 3-a: Summarized question responses in MyUI pre-validation studies.
17
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Good things P1 Clarity: ease of understanding Likes the use of colours; enhanced clarity. If something needed attention, it was in a different colour. The balance of the whole system P2 Colours quite pleasing to the eye (If using remote control) "ease of application" was good, once you now what you're doing Like the fact that it's on a TV screen and not a small laptop P3 that it's on a TV Big screen very clear P4 Format of the home page speed of access to different applications that most of the applications are relevant P5 graphics clear once you've learned what they mean (none) (none)
Bad things P1 Didn’t know what TTS icon was Would like text on all buttons, including settings and TTS (None) P2 Inconsistent at the moment, Still a prototype, not complete Some of the wording; e.g. "acuity". Prefer plain English Icon no 8 (tools) ‐ needs a word. So does the man shouting (TTS icon) P3 When it changed without my control (none) (none) P4 personal settings menu? (none) (none) P5 terminology unfamiliar the symbols ambiguity‐ not clear
Table 3-b: „Good- and Bad” in MyUI pre-validation studies. P1
P2
P3
P4
P5
Y Y Y Audio Y Y
? Y ?? Y typing Y Y
Y N Y text Y Y
N? Y Y text Y Y
OK N Y audio OK N
Y Y
Y N
Y Y
Y Y
Y N
N Y Y N
Y N Y N
N N Y N
Y Y N N
Y N N N
Y better Y Y Y
Y easier Y Y Y
Acceptance A1 A2 A3 A4 A5 A6
Did you like the email application? Did you like the way the email adapted to your needs? Did you like the different ways of replying? Which method would you use most regularly? Were you happy for the system to save your personal capability information? Do you feel your personal capability information is securely saved on the MyUI system?
Usability Us1 Did you find the email application easy to use? Us2 Was it obvious how to do things? Us3 Did you experience any problems while using email (adapt question according to whether opening/reading etc) Us4 Did you feel in control of the changes that occurred? Us5 Are there any other features you would like including? Us6 Are there any improvements you would make?
Utility Ut1 Ut2 Ut3 Ut4 Ut5
Have you ever written an email before? If so, how does the MyUI email compare to the system you used before? Did you find the email easy to access? Was the display clear? Did the system adapt appropriately to your needs (e.g. eyesight)?
Y Y N simpler different N/A Y wi th hel p Y Y Y Y Y N Y Y
Table 3-c: Participant responses in terms of Acceptance, Usability and Utility in MyUI pre-validation studies.
Nearly all of the participants could understand how to play the tested games with minimal 18
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
instruction, and found them easy to play (apart from problems controlling the pointer with the Track IR device) and most enjoyed playing them. About two thirds of them would play the games again. A few had some concerns about the use of games to estimate personal capabilities and the use of MyUI interactive TV technology to store such information, but most were happy with the idea. Automatic adaptation of the interface based on game performance or other sensors was less well accepted; some like the experience (or idea); many didn’t. Most like to feel in control, and many liked the idea of using personalisation menus to specify their accessibility needs or capabilities. In the pre-validation study there was a strong preference for selfadaptation rather than automatic adaptation. This preference was less marked in the games study. The adaptations which resulted from game use during the Games study would have benefitted from further tuning / calibration. Overall, these results indicate that the games should not impact the user interface automatically. As the results are processed by the Context Manager and the adaptation engine the current approach of MyUI is in line with this requirement. The adaptation engine includes dialogue management that asks implicit confirmation (with undo option) and explicit confirmation (asking permission to change). The main relevant findings from the Games Study (n=17) and the Pre-Validation Study (n=5) will be reported in D5.1. 2.4.3 System Architecture MyUI Physical Exercise Games The games are flash files, integrated in the MyUI system via a CakePHP plugin. Next to this the MyUI physical exercises have an architecture of its own to allow for data collection of the games control by users. There are three main components of the MyUI Physical Exercises game system. The Flash framework that controls the animated exercises, a resident program, called the MyUIDevices that handles local device action and content and maps them onto mouse motions, and a remote Central DataBase logger to capture and store raw interaction data for user variable mapping and future- or third party analysis. The Flash framework comprises of a Flash-based application (MyUI_player.swf) embedded into an HTML track controller page (MyUI_Exercises.html or CAKEPHP in the integrated version) and external XML files that direct the execution and layout of specific exercises (ScratchGames.xml) The MyUIPlayer itself is built using Adobe Flex SDK 2.0.1 and PaperVision3D 2.0. The development environment is in FlashDevelop 3.0 or above. Adobe Flex is an environment built on ActionScript 3 and MXML plus the tools for compiling programs written with these two languages; and consists of a set of core classes. ActionScript 3 itself is a dynamically typed, memory managed, object oriented, event driven, single threaded language with a syntax very similar to Java. The core AS3 library is implemented by the Flash interpreter on all platforms therefore runs native code; the API is a set of interfaces. MXML is a declarative language for constructing the user interface and 19
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
setting up basic interaction between the elements in XML syntax. More complex behaviour is defined in connected AS3 classes and files. MXML is not strictly necessary to create Flex applications, because all MXML files can be mechanically translated into AS3, it merely provides a more concise and dense means of creating the GUI. More documentation, including tutorials can be found in the following links: Adobe LiveDocs: http://livedocs.adobe.com/ AS3 Language Reference: http://livedocs.adobe.com/flash/9.0/ActionScriptLangRefV3/ Flex 2 Language Reference: http://livedocs.adobe.com/flex/2/langref/ Tour de Flex (an Adobe AIR application demonstrating Flex GUI widgets): http://www.adobe.com/devnet/flex/tourdeflex/ 3rd party libraries used in our code are: BulkLoader: http://code.google.com/p/bulk-loader/ Tweener: http://code.google.com/p/tweener/ AS3CoreLib: http://code.google.com/p/as3corelib/ PaperVision3D is a 3D rendering library written entirely in AS3. Even though it is under continuous development, it is pretty stable. Nonetheless, regular source code updates are discouraged, because bugs come and go quickly, and can lead to surprising behavior in our code base. We maintain our own version of PV3D, which is a branch of the PV3D trunk, containing very few bug fixes of our own. Equivalent fixes usually appear sooner or later in the trunk, so updates should be followed by checking whether these fixes are still valid. Directory Layout The code base of MyUIPlayer is organized in the following directories:
assets/ – image files used as icons and textures, embedded into the compiled SWF application; overlays/ – standalone flash apps that can be used as dynamic overlays in MyUIPlayer programs; src/ – source files of our own code as well as third party libraries; Tools/ contain miscellaneous tools (currently: ZoomViewer for generating zoomable image tiles); css/, javascript/, php/, reports/ and chats/ – files for web page integration; docs/ – documentation.
The source directories are organized as follows:
br/com/stimuli/loading/ – BulkLoader for managing the loading of a number of external files; caurina/transitions/ – Tweener for smooth transitions;
20
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
com/serialization/json/ – AS3 object to JSON translation for communication with the server, used in logging; de/flamelab/util/ – sprintf for AS3 (from corelib); it/sephiroth/expr/ – arithmetic expression evaluation slightly customized for our own purposes, needed for PhysX/Dynamics triggers; threelbmonkeybrain/collections/ – AS3 hashset library; net / – core files: - flex/ – our own Flex components (GUI elements); - papervision/ – our own PaperVision3D classes (those beginning with PV are slightly modified versions of the corresponding core PV3D classes); - time/ – timeline related classes; - util/ – utility classes; - zoomviewer/ – the zoomviewer Flex component, which can also be compiled as a standalone app; - MyUIPlayer/ – the main classes of the application. dialogs/ – MXML files defining dialog and popup windows; events/ – custom AS3 events; logging/ – classes taking care of server side logging.
21
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Core Files The main entry point of the application is defined in MyUIPlayer.mxml, and the corresponding AS3 file, MyUIPlayer.as. These two files define a subclass of mx.core.Application. MyUIPlayer defines the main GUI elements, and provides a fairly thin wrapper around the main class of the application called MyUIViewer defined in MyUIViewer.as. Further classes in the src/net/panocast/MyUIPlayer/:
SplashScreen: the flash preloader which is displayed while the application is loaded and initialized; TrackManager: the class responsible for most of click and camera track management, including frame editing (except GUI); PhysXData: contrary to its name, in addition to serving as a data store for physx/dynamics descriptions, it is also responsible for managing trigger conditions and events; CreditManager: a simple class managing credit validation and access to managed content by communicating with its server side peer; EditorControls: GUI and behavior of editor controls, when enabled, heavily relies on TrackManager to perform the requested frame edits.
Device management (i.e., time and speed controllers, tracker devices, etc.) are handled in MyUIViewer itself, although they should eventually be extracted into separate classes. Overlays There are two types of overlays; 3D and flat. 3D overlays can be added to application tracks in 3D, flat overlays can be either rendered over the viewport in a 2D window, or on a 3D panel in a panoramic track. Both Flex and plain AS3 projects are supported. All overlays should dispatch an event with name set to “update” every time the screen must be redrawn. The player subscribes to this event in the overlay automatically. If the main exports a public method called init(), it will be called with a reference to the MyUIViewer object as its parameter. This can be used to access the public properties/methods of MyUIViewer from the overlay. Similarly, if a method called destroy() is exported, it is called when the overlay is removed from the scene. In addition to init() and destroy(), overlays may have other public attributes and methods which can be accessed and called from the PhysX/Dynamics descriptions.
22
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
3D Overlays 3D overlays must be PaperVision3D applications. More specifically, they must export an attribute called do3d of type DisplayObject3D. This display object is added to the 3D scene by the player when the overlay is loaded. Flat Overlays Flat overlays are plain Flex or AS3 flash files. The image they create is rendered on a panel or in a 2D window, depending on how they are added to the project. Required software/libraries 1. 2. 3. 4. 5. 6. 7.
Source code AS (Action Scrips and MXML files) Need Java 1.6 or above (http://java.com/en/download/index.jsp?cid=jdp84244) FlashDevelop 3.0 (http://flashdevelop.org, see Releases thread) Debug Flash Player (http://www.adobe.com/support/flashplayer/downloads.html) Flex 2.0.1 SDK Hotfix 1 (newer versions untested, but possibly good, http://labs.adobe.com/technologies/flex/sdk/flex2sdk.html http://kb.adobe.com/selfservice/viewContent.do?externalId=kb401224) Recommended: Flex Compiler Shell – speeds up recompilation (http://labs.adobe.com/wiki/index.php/Flex_Compiler_Shell) PaperVision 3D 2.0 Great White (http://papervision3d.googlecode.com/svn/trunk/as3/trunk)
Setup FlashDevelop 1. Open the a3proj project file with FlashDevelop 2. Open the “Program Settings” dialog box (Tools/Program Settings… or F10) 3. Select AS3Context from the left menu 4. Adjust the Flex2 SDK location setting according to your setup 5. Select the “User Classpath” field, press the appearing “…” button, and enter two lines in the appearing dialog: \frameworks\source \src
23
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Hotkeys in MyUIPlayer CTRL-ALT-D: Shows panels, wire frame (change resolution with num keys 1-9) CTRL-ALT-C: Console (use CTRL-ALT-Arrow keys to scroll back) CTRL-ALT-S: Performance statistics. HTML Integration & Program Control To create an easy-to-use and configure application interface, the MyUIPlayer may be embedded into an HTML page of any design. Integration relies on a set of java scripts that parse the configuration lines embedded. The example below shows the configuration of the above exercises. 24
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
// -------------------------------------------------------------------------------// EDITABLE PARAMETERS SECTION -- START // -------------------------------------------------------------------------------var iconWidth = 90; var iconHeight = 90; var language = '_en'; // default language suffix var resolution = '_large'; // default resolution suffix var viewportScale = {"_small":70,"_medium":85,"_large":100}; var scenes = [ { icon: 'content/MyUI/MyUI_Logo.jpg', name: 'Scratch Games', program: 'content/MyUI/ScratchGames.xml', sites: [ { icon: 'content/MyUI/Balloon_ico.jpg', name: 'Balloon', track: 'Balloon' }, { icon: 'content/MyUI/Gesture_ico.jpg', name: 'Magic Spoon', track: 'Spoon' }, { icon: 'content/MyUI/ScratchGames.jpg', name: 'Paintings', track: 'Paintings' } ] }, ]; var defaultScene = 0;
The main program control of the application is described in an XML file with a variety of different tags as described next. entities, each describing a single video track. All paths are relative to the location of this file. --> http://url.to/policy/file.xml
26
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
2.4.4 Pointer Device Handling The Physical exercises described in the preceding sections, although may be carried out using a mouse as input, require the End-User to be situated in front of the TV and perform gestures as required with the help of a pointing device. Since the Flash framework does not provide access to local HW on a computer due to security reasons, we needed to develop a module to capture such information from any sensor we plan on using and pass that to the central framework for control and execution. The MyUIDevices.exe is a resident executable that runs in the background to provide devicerelated and data services to the main application. It was developed in C++ for the purpose of bridging the communication gap btw the Flash Application layer and the local access. It has integrated a range of sensors such as the microphone for audio recording and the TrackIR devices for tracking the motion of the TV remote in the users’ hand and translating it into mouse actions on the computer screen that may be used to control the application. In addition to its role of building this a bridge between the local HW and limited-access applications, it also provides raw data logging and processing to measure user variables as required by higher levels of the MyUI application and adaptation layers. The exercises for physical interaction focus on wrist motion in terms of range and speed as indicators of physical abilities. Having evaluated a number of different technical solutions to measure motion patterns of the body (Asus Xtion or MS Kinect) the MyUI framework opted for using pointer devices as a simple add-on to the existing IR TV remote infrastructure. To achieve this we integrated a low-cost web camera. The TrackIR4 & TrackIR5 (http://www.naturalpoint.com/trackir/) are infrared tracking devices that the MyUI manager is be able to use both in active mode (i.e. to use the IR signals emitted by the TV remote as a magic wand) or in passive mode, i.e. by placing reflective markers on any device in the home. The setup and use of TrackIR is shown in the Figure 13 below. The devices is to be mounted on the top of the TV Set with a direct view of the user. The SW we have developed allows the MyUI Games to detect the tiny flashes of IR communication of the hand held remote controller and use that to control mouse motion and clicks. These motion patterns are then grouped into gestures that the system recognizes. During an exercise End-Users are asked to perform a sequence of actions and carry out a set of motions designed to increase the level of difficulties.
27
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Figure 13: Device setup and configuration for the TrackIR remote pointing device.
Mouse Emulation Modes First start the application, it will reside in the task bar and wait for users to connect. To test if its works and to demonstrate the basic operations, a second application called MyUI_TestClient.exe maybe used the MyUI Device Manager accept commands and returns data values and access to various devices. These raw commands maybe sent from any application to use its services. - Connect - Send Data source request If you have one of the above devices connected and properly installed, the MyUI Device Manager will recognize it as "OptiTrackCamera00" - Subscribe As a result the camera will turn on its IR illumination and send blob coordinates if anything IR reflective appears in its field of view. You can test it by holding your hand near it. - MOUSE EMULATION MODES There are several ways to use this raw data to emulate the mouse and use it to control an application. Its basic mode of operations are absolute mode in which the coordinates of tracked IR dots are mapped onto screen coordinates directly, and relative mode (not absolute) when each time the IR button is pressed it will position the cursor relative to its last position. To detect clicks, there is a time parameter, any signal shorter that that will be handled as a mouse click, if one keeps the buttons pressed longer, it will turn into a pointing device. NOTE: We suggest you use a button on your remote that is not in use by the system for sending commands. These parameters must be sent using the command. enable="1" or "0" -clickTime="500" -click="1" means that a flash less than clickTime milliseconds long will be interpreted as a mouse click (default is 500 ms) - signalWindow = "500" time default is 100 ms IR flashing tie window filter, the longer it is it treats the IR signal as a continuous one - ledIllumination="0" Turns IR illumination on off (for passive marker mode and active remote control mode) EXAMPLE1 Start mouse emulation / REALTIVE mode with CLICK and LED OFF EXAMPLE2 Start mouse emulation / Absolute mode - Stop mouse emulation
NOTES: #1 scaleX,scaleY depends on the sensor type and the distance if the user ti the screen scaleX = 1/width scaleY = 1/height #2 offsetX, offsetY is in pixels (but can be in fractions) TrackIR5 640 480 TrackIR4 360 288
Recording Audio for Voice Mail - Connect using the button on the MyUI_TestClient.exe application interface - Send Data source request (copy this string into teh main window and press SEND) The system will return a list of available devices connected to your system. You should see "SoundCapture" among many others. If not, your system does not have audio enabled. - Subscribe (for each resource the end-user needs to subscribe to start receiving a data stream. Once subscribed, the device manager will start sending back data values using its default configuration" You will see the stream of messages in the window below. Each device may be configured differently to analyse its raw data using the send-data command) - Start RECORDING (To start recording the audio stream into a local file, use the command shown below be sure to replace the file name and make sure you have typed in the correct backslashes) -- STOP recording (Once finished your application should send a stop command to close the Wav file IMPORTANT: stopWrite="notempty" string Must contain some characters, but it does not matter what)
New Generation TV Pointing Device The trend in TV input devices is moving towards the inclusion of pointing interaction. As an infrared solution requires a camera such as the TrackIR on the TV side. As this has implications for the ever narrowing TV rims, alternatives are considered by TV manufacturers. The solution used for interfacing with the latest Philips TV’s relies on radiofrequency technologies (RF4CE) that require a built-in chip, as opposed to a camera.
29
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Since in this implementation, movements cannot be captured by infrared signals emitted by the remote control, they are captured by sensors embedded in the remote control. Motion data, mainly orientation, is captured and then send to the TV via RF4CE. Relative position of an onscreen cursor to the remote control is calculated. For using via the connected TV platform (Philips Net TV), the remote control descibed in the paragraph above is required. For demonstration purposes, this same approach can be used to control the MyUI applications on a laptop-setup. In this case, an RF4CE module is connected to the laptop via USB. The particular device used in the MyUI application is displayed below in Figure 14.
Figure 14: Philips Net TV New Generation pointing device used in the MyUI framework.
2.4.5 Data Logger for Central DataBase During the operation of the MyUI Flash framework the system constantly logs, records and updates raw sensor information in a Central Database over the Internet or in local files for the purposes of more precise user ability evaluation. From this raw information measures of user variables are assessed and communicated towards the MyUI adaptation engine using XML-RPC and TCP/IP-based protocols. The DataBase uses MySQL and an Apache server with PHP scripts to store interaction data in the following tables.
30
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
The below data structures (see Figure 15) were designed to be able to capture all aspects of the interaction process and simultaneously support future evaluation of captured data by independent processes. Unlike with validated studies, such as the Trail-Making-Test where a corpus of statistical data is available as a reference to assess user variables and therefore the application itself can compute and communicate these scores, non-validated measures signal trends and relative improvements and often require an upgrade of evaluation algorithms as more and more data becomes available. The basic DB structure and queries can be accessed via PHP scripts and User profile variables updated using XML-RPC.
Figure 15: Structure of the Remote Data Logger DataBase.
The particular DB design advocated in the MyUI Game system allows for the integration and support of third party external applications implemented as stand alone Flash-games. In other words, user interaction can be recorded revealing the finest details and drawing implicit measures of interaction patterns not foreseen by the original design. As an example,we use the Trail-Making-Test described earlier in this document. The original TM is a stand alone Flash application the calculates the score in terms of overall time, missed clicks, etc. With 31
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
the help of our Data Base logging module we can measure variables beyond these events, e.g. the path characteristics of the mouse, hand tremors and other features, independent of the exercise itself. This functionality also demonstrates how to incorporate third party applications and use them to motivate and measure at the same time. This open-approach can help future MyUI community developers to come up with their own games and readily integrate them into the system, while keeping its data analysis functionality. As an example, during such an exercise, the MyUI Server still records the entire sequence of mouse motion, that later may be utilized to discover a lot of information about the user’s hand coordination. The Session Manager PHP interface as well as a plot chart of characteristic mouse motions during an exercise are demonstrated in Figure 16.
Figure 16: Session Manager interface and motion pattern charts during a physical exercise recorded remotely by the MyUI Flash framework (see text).
3. Methodology for Scoring and Future Game Integration The process of translating measured raw variables into meaningful user profile variable values of the above, is called Scoring. As mentioned in Chapter 2, scoring can be done for validated and non-validated games. Validated games base their scores on normative data, while non-validated games produce relative scores over time; a comparison to prior scores rather than normative data. An example of scoring for a validated game is provided below for the Trail-making Test. An approach for scoring of non-validated games is provided subsequently. Scoring is based on procedure and results from the German KTN ([Anon11]) which includes Normative Data from a total of 18 different studies with healthy adults around the world. It follows the basic steps outlined below: 32
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Transformation of normative Data to Score like the Wechsler-Bellevue-Scale ([Wech44]) Time needed to complete the task is transformed to performance value with mean value μ = 100 and standard deviation σ = 15 for each age-related cohort This performance value is used to map the score to the Context Manager using the following rules and shown in Figure 17. 1. 2. 3. 4.
Scores from μ - σ and better are mapped to “0” Scores between μ - σ and μ - 2σ are mapped to “1” Scores between μ - 2σ and μ - 3σ are mapped to “2” Scores between μ - 3σ and μ - 4σ are mapped to “3”
Figure 17: Trail-Making-Test / Mapping of meaured time to Scores (KTN and Context Manager) for 65-69 year old people for Trail-Making Test Part A.
One of the challenges in the above scoring mechanism to work for practical user evaluation is that performance measurement based on time causes slope distributions (left peak) with the consequence, that a change at the beginning of the x-axis has more impact on the y-axis than a change at the end of the x-axis. Transformation of normative data to the required score is, however, possible with an area transformation procedure to gain normal distribution ([MooK00]). Specifically, the normalization with area transformation works first by computing a non-linear transformation in order to get percentile rank, followed by a step of calculating z-Values from test values and assign the obtained z-values of the normal distribution to the z-values of the corresponding percentile rank. (e.g. assign z’x=0 = -2,15, which is 1.6% in the normal distribution to x with percentile rank PR=1.6% of the test values). To arrive at the required and stabilized output we need these normalized values to be transformed to the final score (mean μ=100, SD σ=15). This process is summarized in Figures 18 and 19 respectively.
33
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Figure 18: Trail-Making-Test / Slope distribution of collected raw values (refer paragraph above for details).
Figure 19: Trail-Making-Test / Visualization of the percentile rank of random test values in a slope distribution.
The practical meaning of the computed scores and the resulting profile variables can be summarized as follows: 1. 2. 3. 4. 5.
Users that are mapped to “0” performed on average or better. Users that are mapped to “1” performed weak/ slow. Users that are mapped to “2” performed very weak/ slow. Users that are mapped to “3” just managed to finish the game. Users that are mapped to “4” were unable to finish the game. 34
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
For games which have not had their score mapping to user characteristics validated a methodology is needed to enable them to be incorporated in the MyUI system. To do this a simple approach is proposed based upon linear interpolation: the game score range is mapped linearly into five intervals to the user profile impairment rating 0 to 4 (where 0 means no detectable impairment and 4 means maximum detectable impairment). This is an initial approximation of how a game score can influence a user profile variable and is sufficiently general to apply to any game and any impairment. The next step is to run user studies with a large number of participants to determine the distribution of scores obtained by participants with various levels of impairment – see the example of the TMT game given above. This will then provide more validated user profile variable values. However, it is not possible to run such mapping (or ’calibration’) user studies during the MyUI project due to a lack of resources: time and staff and also insufficient statistically large samples of participants with impairments.
4. Conclusion The MyUI consortium has sucessfully developed a series of games used as exercises to regularly assess the initial states of End Users as well as the evolution of User Profiles over time. We have demonstrated and experimentally evaluated cognitive as well as physical challenges that may affect how the user perceives and/or controls the MyUI system. We have also demonstrated how the process of translating measured raw variables into meaningful user profile variable values, maybe based on validated statistical measures, as well as empirical observations. Our preliminary studies indicated that in general the games presented on the TV were easy enough for our older participants to learn and (apart from problems using the track IR) to use, and enjoyable. About two in three of our participants would play them again unprompted. Most participants were happy for the games to be used to estimate their personal capabilities and for this to be stored in their TV, but a minority were uncomfortable with these ideas. Automatic adaptation based on game performance had mixed reception; most participants prefer to specify their own accessibility needs (e.g. font size) rather than have the system estimate their capability levels (e.g. visual acuity) and adapt dynamically. The findings on automatic adaptation might be better if the trigger points for adaptation were better calibrated.
35
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
References [MooK00]
Helfried Moosbrugger, Augustin Kelava (2007): Testtheorie und Fragebogenkonstruktion, 2. Auflage, Springer Berlin Heidelberg
[LeHL04]
Lezak, MD, Howieson, D.B., & Loring, D.W. (2004): Neuropsychological Assessment (4th ed.). New York: Oxford University Press.
[HMTG04] Heaton,R.K., Miller,S.W., Taylor,M.J., & Grant,I. (2004): Revised Comprehensive Norms for an Expanded Halstead-Reitan Battery: Demographically Adjusted Neuropsychological Norms for African American and Caucasian Adults. Professional Manual. Lutz, FL: Psychological Assessment Resources [Wech44]
Wechsler, D. (1939): The Measurement of Adult Intelligence. Baltimore: Williams & Wilkins
[Anon11]
Anonymous Contributors (2011): ktn:start. Konsolidierte Testnormen Neuropsychologie. Retrieved 10:34, 1 March, 2012 from
http://psytest.psy.med.unimuenchen.de/dokuwi ki/doku.php ?id=ktn:start&rev=1304677853
[Tomb04]
Tom N. Tombaugh (2004):Trail Making Test A and B: Normative data stratified by age and education. Arch Clin Neuropsychol 19(2): 203-214
36
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Appendix A MyUI - Validation Study at Lark Hill July 2012 Study held in the Freemans lounge aiming to recruit 6 participants in total: 2 new (i.e. those who have not had any previous involvement in MyUI studies); 2 low experience (i.e. those who were involved in MyUI studies prior to March 2012 – they are aware of the project objectives but have not seen or used the MyUI prototype) and 2 experienced participants (i.e. those who have participated in MyUI studies since March 2012). Studies will be conducted with individual participants to create a case study, the study overall will take approximately 3 hours, breaks and refreshments will be provided. Outline of session: Welcome, explanation of session purpose and structure, informed consent, self assessment if applicable and answer any questions. The practical session will consist of participants completing short tasks to explore the email and weather applications, and thereby use the menus and experience interface adaptations. They will be guided through the system by completing these short tasks, any help and support will be provided if requested. While participants are completing the practical session, researchers will be completing an observer checklist including the time that the events occur on the video. Following the practical session, participants will be asked to complete some questionnaires and the video of the session will be played back to them. They will be informally questioned about their experiences and what they were thinking/feeling when carrying out the practical session. And finally there will be a few open questions discussing adaptation further and the sensors. Researcher roles: Technical support to look after technology setup and demonstrator. Two additional researchers required to run the study; Both researchers will complete observer checklists throughout the practical session including the time an event occurred. One researcher is required to facilitate the questionnaires One researcher is required to facilitate the interview/retrospective verbal protocol. One research is required to facilitate the final questions. Room setup: 37
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Welcome area near entrance with consent forms. Individual participant sessions, the participant will be sat in a typical domestic TV viewing height and 1 easy chair approximately 2.5 metres away when facing the TV. The ‘Webcam will be attached to the TV facing the participant. A video camera on tripod will be located behind the participant facing the TV screen.
Measures: Observations, observation checklist (including time, remote control key presses/route, number/type of request) and targeted questions. Adapted SUS (System Usability Scale) questionnaire. Video Recordings - Retrospective video verbal protocol and targeted questions (including Acceptance, Usability and Utility). Voice recordings for final question session. Open questions referring to further adaptation and sensors. Equipment overview: MyUI’s laptop; running the MyUI demonstrator. Webcam TV screen which will be attached to the laptop. Voice recorder Video camera and tripod Stills camera Study scripts Consent forms Self assessment questionnaires Process Script: Hello {name}. I’m {name} from the University of Nottingham. We are very grateful that you have been able to come today to help us in this research activity. {As you know} this is all part of a project which aims to make the next generation of TV sets more useful and usable by people as they get older. We have done activities to help us understand how usable and understandable different sorts of remote controls are, we have had activities to help us design how a TV screen might look and we have looked at how easy it is to interact with information on a TV screen using a remote control. Today we are going to ask you about your views on the email and weather application as well as the overall product. We will ask you to complete a few tasks on your own to 38
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
understand whether the system is easy to use, beneficial and appealing to you but please remember we are testing the technology, we are not testing you. We hope you will be able to give us an insight into how to design things for the benefit of people older than ourselves, and perhaps people you know who are older than yourself, or beginning to find technology difficult to use. First, here’s one of our consent forms. Please take your time to read it, or ask if you’d like me to read it out to you, and if you’re happy to go ahead please sign it. Hand them a consent form on a clipboard, and a pen; explain if necessary. Okay; thanks very much. You will be asked to attempt a few tasks allowing you to explore the email and weather applications, and thereby using the menus and experiencing a few interface changes. The session will take 2 to 3 hours, but we will stop for a break in the middle. You will be guided through the system via the completion of short tasks. This system is a prototype, not a finished product, so there may be some occasions where it isn’t perfect. The screen displays you will see on the TV include representations of imaginary buttons which could “press” using a remote control. When using the TV for email it sometimes has places where you could type text with a keyboard of some sort. Today we’re not going to be asking you to control the TV with a remote control, nor to type text with a keyboard. We will do those input activities for you but we would like you to tell us what buttons on the screen you would press or what text you would type to achieve the tasks. Hand them the list of tasks they will be asked to complete in this session. We would like you to attempt the tasks on this sheet on your own if you can. However please ask me if you require any help, advice or support during these tasks. Do you have any questions or concerns before we start? Practical session procedure: Separate task instructions printed for the Participants to read. Researcher to fill out the remote control functions table and the observations table (see below)Task 1: Please open the email application from the main menu.
39
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Email Please select the email from John in the inbox and read this aloud. (E.g. “Dear Arthur, will you also be attending the next monthly meeting at the British Legion club? John”). John’s email/ [comment: can’t read that – large text jumbled up] Task 2: Please reply to John’s email (e.g. “Dear John, Yes I will be attending the meeting, see you there! Arthur”). Email/Open email/Reply/type text/Send Task 3: Please delete the original email from John and then return to the main menu. Click delete. [Note: mix-up - participant performed Task 3 prior to Task 2] Task 4: Please change the font size to smaller. “Can’t see anything to make font smaller”. Participant then guessed at Settings button/Font size/Smaller Task 5: Please go back to the main menu. Imagine you want to change your personal profile settings because the screen display isn’t very easy to see. Please ask and the researcher will guide you through this task. (Initially guessed at Revert to return to MM). Verbal script for task 5 (N.B. provide hints freely in this task to ensure they make the right choices) What buttons would you press to change your personal user profile settings to tell the system you’re struggling to see the display? Which of these 6 categories might help to make it easier for you to see the display better? Which button would you press to do this? On this display, a score of 1 represents eyesight condition in need of greatest help, and 5 indicates near perfect eyesight. It currently assesses your eyesight as a score of 4. We’re now going to show you how you could specify that you need a display more appropriate for your eyesight. You might press “3” and see this … Then you might try “2” and see this …
40
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
As you can see the text size is adjusting to make it easier to see. You might like to see what this text size would look like in an email, by pressing the GUIODE button. This is what you would see … Imagine that you felt this text size was too large. You might press the “smaller” button and get this display … Let’s assume this text size is the best for you. You would press the EXIT button, and see this … Notice it has now scored your eyesight as “3”. Then you could press OK Do you prefer providing the system with estimates of your eyesight, or do you prefer adjusting the text size? Prefers changing font size………………………………………………………………… In general, when you want the system to become more accessible, would you prefer to specify information about your capabilities or about your preferred display characteristics? Display characteristics……………………………………………………………………… Now please go back to the main menu. As you see, this is now displayed differently, to cater for the eyesight score you have provided. For tasks 6 onwards, can you go back to trying things out yourself first and asking for help only if needed? Task 6: Please alter the display according to your preferences. No alterations required. Task 7: Please compose a new email to Alice asking her where she is going on holiday this weekend. Email/To box/Alice/Add recipients/type message/Send
41
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Task 8: Please check what the weather is going to be like in Nottingham tomorrow and tell the researcher this forecast. Weather/Select Location/Country/(researchers explained here it was necessary to press ‘Save Location’ due to a bug in the system)/City/Forecast (again had to explain that ‘Save Location’ below has to be pressed to obtain Forecast due to system errors). Alice has replied to your email; she is going to Seville, in Spain at the weekend and wants to know what the weather will be like there, she also prefers emails to be sent by audio. Select Location/Country/City/Save Location Task 9: Please reply to Alice’s email with an audio attachment telling her what the weather will be like in Seville at the weekend. Email/Reply/Reply Mode (participant had to be helped at this point as didn’t understand terminology of ‘Mode’ in this context) – Audio Recording/Record/Attach (again had to prompt participant as he is not familiar with sending emails)/Send. Task 10: Please create a new contact for your friend Pepa. Her email address is [email protected]. Home key to MM/Email/Contacts/Add Contact/Pepa/Enter email address/Save. [Had to assist participant with understanding of ‘contacts’, again as terminology not familiar in technology context and associated with making contact personally or writing letters]. Task 11: Please send an email to Pepa asking her whether she would like to meet for afternoon tea today. Home to MM/Email/contacts/Send email/To box/Subject/type message/Send. Observation checklist (timeline): Note down instances of: When they first appear to be struggling When they ask for help/when they work it out/when researchers intervene if they do not ask for help (if they start to get agitated) How they are struggling General participant comments
Shorthand notations for observations: Struggling – Strug
42
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Asked for help – Ask Worked it out – WIO Researcher intervention – RI Other observations – please describe in more detail
For an example of a filled out table, see below.
MyUI Validation Study Exampe Session on Tuesday, 10 July 2012 Observations Table/Retrospective Video/Verbal Protocol Responses
43
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
UP Code
Observation No
Task Number
Start Time
17
1
1
12.40
Not reacting to changes on screen.
2
1
12.40
Can’t read email. Text too large.
3
3
12.41
Deleting email – no delete on list in inbox. Found it in email.
4
4
12.42
Trying to work out how to make font larger – Anne shows Settings button then OK.
4a
Observations/Details
12.42
5
4
12.44
There is an illustration with no word on the Settings button. Would like a word, eg ‘Tools’.
6
5
12.44
Back to Main Menu – wanted to press ‘Revert’.
7
6
12.45
8
6
12.47
Comment that task 5 text is quite small, and what is the difference between visual acuity and field of vision? Scale for visual acuity is other way around to what she would have thought.
44
Further Questions/Verbal Protocol (Video)
Did you notice all the changes taking place? Participant thought that the system was highlighting the things that we would be using and assumed the system ‘knew’ what she was doing and responding accordingly. She liked the way the system enlarged the display automatically when she leaned forward. The email ‘scrambled’. Participant could not read it out as it was very large and jumbled up. Could see ‘Jo…’ only, which was John. Researcher asked if P. thought anything could be done with the large text to make it clearer to read? P. responded ‘no’ – thought the 2 emails had become mixed up and expected to be able to separate them. P. expected a Delete key to be available in the Inbox. She normally uses Hotmail and expected the system to have the same features.
Researcher asked if the Participant (P) noticed when the numbers on the display appeared/disappeared. P. responded ‘no’ – doesn’t like numbers and so preferred the display without. It makes a difference to her as it is simply more information/clutter that she doesn’t use. P. didn’t recognize the Settings icon and said it was a ‘meaningless graphic’ to her. Would prefer to see a word on the button but not sure how she would change the graphic representation. P. selected the ‘Revert’ button to return to the Main Menu, but nothing appeared to happen. (Revert key actually takes settings back to the default settings). P. couldn’t think of a better word to use for this, but believes you just have to learn the terminology for the system you are using. Said ‘you can always use the Home key anyway’.
The scale of visual acuity can be interpreted differently by different people and participant looked at it differently to the researcher. P. was asked ‘do you prefer to provide system with a personal capability rating, or to
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
9
6
12.49
Tried to change font size through Settings and Personal then Display Mode, then found Font Size option.
10
7
12.51
Wanted to type Alice in to Email To field but was also debating between that and clicking Add Recipient. Chose add recipient and then Alice.
11
8
1.04
Wanted to press Forecast, but Anne advised to press Save Location. Comment that Save Location appears twice.
12
9
1.07
Thinks that TTS (Text to Speech) icon is audio reply button, then debates and decides on Reply Mode button.
13
9
1.09
Isn’t aware that it has recorded. Debates whether to send or attach then send.
14
10
1.10
To add contact, expected to write in ‘To’ and save, rather than other way around. Then mulled it over and used ‘Add Recipient’.
45
alter the system display to adapt?’ P. didn’t mind which way it was done but said she prefers to be able to view all of the relevant information on the display initially and to be able to adjust it later. Participant attempted to alter the text size first by going to Display Mode. She believed that Display Mode would affect the size of the entire display. Researcher explained what the button was actually for (adjusts balance of text and graphics on the display) and asked P. if she would prefer more text or more graphics? P. didn’t give a direct answer to this but said she is happy to explore different keys to see what they do. Researcher asked P. if she was expecting to be able to just type ‘Alice’ into the Email ‘to’ field to adda recipient. She said she thought that if she did that, the system would ask her if she wanted to add Alice, and she had related her experience with her own email system to this one when making this assumption. However, she thought that the MyUI system was easy to work out if you made this mistake. Participant was asked what she thought of the ‘Save Location’ feature in the Weather application. She replied that she didn’t know and that it was simply how the system was set up. Couldn’t understand why ‘Save Location’ appeared twice on the display. She liked it being a different colour as this brought it to her attention. When she selected Spain she expected the drop down city list to correspond automatically to the country, but you have to Save Location for this to happen. P. opted to select the ‘little man’ icon (TTS or Text to Speech), then immediately noticed the Reply Mode button and changed her mind. She commented that the TTS button needed a text identifier on it. The P. thought that a confirmation should be given that she had made a recording. She didn’t know whether the recording would automatically attach and send or not. Researcher asked how the P. felt about saving a new contact before using and P. responded ‘fine’. R. asked if P. noticed ‘Contacts’ under ‘New Email’ and she replied ‘not at first’.
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Questionnaire Adapted SUS (System Usability Scale): Statements
Strongly Disagree
Disagree
Neither Agree nor Disagree
Agree
Strongly agree
I felt like the system was making changes which I had no control over
1
2
3
4
5
I think that I would like to use this system frequently
1
2
3
4
5
I thought there was too much inconsistency in this system
1
2
3
4
5
I found it useful to be able to change the display myself
1
2
3
4
5
I didn’t like the way the system changed by itself
1
2
3
4
5
1
2
3
4
5
1
2
3
4
5
I think the separate applications are well integrated into the whole system
1
2
3
4
5
I found the system unnecessarily complex
1
2
3
4
5
I would imagine that most people would learn to use this system very quickly
1
2
3
4
5
I found the system very cumbersome to use
1
2
3
4
5
1
2
3
4
5
I needed to learn a lot of things before I could start using this system
1
2
3
4
5
I felt in control of the changes that occurred
1
2
3
4
5
I thought the system was easy to use
I think that I would need the support of a technical person to be able to use this system
I felt very confident using the system
1.)
Please name 3 things you liked about the system: 1. _____________________________ 2. _____________________________ 3. _____________________________
2.)
46
Please name 3 things you didn’t like about the system: 1. _____________________________ 2. _____________________________ 3. _____________________________
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
47
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Interview Guide: Please look at the observation checklist and ask questions based on observations. For example: ‐ I noticed that you were struggling at this point. I’m going to play the video back to you then can you please talk me through what you were thinking at this point. Obser. Num.
Task Num.
Start Time
Reason for this observation (i.e. reason for struggling)
1
1
13:17
Went to new first. Truncation of text in large fonts in boxes is probably confusing him. (E.g. “JOHN” becomes “JO…”. The email font was v large but he didn’t complain as he didn’t need to read it to complete the task. (Probably an experimental artefact; he might dislike it in a real life setting).
7
6
13:30
8
7
13:35
9
8
13:38
10
9
13:44
11
9
13:48
12
10
13:50
Didn’t see a need to save location (on weather app). He thought that was about start-up defaults next time he foes into the app. He assumed the forecast button was there to request that a forecast be generated. Once explained he said it seems to be an unnecessary extra step. The “audio” part of the task instructions seemed to make the task complex. (TTS) Icon indicates speech – thought it means I could speak to the system. Then he suggested using Skype. Then wanted to forward the weather screen, or minimise it and attach it to a message. Thinking as a computer user. Thought “new email” was the way to et into the email application (whereas in fact he was already in it). He also assumed CONTACTS button meant people who were already contacts, whereas in fact it also has a function to add new contacts.
48
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
Targeted Questions: Acceptance 1
Did you like the email application? Not as much as my own (AOL)
Once familiar with a SYSTEM OLDER PEOPLE DON’T LIKE CHANGE SO MUCH. We get used to eth familiar. Not averse to change but it takes longer to learn new stuff. 2
Did you like the way the email adapted to your needs? Yes
3
Did you like the different ways of replying? Yes
4
Which method would you use most regularly? Text – due to familiarity. But might learn to use audio.
5
Were you happy for the system to save your personal capability information? Yes
6
Do you feel your personal capability information is securely saved on the MyUI system? Yes
Usability 1
Did you find the email application easy to use? Yes; would do after a bit of practice
2
Was it obvious how to do things? Pretty well
3
Did you experience any problems while using email (adapt question according to whether opening/reading etc) Only on audio
4
Did you feel in control of the changes that occurred? OK
49
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
5
Are there any other features you would like including? No
6
Are there any improvements you would make? Not really
Utility 1
Have you ever written an email before? Yes
2
If so, how does the MyUI email compare to the system you used before? Quite different. Neither better nor worse; would be learnable
3
Did you find the email easy to access? Yes
4
Was the display clear? Yes
5
Did the system adapt appropriately to your needs (e.g. eyesight)? Yes
Open questions (Further adaptation and sensors): Adaptation At the end of the interview session, we will show an extreme random adaptation and ask participants’ opinions on this. (I’m not exactly sure what this means. Is Bao?) As you may have already seen, we are able to change certain aspects of the display. This can either be done manually (as a user) as you have seen in the task. Alternatively the system has sensors which can detect your needs and change the display accordingly. Here is an example of how the display might change based on the user’s needs. NB – NO extreme random adaptation shown; questions were answered reflecting on earlier session 1a . Do you like the change? Font was at times so large that wording was truncated orillegible
50
MyUI / FP7-ICT-2009-4-248606
D4.5 / Final
1b. Why? 2a. Would you use the display in its current format? Yes 2b. Why? 3a. Are there any people who you think would benefit from this? People with poor eyesight. But better for most people to reduce font size so wording was all visible 3b. Why? 4. Are there any changes that could be made and may be beneficial for you, which we haven’t shown you today? No Sensors We were telling you earlier about some of the sensors that the system uses to detect your needs. These sensors can detect things like where you’re looking, how far forward you’re leaning and inactivity. Leaning forward and where you’re looking is detected by a camera attached to your TV set. 1. How would you feel about having a camera in your home attached to your TV and facing where you are sitting? Might not like it. I would rather not be dependent on cameras. I have a bad back and change my position quite a lot. 2. How would you feel about the TV detecting your needs and adapting accordingly? OK 3. How do you feel about the TV storing information about your capabilities? OK
Thank participant and offer gift (pot plant, chocs etc)
51