Friday, December 16, 2011

QB Final Project CS 820-Usability & Interaction

CS 820


Usability and Interaction Project Sections and Contents



1.  Introduction




This document contains CS820 Usability and Interaction course deliverables organized by project sections into a final project.

          1.1.  Scope –For the constraints, what is the scope of your project? For the limits, what is one thing that you are not doing?


Scope


The scope of my Q’s Diabetic App project is a single mobile phone controlling whether or not an individual is having an actual seizure or suffering from hypoglycemia. Only one interaction, or task, will be executed at a time and the difference will be conferred.

Limits


The Q’s Diabetic App project is limited to sighted users and the following tasks:

1.                  Contact Family and primary physician if they are having seizure

2.                  If the sensor is removed the user may not get an adequate diagnosis reading

3.                  Clone the Diabetic App image onto the mobile phone and turn off Sensor monitoring system

4.                  Restore image back on the Mobile Phone

5.                  Restore mobile phone to :phone” sensor monitoring mode

1.2. Purpose – Why is this interface needed?


The Q’s Diabetic App interface is needed because mobile phones are widely available and can be used to control devices that actually helps diabetics. This user experience includes participants will have a sensor that can be attached to the user’s arm or bed that can detect if a user is going to have a seizure or go into hypoglycemic mode. This sensor will detect movements in both the body and while they are lying down in the bed. Also have the sensor alert the user’s family through the cell phone contacts and let them know that the individual is having a seizure. Also alerting them that they may need help or alert their doctor is they had a severe seizure. This sensor will measure the severity of the seizure and detect if its hypoglycemia and alert the user to get some food right away to bring their sugar level back up.



2. User Profiles – use the checklist, then identifies at least two user classes that are your target populations.




Project Topic: Controlling misunderstood diagnosis with a diabetic that has both seizures and hypoglycemia mobile phone (e.g., Blackberry Storm). User tasks will include monitoring seizure activity via using a sensor to the mobile phone.

USER 1 – Tech Savvy Early AdopterThis user Dr. Michael Harrington, (41 year old) Endocrinologist that wants to help his patients in the healthcare industry. He likes to research everything from animal cells to human cells and realized that he can replicate human islets. With his interest in emerging technology and advanced medical forensic software, he wants to be a part of the growing emerging technology.

Psychological Characteristics


Cognitive style


   Spatial/visual


Attitude


   Positive


Motivation


   High


Knowledge and Experience


Reading level


   Above twelfth grade


Typing skill


   High


Education


   Advanced degree


System experience


   Expert


Task experience


   Moderate


Application experience


   Some similar systems


Native language


   English


Culture


   Native culture


   Context culture – occasional home office worker


   Sub-culture – tech savvy, early adopter


Use of other systems


   Frequent


Computer literacy


   High


Job and Task Characteristics


Frequency of use


   High


Primary training


   None


System use


   Discretionary


Job categories


   Enterprise Architect


Turnover rate


   High


Other tools


   Telephone


   Contact List


   PC


   Medical information


Task importance


   High


Task structure


   Moderate


Physical Characteristics


Color-blind


   No


Handedness


   Right


Gender


   Male


Accessibility


   No Impairment








 


USER 2 – Not Tech Savvy Middle-Aged Citizen This user is named Tom Benedict, (31year old) dietitian that works for the state hospital in Wake County North Carolina. Tom has been an avid nutritionist and specialist for North Carolina State. He wants easy to use technology. He research diabetes and has a mobile phone that his kids got him. He reluctantly uses a few advanced features of her mobile phone after his kids demonstrate to her how these features work. Specific user profile characteristics for this user are as follows:



Psychological Characteristics


Cognitive style


   Spatial/visual


Attitude


   Neutral


Motivation


   Moderate




Knowledge and Experience


Reading level


   Above twelfth grade


Typing skill


   Medium


Education


   College degree


System experience


   Novice


Task experience


   Novice in field


Application experience


   One similar system


Native language


   English


Culture


   Native culture


   Context culture – Meal planner


   Sub-culture – Middle age citizen


Use of other systems


   Little or none


Computer literacy


   Low




Job and Task Characteristics


Frequency of use


   High


Primary training


   None


System use


   Discretionary


Job categories


   State Dietitian


Turnover rate


   Low – Reluctant to Monitor seizures within diabetics


Other tools


   Telephone


   Calculator


   Mobile phone – Gift from children


Task importance


   High


Task structure


   High – TV control




Physical Characteristics


Color-blind


   No


Handedness


   Right


Gender


   Male


Accessibility


   No Impairment – (




3. User Interface – plan for the interaction and interface design




The following section describe the interface metaphor for the prototype, show representative user interface screen shots, and contains a link to the final prototype.

3.1. User interface Metaphor – info from the forums, plus any new information




Interface Metaphor: Q’s Diabetic App sensor monitor, i.e., the mobile phone will “become” the physician’s assistant so to speak.  Using the mobile phone touch screen to input health information will help the patient decipher whether or not they are having a seizure or hypoglycemia. Additionally, icons that afford the ability to clone the App image onto the mobile phone and to restore the mobile phone to “phone” mode will be provided by the interface.




Representative User Interface Screen Shots




Figure 1 – Q’s Diabetic App Displaying Home Screen



The following image shows the Q’s Diabetic App being activated and now the seizure and hypoglycemia levels can be monitor through the sensor.










During Actual Seizure Activity user family and Primary Care Physician is Notified







Figure 2 – Q’s Diabetic App Mode







The following image shows the Q’s Diabetic App Home login screen interface following the user imputing information on their health to monitor both hypoglycemia and diabetic seizure levels.








If user experiences hypoglycemia, they are alerted via their sensor to eat and build up their sugar levels.




User Interface Prototype Link


Q’s Diabetic Mobile App prototype can be found here:




4. Use Case Scenarios -- scripted or formal scenarios that tie to the test cases.




Nolan – Type 1 diabetes patient and suffers from frequent seizures

Use Case 1: Turning on the diabetic app sensor to monitor why seizures occur so frequently.

Preconditions:

1.            User is in bedroom trying to take a nap and activates the sensor to monitor seizure activity during sleep.

2.            Mobile phone in silent seizure monitoring mode

3.            User has Q’s Diabetic App available on mobile phone (i.e., icon visible on mobile phone screen)

Use Case:

1.            Using mobile phone, select diabetic type

2.            Q’s diabetic app is shown on mobile phone

3.            Using mobile phone diabetic app sensor, user activates it

4.            Q’s diabetic sensor is turned on

Post conditions: Q’s diabetic app is turned on and user can now detect the patterns that may occur or lead up to seizure

Use Case 2: Monitoring seizure conditions with mobile phone

Preconditions:

1.            User is in bedroom trying to get some sleep and still can keep tabs on seizure activity

2.            Q’s diabetic app is validating user inputted information

3.            User has Q’s diabetic  application active on mobile phone

Use Case:

1.            Using mobile phone to decipher the difference of hypoglycemia and seizures.

2.            Verify health information and contact list as being correct

Post conditions: Q’s diabetic app and sensor has been deemed active

Use Case 3: Move to diabetic type and seizure activity

Preconditions:

1.            User is in bed sleeping and sensor gives information updates every 20 minutes

2.            Sensor is silent but vibrate in hibernate mode because user is sleeping

3.            User has Q’s diabetic app active on mobile phone

Use Case:

1.            Using mobile phone, select “Diabetic Type”

2.            Using mobile phone, select enter after entering information and move to user doctor information”

3.            Using mobile phone, turn on seizure activity monitor

4.            Verify family member in the contact list

Use Case 4: User changes mind, wants to restore monitor level to non-moderate

Preconditions:

1.            Using mobile phone to decipher the difference of hypoglycemia and seizures.

2.            Verify health information and contact list as being correct

Use Case:

1.            Using mobile phone, select “Diabetic Type”

2.            Using mobile phone, select enter after entering information and move to user doctor information”

3.            Using mobile phone, turn on seizure activity monitor

4.                  Verify family member in the contact list

Post conditions: (Q’s diabetic app can help the user decipher a serious medical condition from a serious temporary one.

Use Case 5: Continue Sensor monitoring experience

Preconditions:

1.            User has Q’s Diabetic App available on mobile phone (i.e., icon visible on mobile phone screen)

Use Case:

1.            Using mobile phone, select diabetic type

2.            Q’s diabetic app is shown on mobile phone

3.            Using mobile phone diabetic app sensor, user activates it

4.            Q’s diabetic sensor is turned on



Post conditions: Q’s diabetic app on Mobile phone is in sound phone mode



5. Usability Specification – identify the goals & what to measure -- use 3 QUIS rows




The following table illustrates my Usability Specification with my test results filled in.

Task
#
Task Description or
Survey Question
Value
Measured/QUIS Goal
Current
Level
Min.
(or Max.)
Acceptable
Planned
Target
Best
Possible
Test
Result 1
Test
Result 2
Test
Result 3
Avg. Result
1
Turning on sensor using mobile phone; Actor is person (i.e., user) at home
Success rate (QUIS goal: System capabilities)
Est.
98%
2 (of 3)
3
3
S
S
S
3



Task Completion Time (QUIS goal: Learning)
Est.
3 sec
2 min.
20 sec.
5 sec.
4
3
3
3.33


Number of Errors
(QUIS goal: Overall user reactions)
Est.
.05
3
0
0
1
0
0
.33


Verbal Feedback
(QUIS goal: Overall user reactions)
Est.
0
3 (ver. reactions)
1
0
0
0
0
0
2
Increasing sensor experience via mobile phone
Success rate (QUIS goal: System capabilities)
Est.
96%
2 (of 3)
3
3
S
S
S
3



Task Completion Time (QUIS goal: Learning)
Est.
3 sec
1 min.
15 sec.
5 sec.
4
3
3
3.33


Number of Errors
(QUIS goal: Overall user reactions)
Est.
.05
2
0
0
0
0
0
0


Verbal Feedback
(QUIS goal: Overall user reactions)
Est.
0
3 (ver. reactions)
1
0
0
0
0
0
3
Move sensor level experience to monitor user even in sleep
Success rate (QUIS goal: System capabilities)
Est.
90%
2 (of 3)
3
3
S
S
S
3



Task Completion Time (QUIS goal: Learning)
Est.
5 sec
2 min.
20 sec.
15 sec.
7
6
3
5.33


Number of Errors
(QUIS goal: Overall user reactions)
Est.
1.5
4
1
0
1
1
0
.66


Verbal Feedback
(QUIS goal: Overall user reactions)
Est.
1.5
4 (ver. reactions)
2
0
1
1
0
.66
4
User changes mind, wants to restore Seizure activity experience phone and keep a record of results

Move sensor inactive level
experience
Success rate (QUIS goal: System capabilities)
Est.
85%
2 (of 3)
3
3
S
S
S
3


Task Completion Time (QUIS goal: Learning)
Est.
10 Sec
2 min.
20 sec.
10 sec.
5
6
2
4.33


Number of Errors
(QUIS goal: Overall user reactions)
Est.
2
4
1
0
1
0
0
.33


Verbal Feedback
(QUIS goal: Overall user reactions)
Est.
1.5
5 (ver. reactions)
2
0
1
0
0
.33
5
Continue sensor monitoring experience; restore mobile phone to “silent monitoring ” mode
Success rate (QUIS goal: System capabilities)
Est.
99%
2 (of 3)
3
3
S
S
S
3


Task Completion Time (QUIS goal: Learning)
Est.
5 sec
2 min.
30 sec.
10 sec.
9
2
2
4.33


Number of Errors
(QUIS goal: Overall user reactions)
Est.
.05
4
1
0
1
0
0
.33


Verbal Feedback
(QUIS goal: Overall user reactions)
Est.
1
5 (ver. reactions)
2
0
0
0
0
0





The following table lists three qualitative QUIS measures and response from three test subjects.

QUIS Qualitative Survey Scores:

#
Question
1 – 9 Score
1 – 9 Score
1 – 9 Score
Average
Result
1
On a scale of 1 to 9 with 9 being wonderful, please rate the following: overall reactions to the system.
8
8
9
8.33
2
On a scale of 1 to 9 with 9 being easy, please rate the following: learning to operate the system.
8
7
9
8
3
On a scale of 1 to 9 with 9 being always, please rate the following: the system is reliable.
9
9
9
9





 


6. Test Cases and Test Procedure – use at least 3-5 test cases, testable in 5-10 minutes




Test Procedure

How: Testing will occur in a Breeze session in person with the user directly manipulating the prototype. The user interface prototype will use a hyperlinked PowerPoint presentation. The test conductor will verbally describe the test task and allow the user to complete the task unaided control the prototype based on verbal user responses. The test conductor will provide verbal feedback to the user strictly adhering to this test procedure. The user will be presented with only one test case at a time.

When: December 16, 2011

Where: Nolan residence, Miami, Florida



Task 1: Activating Q’s Diabetic App on mobile phone

Preconditions:

1.            Using mobile phone, select diabetic type

2.            Q’s diabetic app is shown on mobile phone

3.            Using mobile phone diabetic app sensor, user activates it

4.            Q’s diabetic sensor is turned on

Use Case:

1.            Using mobile phone, select “Diabetic Type”

2.            Using mobile phone, select enter after entering information and move to user doctor information”

3.            Using mobile phone, turn on seizure activity monitor

4.            Verify family member in the contact list

Quantitative Data Collection:

1.                  Task Completion Time =

2.                  Number of Errors =

3.                  Verbal Feedback =

Post conditions: Q’s diabetic app sensor mobile phone has active and visible



Posttest Quantitative Data Calculation:

1.                  Success Rate =



Task 2: Monitoring health issue using mobile phone

Preconditions:

1.            Using mobile phone to decipher the difference of hypoglycemia and seizures.

2.            Verify health information and contact list as being correct

Use Case:

1.            Using mobile phone, select “Diabetic Type”

2.            Using mobile phone, select enter after entering information and move to user doctor information”

3.            Using mobile phone, turn on seizure activity monitor

4.            Verify family member in the contact list

Quantitative Data Collection:

1.                  Task Completion Time =

2.                  Number of Errors =

3.                  Verbal Feedback =

Post conditions: User has validated that sensor is active



Posttest Quantitative Data Calculation:

1.                  Success Rate =





Task 3: Move (i.e., clone) TV viewing experience from TV to mobile phone; turn off TV

Preconditions:

1.                  User is in living room with TV (turned on) and mobile phone (turned on)

2.                  TV is on channel 3

3.                  User has TV Controller application active on mobile phone

Use Case:

1.            Using mobile phone, select diabetic type

2.            Q’s diabetic app is shown on mobile phone

3.            Using mobile phone diabetic app sensor, user activates it

4.            Q’s diabetic sensor is turned on

Quantitative Data Collection:

1.                  Task Completion Time =

2.                  Number of Errors =

3.                  Verbal Feedback =

Post conditions: Q’s diabetic app is activated and sensor hibernating mode is turned on



Posttest Quantitative Data Calculation:

1.                  Success Rate =



Task 4: User changes mind, wants to restore Seizure activity experience phone and keep a record of results.  Move sensor inactive level

Experience

Preconditions:

1.                  User updates the record of seizure and hypoglycemia activity

2.                  Mobile Phone is monitoring user health information via sensor

3.                  User has Q’s Diabetic application active on mobile phone

Use Case:

1.                  Using mobile phone, while user sleeps

2.            Mobile phone in silent seizure monitoring mode

3.            User has Q’s Diabetic App available on mobile phone (i.e., icon visible on mobile phone screen)

Quantitative Data Collection:

1.                  Task Completion Time =10

2.                  Number of Errors =2

3.                  Verbal Feedback =4

Post conditions: User has Q’s Diabetic application active on mobile phone



Posttest Quantitative Data Calculation:

1.                  Success Rate =



Task 5: Continue sensor monitoring experience; restore mobile phone to “sensor monitoring” mode

Preconditions:

Use Case:

1.            Using mobile phone, select “Diabetic Type”

2.            Using mobile phone, select enter after entering information and move to user doctor information”

3.            Using mobile phone, turn on seizure activity monitor

4.            Verify family member in the contact list



1.                  Using mobile phone, select  silent “Phone” mode

2.                  Verify input user information

3.                  Verify mobile phone is in sensor activation mode

Quantitative Data Collection:

1.                  Task Completion Time =15 minutes

2.                  Number of Errors =4

3.                  Verbal Feedback =4

Post conditions: Validate that sensor is in activation mode



Posttest Quantitative Data Calculation:

1.                  Success Rate = 90%





QUIS Qualitative Survey Questions – Ask each user selected qualitative QUIS questions. Quantitative measures will be collected during each test.



Test Plan

Greeting and Short Introduction to the Project: “Hi, I’m Quiana Bradshaw and we’ll be testing my mobile phone diabetic app user interface prototype. Pretend you are in a room in your home that you have diabetes and that you are holding a touch screen mobile phone. I will describe the task, and then you tell me what to do will perform the task as instructed. Let's begin.”

Task 1: Turning on sensor using mobile phone

Test Conductor Instructions: “Begin Task 1. Using the mobile phone, please turn on the sensor.”

Test Conductor Okay Response: “Okay, good job. That is the end of Task 1.”

Test Conductor - Problem Response 1: “That is not what was expected. Please try again.”

Test Conductor - Problem Response 2: “That didn’t work right. Please try again from the beginning. Using the mobile phone, please turn on the sensor.”

Test Conductor - Problem Response 3: “This isn’t working, but that’s okay. Let’s move on to the next task.”



Task 2: Using Q’s diabetic app using mobile phone

Test Conductor Instructions: “Begin Task 2. Using the mobile phone please input diabetes type.”

Test Conductor Okay Response: “Okay, good job. That is the end of Task 2.”

Test Conductor - Problem Response 1: “That is not what was expected. Please try again.”

Test Conductor - Problem Response 2: “That didn’t work right. Please try again from the beginning. Using the mobile phone, Please connect sensors to activate them and input proper health information.”

Test Conductor - Problem Response 3: “This isn’t working, but that’s okay. Let’s move on to the next task.”



Task 3: Move (i.e., clone) Q’s diabetic app and change to silent hibernate mode

Test Conductor Instructions: “Begin Task 3. Using the mobile phone, first clone the TV view onto the mobile phone. Then, please turn off the TV.”

Test Conductor Okay Response: “Okay, good job. That is the end of Task 3.”

Test Conductor - Problem Response 1: “That is not what was expected. Please try again.”

Test Conductor - Problem Response 2: “That didn’t work right. Please try again from the beginning. Using the mobile phone, please clone the TV view onto the mobile phone. After that, please turn off the sound and change to silent mode.”

Test Conductor - Problem Response 3: “That’s not quite everything. Remember, we want to do two things, fist change to a quiet mode and shift the sensors.”

Test Conductor - Problem Response 4: “This isn’t working, but that’s okay. Let’s move on to the next task.”



Task 4: User changes mind, wants to restore noise activated mode from silent sensor mobile phone

Test Conductor Instructions: “Begin Task 4. Using the mobile phone, please clone the sound activated mode from silent back to sound.”

Test Conductor Okay Response: “Okay, good job. That is the end of Task 4.”

Test Conductor - Problem Response 1: “That is not what was expected. Please try again.”

Test Conductor - Problem Response 2: “That didn’t work right. Please try again from the beginning. Using the mobile phone, please clone the mode and change back to sound activated mode.”

Test Conductor - Problem Response 3: “This isn’t working, but that’s okay. Let’s move on to the last task.”



Task 5: Continue to validate user, physician and family contact information is correct then; restore mobile phone to “Phone” mode

Test Conductor Instructions: “Begin Task 5. This is the last task. Using the mobile phone, please restore the mobile phone back to the “Phone” mode.”

Test Conductor Okay Response: “Okay, good job. That is the end of Task 5.”

Test Conductor - Problem Response 1: “That is not what was expected. Please try again.”

Test Conductor - Problem Response 2: “That didn’t work right. Please try again from the beginning. Using the mobile phone, please restore the mobile phone back to the “Phone” mode.”

Test Conductor - Problem Response 3: “This isn’t working, but that’s okay. Let’s finish with three survey questions.”



QUIS Qualitative Survey Questions:

1.                  On a scale of 1 to 9 with 9 being wonderful, please rate the following: overall reactions to the system.

2.                  On a scale of 1 to 9 with 9 being easy, please rate the following: learning to operate the system.

3.                  On a scale of 1 to 9 with 9 being always, please rate the following: the system is reliable.



7. Test Results – hold a dry run test, modify and baseline the project, then test with 3-5 subjects and analyze the results. Record the results in your project.




The following table illustrates my test results.

Task
#
Task Description or
Survey Question
Value
Measured/QUIS Goal
Current
Level
Min.
(or Max.)
Acceptable
Planned
Target
Best
Possible
Test
Result 1
Test
Result 2
Test
Result 3
Avg. Result
1
Turning on sensor using mobile phone; Actor is person (i.e., user) at home
Success rate (QUIS goal: System capabilities)
Est.
98%
2 (of 3)
3
3
S
S
S
3



Task Completion Time (QUIS goal: Learning)
Est.
3 sec
2 min.
20 sec.
5 sec.
4
3
3
3.33


Number of Errors
(QUIS goal: Overall user reactions)
Est.
.05
3
0
0
1
0
0
.33


Verbal Feedback
(QUIS goal: Overall user reactions)
Est.
0
3 (ver. reactions)
1
0
0
0
0
0
2
Increasing sensor experience via mobile phone
Success rate (QUIS goal: System capabilities)
Est.
96%
2 (of 3)
3
3
S
S
S
3



Task Completion Time (QUIS goal: Learning)
Est.
3 sec
1 min.
15 sec.
5 sec.
4
3
3
3.33


Number of Errors
(QUIS goal: Overall user reactions)
Est.
.05
2
0
0
0
0
0
0


Verbal Feedback
(QUIS goal: Overall user reactions)
Est.
0
3 (ver. reactions)
1
0
0
0
0
0
3
Move sensor level experience to monitor user even in sleep
Success rate (QUIS goal: System capabilities)
Est.
90%
2 (of 3)
3
3
S
S
S
3



Task Completion Time (QUIS goal: Learning)
Est.
5 sec
2 min.
20 sec.
15 sec.
7
6
3
5.33


Number of Errors
(QUIS goal: Overall user reactions)
Est.
1.5
4
1
0
1
1
0
.66


Verbal Feedback
(QUIS goal: Overall user reactions)
Est.
1.5
4 (ver. reactions)
2
0
1
1
0
.66
4
User changes mind, wants to restore Seizure activity experience phone and keep a record of results

Move sensor inactive level
experience
Success rate (QUIS goal: System capabilities)
Est.
85%
2 (of 3)
3
3
S
S
S
3


Task Completion Time (QUIS goal: Learning)
Est.
10 Sec
2 min.
20 sec.
10 sec.
5
6
2
4.33


Number of Errors
(QUIS goal: Overall user reactions)
Est.
2
4
1
0
1
0
0
.33


Verbal Feedback
(QUIS goal: Overall user reactions)
Est.
1.5
5 (ver. reactions)
2
0
1
0
0
.33
5
Continue sensor monitoring experience; restore mobile phone to “silent monitoring ” mode
Success rate (QUIS goal: System capabilities)
Est.
99%
2 (of 3)
3
3
S
S
S
3


Task Completion Time (QUIS goal: Learning)
Est.
5 sec
2 min.
30 sec.
10 sec.
9
2
2
4.33


Number of Errors
(QUIS goal: Overall user reactions)
Est.
.05
4
1
0
1
0
0
.33


Verbal Feedback
(QUIS goal: Overall user reactions)
Est.
1
5 (ver. reactions)
2
0
0
0
0
0



The following table lists three qualitative QUIS measures and response from three test subjects.

QUIS Qualitative Survey Scores:

#
Question
1 – 9 Score
1 – 9 Score
1 – 9 Score
Average
Result
1
On a scale of 1 to 9 with 9 being wonderful, please rate the following: overall reactions to the system.
8
8
9
8.33
2
On a scale of 1 to 9 with 9 being easy, please rate the following: learning to operate the system.
8
7
9
8
3
On a scale of 1 to 9 with 9 being always, please rate the following: the system is reliable.
9
9
9
9



 


8. Conclusion – analyze the results and identify what needs to change. Note whether the test subjects fit your desired population.




Analysis of Results

Test Subject Demographics:

  1. Female, Age 28, Computer Expertise (1 – 9 with 9 being expert) = 5
  2. Male, Age 31, Computer Expertise (1 – 9 with 9 being expert) = 2
  3. Male, Age 18, Computer Expertise (1 – 9 with 9 being expert) = 6



Things that need to change:

1.      Add more complex capabilities to the prototype, for example picture-in-picture, widgets (e.g., weather, stock quotes) or video on demand.

2.      Replace two or more test tasks with more difficult tasks

3.      Consider adding speech recognition to the interface



Test subject fit:

1.                  While the three test subject where different from each other, my test subjects did not include a senior citizen.

2.                  My test subjects all had novice to advanced computer experience. It would be interesting to test with subject that has very little computer experience.

3.                  Overall, my test subjects were adequate to inform early prototype testing.



Reflection on Test Results:

1.                  Usability tasks, while realistic and useful, were probably too simple.

2.                  At time of Usability Specification development, I overestimated the times subjects would need to complete tasks.

3.                  I’m glad I specified 5 tasks (albeit easy tasks) instead of just 3 or 4 tasks.

4.                  My test plan and procedures worked very well. I’m glad I changed my tests to in-person and direct tester interaction with the prototype (vs. Breeze or WebEx).

5.                  The extra time I spent making the prototype intuitive and easy to use proved worth the effort.

6.                  Tests subjects did not try the prototype prior to formal testing which proved important in making a valid assessment of ease of use.

7.                  The survey results were quite high. This may be due in part to the simplicity of test tasks.

8.                  Post test and survey comments seem to indicate test subjects liked the prototype and the concept it represents (I might want to show this prototype to Verizon).



9. Areas of Future Research – identify recommended process changes and future areas of interaction design research.




Recommended process changes:

1.                  While the basic stages of discovery, development, evaluation, and implementation will probably continue to be required in the design process, future interaction design could likely benefit from techniques include Agile software development.

2.                  Standard Graphical user interfaces (GUIs) screen components like menus, buttons, and windows will likely emerge for contemporary interfaces like mobile communications and gaming devices. The interaction design process should leverage these re-usable components to accelerate the interaction design process.

3.                  Interaction processes need to evolve to support more convergent interaction including collaborative environments, embodied virtuality systems and immersive virtual reality (Heim, 2007).

4.                  Interaction design processes should accommodate complex interactions on increasingly smaller devices.



Future areas of interaction design research:

1.                  Existing and new interaction models will likely evolve to take advantage of emerging technologies including gesture recognition, eye tracking and natural language search.

2.                  Input techniques, including touch and gesture, are being incorporated by multiple vendors led by mobile devices like the pioneering Apple iPhone (Fenn et al., 2008). The constraints of menu-based interactions and small screens will continue to require new innovations in interaction design (Fenn et al., 2008).

3.                  Screen technology including low-cost large-screen displays and glanceable ambient displays will also help reshape office and home environments (Fenn et al., 2008). Touch and gesture will also play a role in these fixed displays and will be influenced by 3-D game controllers and multitouch interaction with large screens like the Microsoft Surface tabletop (Fenn et al., 2008).

4.                  New personal devices not well-adapted to GUIs will leverage new technologies including accelerometers, touch sensitive fabrics, and location sensing (Fenn et al., 2008). Location-bases services are already becoming popular and will most likely continue to improve. These location capabilities will enable richer context-based interactions.

5.                  Environmental interfaces, where the interface migrates from the device to the broader physical environment, are a major evolution (Fenn et al., 2008). These interfaces, exemplified by an incoming message being signaled by the music system and the user issuing a “show me” voice command, allow everything in the user’s vicinity to collaborate in an interaction (Fenn et al., 2008). These environmental interfaces will require new interaction designs.



References:




Fenn, J., Tully, J., Ball, R. J., O'Donovan, P., Kitagawa, M., Raskino, M., Dulaney, K., Baker, B. L., Schlegel, K., Cramoysan, S., Davies, J., Bell, T., Williams, M., Jacobs, J., Andrews, W., Prentice, S., Costello, R., Morrison, S., Mason, R. F. (2008). Hype cycle for human-computer interaction, 2008. Forrester Research. December 22, 2008.

Heim, S. (2007). The Resonant Interface: HCI Foundations for Interaction Design. Addison Wesley. ISBN: 978-0321375964

Twitter (2011). Q Logo. Retrieved December 16, 2011 from http://twitter.com/qsensei


No comments:

Post a Comment