Presenter Information

Tom ChappelearFollow

Advisor Information

Ann L. Fruhling, Ph.D., MBA

Location

ROOM 225

Presentation Type

Oral Presentation

Start Date

1-3-2019 12:45 PM

End Date

1-3-2019 2:00 PM

Abstract

User acceptance is a measure that contributes to technology feasibility which is a decision point that entrepreneurs use to help make better investment decisions (Hoffer, 2011). User acceptance testing is part of a design strategy where the system designer attempts to minimize risks and provide design information that an entrepreneur can use to make better investment decisions (Hoffer, 2011). A user acceptance testing instruments can range from a low-cost coffeeshop review to expensive lab-based user testing (Shneiderman, 2017). This study builds upon the User Acceptance of Information Technology (UTAUT) model to measure user acceptance for the system being evaluated (Venkatesh, 2003). User acceptance is an essential attribute for system design; however, there is limited research literature concerning the evaluation of different types of user acceptance instruments (Shneiderman, 2017). An entrepreneur often launches a product that contains technology which requires user acceptance testing; however, with limited research concerning user acceptance instrument, this study will compare a low-cost user acceptance instrument to traditional hands-on user acceptance testing instruments. Specifically, we will determine if a user interface presenting the product features on a YouTube video is comparable to a conventional hands-on user acceptance tool (Todd, 1998).

Bibliography

Hoffer, J. A. (2011). Modern systems analysis and design. Boston, MA: Pearson.

Shneiderman, B. P. (2017). Designing the User Interface Strategies for Effective Human-Computer Interaction, 6th Edition. Boston MA: Pearson.

Todd, K. H. (1998). Randomized, Controlled Trial of Video Self-Instruction Versus Traditional CPR Training. Annals of Emergency Medicine, 364-369.

Venkatesh, V. M. (2003). User Acceptance of Information Technology: Toward a Unified View. MIS Quarterly, 425 - 478.

COinS
 
Mar 1st, 12:45 PM Mar 1st, 2:00 PM

Decreasing Technology Design Costs by Using YouTube Video to Evaluate User Acceptance

ROOM 225

User acceptance is a measure that contributes to technology feasibility which is a decision point that entrepreneurs use to help make better investment decisions (Hoffer, 2011). User acceptance testing is part of a design strategy where the system designer attempts to minimize risks and provide design information that an entrepreneur can use to make better investment decisions (Hoffer, 2011). A user acceptance testing instruments can range from a low-cost coffeeshop review to expensive lab-based user testing (Shneiderman, 2017). This study builds upon the User Acceptance of Information Technology (UTAUT) model to measure user acceptance for the system being evaluated (Venkatesh, 2003). User acceptance is an essential attribute for system design; however, there is limited research literature concerning the evaluation of different types of user acceptance instruments (Shneiderman, 2017). An entrepreneur often launches a product that contains technology which requires user acceptance testing; however, with limited research concerning user acceptance instrument, this study will compare a low-cost user acceptance instrument to traditional hands-on user acceptance testing instruments. Specifically, we will determine if a user interface presenting the product features on a YouTube video is comparable to a conventional hands-on user acceptance tool (Todd, 1998).

Bibliography

Hoffer, J. A. (2011). Modern systems analysis and design. Boston, MA: Pearson.

Shneiderman, B. P. (2017). Designing the User Interface Strategies for Effective Human-Computer Interaction, 6th Edition. Boston MA: Pearson.

Todd, K. H. (1998). Randomized, Controlled Trial of Video Self-Instruction Versus Traditional CPR Training. Annals of Emergency Medicine, 364-369.

Venkatesh, V. M. (2003). User Acceptance of Information Technology: Toward a Unified View. MIS Quarterly, 425 - 478.