For more details on human factors see:
Salvendy, G. (ed.) (2012). Handbook of Human Factors and Ergonomics. Hoboken, NJ: John Wiley & Sons.
Kruijff, E., Swan II, E., and Feiner, S. (2010). “Perceptual issues in Augmented Reality Revisited.” Proceedings of the IEEE and ACM International Symposium on Mixed and Augmented Reality, 3–12.
An introduction to the origins of nonconventional control can be found in this book chapter:
Bullinger, R, H., Kern, P., and M. Braun (1997). “Controls.” In G. Salvendy (ed.), Handbook of Human Factors and Ergonomics, 697–728. New York: John Wiley & Sons.
More details on graphics widgets can be found in:
Schmalstieg D., and Höllerer, T. (2016). Augmented Reality: Principles and Practice. Addison-Wesley.
More details on using voice as an input modality can be found in the following text:
Pieraccini, R. (2012). The Voice in the Machine: Building Computers That Understand Speech. Cambridge, MA: MIT Press.
Jurafsky, D., and J. Martin (2008) Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition. Prentice Hall.
More detail on gestures can be found int he following text:
Wigdor, D., and Wixon, D. (2011). Brave NUI World: Designing Natural User Interfaces for Touch and Gesture. Burlington, MA: Morgan Kaufmann.
The following two papers provide more information on using tools as part of a 3D interface:
Ishii, H., and B. Ullmer (1997). Tangible Bits: Towards Seamless Interfaces between People, Bits, and Atoms. Proceedings of the ACM Conference on Human Factors in Computing Systems, ACM Press, 234–241.
Hinckley, K., R. Pausch, J. Goble, and N. Kassell (1994). Passive Real-World Interfaces Props for Neurosurgical Visualization. Proceedings of the 1994 ACM Conference on Human Factors in Computing Systems (CHI ’94), ACM Press, 452–458.