Posts tagged “HCI”

ChittahChattah Quickies

  • [from julienorvaisas] Six scientists tell us about the most accurate science fiction in their fields [Mad Science io9] – [Brief interviews with scientists discussing where some of the real science resides in our science fiction. Great comments thread, including this one: "The other side of the coin is how has science benefited from science fiction stories."] Ronald Arkin, director of the Mobile Robotics Lab, Georgia Tech: "Realistic depictions of robots are pretty boring, so there's not much to say on what is accurate or not. No positronic brains, no running amok killing everyone and everything. I guess that's the fiction in science fiction. You watch enough videos of robots at real research conferences and it's hard to stay awake… Anyway, [one] comes to mind that is a bit more accurate than most: Hal 9000, in 2001, apart from his apparent psychotic episode, is a robotic system that people live inside. Current research agendas, in human-robot interaction, task planning, command and control, etc., could conceivably lead to such an intelligent system."
  • [from steve_portigal] Will You Try My Paper iPhone App? [Techcrunch] – [Stanford HCI student gets soundly criticized for seeking feedback on paper prototype with actual users! The drama – as often on the web – really takes off in the comments.] When I looked down at his hands, however, instead of an iPhone, he held a few pieces of paper with wireframe drawings in pencil. This was his app. I was supposed to pretend the paper was an iPhone screen and press the hand-drawn buttons as I shuffled through the flow. The idea is that you could point your camera at a magazine rack and get digital versions of the magazines, which you could preview on your iPhone and then purchase individual articles or the entire magazine. It made a lot more sense when he did it (see video). Now, there is nothing wrong with getting your ideas down on paper or paper prototypes to work out the kinks before you start coding. But you might want to wait until you have an actual working app on an iPhone before testing it out in the wild and asking for feedback from normal people.

Steve’s Masters Thesis

This PDF file is a (scan of a) short paper version of my 1994 M.Sc. Thesis from U of Guelph. My entire thesis was up on the web in the very very earliest days of web (back when it was NCSA Mosaic that was being used to navigate) as an experiment by a friend at Apple. If I recall, he turned the whole thing into HTML, etc. Anyway, I can’t imagine why anyone would want to read it, but if you are really hard up, perhaps this shorter paper might be of interest. If nothing else, I’ll blog it so I can find it later.

Update: full thesis has been found. See PDF here.

Abstract: An experiment compared the effectiveness of auditory, visual, and combination cues to convey document structure. Subjects demonstrated an equivalent level of understanding of the document structure and its content with either a combination cue or a visual cue. Subjects required more time to answer questions in the combination condition than in the visual condition. This suggests a greater cognitive effort is required. A sound-only condition has the poorest performance both in response time and in the subjectÔø?s answers to questions about the documentÔø?s structure and its content. Subjects were grouped based on whether or not they replayed sounds as a retention tactic. Subjects who replayed sounds did better than subjects who did not. These results contribute to our understanding of potential uses of sound in user interfaces. The specific cues used here for this particular task do not appear promising. Future research to determine how to be make use of sound must carefully consider user tactics for processing sound cues.


About Steve