Log in

No account? Create an account

Previous Entry | Next Entry

I've only been awake for a few minutes and already I'm fluimmoxed. "The whole idea of using an athropomorphic user interface to the computer is considered to be one of the biggest obstacles in computing." which I took from greyyguy's journal.

I thought like... high computational speeds with more stable hardware and less heat bleed off was pretty high on the list, myself.
hehe .
To think that the thing that bothers a lot of people is the fact that computers don't have faces....


( 11 comments — Leave a comment )
Apr. 24th, 2002 08:55 am (UTC)
Well, those are big obstacles in on the technical side but I was reffering to the user interface aspect :)

I was talking about making computers easy to use. All computer interfaces now are extremely artificial and contrived. We talk to people all the time, and are very used to that mode of interfacing. Computers expect people to learn a whole new way of giving instructions and recieving information. By making a more anthropomorphic interface - making the computer seem and act more human - people can be much more comfortable with using them.

Voice recognition is another aspect of an anthropomorphc interface. Being able to talk to your computer in a natural language and have it understand would be a huge step forward for the user interface.

As for it bothering people that computers don't have faces, I would suggest that it does to a degree. Thinking of people, like some of my family members who I don't even trust to turn on my laptop, who are not comfortable with computers, I think putting a face that they can interact with would add a greater comfort level to using the computer. Of course, if it was a scary face that might make things worse :)

Sorry to have fluimmoxed you so early in the morning :)
Apr. 24th, 2002 09:08 am (UTC)
I dunno... it's a machine. I don't want a machine to have a face. Faces imply personalities, lives.

and yes, it still seems silly to me that we're attempting to make computers more "human" like.
What's next? trying to build a desktop PC that passes the Turing test?

I think that making computers more human like in an attempt to make them more user friendly is just... wasted effort. *simplicity* makes things easy to use, not something as complex as a human physiology.

I dunno.. hehe... it just struck me as we-are-geeks-with-too-much-time-and-too-many-grants...

Apr. 24th, 2002 09:19 am (UTC)
I agree that simplicity is better in machines for the most part. But in this direction, I'm thinking of people that just can't (or won't) learn how to use a computer. The same people that have 12:00 flashing on their VCRs. I don't think that computers need to pass the turing test to be easier to use, but I think they do need to be friendlier to people.

And while it is complex on the technical side, what could be simpler to use then looking at a face and talking to it?

And part of being a geek is doing things that look useless yet still have that "wow" factor to it :)
Apr. 24th, 2002 10:19 am (UTC)
I think that making machines "friendly" at all is the wrong tact to take.

I think we just have fundamental differences in the ways we view technology.

I know all I will ever want to know about geekhood.
Think I'll go find some luddites to chill with for awhile so I can appreciate the techy world again.
Apr. 24th, 2002 09:55 am (UTC)

Microsoft Bob and the Paperclip are the two most successful products of this idea, ever. A somewhat less successful one is the MS e-book reader, which worked so hard to be just like a book that ninety percent of its users CAN NOT find such functions as "go to chapter 3" or "go to a page a third of the way in". (I worked on that one... oh, lord, it was bad... even I couldn't figure out some of the features, and I was coding them, and there weren't that many.)

"Making computers easy to use" is a good thing to work on. "Making using a computer more like talking to a human" is a bad thing to work on, because on the inside where it counts, a computer is *not* like a human. You end up spending most of your effort hiding what's really going on from the user, and then they get *really* confused.

The "flashing 12" thing is NOT a user-interface problem. It's cost-cutting; the VCR manufacturers don't want to spend the extra two bucks a unit it would cost to put on "Set", "Hour" and "Minute" buttons. Remotes with lots of mysterious buttons and that silly wheel thing they WILL do, because those boost sales... but apparently, people actually won't pay for the ability to set the clock.
Apr. 24th, 2002 11:39 am (UTC)
Microsoft Bob and the Paperclip are the two most successful products of this idea, ever.

I wasn't quiet sure if you were joking about that or not, seeing as how those were two of the most reviled things ever. But they were good implimentations of trying to make the computer more personable. They just gave the computer an annoying personality.

I agree that on the inside a computer is not like a human, but I don't think that most people are able to get to that level of thinking and change how they look at the technology. The computers and software are sold as time savers and virtual assistants and the like. So the perception is that they are more human-like in their tasks then any other appliance peopel will use. Since they do human-like tasks of reminding us of appointments and the like, it makes sense to give them a human-like interface. That is the whole goal of voice recognition software- to be able to talk to your computer and have your computer understand. You talk to people all the time. It is a natural way of interacting.

The inside of a computer is nothing like the inside of a person, but I would say there are large groups of people out there that don't care what the inside is like. I would also suggest that a more intelligent interface is the logical step in computing. Just as the operating system hides the complexity of the underlying hardware from the software, the improved interface hides the complexity of the software from the user.
Apr. 24th, 2002 01:56 pm (UTC)
Not joking. Years of research went into Bob, lots of them. Likewise the paperclip-- there was actually a version of the help engine behind it in Excel 5.0, and some of the auto-format components were in Excel 5.0 and WinWord 2.0, and the team developing the front end that everybody hates was huge by MS feature-team standards. That paperclip was exactly what people clamored for for decades; I remember reading speculative articles in Byte and Compute! that described it fairly well back in the early eighties. Be careful what you wish for, eh?

People are entirely capable of learning how to use tools, real ones or software, if those tools work in simple, obvious, consistent ways. Most of the windowing UI stuff seen on Windows, the Mac, X-Windows, etc., works fairly well for this. The thing that drives folks up the wall is when the computer doesn't act like a tool, when it works inconsistently. Things like Excel's wacky clipboard behavior or some of the unmarked "special" areas in Microsoft editors that select whole words or lines, or when it gives you the Blue Screen of Death at random intervals for no reason you can see.

Apr. 24th, 2002 11:47 am (UTC)
You worked on the MS e-book? I haven't used any of them but I keep hearing really bad things about all of them from just about everybody. Is there something really difficult about emulating a paper book in electronic format on a handheld device? I've always wondered about that...
Apr. 24th, 2002 01:34 pm (UTC)
Re: Also...
Most of the readers on the market aren't really... ahhh.. .READABLE, as it were. Cheapass low-res non-color LCD screens make them a literal pain to use for extended reading. Microsoft's reader does not have this problem, partly because it only runs on the desktop or Windows CE devices which tend to have good screens, and partly because of some truly nifty anti-aliasing technology they developed.

The MS reader has a different. They went in with the notion, "We're going to make this just like a book," and accordingly they dumped the Windows UI conventions and hid the entire UI except for the "previous page" and "next page" buttons. Thus, you can't really do anything once you're in a book. You've also got to jump through some wierd digital-rights hoops-- can't copy and paste, can't read a book on more than one device, have to have a .NET passport, have to connect to the 'net and go to this one page on microsoft.com before you can read any copyrighted content at all, stuff like that.
Apr. 24th, 2002 01:57 pm (UTC)
Re: Also...
I thought the LCD screen would be part of the problem, and figured that a WinCE device or Palm would be better then developing a whole new platform but didn't know if anyone was going in that direction. Now it turns out they are but not well :)

I can't wait till groups figure out that the whole content protection thing is just annoying customers. I would love to have a dozen books I've bought with me in my pocket, but can't because they haven't figured out how to keep me from stealing it yet.
Apr. 25th, 2002 12:15 am (UTC)
Re: Also...
Some of the WinCE devices out there have really nice screens; one of the machines I had on my desk had something like 400x600 resolution at 96 dpi. Add in ClearType and text was... it was GOOD. I found myself reading about half of The Divine Comedy on it one evening, when I'd just missed a ferry, and it was no trouble at all, not in the car in the darkened parking lot, not on the boat with the wierd flourescent light poking in through the windshield, not up in the passenger cabin. Now, if only I could remember how to set a bookmark...

Yep. Digital rights management is an utter crock at the moment, from the user's point of view. It also doesn't help that the publishers charge the trade-paperback price for most e-books, and full paperback price for the rest.
( 11 comments — Leave a comment )


A Non-Newtonian Fluid

Latest Month

March 2010

Page Summary

Powered by LiveJournal.com
Designed by Tiffany Chow