?

Log in

No account? Create an account

Previous Entry | Next Entry

"We don't have the user-centricity. Until we understand context, which is way beyond presence -- presence is the most trivial notion of context." - Bill Gates quoted in the Business 2.0 piece in my previous entry.

Now, I know I work with mobile technologies and spend a lot of time thinking about the next generation of user-centred web applications (along with how we can use elements from the ubiquitous computing world in business and government), so I live steeped in the jargon of the field, but there's nothing actually wrong in that quote.

In fact, it's a clear, correct and concise encapsulation of one of the biggest problems facing interaction designers and application developers today. While knowing that someone is online, and where they're online is good (presence), knowing why they're online and what they're doing while they're online is better (context)...

Note to Business 2.0:

Before you pull a random quote that you don't understand, why not spend a little time checking to ensure that it's really garbage before running it. If you need to know more about ubi-comp and why context is a critical concept Go read my introduction to context computing (PDF). Call up my friends Tac, Kyle and Mike in San Francisco. Or better still, go hang out with ubi-comp folks at MIT and Stanford and think about Mark Wiser's original ubi-comp work at PARC.

Feh. Makes me ashamed to be a practicing technology journalist.

Comments

( 9 comments — Leave a comment )
nmg
Mar. 20th, 2003 09:55 am (UTC)

What's also interesting is the way that the terminology is drifting. In the CS community with which I interact, ubiquitous computing has been largely supplanted by pervasive computing, although I've yet to be given a good example of how these two disciplines differ from each other.

sbisson
Mar. 20th, 2003 10:00 am (UTC)
Re:
No difference really. I think ubicomp is the west coast PARC derivation, pervasive east coast MIT derivation...
nmg
Mar. 20th, 2003 10:56 am (UTC)

That would seem right - our local pervasive computing guru seems to have picked up most of his errant ways while on a sabbatical at MIT.

blufive
Mar. 20th, 2003 11:18 am (UTC)
This issue (highly specialist jargon being misread as bollocks by the ignorant) is often raised in letters to Private Eye about their regular "Pseuds Corner", usually when they publish something from an academic paper.

It's also something I run into in a couple of other contexts in my life: engineering (where "stiff", "strong", "tough" and "hard" have different very precise meanings, and jelly is brittle) and, of course, programming. In the latter, there are times where it's obvious the English language just doesn't have enough synonyms for some concepts to allow a different word for each particular variant involved in a project/architecture/whatever.
stillcarl
Mar. 20th, 2003 01:39 pm (UTC)
Anyone who doesn't laugh at a word such as user-centricity should be shot as a deserter from the English language. (I'm complaining about aesthetics here, not accuracy;)

From: http://www.usercentricity.com/

"User-centricity constitutes a philosophy to support the design of digital media applications. It draws on and applies theory from product design, human-computer interaction, branding and marketing."

Surely they're not suggesting branding and marketing isn't all that's needed?

But anyway, the tendency of technologists to appropriate words and give them new meaning is part of the problem. Trying to use one word when five would be better is treating human language as if it were mathematics. For most people, this doesn't work.

And what about the "way beyond"? A very loose term I would've thought.
redbird
Mar. 21st, 2003 03:15 pm (UTC)
All that granted, that isn't an English sentence. Or rather, it isn't two English sentences: delete the first period and lower-case "until" and it's English. I also suspect that they're abusing the word "notion" here, when they mean something like "part" or "aspect."
sbisson
Mar. 22nd, 2003 12:47 am (UTC)
Re:
It's a verbatim quote of an off the cuff answer to a question - I wouldn't expect any better (I know I'm probably far worse when asked questions when on panels or giving presentations).

Showing normal speech patterns in an interview context is never a good idea, but it happens all too often. A good journalist should be able to edit speech patterns out while retaining meaning.
(Deleted comment)
sbisson
Mar. 24th, 2003 05:22 am (UTC)
Re: good journo/bad journo
No, it's SOP to edit for meaning. Sure, you should check that that's OK - but you need to be sure that the reader can understand what's being said - I usually inidicate that that's likely at the start of an interview, and get their permission then and there.

And once you've got that, getting the edits done is easy enough - instead of a direct quote, use a passive quotation.

So instead of 'Bill Gates said, "new blah, now blahs...", it becomes 'Bill Gates said that the latest blah could blah'. Makes more sense to the reader, too...
(Anonymous)
Nov. 18th, 2003 06:40 am (UTC)
Bill Gates on Context
As someone who has worked on context in the field of telephony services, I too cannot understand the comments on Bill Gates' quote. I agree with you that it a good description of the issues that will be facing the designers of distributed applications.

Just what are the factors that link a user's presence and co-presence to the user's context. Just how do the what, when, where, and who (i.e. the presence and co-presence) provide indications of the why (i.e. the context) of user actions. In other words, how can a system determine what the user's goals are in his interaction with the system so that it can set its response to meet these goals.

The lack of understanding that meets this issue is in my opinion due to an ignorance of the nature of the reasons for human iteraction with a distributed computing system. Users do not interact with a computing system in order to perform actions on it. They interact to accomplish some task that will aid the collaborate with their colleagues. Understanding that computer interaction is only an instrumental part of a distributed human collaboration can greatly assist the understanding of how these applications should operate.

Tom Gray
PineTEL
tom_gray_grc@yahoo.com
( 9 comments — Leave a comment )