WWW conference

by Eszter Hargittai on May 18, 2004

Today, I will be attending a conference workshop in New York on Measuring Search Effectiveness: The User Perspective. I will be presenting some findings about What Makes an Expert Searcher? Evidence from User Studies. (That paper is not ready for distribution, but I will take this opportunity to link again:) to the paper that presents the coding scheme I used to analyze most of the data.) The workshop is being held in conjunction with WWW2004, the Thirteenth International World Wide Web Conference.

I am reminded of my attendance at The 4th International World Wide Web Conference in Boston in 1995. I was a senior in college writing a thesis on the unequal international spread of the Internet. I went to this conference with the hopes of learning what research was being done about the social implications of the Internet. There were very few sessions on the program that were about any aspects other than technical. After one of the few sessions where panelists discussed some philosophical questions related to the Internet, I walked up to someone to ask whether they thought the government was doing anything about the Web. His response: “Yes, I think they have a Web page now.” This wasn’t exactly what I was getting at. I had hoped to see some sessions discussing policy implications. But this was still the era when many people thought the medium was somehow going to evolve in a vacuum, in isolation from existing social institutions.

Looking at this year’s program, it is clear that technical questions are still the overwhelming topic of this particular conference so perhaps it was a mistake to look for other types of content at WWW4. But this is easy to say today when the conference scene is littered with meetings discussing all aspects of IT. Back in 1995, there weren’t too many meetings you could go to where people would care to discuss any aspects of the Web.

{ 5 comments }

1

Phill 05.18.04 at 4:33 pm

Strange that you would think this was the case. Back in 1992 I started working with Jock Gill who ran the 1992 Clinton-Gore online campaign during that campaign at a time we had about 100 web users.

There was a lot of early government use of the Web, Al Gore’s Open Meeting that was run by MIT AI lab, the publications server. I can tell you who made the Web Page comment (not me), he was being sarcastic.

The idea that the Web evolved in a vacum was largely a media construction. The media paid little attention to the real Web developers, it was much easier to report press releases from silicon valley startups. Besides best not mention that a bunch of folk had taken Ithiel Pool’s roadmap and used it.

The basic idea was to disintermediate the media, to provide a direct line of communication between people and news sources. Of course until last year everyone thought this idea had failled because CNN etc controlled the main websites.

The mechanism for disintermediation was not to replace, but to provide a control loop. It has taken some time but the Republican echo chamber is pretty much broken at this point. Without the Web the administration would have been successful in their coverups on Iraq for at least four or five more years.

Of course you might not see this type of idea being presented at academic conferences. Back in 1995 the last thing I was going to do was to tell folk what the game plan was. They would probably have worked out a way to stop it.

You have to compare the Web with the ‘Interactive TV’ concept that the Time Warner guys wanted to impose. The only ‘interactive’ part of the program was you could order stuff online, it was a purely passive consumption exercise.

Forget Internet Time, it was a bogus concept from the start. The Web has yet to really get started. Back in 1995 Alan Kay pointed out that page numbers only appeared on books fifty years after the printing press and it was a reader who had the idea. He asked what the equivalent would be for the Web, one answer is Weblogs.

2

Mary Kay 05.18.04 at 7:02 pm

Interestingly, when I saw the title of your paper I wondered if you were a librarian and were on your way to a library conference. I’m sure there’s a significant comment to be made concerning this, but it’s just out of my reach.

MKK–retired librarian

3

Scott Martens 05.18.04 at 11:02 pm

“What Makes an Expert Searcher?”

That’s easy. Domain knowledge. Knowing wht to search for next when your first set of word choices fails. Remember, search engines almost exclusively search for words. They have very little meaningful semantic knowledge. There are technologies that do a better job of that, but as yet they don’t scale well. For the moment, domain specific lexical and encyclopedic knowledge is the core competence of expert searchers.

4

eszter 05.19.04 at 4:32 am

Phill – Curious you think people could’ve worked out a way to stop government intervention. Or perhaps I misunderstood your comment.

MKK – Good call, most of the participants in this workshop were from LIS departments or from industry working on various search engines (e.g. people from Yahoo, Verizon, Barnes and Noble). My research does look at information retrieval online so I have connections to that community. The difference is mostly the questions I’m interested in with respect to information seeking and some of the details of the methods I use. For example, for me it is important not to restrict my sample to students or library patrons. Rather, I study the online behavior of average Internet users.

SM – It makes sense to assume that domain knowledge would help, although I’ve found in my studies that it doesn’t necessarily. My findings have to do with the ability to switch between strategies quickly. Even the best users sometimes start out with suboptimal strategies, but as long as they can switch quickly, it’s fine. By the way, I should note that when I say “searcher”, I don’t just mean “search engine user”, I mean information seeker online using whatever means to find content (including typing in a URL, using an AOL channel link, etc.).

5

Phill 05.19.04 at 7:04 am

i think you misunderstand the comment, the Web was not a Clinton administration program. What I was referring to was the attempt by the establishment to wreck the Web and turn it into interactive TV.

Netscape was 100% determined to give corporations everything they wanted. That is why you can visit sites today that bombard you with popups and then try to change your home page. Their model was that they give the browser away for free and the users automatically become the property of the people who bought the Netscape Web server.

But no, having Clinton administration support is not enough to stop the establishment from protecting its own narrow interests.

In the US the establishment stopped a lot of government interventions in the 1990s. Thats why you guys have no national health care.

One of the reasons why there were so many Clinton haters in the media was that the administration did not accept the oft repeated claim that the people need journalists to interpret the news for them. The punditocracy knew the Web was a threat from the start.

If the majority of the US had been on the Web in 1992 the establishment could not have stopped the health care bill with a single ad buy that played almost exclusively in Washington DC and peddled transparent lies.

Sure Murdoch and co wanted to stop the process they tried, they failed.

It will take another decade but the power of the press barons is broken. Google news disintermediated them.

It is going to be much harder for people to hold unexamined views in the future. Even in Saudi Arabia most people under the age of 30 know that the oppression of women in their society is not even typical of the Islamic world.

Sometime the dam will even burst in the US.

Comments on this entry are closed.