Workshop on Recommender Systems:
Algorithms and Evaluation

University of California, Berkeley
August 19, 1999

Part of the 22nd International Conference on Research and Development in Information Retrieval

On This Page



I'm very pleased to report that the workshop was a big success. Between 40 and 50 people attended the keynote and various sessions, at times spilling out into the hallway. However, while size can be as much a curse as a blessing to a workshop, the discussion was insightful and provocative. The organizers would like to publicly thank all those that attended and participated.

A summary of the workshop will appear in SIGIR Forum; I hope everything is accurate. Please send any comments you may have to Ian Soboroff.

You can read the call for papers here. It was sent out on May 5th, and revised with the new date on May 18th.


The world of recommender systems has undergone quite an expansion since the Communications of the ACM published their feature issue on the topic two years ago. Projects such as GroupLens have gone on to be successful commercial ventures, and recommendation systems are de rigeur for Internet commerce. The basic technology has very quickly gone from the research world to popular applications. However, many problems remain to be solved.

Several methods for recommender systems have emerged, including approaches that base recommendations on correlations of groups of users and methods that learn about individual users. However, the architectural issues of cold-start, sparse ratings, and scalability continue to dominate the field.

The state of the art in recommender systems will be enhanced by the development of evaluation methodologies for recommender systems. User studies are difficult to conduct and generalize from, and issues of presentation and relevance make traditional IR evaluation measures not entirely suited to the domain. Furthermore, test collections such as DEC SRC's EachMovie data set are becoming standard tools, but the need for larger collections in different domains is great.

Thus, the theme of the workshop is moving to the next phase of recommender systems research, from the basic "how do we do it" to "how can we do it better", and "how do we know that it's better".

The objective of the workshop is to bring together researchers and practitioners involved in developing, testing, and fielding recommender systems. The workshop will provide a forum for discussing current practice and recent research results, and develop a roadmap for future recommender systems research.

Preliminary Workshop Schedule


9:15-10:15 Keynote: Recommender System Research: Perspectives and Thoughts
Joseph Konstan (University of Minnesota)

10:15-10:40 Break

10:45-12:00 Session I: Recommendation Algorithms

Memory-Based Weighted-Majority Prediction for Recommender Systems
Joaquin Delgado, Naohiro Ishii (Nagoya Institute of Tech, Japan)

Jester 2.0: A New Linear-Time Collaborative Filtering Algorithm Applied to Jokes
Dhruv Gupta, Mark Digiovanni, Hiro Narita, Ken Goldberg (UC Berkeley)

Clustering Items for Collaborative Filtering
Mark O'Conner, Jon Herlocker (Univ of Minnesota)

12:00-13:00 Lunch


13:05-14:20 Sesssion II: Combining Content and Collaborative Recommendation

Combining Content-Based and Collaborative Filters in an Online Newspaper
Mark Claypool, Anuja Gokhale, Tim Miranda, Pavel Murnikov, Dmitry Netes, Matthew Sartin (WPI)

Bayesian Mixed-Effects Models for Recommender Systems
Michelle Condliff (Boeing), David D. Lewis (AT&T Research), David Madigan (Univ of Washington), Christian Posse (Talaria, Inc.)

Content-Based Book Recommending Using Learning for Text Categorization
Raymond Mooney, Loriene Roy (Univ of Texas)

14:20-14:35 Break

14:35-15:50 Session III: Models for Users and Information Need

Recommenders for Expertise Management
Mark Ackerman, David McDonald, Wayne Lutters, Jack Muramatsu (UC Irvine)

Recommending Web Documents Based on User Preferences
Eric Glover, Steve Lawrence, Michael Gordon, William Birmingham, C. Lee Giles (NEC Research)

  • Ali Kamal (Tivo, Inc.)
  • Joseph Konstan (University of Minnesota)
  • Clifford Lynch (Coalition for Networked Information)
  • Raymond Mooney (University of Texas)

15:50-16:00 Wrap-up

Important Dates

June 9, 1999 Paper submissions due
July 9, 1999 Notification of paper acceptance
July 23, 1999 Camera-ready copy with revisions due
August 19, 1999 Recommender Systems Workshop


Ian Soboroff
Charles Nicholas
Michael J. Pazzani

Program Committee

  • Chumki Basu (Telcordia/Rutgers)
  • Natalie Glance (Xerox Research Centre Europe)
  • Haym Hirsh (Rutgers)
  • Jon Herlocker (Univ of Minnesota)
  • David Hull (Xerox Research Centre Europe)
  • Joseph Konstan (Univ of Minnesota)
  • Loren Terveen (AT&T Labs)
Last modified 12 August 1999. Please send comments or corrections to