information filtering lbsc 878 douglas w. oard and dagobert soergel week 10: april 12, 1999

28
Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Upload: dominick-hart

Post on 03-Jan-2016

218 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Information Filtering

LBSC 878

Douglas W. Oard and Dagobert Soergel

Week 10: April 12, 1999

Page 2: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Agenda

• Information filtering

• Profile learning

• Social filtering

Page 3: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Information Access Problems

Collection

Info

rmat

ion

Nee

d

Stable

Stable

DifferentEach Time

Retrieval

Filtering

DifferentEach Time

Page 4: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Information Filtering

• An abstract problem in which:– The information need is stable

• Characterized by a “profile”

– A stream of documents is arriving• Each must either be presented to the user or not

• Introduced by Luhn in 1958– As “Selective Dissemination of Information”

• Named “Filtering” by Denning in 1975

Page 5: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

A Simple Filtering Strategy

• Use any information retrieval system– Boolean, vector space, probabilistic, …

• Have the user specify a “standing query”– This will be the profile

• Limit the standing query by date– Each use, show what arrived since the last use

Page 6: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

What’s Wrong With That?

• Unnecessary indexing overhead– Indexing only speeds up retrospective searches

• Every profile is treated separately– The same work might be done repeatedly

• Forming effective queries by hand is hard– The computer might be able to help

• It is OK for text, but what about audio, video, …– Are words the only possible basis for filtering?

Page 7: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

The Fast Data Finder

• Fast Boolean filtering using custom hardware– Up to 10,000 documents per second

• Words pass through a pipeline architecture– Each element looks for one word

good partygreat aid

OR

AND

NOT

Page 8: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Profile Indexing in SIFT

• Build an inverted file of profiles– Postings are profiles that contain each term

• RAM can hold 5,000 profiles/megabyte– And several machines can be run in parallel

• Both Boolean and vector space matching– User-selected threshold for each ranked profile

• Hand-tuned on a web page using today’s news

• Delivered by email

Page 9: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

SIFT Limitations

• Centralization– Requires some form of cost sharing

• Privacy– Centrally registered profiles– Each profile associated with an email address

• Usability– Manually specified profiles– Thresholds based on obscure retrieval status value

Page 10: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Profile Learning

• Automatic learning for stable profiles– Based on observations of user behavior

• Explicit ratings are easy to use– Binary ratings are simply relevance feedback

• Unobtrusive observations are easier to get– Reading time, save/delete, …– But we only have a few clues on how to use them

Page 11: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Relevance Feedback

MakeProfileVector

ComputeSimilarity

Select andExamine

(user)

AssignRatings(user)

UpdateUser Model

NewDocuments

Vector

Documents,Vectors,

Rank Order

Document,Vector

Rating,Vector

Vector(s)

MakeDocument

Vectors

InitialProfile Terms

Vectors

Page 12: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Machine Learning

• Given a set of vectors with associated values– e.g., document vector + relevance judgments

• Predict the values associated with new vectors– i.e., learn a mapping from vectors to values

• All learning systems share two problems– They need some basis for making predictions

• This is called an “inductive bias”

– They must balance adaptation with generalization

Page 13: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Machine Learning Techniques

• Hill climbing (Rocchio Feedback)

• Instance-based learning

• Rule induction

• Regression

• Neural networks

• Genetic algorithms

• Statistical classification

Page 14: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Instance-Based Learning

• Remember examples of desired documents– Dissimilar examples capture range of possibilities

• Compare new documents to each example– Vector similarity, probabilistic, …

• Base rank order on the most similar example

• May be useful for broader interests

Page 15: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Rule Induction

• Automatically derived Boolean profiles– (Hopefully) effective and easily explained

• Specificity from the “perfect query”– AND terms in a document, OR the documents

• Generality from a bias favoring short profiles– e.g., penalize rules with more Boolean operators– Balanced by rewards for precision, recall, …

Page 16: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Regression

• Fit a simple function to the observations– Straight line, s-curve, …

• Logistic regression matches binary relevance– S-curves fit better than straight lines– Same calculation as a 3-layer neural network

• But with a different training technique

Page 17: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Neural Networks

• Design inspired by the human brain– Neurons, connected by synapses

• Organized into layers for simplicity

– Synapse strengths are learned from experience• Seek to minimize predict-observe differences

• Can be fast with special purpose hardware– Biological neurons are far slower than computers

• Work best with application-specific structures– In biology this is achieved through evolution

Page 18: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Genetic Algorithms

• Design inspired by evolutionary biology– Parents exchange parts of genes (crossover)– Random variations in each generation (mutation)– Fittest genes dominate the population (selection)

• Genes are vectors– Term weights, source weights, module weights, …

• Selection is based on a fitness function– Tested repeatedly - must be informative and cheap

Page 19: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Statistical Classification

• Represent relevant docs as one random vector– And nonrelevant docs as another

• Build a statistical model for each– e.g., a normal distribution

• Find the surface separating the distributions– e.g., a hyperplane

• Rank documents by distance from that surface– Possibly distorted by the shape of the distributions

Page 20: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Training Strategies

• Overtraining can hurt performance– Performance on training data rises and plateaus– Performance on new data rises, then falls

• One strategy is to learn less each time– But it is hard to guess the right learning rate

• Splitting the training set is a useful alternative– Part for training, part for finding “new data” peak

Page 21: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Social Filtering

• Exploit ratings from other users as features– Like personal recommendations, peer review, …

• Reaches beyond topicality to:– Accuracy, coherence, depth, novelty, style, …

• Applies equally well to other modalities– Movies, recorded music, …

• Sometimes called “collaborative” filtering

Page 22: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Social Filtering Example

9 6 1 9 83 5 7

5 7 5 5 5 49 7 2 9

3 3 9 1 24 9 8 1 8

8 2 59

Comedy Drama Action Mystery

Amy

Norina

Paul

Skip

Page 23: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Some Things We (Sort of) Know

• Treating each genre separately can be useful– Separate predictions for separate tastes

• Negative information might be useful– “I hate everything my parents like”

• People like to know who provided ratings

• Popularity provides a useful fallback

• People don’t like to provide ratings– Few experiments have achieved sufficient scale

Page 24: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Implicit Feedback

• Observe user behavior to infer a set of ratings– Examine (reading time, scrolling behavior, …)– Retain (bookmark, save, save & annotate, print, …)– Refer to (reply, forward, include link, cut & paste, …)

• Some measurements are directly useful– e.g., use reading time to predict reading time

• Others require some inference– Should you treat cut & paste as an endorsement?

Page 25: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Constructing a Rating Matrix

1 2 3 4

Document

Jack

Sam

Tim

Ken

Marty

Joe

Jill

Mary 0.29

0.37

0.53

0.13

0.62

0.77

0.57

0.14

0.19

0.79

0.05

0.71

0.69

0.44

0.57

Jack

Sam

Tim

Ken

Marty

Joe

Jill

MaryExaminationRetention

Reference

Explicit Rating

Rater

Observation

1 2 3 4

Document

Rater

Page 26: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Some Research Questions

• How readers can pay raters efficiently

• How to use implicit feedback– No large scale integrated experiments yet

• How to protect privacy– Pseudonyms can be defeated fairly easily

• How to merge social and content-based filtering– Social filtering never finds anything new

• How to evaluate social filtering effectiveness

Page 27: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

0.29

0.37

0.53

0.13

0.62

0.77

0.57

0.14

0.19

0.79

0.05

0.71

1 2 3

0.69

0.44

0.57

4

0.29

0.37

0.53

0.13

0.62

0.77

0.57

0.14

0.19

0.79

0.05

0.71

0.69

0.44

0.57

Jack

Sam

Tim

Ken

Marty

Joe

Jill

Mary

Rater

nuclear

fallout

siberia

contaminated

interesting

complicated

information

retrieval

Terms

Document

Combining Content and Ratings

• Two sources of features– Terms and raters

• Each is used differently– Find terms in documents– Find informative raters

Page 28: Information Filtering LBSC 878 Douglas W. Oard and Dagobert Soergel Week 10: April 12, 1999

Variations on Social Filtering

• Citation indexing– A special case of refer-to implicit feedback

• Search for people based on their behavior– Discover potential collaborators– Focus advertising on interested consumers

• Collaborative exploration– Explore a large, fairly stable collection– Discoveries migrate to people with similar interests