{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}


lecture15-webchar-handout-6-per -...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
1 Introduc)on to Informa(on Retrieval CS276 Informa)on Retrieval and Web Search Pandu Nayak and Prabhakar Raghavan Lecture 15: Web search basics Introduc)on to Informa)on Retrieval Brief (non‐technical± history Early keyword‐based engines ca. 1995‐1997 Altavista, Excite, Infoseek, Inktomi, Lycos Paid search ranking: Goto (morphed into Overture.com Yahoo!± Your search ranking depended on how much you paid Auc)on for keywords: casino was expensive! 2 Introduc)on to Informa)on Retrieval Brief (non‐technical± history 1998+: Link‐based ranking pioneered by Google Blew away all early engines save Inktomi Great user experience in search of a business model Meanwhile Goto/Overture’s annual revenues were nearing $1 billion Result: Google added paid search “ads” to the side, independent of search results Yahoo followed suit, acquiring Overture (for paid placement± and Inktomi (for search± 2005+: Google gains search share, domina)ng in Europe and very strong in North America 2009: Yahoo! and Microso_ propose combined paid search oFering 3 Introduc)on to Informa)on Retrieval Algorithmic results. Paid Search Ads 4 Introduc)on to Informa)on Retrieval Web search basics The Web Ad indexes Web spider Indexer Indexes Search User Sec. 19.4.1 5 Introduc)on to Informa)on Retrieval User Needs Need [Brod02, RL04] Informa(onal – want to learn about something (~40% / 65%± Naviga(onal – want to go to that page (~25% / 15%± Transac(onal – want to do something (web‐mediated± (~35% / 20%± Access a service Downloads Shop Gray areas ²ind a good hub Exploratory search “see what’s there” Sec. 19.4.1 6
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
2 Introduc)on to Informa)on Retrieval How far do people look for results? (Source: iprospect.com WhitePaper_2006_SearchEngineUserBehavior.pdf) 7 Introduc)on to Informa)on Retrieval Users’ empirical evalua)on of results Quality of pages varies widely Relevance is not enough Other desirable quali)es (non IR!!± Content: Trustworthy, diverse, non‐duplicated, well maintained Web readability: display correctly & fast No annoyances: pop‐ups, etc. Precision vs. recall On the web, recall seldom ma²ers What ma²ers Precision at 1? Precision above the fold? Comprehensiveness – must be able to deal with obscure queries Recall ma²ers when the number of matches is very small User percep)ons may be unscien)Fc, but are signiFcant over a large aggregate 8 Introduc)on to Informa)on Retrieval Users’ empirical evalua)on of engines Relevance and validity of results UI – Simple, no clu²er, error tolerant Trust – Results are objec)ve Coverage of topics for polysemic queries Pre/Post process tools provided Mi)gate user errors (auto spell check, search assist,…± Explicit: Search within results, more like this, reFne . ..
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

Page1 / 10

lecture15-webchar-handout-6-per -...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon bookmark
Ask a homework question - tutors are online