top of page

Neural Search Talks [1] — Shallow Pooling of Sparse Labels

In this episode of Neural Search Talks, Andrew Yates and Sergi Castella discuss the paper "Shallow Pooling for Sparse Labels". This paper puts the spotlight on the popular IR benchmark MS MARCO and investigates whether modern neural retrieval models retrieve documents that are even more relevant than the original top relevance annotations. The results have important implications and raise the question of to what degree this benchmark is still an informative north star to follow.

We compare and contrast different methods for information retrieval, including learning to rank, dense retrieval, approximate nearest neighbor search, and learned sparse representations. Furthermore, we explore ideas about evaluating search engines using different types of judgments, such as sparse and graded annotations, and discuss the difficulty in selecting the best answer when there are multiple best answers or when the questions are complex. Finally, we talk about the future of benchmarking and evaluation methods in information retrieval, including the idea of qualitative evaluations in machine learning models.

Here are more details and resources:



00:00 — Introduction.

01:52 — Overview and motivation of the paper.

04:00 — Origins of MS MARCO.

07:30 — Modern approaches to IR: keyword-based, dense retrieval, rerankers and learned sparse representations.

13:40 — What is "better than perfect" performance on MS MARCO?

17:15 — Results and discussion: how often are neural rankers preferred over original annotations on MS MARCO? How should we interpret these results?

26:55 — The authors' proposal to "fix" MS MARCO: shallow pooling

32:40 — How does TREC Deep Learning compare?

38:30 — How do models compare after re-annotating MS MARCO passages?

45:00 — Figure 5 audio description.

47:00 — Discussion on models' performance after re-annotations.

51:50 — Exciting directions in the space of IR benchmarking.

1:06:20 — Outro.

8 views0 comments

Recent Posts

See All


bottom of page