Caseformer: Pre-training for Legal Case Retrieval Based on Inter-Case Distinctions
Clicks: 23
ID: 283294
2023
Article Quality & Performance Metrics
Overall Quality
Improving Quality
0.0
/100
Combines engagement data with AI-assessed academic quality
Reader Engagement
Emerging Content
2.1
/100
7 views
7 readers
Trending
AI Quality Assessment
Not analyzed
Abstract
Legal case retrieval aims to help legal workers find relevant cases related
to their cases at hand, which is important for the guarantee of fairness and
justice in legal judgments. While recent advances in neural retrieval methods
have significantly improved the performance of open-domain retrieval tasks
(e.g., Web search), their advantages have not been observed in legal case
retrieval due to their thirst for annotated data. As annotating large-scale
training data in legal domains is prohibitive due to the need for domain
expertise, traditional search techniques based on lexical matching such as
TF-IDF, BM25, and Query Likelihood are still prevalent in legal case retrieval
systems. While previous studies have designed several pre-training methods for
IR models in open-domain tasks, these methods are usually suboptimal in legal
case retrieval because they cannot understand and capture the key knowledge and
data structures in the legal corpus. To this end, we propose a novel
pre-training framework named Caseformer that enables the pre-trained models to
learn legal knowledge and domain-specific relevance information in legal case
retrieval without any human-labeled data. Through three unsupervised learning
tasks, Caseformer is able to capture the special language, document structure,
and relevance patterns of legal case documents, making it a strong backbone for
downstream legal case retrieval tasks. Experimental results show that our model
has achieved state-of-the-art performance in both zero-shot and full-data
fine-tuning settings. Also, experiments on both Chinese and English legal
datasets demonstrate that the effectiveness of Caseformer is
language-independent in legal case retrieval.
| Reference Key |
zhang2023caseformer
Use this key to autocite in the manuscript while using
SciMatic Manuscript Manager or Thesis Manager
|
|---|---|
| Authors | Weihang Su; Qingyao Ai; Yueyue Wu; Yixiao Ma; Haitao Li; Yiqun Liu; Zhijing Wu; Min Zhang |
| Journal | arXiv |
| Year | 2023 |
| DOI |
DOI not found
|
| URL | |
| Keywords |
Citations
No citations found. To add a citation, contact the admin at info@scimatic.org
Comments
No comments yet. Be the first to comment on this article.