scispace - formally typeset
X

Xipeng Qiu

Researcher at Fudan University

Publications -  278
Citations -  11673

Xipeng Qiu is an academic researcher from Fudan University. The author has contributed to research in topics: Computer science & Automatic summarization. The author has an hindex of 43, co-authored 227 publications receiving 7402 citations. Previous affiliations of Xipeng Qiu include New York University & Carnegie Mellon University.

Papers
More filters
Journal ArticleDOI

Pre-trained Models for Natural Language Processing: A Survey

TL;DR: Recently, the emergence of pre-trained models (PTMs) has brought natural language processing (NLP) to a new era as mentioned in this paper, and a comprehensive review of PTMs for NLP can be found in this survey.
Book ChapterDOI

How to Fine-Tune BERT for Text Classification?

TL;DR: A general solution for BERT fine-tuning is provided and new state-of-the-art results on eight widely-studied text classification datasets are obtained.
Proceedings Article

Recurrent neural network for text classification with multi-task learning

TL;DR: This article proposed three different mechanisms of sharing information to model text with task-specific and shared layers, which can improve the performance of a task with the help of other related tasks in text classification.
Proceedings ArticleDOI

Utilizing BERT for Aspect-Based Sentiment Analysis via Constructing Auxiliary Sentence

TL;DR: This paper constructs an auxiliary sentence from the aspect and converts ABSA to a sentence-pair classification task, such as question answering (QA) and natural language inference (NLI), and fine-tune the pre-trained model from BERT.
Proceedings ArticleDOI

Adversarial Multi-task Learning for Text Classification

TL;DR: This paper proposed an adversarial multi-task learning framework, which alleviates the shared and private latent feature spaces from interfering with each other and showed that the shared knowledge learned by their proposed model can be regarded as off-the-shelf knowledge and easily transferred to new tasks.