model

文本摘要

名称

机构

作者

代码链接

论文

模型链接

年份

编程语言

运行环境

star

fork

引用

Extractive Summarization as Text Matching

Fudan University

Ming Zhong

查看

Extractive Summarization as Text Matching

查看

2020

Python

PyTorch

240

51

23

Text Summarization with Pretrained Encoders

University of Edinburgh

Yang Liu

查看

Text Summarization with Pretrained Encoders

查看

2019

Python 3.6

PyTorch 1.1.0

825

325

233

GSum

Carnegie Mellon University

Zi-Yi Dou

查看

GSum: A General Framework for Guided Neural Abstractive Summarization

查看

2020

None

None

14

1

1

ProphetNet

University of Science and Technology of China,Microsoft,Microsoft Research Asia,Sichuan University

Weizhen Qi

查看

ProphetNet: Predicting Future N-gram for Sequence-to-Sequence Pre-training

查看

2020

Python

torch==1.3.0 fairseq==v0.9.0

229

44

34

Controlling the Amount of Verbatim Copying in Abstractive Summarization

University of Central Florida,Robert Bosch LLC

Kaiqiang Song

查看

Controlling the Amount of Verbatim Copying in Abstractive Summarization

查看

2020

Python 3.7

Pytorchv1.3 Pyrouge pytorch-pretrained-bert

33

7

7

PEGASUS

Imperial College London,Google Research

Jingqing Zhang

查看

PEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization

查看

2020

Python

Google Cloud or gsutil

915

180

93

UNILM

Microsoft Research

Li Dong

查看

Unified Language Model Pre-training for Natural Language Understanding and Generation

查看

2019

Python

UniLM v1

1723

371

305

BART

Facebook AI

Mike Lewis

查看

BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

查看

2019

Python

Pytorch

10.9k

2.8k

392

ALONE

Tokyo Institute of Technology,Tohoku University

Sho Takase

查看

All Word Embeddings from One Embedding

查看

2020

Python>=3.6

Pytorch>=1.4.0

17

1

1

Learning to Extract Coherent Summary via Deep Reinforcement Learning

Hong Kong University of Science and Technology,University of Massachusetts Medical School

YuxiangWu

查看

Learning to Extract Coherent Summary via Deep Reinforcement Learning

查看

2018

Null

Null

Null

Null

58

Extractive Summarization with SWAP-NET

Indian Institute of Science,School of Computing National University of Singapore

Aishwarya Jadhav

查看

Extractive Summarization with SWAP-NET:Sentences and Words from Alternating Pointer Networks

查看

2018

Null

Null

Null

Null

39

自然语言推理

名称

机构

作者

代码链接

论文

模型链接

年份

编程语言

运行环境

star

fork

引用

Transformer-XH

University of Maryland, College Park,Microsoft AI & Research

Chen Zhao

查看

Transformer-XH: Multi-Evidence Reasoning with eXtra Hop Attention

查看

2019

Python

NVIDIA apex

52

12

21

XLNet

Carnegie Mellon University,Google AI Brain Team

Zhilin Yang

查看

Generalized Autoregressive Pretraining for Language Understanding

查看

2019

Python2

TensorFlow 1.13.1

5.5k

1.1k

1906

RoBERTa

Paul G. Allen School of Computer Science & Engineering University of Washington,Facebook AI

Yinhan Liu

查看

RoBERTa: A Robustly Optimized BERT Pretraining Approach

查看

2019

Python>=3.6

PyTorch>= 1.5.NVIDIA GPUNCCL

11k

2.8k

686

BERT

Google AI Language

Jacob Devlin

查看

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

查看

2019

Python

TensorFlow

26.6k

7.5k

14563

SAE

JD AI Research

Ming Tu

查看

Select, Answer and Explain: Interpretable Multi-Hop Reading Comprehension over Multiple Documents

查看

2020

Python

PyTorch >= 1.1

22

3

22

DFGN

Shanghai Jiao Tong University,ByteDance AI Lab, China

Lin Qiu

查看

Dynamically Fused Graph Network for Multi-hop Reasoning

查看

2019

Python3

PyTorch0.4.1boto3

158

31

35

ALBERT

Google Research,Toyota Technological Institute at Chicago

Zhenzhong Lan

查看

ALBERT: A Lite BERT for Self-supervised Learning of Language Representations

查看

2019

Python

PyTorch

2.6k

475

1093

GPT

OpenAI

Alec Radford

查看

Improving Language Understanding by Generative Pre-Training

查看

2018

Python

ftfy==4.4.3spacy

1.6k

419

1844

MPNet

Nanjing University of Science and Technology,Microsoft Research

Kaitao Song

查看

MPNet: Masked and Permuted Pre-training for Language Understanding

查看

2020

Python

pytorch_transformers==1.0.0transformersscipysklearn

162

18

8

Longformer

Allen Institute for Artificial Intelligence

Iz Beltagy

查看

Longformer: The Long-Document Transformer

查看

2020

Python 3.7

cudatoolkit=10.0

1k

125

166

DPR

Facebook AI,University of Washington,Princeton University

Vladimir Karpukhin

查看

Dense Passage Retrieval for Open-Domain Question Answering

查看

2020

Python 3.6+

PyTorch 1.2.0+

477

84

37

Image Captioning

名称

机构

作者

代码链接

论文

模型链接

年份

编程语言

运行环境

star

fork

引用

VirTex

University of Michigan

Karan Desai

查看

VirTex: Learning Visual Representations from Textual Annotations

查看

2021

Python 3.6+

PyTorch 1.2.0+

330

31

10

Bottom-up and Top-down Attention

Australian National University,Microsoft Research,University of Adelaide,Macquarie University

Peter Anderson

查看

Bottom-Up and Top-Down Attention for Image Captioning and Visual Question Answering

查看

2018

Python 3.6

Pytorch 0.4.1

115

28

1564

VC R-CNN

University of Electronic Science and Technology of China,Damo Academy, Alibaba Group,Nanyang Technological University,Singapore Management University

Tan Wang

查看

Visual Commonsense R-CNN

查看

2020

Python 3.7

Pytorch 1.0

255

37

15

AoA

School of Electronic and Computer Engineering, Peking University,Peng Cheng Laboratory,Macau University of Science and Technology

Lun Huang

查看

Attention on Attention for Image Captioning

查看

2019

Python 3.6

PyTorch 1.0

229

50

99

Improving IC

Ingenuity Labs Research Institute, Queen’s University,Department of Electrical and Computer Engineering, Queen’s University,School of Computer Science, Fudan University

Zhan Shi

查看

Improving Image Captioning with Better Use of Captions

查看

2020

Python 2.7.15

Torch 1.0.1

16

5

1

Self-Attention Network

National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences,School of Artificial Intelligence, University of Chinese Academy of Sciences,University of Science and Technology Beijing,Wuhan University

Longteng Guo

查看

Normalized and Geometry-Aware Self-Attention Network for Image Captioning

查看

2020

None

None

None

None

4

Meshed-Memory Transformer

University of Modena and Reggio Emilia

Marcella Cornia

查看

Meshed-Memory Transformer for Image Captioning

查看

2020

Python 3.6

Pytorch

194

42

39

X-Linear Attention Networks

JD AI Research, Beijing, China

Yingwei Pan

查看

X-Linear Attention Networks for Image Captioning

查看

2020

Python 3

PyTorch (>1.0)

157

20

32

Oscar

Microsoft Corporation,University of Washington

Xiujun Li

查看

Oscar: Object-Semantics Aligned Pre-training for Vision-Language Tasks

查看

2020

Python

Pytorch

279

55

28

NER

名称

机构

作者

代码链接

论文

模型链接

年份

编程语言

运行环境

star

fork

引用

LUKE

Studio Ousia,RIKEN AIP,University of Washington,Nara Institute of Science and Technology,National Institute of Informatics

Ikuya Yamada

查看

LUKE: Deep Contextualized Entity Representations with Entity-aware Self-attention

查看

2020

Python

Pytorch

149

16

5

MRC Framework

Department of Computer Science and Technology, Zhejiang University,Shannon.AI

Xiaoya Li

查看

A Unified MRC Framework for Named Entity Recognition

查看

2019

Python

Pytorch

103

26

45

NER as Dependency Parsing

Queen Mary University,Google Research

Juntao Yu

查看

Named Entity Recognition as Dependency Parsing

查看

2020

Python 2

Pytorch

100

18

8

Relation Extraction

名称

机构

作者

代码链接

论文

模型链接

年份

编程语言

运行环境

star

fork

引用

Joint Entity and Relation Extraction

College of Computer Science and Technology, Zhejiang University,StatNLP Research Group, Singapore University of Technology and Design

Jue Wang

查看

Two are Better than One: Joint Entity and Relation Extraction with Table-Sequence Encoders

查看

2020

python3

pytorch 1.4.0

61

16

None

Downstream Model Design

AI Application Research Center, Huawei Technologies, Shenzhen, China

Cheng Li

查看

Downstream Model Design of Pre-trained Language Model for Relation Extraction Task

查看

2020

Python

Pytorch

74

14

4

Event Extraction

名称

机构

作者

代码链接

论文

模型链接

年份

编程语言

运行环境

star

fork

引用

One for All

Alt Inc.,Department of Computer and Information Science, University of Oregon

Trung Minh Nguyen

查看

One for All: Neural Joint Modeling of Entities and Events

查看

2019

None

None

None

None

33

Natural Language Inference

名称

机构

作者

代码链接

论文

模型链接

年份

编程语言

运行环境

star

fork

引用

Self-Explaining Structures

Zhejiang University,Computer Center of Peking University,Peng Cheng Laboratory,Shannon.AI

Zijun Sun

查看

Self-Explaining Structures Improve NLP Models

查看

2020

Python

Pytorch

11

1

None

Conditionally Adaptive Multi-Task Learning

Polytechnique Montreal & Mila,Element AI,CIFAR AI Chair

Jonathan Pilault

查看

Conditionally Adaptive Multi-Task Learning: Improving Transfer Learning in NLP Using Fewer Parameters & Less Data

查看

2020

None

None

None

None

None

Exploring the Limits of Transfer Learning

Google, Mountain View, CA 94043, USA

Colin Raffel

查看

Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer

查看

2019

Python

TensorFlow

3113

421

575

Machine Reading Comprehension

名称

机构

作者

代码链接

论文

模型链接

年份

编程语言

运行环境

star

fork

引用

SpanBERT

Allen School of Computer Science & Engineering, University of Washington, Seattle, WA,Computer Science Department, Princeton University, Princeton, NJ,Allen Institute of Artificial Intelligence, Seattle,Facebook AI Research, Seattle

Mandar Joshi

查看

SpanBERT: Improving Pre-training by Representing and Predicting Spans

查看

2020

Python

Pytorch

500

95

282

Hierarchical Graph Network

Microsoft Dynamics 365 AI Research

Yuwei Fang

查看

Hierarchical Graph Network for Multi-hop Question Answering

查看

2019

Python

Pytorch

32

5

29