You can not select more than 25 topics Topics must start with a chinese character,a letter or number, can include dashes ('-') and can be up to 35 characters long.

README.md 2.8 kB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778
  1. # Summarization
  2. ## Extractive Summarization
  3. ### Models
  4. FastNLP中实现的模型包括:
  5. 1. Get To The Point: Summarization with Pointer-Generator Networks (See et al. 2017)
  6. 2. Searching for Effective Neural Extractive Summarization What Works and What's Next (Zhong et al. 2019)
  7. 3. Fine-tune BERT for Extractive Summarization (Liu et al. 2019)
  8. ### Dataset
  9. 这里提供的摘要任务数据集包括:
  10. - CNN/DailyMail
  11. - Newsroom
  12. - The New York Times Annotated Corpus
  13. - NYT
  14. - NYT50
  15. - DUC
  16. - 2002 Task4
  17. - 2003/2004 Task1
  18. - arXiv
  19. - PubMed
  20. 其中公开数据集(CNN/DailyMail, Newsroom, arXiv, PubMed)预处理之后的下载地址:
  21. - [百度云盘](https://pan.baidu.com/s/11qWnDjK9lb33mFZ9vuYlzA) (提取码:h1px)
  22. - [Google Drive](https://drive.google.com/file/d/1uzeSdcLk5ilHaUTeJRNrf-_j59CQGe6r/view?usp=drivesdk)
  23. 未公开数据集(NYT, NYT50, DUC)数据处理部分脚本放置于data文件夹
  24. ### Dataset_loader
  25. - SummarizationLoader: 用于读取处理好的jsonl格式数据集,返回以下field
  26. - text: 文章正文
  27. - summary: 摘要
  28. - domain: 可选,文章发布网站
  29. - tag: 可选,文章内容标签
  30. - labels: 抽取式句子标签
  31. - BertSumLoader:用于读取作为 BertSum(Liu 2019) 输入的数据集,返回以下 field:
  32. - article:每篇文章被截断为 512 后的词表 ID
  33. - segmet_id:每句话属于 0/1 的 segment
  34. - cls_id:输入中 ‘[CLS]’ 的位置
  35. - label:抽取式句子标签
  36. ### Performance and Hyperparameters
  37. | Model | ROUGE-1 | ROUGE-2 | ROUGE-L | Paper |
  38. | :-----------------------------: | :-----: | :-----: | :-----: | :-----------------------------------------: |
  39. | LEAD 3 | 40.11 | 17.64 | 36.32 | our data pre-process |
  40. | ORACLE | 55.24 | 31.14 | 50.96 | our data pre-process |
  41. | LSTM + Sequence Labeling | 40.72 | 18.27 | 36.98 | |
  42. | Transformer + Sequence Labeling | 40.86 | 18.38 | 37.18 | |
  43. | LSTM + Pointer Network | - | - | - | |
  44. | Transformer + Pointer Network | - | - | - | |
  45. | BERTSUM | 42.71 | 19.76 | 39.03 | Fine-tune BERT for Extractive Summarization |
  46. | LSTM+PN+BERT+RL | - | - | - | |
  47. ## Abstractive Summarization
  48. Still in Progress...