[摘要生成]Boosting Factual Correctness of Abstractive Summarization with Knowledge Graph

切入点:factual correctness

提出 两个模型

结论

测评数据集:CNN/DailyMail 和 Xsum

  1. FC指标:为了测评factual correctness。FactCC模型在xxx进行fine-tune之后用于评估。可以看到整理结果:
    [摘要生成]Boosting Factual Correctness of Abstractive Summarization with Knowledge Graph
  2. Noval n-grams:Diab论文中提到”less abstractive summaries are more factual consistent with the article”,所以作者想看看是否自己的模型”boost factual correctness simply by copying
    more portions of the article”。为此,计算了sum中出现article不存在的n-gram的比例,越高说明抽象程度越高。
    [摘要生成]Boosting Factual Correctness of Abstractive Summarization with Knowledge Graph
  3. Relation Matching Rate RMR:为了测评factual correctness。将对事实的评估转化到从summary中抽取到三元组的准确率。具体来说从生成的sum中抽取出三元组合集R s = ( s i , r i , o i ) R_s = {(s_i,r_i,o_i)}R s ​=(s i ​,r i ​,o i ​) ,同样从原始的article中抽出三元组合集R a R_a R a ​,( s i , r i , o i ) (s_i,r_i,o_i)(s i ​,r i ​,o i ​)和R a R_a R a ​比较会出现三种情况:Correct hit (C C C)命中、Wrong hit (W W W)、Miss (M M M)就是其他情况。基于此,定义RMR为:
    R M R 1 = 100 × C C + W RMR_1=100 \times \frac{C}{C+W}R M R 1 ​=1 0 0 ×C +W C ​
    R M R 2 = 100 × C C + W + M RMR_2=100 \times \frac{C}{C+W+M}R M R 2 ​=1 0 0 ×C +W +M C ​
    为了评估RMR指标的质量,文章计算了 _人评估_和 _RMR指标_之间的correlation coefficient γ \gamma γ,计算得到γ = 0.43 \gamma=0.43 γ=0 .4 3,说明了RMR和人工评估结果之前存在可观察的关系。
  4. Natural Language Inference NLI models:为了测评factual correctness。用BERT-large模型在MNLI数据集上进行fine-tune,模型输出三种类型:entailment, neutral and contradiction. 对应到这个任务的度量上,NLI的输入和输出分别是article和sum,通过衡量NLI模型输出的contradiction的比例来衡量争取事实比例,比例越小说明 article和生成摘要的冲突越小。
    [摘要生成]Boosting Factual Correctness of Abstractive Summarization with Knowledge Graph
  5. Human Evaluation:三个人,打分1-3,两个维度 factual correctnessinformativeness。效果如下
    [摘要生成]Boosting Factual Correctness of Abstractive Summarization with Knowledge Graph
    [摘要生成]Boosting Factual Correctness of Abstractive Summarization with Knowledge Graph

Original: https://blog.csdn.net/SARACH_WONG/article/details/112598320
Author: joshuwang0810
Title: [摘要生成]Boosting Factual Correctness of Abstractive Summarization with Knowledge Graph

原创文章受到原创版权保护。转载请注明出处:https://www.johngo689.com/595576/

转载文章受原作者版权保护。转载请注明原作者出处!

(0)

大家都在看

亲爱的 Coder【最近整理,可免费获取】👉 最新必读书单  | 👏 面试题下载  | 🌎 免费的AI知识星球