Zero-Shot Relation Extraction Based on Contrast Learning

Slide Note
Embed
Share

This paper presents a zero-shot relation extraction approach based on contrast learning, aiming to improve the efficiency of relation extraction tasks. The methodology involves utilizing a knowledge graph to extract relational triplets and leveraging zero-shot learning to automate the process. The study compares the performance of the proposed approach with previous methods through experiments and concludes its effectiveness for real-world scenarios. Additionally, the work discusses the challenges of manual annotation in relation extraction and introduces the concept of zero-shot learning for more practical applications.


Uploaded on Sep 26, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. A Zero-Shot Relation Extraction Approach Based on Contrast Learning Hongyu Zhu, Jun Zeng, Yu Yang, Yingbo Wu School of Big Data & Software Engineering Chongqing University zengjun@cqu.edu.cn The 34th International Conference on Software Engineering & Knowledge Engineering

  2. Content Background Previous Work Methodology Experiments Conclusion

  3. Background Tail entity Knowledge Graph Relation Head entity Relational Triplet: Head Entity & Relation & Tail Entity 3

  4. Background Zero-Shot Learning RE Task Definition Head entity Tail entity The nearest general aviation airport is Turners Falls Airport in Montague , and the nearest national air service is at Bradley International Airport in Connecticut. Place served by transport hub: territorial entity or entities served by this transport hub Relation & Description Suitable for Zero-Shot Learning - Manual annotation is time-consuming Zebra is a horse Zebra has black and white stripes - More in line with the needs of real-word scenarios 4

  5. Previous work Data Augment Chia, Yew Ken, et al. "RelationPrompt: Leveraging Prompts to Generate Synthetic Data for Zero-Shot Relation Triplet Extraction." Findings of the Association for Computational Linguistics: ACL 2022. 2022. 5

  6. Previous work Textual Entailment Relation & Description: Place served by transport hub: territorial entity or entities served by this transport hub(airport, train station, etc.) with/without Sample: 10,000 at Baiyun Airport in Guangzhou were stranded after 55 lights were cancelled Chen, Qian, et al. "Enhanced LSTM for Natural Language Inference." Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2017. 6

  7. Methodology B: Negative Example Generator A: Concatenate entity pair to get ?+ C: Similarity calculation & Contrast Training Method The overall process of the experiment 7

  8. Methodology A: Concatenate entity pair to get ?+ ?+= ??(???????(?0 ??1 ??2)) ?+ ? ? 1 1 ??2= ? ? + 1? ?? ??1= ? ? + 1? ?? ?0 ??1 ??2 ?=? ?=? ?? ?0 ?? ?? ?? ... ... [CLS] token Head entity tokens Tail entity tokens 8

  9. Methodology B: Negative Example Generator ?1+ ?1 ?2+ batch size = K + ?? ???(?(?1+,??+)),? = 2,...,? Random Negative Samples(RNS) 9

  10. Methodology Relational Negative Samples(ReNS) Sample: In 1954 USAF C-124 transports assisted the French ... , landing at Da Nang 's Tourane Airfield. Relation: Place served by transport hub:territorial entity or entities served by this transport hub Increase focus on key words or other potential information Mask In 1954 USAF C-124 transports assisted the French ... , landing at Da Nang 's Tourane Airfield. 10

  11. Methodology Entity Negative Samples(ENS) Sample: Childress MunicipalAirport is a commercial airport located within city limits, 4 miles west of central Childress, Texas. Entity pair is the most direct information to reflect the relationship Strengthen the emphasis on entity pairs Mask Mask Mask Childress MunicipalAirport is a commercial airport located within city limits, 4 miles west of central Childress, Texas. Mask Mask 11

  12. Methodology C: similarity calculation & Contrast Training Method ?? ?? ?? ?? ... ... ... Get contrast prototypes ??for each relation: Sentence-BERT The representation of similarity between the positive sample and the relation description is closer; The representation of similarity between the negative sample and the relation description is farther. ?? ... [SPE] [CLS] ... ... league in which team or player plays or has played in Relation Description 12

  13. Methodology Similarity Calculation ?? ?1 ?2 ?3 ?+ = ?????(?? ,??), i=1,2,3 ?+= ?????(?+,??) ?? ?+ ?1 ?2 ?3 ) = ???(0,? ?++ ?? ) ?(?+,?? 1,2,3 )] [?(?+,?? ? = ? 13

  14. Methodology Contrast Training Method Contact Relation Labels with Relation Descriptions ?: 0,2,3 ( 1)2 ???????(???) + ?) ? = ???(0, ? ???? = (1 ?) ? + ? ? 1,2,3 )] [?(?+,?? ? = ? 14

  15. Experiments Datasets #instance #entities #relations avg.len FewRel 56,000 72,954 80 24.95 Wiki-ZSL 94,383 77,623 113 24.85 Wiki-KB 1,518,444 306,720 354 23.82 Notice The relation labels within training and testing data are disjoint 15

  16. Experiments Parameter Settings Baselines Model Notices Parameter FewRel/Wiki-ZSL R-BERT Normal RE 4 Batch size ESIM Textual Entailment Hidden layer 768 CIM Textual Entailment Sentence embedding 1024 ZS-BERT Zero-shot Learning 0.4 4/0 ? 7.5 ? Similarity function inner 16

  17. Experiments Comparison of Relation Extraction Task Results: Zero-Shot Relation Extraction task is hard for Normal Supervised Relation Extraction Model. Textual entailment task is not suitable for this task. Our model ZRCM outperforms baselines at almost all the situation 17

  18. Experiments Ablation - Negative Sampling Methods Random Negative Samples is the most part of Negative Sample, other parts also have a positive effect on the final result; Entity Negative Samples fluctuate due to the way in which the representation samples are generated. 1,2,3 )] [?(?+,?? ? = ? 18

  19. Experiments Parametric experiment 0,2,3 ( 1)2 ???????(???) + ?) ? = ???(0, ? The two curves have the same trend, but they achieve the best performance in different place. The Effect of ? on F1 19

  20. Conclusion We present a novel method matching representation learning for Zero- Shot Relation Extraction; We design a negative example generator specifically for relation extraction, which fully captures the most relation-related information in sentences; We also carry out extensive experiments which can verify the effectiveness of the designed adversarial training. 20

  21. Thanks for Listening That s all for my presentation The 34th International Conference on Software Engineering & Knowledge Engineering

Related