Zero-Shot Relation Extraction Based on Contrast Learning

 
Hongyu Zhu, Jun Zeng, Yu Yang, Yingbo Wu
Hongyu Zhu, Jun Zeng, Yu Yang, Yingbo Wu
School of Big Data & Software Engineering
School of Big Data & Software Engineering
Chongqing University
Chongqing University
zengjun@cqu.edu.cn
zengjun@cqu.edu.cn
 
The 34th International Conference on Software Engineering & Knowledge Engineering
The 34th International Conference on Software Engineering & Knowledge Engineering
 
A Zero-Shot Relation Extraction Approach Based
A Zero-Shot Relation Extraction Approach Based
on Contrast Learning
on Contrast Learning
 
Content
 
Background
 
Previous Work
 
Methodology
 
Experiments
 
Conclusion
 
Background
 
3
 
Knowledge Graph
 
Relational Triplet: Head Entity & Relation & Tail Entity
 
Background
 
4
 
Zero-Shot Learning
 
Z
e
b
r
a
 
i
s
 
a
 
h
o
r
s
e
Z
e
b
r
a
 
h
a
s
 
b
l
a
c
k
 
a
n
d
 
w
h
i
t
e
 
s
t
r
i
p
e
s
 
RE T
ask 
D
efinition
 
S
u
i
t
a
b
l
e
 
f
o
r
 
Z
e
r
o
-
S
h
o
t
 
L
e
a
r
n
i
n
g
-
 
M
a
n
u
a
l
 
a
n
n
o
t
a
t
i
o
n
 
i
s
 
t
i
m
e
-
c
o
n
s
u
m
i
n
g
-
 
M
o
r
e
 
i
n
 
l
i
n
e
 
w
i
t
h
 
t
h
e
 
n
e
e
d
s
 
o
f
 
r
e
a
l
-
w
o
r
d
 
s
c
e
n
a
r
i
o
s
 
Previous work
 
5
 
Data Augment
 
Chia, Yew Ken, et al. "RelationPrompt: Leveraging Prompts to Generate Synthetic Data for Zero-Shot Relation Triplet Extraction." Findings of the
Association for Computational Linguistics: ACL 2022. 2022.
 
Previous work
 
6
 
Textual Entailment
 
Chen, Qian, et al. "Enhanced LSTM for Natural Language Inference." Proceedings of the 55th Annual Meeting of the Association for
Computational Linguistics (Volume 1: Long Papers). 2017.
 
Methodology
 
7
 
The overall process of the experiment
 
B
:
 
N
e
g
a
t
i
v
e
 
E
x
a
m
p
l
e
 
G
e
n
e
r
a
t
o
r
 
C
:
 
S
i
m
i
l
a
r
i
t
y
 
c
a
l
c
u
l
a
t
i
o
n
 
&
C
o
n
t
r
a
s
t
 
T
r
a
i
n
i
n
g
 
M
e
t
h
o
d
 
Methodology
 
8
 
 
[CLS] token   Head entity tokens   Tail entity tokens
 
Methodology
 
9
 
Random Negative Samples(RNS)
 
B: Negative Example Generator
 
Methodology
 
10
 
Relational Negative
Samples(ReNS)
 
Increase focus on key words or
other potential information
 
Methodology
 
11
 
Entity Negative
Samples(ENS)
 
Entity pair is the most direct
information to reflect the
relationship
Strengthen the emphasis on
entity pairs
 
Methodology
 
12
 
C: similarity calculation & Contrast Training Method
 
Methodology
 
13
 
 
Similarity Calculation
 
Methodology
 
14
 
Contrast Training Method
 
Experiments
 
15
 
Datasets
Notice
The relation labels within training and testing data are disjoint
 
Experiments
 
16
 
Parameter Settings
 
Baselines
 
Experiments
 
17
Comparison of Relation Extraction Task Results:
Zero-Shot Relation Extraction task is hard for Normal
Supervised Relation Extraction Model.
Textual entailment task is not suitable for this task.
Our model ZRCM outperforms baselines at almost all the
situation
 
Experiments
 
18
 
Ablation - Negative Sampling Methods
Random Negative Samples is the most part of Negative
Sample, other parts also have a positive effect on the
final result;
Entity Negative Samples fluctuate due to the way in
which the representation samples are generated.
 
Experiments
 
19
T
he two curves have the same trend, but
 
they achieve
the best performance in different place.
 
Parametric experiment
 
Conclusion
 
20
 
We present a novel method matching representation learning for Zero-
Shot Relation Extraction;
We design a negative example generator specifically for relation
extraction, which fully captures the most relation-related information in
sentences;
We also carry out extensive experiments which can verify the
effectiveness of the designed adversarial training.
 
Thanks for Listening
Thanks for Listening
 
The 34th International Conference on Software
The 34th International Conference on Software
Engineering & Knowledge Engineering
Engineering & Knowledge Engineering
 
That’s all for my presentation
That’s all for my presentation
Slide Note
Embed
Share

This paper presents a zero-shot relation extraction approach based on contrast learning, aiming to improve the efficiency of relation extraction tasks. The methodology involves utilizing a knowledge graph to extract relational triplets and leveraging zero-shot learning to automate the process. The study compares the performance of the proposed approach with previous methods through experiments and concludes its effectiveness for real-world scenarios. Additionally, the work discusses the challenges of manual annotation in relation extraction and introduces the concept of zero-shot learning for more practical applications.

  • Relation Extraction
  • Zero-Shot Learning
  • Knowledge Graph
  • Contrast Learning

Uploaded on Sep 26, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. A Zero-Shot Relation Extraction Approach Based on Contrast Learning Hongyu Zhu, Jun Zeng, Yu Yang, Yingbo Wu School of Big Data & Software Engineering Chongqing University zengjun@cqu.edu.cn The 34th International Conference on Software Engineering & Knowledge Engineering

  2. Content Background Previous Work Methodology Experiments Conclusion

  3. Background Tail entity Knowledge Graph Relation Head entity Relational Triplet: Head Entity & Relation & Tail Entity 3

  4. Background Zero-Shot Learning RE Task Definition Head entity Tail entity The nearest general aviation airport is Turners Falls Airport in Montague , and the nearest national air service is at Bradley International Airport in Connecticut. Place served by transport hub: territorial entity or entities served by this transport hub Relation & Description Suitable for Zero-Shot Learning - Manual annotation is time-consuming Zebra is a horse Zebra has black and white stripes - More in line with the needs of real-word scenarios 4

  5. Previous work Data Augment Chia, Yew Ken, et al. "RelationPrompt: Leveraging Prompts to Generate Synthetic Data for Zero-Shot Relation Triplet Extraction." Findings of the Association for Computational Linguistics: ACL 2022. 2022. 5

  6. Previous work Textual Entailment Relation & Description: Place served by transport hub: territorial entity or entities served by this transport hub(airport, train station, etc.) with/without Sample: 10,000 at Baiyun Airport in Guangzhou were stranded after 55 lights were cancelled Chen, Qian, et al. "Enhanced LSTM for Natural Language Inference." Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2017. 6

  7. Methodology B: Negative Example Generator A: Concatenate entity pair to get ?+ C: Similarity calculation & Contrast Training Method The overall process of the experiment 7

  8. Methodology A: Concatenate entity pair to get ?+ ?+= ??(???????(?0 ??1 ??2)) ?+ ? ? 1 1 ??2= ? ? + 1? ?? ??1= ? ? + 1? ?? ?0 ??1 ??2 ?=? ?=? ?? ?0 ?? ?? ?? ... ... [CLS] token Head entity tokens Tail entity tokens 8

  9. Methodology B: Negative Example Generator ?1+ ?1 ?2+ batch size = K + ?? ???(?(?1+,??+)),? = 2,...,? Random Negative Samples(RNS) 9

  10. Methodology Relational Negative Samples(ReNS) Sample: In 1954 USAF C-124 transports assisted the French ... , landing at Da Nang 's Tourane Airfield. Relation: Place served by transport hub:territorial entity or entities served by this transport hub Increase focus on key words or other potential information Mask In 1954 USAF C-124 transports assisted the French ... , landing at Da Nang 's Tourane Airfield. 10

  11. Methodology Entity Negative Samples(ENS) Sample: Childress MunicipalAirport is a commercial airport located within city limits, 4 miles west of central Childress, Texas. Entity pair is the most direct information to reflect the relationship Strengthen the emphasis on entity pairs Mask Mask Mask Childress MunicipalAirport is a commercial airport located within city limits, 4 miles west of central Childress, Texas. Mask Mask 11

  12. Methodology C: similarity calculation & Contrast Training Method ?? ?? ?? ?? ... ... ... Get contrast prototypes ??for each relation: Sentence-BERT The representation of similarity between the positive sample and the relation description is closer; The representation of similarity between the negative sample and the relation description is farther. ?? ... [SPE] [CLS] ... ... league in which team or player plays or has played in Relation Description 12

  13. Methodology Similarity Calculation ?? ?1 ?2 ?3 ?+ = ?????(?? ,??), i=1,2,3 ?+= ?????(?+,??) ?? ?+ ?1 ?2 ?3 ) = ???(0,? ?++ ?? ) ?(?+,?? 1,2,3 )] [?(?+,?? ? = ? 13

  14. Methodology Contrast Training Method Contact Relation Labels with Relation Descriptions ?: 0,2,3 ( 1)2 ???????(???) + ?) ? = ???(0, ? ???? = (1 ?) ? + ? ? 1,2,3 )] [?(?+,?? ? = ? 14

  15. Experiments Datasets #instance #entities #relations avg.len FewRel 56,000 72,954 80 24.95 Wiki-ZSL 94,383 77,623 113 24.85 Wiki-KB 1,518,444 306,720 354 23.82 Notice The relation labels within training and testing data are disjoint 15

  16. Experiments Parameter Settings Baselines Model Notices Parameter FewRel/Wiki-ZSL R-BERT Normal RE 4 Batch size ESIM Textual Entailment Hidden layer 768 CIM Textual Entailment Sentence embedding 1024 ZS-BERT Zero-shot Learning 0.4 4/0 ? 7.5 ? Similarity function inner 16

  17. Experiments Comparison of Relation Extraction Task Results: Zero-Shot Relation Extraction task is hard for Normal Supervised Relation Extraction Model. Textual entailment task is not suitable for this task. Our model ZRCM outperforms baselines at almost all the situation 17

  18. Experiments Ablation - Negative Sampling Methods Random Negative Samples is the most part of Negative Sample, other parts also have a positive effect on the final result; Entity Negative Samples fluctuate due to the way in which the representation samples are generated. 1,2,3 )] [?(?+,?? ? = ? 18

  19. Experiments Parametric experiment 0,2,3 ( 1)2 ???????(???) + ?) ? = ???(0, ? The two curves have the same trend, but they achieve the best performance in different place. The Effect of ? on F1 19

  20. Conclusion We present a novel method matching representation learning for Zero- Shot Relation Extraction; We design a negative example generator specifically for relation extraction, which fully captures the most relation-related information in sentences; We also carry out extensive experiments which can verify the effectiveness of the designed adversarial training. 20

  21. Thanks for Listening That s all for my presentation The 34th International Conference on Software Engineering & Knowledge Engineering

More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#