Unveiling the World of Mass Excellence Cluster
Cluster ENIGMASS aims to drive groundbreaking research in neutrino physics, foster collaborations, enhance scientific education, and support innovative projects. By leveraging manpower, education initiatives, and strategic investments, the cluster propels advancements in particle physics and instrumentation technology. With a focus on scientific communication and fostering partnerships, ENIGMASS paves the way for transformative discoveries in the realm of astroparticle physics.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Towards Extreme Multi-label Text Classification Through Group-wise Label Ranking Jie Xiong @ SEKE23 2023-07-08
1. Introduction Extreme Multi-label text Classification (XMC) aims to tag a document with relevant labels from an extremely large label set. Labels Positive label Group ?? document classification Albert Einstein was a German-born theoretical physicist. Negative label Group ?? 2
1. Introduction Problems in existing XMC Methods: The ranking learning between labels is missing, but we hope all scores in ??are greater than those in ??. The sparse positive labels lead to serious label imbalance in XMC, which may hinder the model convergence. 3
2. Proposed Method We proposed a novel Group- wise label Ranking address the rank-missing and label imbalance problems of the existing XMC models. We proposed model with GRank loss, named X-Grank. We trained it end-to- end to eliminate the cascaded errors in existing methods. loss to a new XMC 4
2.1 GRank: Group-wise ranking loss 1. The original intention: all positive labels have higher scores than the negative labels 2. We made an equivalent transformation: 3. Use log-sum-exp LSE to approximate the max function and get the Group-wise Ranking (GRank) loss: 4. Grank loss has a mutual constraint gradients and thus avoids label imbalance problem. 5
2.2 The application of Grank loss The loss of the retrieving stage The loss of the ranking stage Total loss in Model Training 6