Enhancing Session-Based Recommendation with Local Invariance Model

 
S
e
s
s
i
o
n
-
b
a
s
e
d
 
R
e
c
o
m
m
e
n
d
a
t
i
o
n
w
i
t
h
 
L
o
c
a
l
 
I
n
v
a
r
i
a
n
c
e
 
Tianwen Chen, Raymond Chi-Wing Wong
What is Session-based Recommendation
(SBR)?
 
In many online services, users’
actions are 
sequential
 and
grouped into 
sessions
.
Session:
 a sequence of actions
that occur in a certain
timeframe.
Objective
: to recommend the
next action given previous
actions in the current session.
2
Applications
Problem of Previous Methods
 
Detailed ordering
 in some sub-
sessions may not matter. (We call
this property 
local invariance
.)
 
High-level ordering
 in the
complete session is still important.
 
A good session-based recommender
system (SBRS) should pay different
attention to the ordering information in
different levels of granularity.
Most of the previous methods
predict the next action by following a
strict
 order
3
 
Contributions
 
We are the first to introduce the 
local invariance 
property in
SBR.
We propose a model to explicitly consider the 
local invariance
property.
Experiments on two public benchmark datasets demonstrate
the superiority of the proposed method over state-of-the-art
methods.
 
4
Proposed Method
5
Embedding Layer
Local Encoder
Global Encoder
Predictor
Local Encoder
6
Local Encoder
7
 
Adjacent items more
similar, larger group,
larger variance
 
Adjacent items less
similar, smaller group,
smaller variance
Global Encoder
8
Objective
: extract high-level sequential
information from the group features.
Predictor
9
Embedding Layer
 
 
10
 
Experiments
 
Compared Methods
 
11
Results
 
Neural network-based methods performs much better than conventional methods.
STAMP and SR-GNN do not encode the sequential information strictly but still have
competitive performance.
The proposed method LINet outperforms or matches the state-of-the-art methods.
12
Conventional
methods
Neural network-
based methods
Proposed method
 
Ablation Experiments
 
To study the effects of the local encoder (LE), global encoder (GE),
attention weights (AW), and Gaussian weights (GW).
 
13
 
Approach
: by comparing the 
performance
on similar pairs
 of sessions.
A
 
pair of sessions are similar if:
They have the same multiset of
They have the same last item.
There are some 
local ordering differences
.
Consider the last item as the next item of
all previous items.
Local ordering differences do not affect
the occurrence of the next item.
The pair of similar sessions have the 
local
invariance 
property.
 
items.
Capability of Considering Local Invariance
14
 
The matched items are exactly the same
Generalization
: The matched items are 
highly similar
.
 
(similar)
Capability of Considering Local Invariance
The percentage of similar pairs predicted accurately* by each model
(*: a pair is considered as “predicted accurately” if both sessions in the pair is predicted accurately.)
15
 
Conclusion
 
We introduce the 
local invariance 
property in SBR.
We propose a model to explicitly consider the 
local invariance
 property.
With the combination of the local and global encoders, our model captures high-level
sequential information and automatically discards insignificant local ordering
information.
Experiments on two public benchmark datasets demonstrate the superiority
of the proposed method over state-of-the-art methods.
 
16
 
Q & A
Q & A
 
17
Capability of Considering Local Invariance
Matching
18
Last Item
 
19
 
Matching
 
Last Item
Slide Note
Embed
Share

Introducing local invariance to Session-Based Recommendation (SBR), the proposed model considers both detailed ordering and high-level session ordering. By explicitly capturing local context information and global sequential patterns, the model outperforms existing methods in predicting the next user action within a session. Utilizing Gaussian weights and attention mechanisms, the model extracts group features and measures similarity to enhance recommendation accuracy.

  • Recommendation System
  • Session-Based
  • Local Invariance
  • Sequential Patterns
  • Gaussian Weights

Uploaded on Oct 05, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Session Session- -based Recommendation based Recommendation with Local Invariance with Local Invariance Tianwen Chen, Raymond Chi-Wing Wong

  2. What is Session-based Recommendation (SBR)? Applications In many online services, users actions are sequential and grouped into sessions. Session: a sequence of actions that occur in a certain timeframe. Objective: to recommend the next action given previous actions in the current session. Video streaming News feeding E-commerce A user s actions, e.g., purchased items, watched videos. Time Session ?? 1 Session ?? ??,5 Session ?? ??,1 ??,2 ??,3 ??,4 2

  3. Problem of Previous Methods Detailed ordering in some sub-session Most of the previous methods predict the next action by following a strict order Detailed ordering in some sub- sessions may not matter. (We call this property local invariance.) ??,1 ??,2 ??,3 ??,4 ??,5 ??,6 ??,7 High-level ordering in the complete session High-level ordering in the complete session is still important. Songs A good session-based recommender system (SBRS) should pay different attention to the ordering information in different levels of granularity. Artists 3

  4. Contributions We are the first to introduce the local invariance property in SBR. We propose a model to explicitly consider the local invariance property. Experiments on two public benchmark datasets demonstrate the superiority of the proposed method over state-of-the-art methods. 4

  5. Probability distribution of the next item ?(??+?|?) Proposed Method Predictor ??: high-level sequential information that discards insignificant detailed ordering ? = (??,??) ?? ??: short-term memory that represents user s recent interests ?? Global Encoder Extracts the high-level sequential information Local Encoder Extracts local context information that is invariant to subtle position changes in sub-sessions ?? ?? ?? ?? Item embeddings Embedding Layer 5 Session ?

  6. Local Encoder Gaussian weight ??(? ?) measure the (temporal) closeness between item ?? and item ?? Objective: extract group features that are invariant to position changes inside each group. Intuition: groups are formed by similar adjacent items. ?? group features: ?? ??? is large if ??is similar and close to?? ??? ??? = ?=1 ? ????? (?) ??(? ?) (?) Attention weight ??? measures the similarity between item ?? and item ?? Attention weight Gaussian weight 6

  7. Local Encoder ? is computed using the Attention weight ??? attention mechanism. ??? ?? (?)= ?? (?)= softmax(?? ?tanh ??[??,??] ?? ??+? ?? ? (?)) Adjacent items more similar, larger group, larger variance Gaussian weight ??? ? ??? : pdf of ? 0,?? Assume ?? adjacent items: ?? 2 2 avg. similarity to ??? ? Adjacent items less similar, smaller group, smaller variance 1 2= ? 2? ?,0< ? ? ?sim ??,?? 7

  8. Global Encoder Objective: extract high-level sequential information from the group features. ? : high-level sequential information ??: the output state of a GRU at time ? Straightforward method: ? = ?? Common method: ? = ????? ?? ??= softmax(??) (?)= ?? ?tanh(??[??,??]) 8

  9. Probability distribution of the next item Predictor Objective: generate a probability distribution of the next item ? ??+1? ? ??+?? = softmax(?? ? ) ? ? ? ? ? ? A neural network that transform ? to have the same dimensionality as ?? ?: Fully-connected Layers ?? ?: embedding of the item ? ? = ??,?? 2? Embedding Layer ?: Hybrid session representation ??: User s recent interests 9

  10. Experiments Datasets: two commonly used benchmark datasets. YooChoose (RecSys Challenge 2015) Users click streams on an e-commerce website within 6 months. Diginetica (CIKM Cup 2016) Users online purchase records in one week. Evaluation metrics (following previous work): Hit@20: The fraction of test samples in which the desired next item is ranked among the top 20 positions. MRR@20: The mean reciprocal rank of the desired next items. The reciprocal rank is set to 0 if the rank > 20. 10

  11. Compared Methods Consider All Previous Items? Is the Ordering of Previous Items Considered? Is the Local Invariance Property Considered? Methods Item-KNN (WWW 01) No No No Conventional methods BPR-MF (UAI 09) Yes No No Some FPMC (WWW 10) No No (Only Several Previous Items) GRU4Rec (ICLR 16) Yes Yes No NARM (CIKM 17) Yes Yes No STAMP (KDD 18) Yes No No RepeatNet (AAAI 19) Yes Yes No Neural network- based methods Yes SR-GNN (AAAI 19) Yes No (Not Exact Ordering) Yes Yes Yes 11

  12. Results Conventional methods Neural network- based methods Proposed method Neural network-based methods performs much better than conventional methods. STAMP and SR-GNN do not encode the sequential information strictly but still have competitive performance. The proposed method LINet outperforms or matches the state-of-the-art methods. 12

  13. Ablation Experiments To study the effects of the local encoder (LE), global encoder (GE), attention weights (AW), and Gaussian weights (GW). 13

  14. Capability of Considering Local Invariance Approach: by comparing the performance on similar pairs of sessions. Apair of sessions are similar if: They have the same multiset of They have the same last item. There are some local ordering differences. Consider the last item as the next item of all previous items. Local ordering differences do not affect the occurrence of the next item. The pair of similar sessions have the local invariance property. Last Item items. (similar) ?? ??,1 ??,2 ??,3 ??,5 ??,6 ? ??,4 ?? ??,1 ??,2 ??,3 ??,4 ??,6 ??,5 ? The matched items are exactly the same Generalization: The matched items are highly similar. 14

  15. Capability of Considering Local Invariance The percentage of similar pairs predicted accurately* by each model (*: a pair is considered as predicted accurately if both sessions in the pair is predicted accurately.) The proposed model outperforms all other methods in similar pairs The proposed model has a higher capability of considering local invariance. 15

  16. Conclusion We introduce the local invariance property in SBR. We propose a model to explicitly consider the local invariance property. With the combination of the local and global encoders, our model captures high-level sequential information and automatically discards insignificant local ordering information. Experiments on two public benchmark datasets demonstrate the superiority of the proposed method over state-of-the-art methods. 16

  17. Q & A 17

  18. Capability of Considering Local Invariance Formal and generalized definition Two sessions ?? and ?? are similar if: 1. Same length: ?? = ?? = ? 2. Same last item: ??,?= ??,?= ? 3. There exists a perfect matching between ??,1:? 1 and ??,1:? 1, such that the similarity between the matched items is at least ?, and the difference between indices of the matched items is at most ?. Similarity between two items ?? and ??: 1 (??) Matching Last Item ??,1 ??,2 ??,3 ??,4 ??,5 ? ?? ??,1 ??,2 ??,3 ??,4 ??,5 ? ?? Similarity 0.86 0.74 0.93 0.78 0.81 Index 1 2 3 4 5 Index Diff. 2 1 2 2 1 (??) ?? sim ??,?? = log2? + 1 ??? log2? + 1 : a term penalizes large difference between indices, inspired by discounted cumulative gain ? (??): the number of training sessions where the difference between indices of ?? and ?? is ?. ?? 18

  19. Matching Last Item ??,1 ??,2 ??,3 ??,4 ??,5 ? ?? ??,1 ??,2 ??,3 ??,4 ??,5 ? ?? ?? ??,2 ??,5 ??,4 ??,3 ??,1 ? Proving a similar pair of sessions have the local invariance property. Matching defines a way to re-order ?? The re-ordered ??, i.e., ?? items are similar. So ?? ?? After some swaps in local regions, the next item is not changed ?? has the local invariance property. Similarly, ?? has the local invariance property. is similar to ?? because their corresponding and ?? are likely to have the same next item. and ?? have the same next item. 19

Related


More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#