Masking for Multi-Class Classification with Dynamic Number of Classes #435
Unanswered
ramondalmau
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Dear Community,
Firstly, I would like to extend my congratulations on developing what appears to be an extremely useful tool. Before I begin utilizing it, I would appreciate some guidance on whether it is suitable for my specific use-case.
I am currently working on a multi-class classification problem where the number of classes is dynamic. In other words, one observation might have 3 classes, while another might have 10, and so on.
In my tabular dataset, each observation
i
is represented not by a single row, but byN_i
rows, with the columns representing the features. The number of rows (N_i
) per observation is dynamic.I have structured my dataset as a tensor with the shape
[B, N_max, F]
, whereB
represents the batch size,N_max
is the maximum number of rows across all observations, andF
is the number of features. For each observation in the batch, I have a label in the range[0, N_i - 1]
.For observations where
N_i < N_max
, I pad the rows fromN_i + 1
toN_max
with zeros. In the loss function, I mask the logits of the padded rows to-inf
before applying the softmax function. This ensures that the probability of these rows being chosen is effectively zero.Consequently, the loss function must accept both X, y and mask:
Given these specifics, I would like to know if this library could be adapted to my problem.
Thank you for your time and assistance.
Best Regards,
Ramon
Beta Was this translation helpful? Give feedback.
All reactions