Lstm With Varying K-hot Encoded Vector
Followup question from: LSTM with keras In this example a one hot encoded vector is used to perform classification using an LSTM. How could this LSTM be used to perform k-hot encod
Solution 1:
This is a multiclass classification task. In order to solve that you need to:
Set your output activation to sigmoid:
model.add(Dense(150, activation='sigmoid'))
Set your targets to indicator encoding:
If you e.g. 4 classes and for a given example set classes 0 and 2 your output should be
[1, 0, 1, 0]
Use the following loss:
import keras.backend as K def multiclass_loss(y_true, y_pred): EPS =1e-5 y_pred = K.clip(y_pred, EPS, 1 - EPS) return -K.mean((1 - y_true) * K.log(1 - y_pred) + y_true * K.log(y_pred)) model.compile(optimizer=..., loss=multiclass_loss)
Post a Comment for "Lstm With Varying K-hot Encoded Vector"