chainer.functions.connectionist_temporal_classification¶
-
chainer.functions.
connectionist_temporal_classification
(x, t, blank_symbol, input_length=None, label_length=None, reduce='mean')[source]¶ Connectionist Temporal Classification loss function.
Connectionist Temporal Classification(CTC) [Graves2006] is a loss function of sequence labeling where the alignment between the inputs and target is unknown. See also [Graves2012]
The output is a variable whose value depends on the value of the option
reduce
. If it is'no'
, it holds the samplewise loss values. If it is'mean'
, it takes the mean of loss values.Parameters: - x (list or tuple of
Variable
) – A list of unnormalized probabilities for labels. Each element ofx
,x[i]
is aVariable
object, which has shape(B, V)
, whereB
is the batch size andV
is the number of labels. The softmax ofx[i]
represents the probabilities of the labels at timei
. - t (
Variable
or N-dimensional array) – A matrix including expected label sequences. Its shape is(B, M)
, whereB
is the batch size andM
is the maximum length of the label sequences. All elements int
must be less thanV
, the number of labels. - blank_symbol (int) – Index of blank_symbol. This value must be non-negative.
- input_length (
Variable
or N-dimensional array) – Length of sequence for each of mini batchx
(optional). Its shape must be(B,)
. If theinput_length
is omitted orNone
, it assumes that all ofx
is valid input. - label_length (
Variable
or N-dimensional array) – Length of sequence for each of mini batcht
(optional). Its shape must be(B,)
. If thelabel_length
is omitted orNone
, it assumes that all oft
is valid input. - reduce (str) – Reduction option. Its value must be either
'mean'
or'no'
. Otherwise,ValueError
is raised.
Returns: A variable holding a scalar value of the CTC loss. If
reduce
is'no'
, the output variable holds array whose shape is (B,) where B is the number of samples. If it is'mean'
, it holds a scalar.Return type: Note
You need to input
x
without applying to activation functions(e.g. softmax function), because this function applies softmax functions tox
before calculating CTC loss to avoid numerical limitations. You also need to apply softmax function to forwarded values before you decode it.Note
This function is differentiable only by
x
.Note
This function supports (batch, sequence, 1-dimensional input)-data.
[Graves2006] Alex Graves, Santiago Fernandez, Faustino Gomez, Jurgen Schmidhuber, Connectionist Temporal Classification: Labelling Unsegmented Sequence Data with Recurrent Neural Networks [Graves2012] Alex Graves, Supervised Sequence Labelling with Recurrent Neural Networks - x (list or tuple of