1. model.compile(loss=losses.mean_squared_error, optimizer='sgd')

You can either pass the name of an existing loss function, or pass a TensorFlow/Theano symbolic function that returns a scalar for each data-point and takes the following two arguments:

  • y_true: True labels. TensorFlow/Theano tensor.
  • y_pred: Predictions. TensorFlow/Theano tensor of the same shape as y_true.

The actual optimized objective is the mean of the output array across all datapoints.

For a few examples of such functions, check out the .

Available loss functions

  1. keras.losses.mean_squared_error(y_true, y_pred)

mean_absolute_error

  1. keras.losses.mean_absolute_error(y_true, y_pred)

mean_absolute_percentage_error

  1. keras.losses.mean_absolute_percentage_error(y_true, y_pred)

mean_squared_logarithmic_error

    squared_hinge

    1. keras.losses.hinge(y_true, y_pred)

    categorical_hinge

    1. keras.losses.categorical_hinge(y_true, y_pred)

    logcosh

    log(cosh(x)) is approximately equal to (x ** 2) / 2 for small x andto abs(x) - log(2) for large x. This means that 'logcosh' works mostlylike the mean squared error, but will not be so strongly affected by theoccasional wildly incorrect prediction.

    Arguments

    • y_true: tensor of true targets.
    • y_pred: tensor of predicted targets.

    Returns

    huber_loss

    1. keras.losses.huber_loss(y_true, y_pred, delta=1.0)

    categorical_crossentropy

    1. keras.losses.categorical_crossentropy(y_true, y_pred, from_logits=False, label_smoothing=0)

    binary_crossentropy

    1. keras.losses.binary_crossentropy(y_true, y_pred, from_logits=False, label_smoothing=0)

    kullback_leibler_divergence

    poisson

    1. keras.losses.poisson(y_true, y_pred)

    cosine_proximity

    1. keras.losses.cosine_proximity(y_true, y_pred, axis=-1)
    1. keras.losses.is_categorical_crossentropy(loss)

    Note: when using the categoricalcrossentropy loss, your targets should be in categorical format (e.g. if you have 10 classes, the target for each sample should be a 10-dimensional vector that is all-zeros except for a 1 at the index corresponding to the class of the sample). In order to convert _integer targets into categorical targets, you can use the Keras utility to_categorical:

    When using the sparsecategorical_crossentropy loss, your targets should be _integer targets.If you have categorical targets, you should use categorical_crossentropy.

    categorical_crossentropy is another term for multi-class log loss.