All Downloads are FREE. Search and download functionalities are using the official Maven repository.

scripts.nn.layers.dropout.dml Maven / Gradle / Ivy

There is a newer version: 1.2.0
Show newest version
#-------------------------------------------------------------
#
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements.  See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership.  The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License.  You may obtain a copy of the License at
#
#   http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied.  See the License for the
# specific language governing permissions and limitations
# under the License.
#
#-------------------------------------------------------------

/*
 * Dropout layer.
 */

forward = function(matrix[double] X, double p, int seed)
    return (matrix[double] out, matrix[double] mask) {
  /*
   * Computes the forward pass for an inverted dropout layer.
   *
   * Drops the inputs element-wise with a probability p, and divides
   * by p to maintain the expected values of those inputs (which are
   * the outputs of neurons) at test time.
   *
   * Inputs:
   *  - X: Inputs, of shape (any, any).
   *  - p: Probability of keeping a neuron output.
   *  - seed: [Optional: -1] Random number generator seed to allow for
   *      deterministic evaluation.  Set to -1 for a random seed.
   *
   * Outputs:
   *  - out: Outputs, of same shape as `X`.
   *  - mask: Dropout mask used to compute the output.
   */
  # Normally, we might use something like
  #    `mask = rand(rows=nrow(X), cols=ncol(X), min=0, max=1, seed=seed) <= p`
  # to create a dropout mask.  Fortunately, SystemML has a `sparsity` parameter on
  # the `rand` function that allows use to create a mask directly.
  if (seed == -1) {
    mask = rand(rows=nrow(X), cols=ncol(X), min=1, max=1, sparsity=p)
  } else {
    mask = rand(rows=nrow(X), cols=ncol(X), min=1, max=1, sparsity=p, seed=seed)
  }
  out = X * mask / p
}

backward = function(matrix[double] dout, matrix[double] X, double p, matrix[double] mask)
    return (matrix[double] dX) {
  /*
   * Computes the backward pass for an inverted dropout layer.
   *
   * Applies the mask to the upstream gradient, and divides by p to
   * maintain the expected values at test time.
   *
   * Inputs:
   *  - dout: Gradient wrt `out`, of same shape as `X`.
   *  - X: Inputs, of shape (any, any).
   *  - p: Probability of keeping a neuron output.
   *  - mask: Dropout mask used to compute the output.
   *
   * Outputs:
   *  - dX: Gradient wrt `X`, of same shape as `X`.
   */
  dX = mask / p * dout
}





© 2015 - 2024 Weber Informatics LLC | Privacy Policy