mygrad.operation_base.Operation#
- class mygrad.operation_base.Operation[source]#
Base class for all tensor operations that support back-propagation of gradients.
Consider the Operation-instance
f
. A forward-pass throughf
is defined viaf.__call__(...)
. Thus, given tensorsa
andb
, a computational graph is definedf.__call__(a, b) -> c
, where the “creator” of tensorc
is recorded asf
:(node: a) --+ -> [operation: f(a, b)] --> (node: c) (node: b) --+
Back-propagating through
c
will instructf
to back-propagate the gradient to its inputs, which are recorded asa
andb
. Each node then back-propagates to any Operation-instance that is recorded as its creator, and so on.Methods
__call__
(*input_vars, **kwargs)Performs a forward pass, f, of this Operation.
backward
(grad, **kwargs)Back-propagates the gradient through all of the operation's inputs, which are stored in the tuple self.variables.
backward_var
(grad, index, **kwargs)Given
grad = dℒ/df
, computes∂ℒ/∂x_{i}
, wherex_{i}
is one ofx1, ...., xn
.grad_post_process_fn
Methods
__init__
()backward
(grad, **kwargs)Back-propagates the gradient through all of the operation's inputs, which are stored in the tuple self.variables.
backward_var
(grad, index, **kwargs)Given
grad = dℒ/df
, computes∂ℒ/∂x_{i}
, wherex_{i}
is one ofx1, ...., xn
.grad_post_process_fn
(grad, var_shape)Attributes
can_return_view
variables