afnio.autodiff.function#
Classes
|
Base class to create custom autodiff.Function. |
- class afnio.autodiff.function.Function(*args, **kwargs)[source]#
Bases:
objectBase class to create custom autodiff.Function.
To create a custom autodiff.Function, subclass this class and implement the
forward()andbackward()static methods. Then, to use your custom op in the forward pass, call the class methodapply. Do not callforward()directly.Example:
>>> class Func(Function): >>> @staticmethod >>> def forward(ctx, x: hf.Variable): >>> reverse = x.data[::-1] >>> out = hf.Variable(data=reverse, role=x.role, requires_grad=True) >>> ctx.save_for_backward(x, reverse, out) >>> return out >>> >>> @staticmethod >>> def backward(ctx, grad_out): >>> x, reverse, out = ctx.saved_variables >>> grad = f"Here is the feedback for {x.role} (reversed): {grad_out.grad}" >>> role = f"Feedback to {x.role}" >>> x.grad = hf.Variable(data=grad, role=role) >>> return x.grad >>> >>> a = hf.Variable(data="This is a string", role="Input string", requires_grad=True) >>> c = Func.apply(a)
- classmethod apply(*args, **kwargs)[source]#
Applies the forward function of the custom Function class.
This method handles cases where setup_context is defined to set up the ctx (context) object separately or within the forward method itself.
- static backward(ctx, *grad_outputs)[source]#
Define a formula for differentiating the operation with backward mode automatic differentiation.
This function is to be overridden by all subclasses.
It must accept a context
ctxas the first argument, followed by as many outputs as theforward()returned (None will be passed in for non variable outputs of the forward function), and it should return as many variables, as there were inputs toforward(). Each argument is the gradient w.r.t the given output, and each returned value should be the gradient w.r.t. the corresponding input. If an input is not a Variable or is a Variable not requiring grads, you can just pass None as a gradient for that input.The context can be used to retrieve variables saved during the forward pass. It also has an attribute
ctx.needs_input_gradas a tuple of booleans representing whether each input needs gradient. E.g.,backward()will havectx.needs_input_grad[0] = Trueif the first input toforward()needs gradient computed w.r.t. the output.
- static forward(*args, **kwargs)[source]#
Define the forward of the custom autodiff Function.
This function is to be overridden by all subclasses. There are two ways to define forward:
Usage 1 (Combined forward and ctx):
@staticmethod def forward(ctx: Any, *args: Any, **kwargs: Any) -> Any: pass
It must accept a context ctx as the first argument, followed by any number of arguments (variables or other types).
Usage 2 (Separate forward and ctx):
@staticmethod def forward(*args: Any, **kwargs: Any) -> Any: pass @staticmethod def setup_context(ctx: Any, inputs: Tuple[Any, ...], output: Any) -> None: pass
The forward no longer accepts a ctx argument.
Instead, you must also override the
afnio.autodiff.Function.setup_context()staticmethod to handle setting up thectxobject.outputis the output of the forward,inputsare a Tuple of inputs to the forward.
The context can be used to store arbitrary data that can be then retrieved during the backward pass. Variables should not be stored directly on ctx. Instead, variables should be saved either with
ctx.save_for_backward()if they are intended to be used inbackward.
- static setup_context(ctx, inputs, output)[source]#
There are two ways to define the forward pass of an autodiff.Function.
Either:
Override forward with the signature
forward(ctx, *args, **kwargs).setup_contextis not overridden. Setting up the ctx for backward happens inside theforward.Override forward with the signature
forward(*args, **kwargs)and overridesetup_context. Setting up the ctx for backward happens insidesetup_context(as opposed to inside theforward)