afnio.autodiff.basic_ops#
Classes
|
Implements an addition operation for |
|
Implements a split operation for |
|
Implements a summation operation for a list of |
- class afnio.autodiff.basic_ops.Add(*args, **kwargs)[source]#
Bases:
FunctionImplements an addition operation for
Variableinstances within theafnioframework, supporting automatic differentiation.This class inherits from
autodiff.Functionand requires both theforwardandbackwardmethods to be defined.The
Addfunction supports both scalar and list.datafields:Scalars: Adds numerical values (
int,float) or concatenates strings.Lists: Performs element-wise addition of corresponding elements from the lists. Lists must be of the same length.
It automatically handles type-based operations:
For numerical data (
int,float), it performs arithmetic addition.For strings, it concatenates the values.
Mixed types (e.g., string and number) are converted appropriately before performing the addition.
This operation also tracks
Variabledependencies, enabling automatic gradient computation through backpropagation.Example with scalar inputs:
>>> x = Variable(data="abc", role="first input", requires_grad=True) >>> y = Variable(data="def", role="second input", requires_grad=False) >>> result = Add.apply(x, y) >>> result.data 'abcdef' >>> result.role 'first input and second input' >>> result.requires_grad True >>> g = Variable(data="MY_FEEDBACK", role="add gradient") >>> result.backward(g) >>> x.grad[0].data 'Here is the combined feedback we got for this specific first input and other variables: MY_FEEDBACK' >>> x.grad[0].role 'feedback to first input'
Example with batched inputs:
>>> x = Variable(data=[1, 2, 3], role="first input", requires_grad=True) >>> y = Variable(data=[4, 5, 6], role="second input", requires_grad=False) >>> result = Add.apply(x, y) >>> result.data [5, 7, 9] >>> result.role 'first input and second input' >>> result.requires_grad True
- classmethod apply(*args, **kwargs)#
Applies the forward function of the custom Function class.
This method handles cases where setup_context is defined to set up the ctx (context) object separately or within the forward method itself.
- static backward(ctx, grad_output)[source]#
Define a formula for differentiating the operation with backward mode automatic differentiation.
This function is to be overridden by all subclasses.
It must accept a context
ctxas the first argument, followed by as many outputs as theforward()returned (None will be passed in for non variable outputs of the forward function), and it should return as many variables, as there were inputs toforward(). Each argument is the gradient w.r.t the given output, and each returned value should be the gradient w.r.t. the corresponding input. If an input is not a Variable or is a Variable not requiring grads, you can just pass None as a gradient for that input.The context can be used to retrieve variables saved during the forward pass. It also has an attribute
ctx.needs_input_gradas a tuple of booleans representing whether each input needs gradient. E.g.,backward()will havectx.needs_input_grad[0] = Trueif the first input toforward()needs gradient computed w.r.t. the output.
- static forward(ctx, x, y)[source]#
Define the forward of the custom autodiff Function.
This function is to be overridden by all subclasses. There are two ways to define forward:
Usage 1 (Combined forward and ctx):
@staticmethod def forward(ctx: Any, *args: Any, **kwargs: Any) -> Any: pass
It must accept a context ctx as the first argument, followed by any number of arguments (variables or other types).
Usage 2 (Separate forward and ctx):
@staticmethod def forward(*args: Any, **kwargs: Any) -> Any: pass @staticmethod def setup_context(ctx: Any, inputs: Tuple[Any, ...], output: Any) -> None: pass
The forward no longer accepts a ctx argument.
Instead, you must also override the
afnio.autodiff.Function.setup_context()staticmethod to handle setting up thectxobject.outputis the output of the forward,inputsare a Tuple of inputs to the forward.
The context can be used to store arbitrary data that can be then retrieved during the backward pass. Variables should not be stored directly on ctx. Instead, variables should be saved either with
ctx.save_for_backward()if they are intended to be used inbackward.
- static setup_context(ctx, inputs, output)#
There are two ways to define the forward pass of an autodiff.Function.
Either:
Override forward with the signature
forward(ctx, *args, **kwargs).setup_contextis not overridden. Setting up the ctx for backward happens inside theforward.Override forward with the signature
forward(*args, **kwargs)and overridesetup_context. Setting up the ctx for backward happens insidesetup_context(as opposed to inside theforward)
- class afnio.autodiff.basic_ops.Split(*args, **kwargs)[source]#
Bases:
FunctionImplements a split operation for
Variableinstances within theafnioframework, supporting automatic differentiation.This class inherits from
Functionand requires both theforwardandbackwardmethods to be defined.The
Splitfunction divides the.dataof the inputVariableinto multiple parts using a specified delimitersep. Ifmaxsplitis specified, the split operation is limited to a maximum number of splits. It handles both scalar and list.datafields:Scalars: The scalar
.data(a single string) is split into substrings based on the specifiedsepandmaxsplitparameters.Lists: Each element of the list
.data(strings) is split individually. If splits of varying lengths occur, shorter splits are automatically padded with empty strings to ensure consistent dimensions.
During backpropagation, feedback is collected and aggregated across all split parts. The combined feedback is propagated back to the original input
Variable, allowing for the proper computation of gradients.Example with scalar inputs:
>>> x = Variable(data="afnio is great!", role="sentence", requires_grad=True) >>> result = Split.apply(x, sep=" ", maxsplit=1) >>> [var.data for var in result] ['afnio', 'is great!'] >>> result[0].role 'split part 0 of sentence' >>> g_1 = Variable(data="MY_FIRST_FEEDBACK", role="gradient") >>> g_2 = Variable(data="MY_SECOND_FEEDBACK", role="gradient") >>> result[0].backward(g_1, retain_graph=True) >>> result[1].backward(g_2) >>> x.grad[0].data 'Here is the combined feedback we got for this specific sentence and other variables: <ITEM>MY_FIRST_FEEDBACK</ITEM><ITEM></ITEM>' >>> x.grad[0].role 'feedback to sentence' >>> x.grad[1].data 'Here is the combined feedback we got for this specific sentence and other variables: <ITEM></ITEM><ITEM>MY_SECOND_FEEDBACK</ITEM>' >>> x.grad[1].role 'feedback to sentence'
Example with batched inputs:
>>> x = Variable( ... data=["afnio is great!", "Deep learning"], ... role="sentences", ... requires_grad=True ... ) >>> result = Split.apply(x, sep=" ", maxsplit=2) >>> [var.data for var in result] [['afnio', 'Deep'], ['is', 'learning'], ['great!', '']] >>> g = Variable(data="MY_FEEDBACK", role="gradient") >>> result[1].backward(g) >>> x.grad[0].data 'Here is the combined feedback we got for this specific sentences and other variables: <ITEM></ITEM><ITEM>MY_FEEDBACK</ITEM><ITEM></ITEM>' >>> x.grad[0].role 'feedback to sentences'
- classmethod apply(*args, **kwargs)#
Applies the forward function of the custom Function class.
This method handles cases where setup_context is defined to set up the ctx (context) object separately or within the forward method itself.
- static backward(ctx, *grad_outputs)[source]#
Define a formula for differentiating the operation with backward mode automatic differentiation.
This function is to be overridden by all subclasses.
It must accept a context
ctxas the first argument, followed by as many outputs as theforward()returned (None will be passed in for non variable outputs of the forward function), and it should return as many variables, as there were inputs toforward(). Each argument is the gradient w.r.t the given output, and each returned value should be the gradient w.r.t. the corresponding input. If an input is not a Variable or is a Variable not requiring grads, you can just pass None as a gradient for that input.The context can be used to retrieve variables saved during the forward pass. It also has an attribute
ctx.needs_input_gradas a tuple of booleans representing whether each input needs gradient. E.g.,backward()will havectx.needs_input_grad[0] = Trueif the first input toforward()needs gradient computed w.r.t. the output.
- static forward(ctx, x, sep=None, maxsplit=-1)[source]#
Define the forward of the custom autodiff Function.
This function is to be overridden by all subclasses. There are two ways to define forward:
Usage 1 (Combined forward and ctx):
@staticmethod def forward(ctx: Any, *args: Any, **kwargs: Any) -> Any: pass
It must accept a context ctx as the first argument, followed by any number of arguments (variables or other types).
Usage 2 (Separate forward and ctx):
@staticmethod def forward(*args: Any, **kwargs: Any) -> Any: pass @staticmethod def setup_context(ctx: Any, inputs: Tuple[Any, ...], output: Any) -> None: pass
The forward no longer accepts a ctx argument.
Instead, you must also override the
afnio.autodiff.Function.setup_context()staticmethod to handle setting up thectxobject.outputis the output of the forward,inputsare a Tuple of inputs to the forward.
The context can be used to store arbitrary data that can be then retrieved during the backward pass. Variables should not be stored directly on ctx. Instead, variables should be saved either with
ctx.save_for_backward()if they are intended to be used inbackward.
- static setup_context(ctx, inputs, output)#
There are two ways to define the forward pass of an autodiff.Function.
Either:
Override forward with the signature
forward(ctx, *args, **kwargs).setup_contextis not overridden. Setting up the ctx for backward happens inside theforward.Override forward with the signature
forward(*args, **kwargs)and overridesetup_context. Setting up the ctx for backward happens insidesetup_context(as opposed to inside theforward)
- class afnio.autodiff.basic_ops.Sum(*args, **kwargs)[source]#
Bases:
FunctionImplements a summation operation for a list of
Variableinstances within theafnioframework, supporting automatic differentiation.This class inherits from
Functionand requires both theforwardandbackwardmethods to be defined.The
Sumfunction aggregates the.data,.role, and.requires_gradattributes of all inputVariableinstances into a singleVariable. It supports both scalar and list.datafields:Scalars: Computes the arithmetic sum for numerical data (
int,float) or concatenates all string values, wrapping each in <ITEM></ITEM> tags.Lists: Aggregates the corresponding elements of the lists. For numerical data, it sums the corresponding elements. For string data, it concatenates them, wrapping each element in
<ITEM></ITEM>tags.
During backpropagation, the function distributes the gradient to all input
Variableinstances that require gradients.Example with scalar inputs:
>>> x = Variable(data="abc", role="first input", requires_grad=True) >>> y = Variable(data="def", role="second input", requires_grad=False) >>> result = Sum.apply([x, y]) >>> result.data '<ITEM>abc</ITEM><ITEM>def</ITEM>' >>> result.role 'first input and second input' >>> result.requires_grad True >>> g = Variable(data="MY_FEEDBACK", role="add gradient") >>> result.backward(g) >>> x.grad[0].data 'Here is the combined feedback we got for this specific first input and other variables: MY_FEEDBACK' >>> x.grad[0].role 'feedback to first input'
Example with batched inputs:
>>> x = Variable(data=[1, 2, 3.5], role="first input", requires_grad=True) >>> y = Variable(data=[4, 5, 6], role="second input", requires_grad=False) >>> result = Sum.apply([x, y]) >>> result.data [5, 7, 9.5] >>> result.role 'first input and second input' >>> result.requires_grad True
- classmethod apply(*args, **kwargs)#
Applies the forward function of the custom Function class.
This method handles cases where setup_context is defined to set up the ctx (context) object separately or within the forward method itself.
- static backward(ctx, grad_output)[source]#
Define a formula for differentiating the operation with backward mode automatic differentiation.
This function is to be overridden by all subclasses.
It must accept a context
ctxas the first argument, followed by as many outputs as theforward()returned (None will be passed in for non variable outputs of the forward function), and it should return as many variables, as there were inputs toforward(). Each argument is the gradient w.r.t the given output, and each returned value should be the gradient w.r.t. the corresponding input. If an input is not a Variable or is a Variable not requiring grads, you can just pass None as a gradient for that input.The context can be used to retrieve variables saved during the forward pass. It also has an attribute
ctx.needs_input_gradas a tuple of booleans representing whether each input needs gradient. E.g.,backward()will havectx.needs_input_grad[0] = Trueif the first input toforward()needs gradient computed w.r.t. the output.
- static forward(ctx, x)[source]#
Define the forward of the custom autodiff Function.
This function is to be overridden by all subclasses. There are two ways to define forward:
Usage 1 (Combined forward and ctx):
@staticmethod def forward(ctx: Any, *args: Any, **kwargs: Any) -> Any: pass
It must accept a context ctx as the first argument, followed by any number of arguments (variables or other types).
Usage 2 (Separate forward and ctx):
@staticmethod def forward(*args: Any, **kwargs: Any) -> Any: pass @staticmethod def setup_context(ctx: Any, inputs: Tuple[Any, ...], output: Any) -> None: pass
The forward no longer accepts a ctx argument.
Instead, you must also override the
afnio.autodiff.Function.setup_context()staticmethod to handle setting up thectxobject.outputis the output of the forward,inputsare a Tuple of inputs to the forward.
The context can be used to store arbitrary data that can be then retrieved during the backward pass. Variables should not be stored directly on ctx. Instead, variables should be saved either with
ctx.save_for_backward()if they are intended to be used inbackward.
- static setup_context(ctx, inputs, output)#
There are two ways to define the forward pass of an autodiff.Function.
Either:
Override forward with the signature
forward(ctx, *args, **kwargs).setup_contextis not overridden. Setting up the ctx for backward happens inside theforward.Override forward with the signature
forward(*args, **kwargs)and overridesetup_context. Setting up the ctx for backward happens insidesetup_context(as opposed to inside theforward)