Skip to content

args

The Args module provides Arguments Mapping and Arguments Injection mechanisms in Bridgic.

ArgsMappingRule

Bases: Enum

Enumeration of Arguments Mapping rules for worker parameter passing.

ArgsMappingRule defines how the return values from predecessor workers are mapped to the parameters of the current worker. This controls the data flow between workers in an automa execution graph.

Attributes:

Name Type Description
AS_IS Enum

Preserves the exact order and types of return values from predecessor workers. No unpacking or merging is performed.

UNPACK Enum

Unpacks the return value from the predecessor worker and passes as individual arguments. Only valid when the current worker has exactly one dependency and the return value is a list/tuple or dict.

MERGE Enum

Merges all return values from predecessor workers into a single tuple as the only argument of the current worker.

SUPPRESSED Enum

Suppresses all return values from predecessor workers. No arguments are passed to the current worker from its dependencies.

Examples:

class MyAutoma(GraphAutoma):
    @worker(is_start=True)
    def worker_0(self, user_input: int) -> int:
        return user_input + 1

    @worker(dependencies=["worker_0"], args_mapping_rule=ArgsMappingRule.AS_IS)
    def worker_1(self, worker_0_output: int) -> int:
        # Receives the exact return value from worker_0
        return worker_0_output + 1

    @worker(dependencies=["worker_0"], args_mapping_rule=ArgsMappingRule.UNPACK)
    def worker_2(self, user_input: int, result: int) -> int:
        # Unpacks the return value from worker_0 (assuming it returns a tuple)
        return user_input + result

    @worker(dependencies=["worker_0", "worker_1"], args_mapping_rule=ArgsMappingRule.MERGE)
    def worker_3(self, all_results: tuple) -> int:
        # Receives all results as a single tuple
        return sum(all_results)

    @worker(dependencies=["worker_3"], args_mapping_rule=ArgsMappingRule.SUPPRESSED)
    def worker_4(self, custom_input: int = 10) -> int:
        # Ignores return value from worker_3, uses custom input
        return custom_input + 1
Note
  1. AS_IS is the default mapping rule when not specified
  2. UNPACK requires exactly one dependency and a list/tuple/dict return value
  3. MERGE combines all predecessor outputs into a single tuple argument
  4. SUPPRESSED allows workers to ignore dependency outputs completely
Source code in bridgic/core/types/_common.py
class ArgsMappingRule(Enum):
    """
    Enumeration of Arguments Mapping rules for worker parameter passing.

    ArgsMappingRule defines how the return values from predecessor workers are mapped 
    to the parameters of the current worker. This controls the data flow between workers 
    in an automa execution graph.

    Attributes
    ----------
    AS_IS: Enum
        Preserves the exact order and types of return values from predecessor workers.
        No unpacking or merging is performed.
    UNPACK: Enum
        Unpacks the return value from the predecessor worker and passes as individual 
        arguments. Only valid when the current worker has exactly one dependency and 
        the return value is a list/tuple or dict.
    MERGE: Enum
        Merges all return values from predecessor workers into a single tuple as the 
        only argument of the current worker.
    SUPPRESSED: Enum
        Suppresses all return values from predecessor workers. No arguments are passed 
        to the current worker from its dependencies.

    Examples
    --------
    ```python
    class MyAutoma(GraphAutoma):
        @worker(is_start=True)
        def worker_0(self, user_input: int) -> int:
            return user_input + 1

        @worker(dependencies=["worker_0"], args_mapping_rule=ArgsMappingRule.AS_IS)
        def worker_1(self, worker_0_output: int) -> int:
            # Receives the exact return value from worker_0
            return worker_0_output + 1

        @worker(dependencies=["worker_0"], args_mapping_rule=ArgsMappingRule.UNPACK)
        def worker_2(self, user_input: int, result: int) -> int:
            # Unpacks the return value from worker_0 (assuming it returns a tuple)
            return user_input + result

        @worker(dependencies=["worker_0", "worker_1"], args_mapping_rule=ArgsMappingRule.MERGE)
        def worker_3(self, all_results: tuple) -> int:
            # Receives all results as a single tuple
            return sum(all_results)

        @worker(dependencies=["worker_3"], args_mapping_rule=ArgsMappingRule.SUPPRESSED)
        def worker_4(self, custom_input: int = 10) -> int:
            # Ignores return value from worker_3, uses custom input
            return custom_input + 1
    ```

    Note
    ----
    1. AS_IS is the default mapping rule when not specified
    2. UNPACK requires exactly one dependency and a list/tuple/dict return value
    3. MERGE combines all predecessor outputs into a single tuple argument
    4. SUPPRESSED allows workers to ignore dependency outputs completely
    """
    AS_IS = "as_is"
    UNPACK = "unpack"
    MERGE = "merge"
    SUPPRESSED = "suppressed"

From dataclass

Bases: ArgsDescriptor

Implementing arguments injection for worker parameters with default value.

When a worker needs the output of another worker but does not directly depend on it in execution, you can use From to declare an arguments injection in its parameters.

Attributes:

Name Type Description
key str

The key of the worker to inject arguments from.

default Optional[Any]

The default value of the arguments.

Examples:

class MyAutoma(GraphAutoma):
    @worker(is_start=True)
    def worker_0(self, user_input: int) -> int:
        return user_input + 1

    @worker(dependencies=["worker_0"])
    def worker_1(self, worker_0_output: int) -> int:
        return worker_0_output + 1

    @worker(dependencies=["worker_1"], is_output=True)
    def worker_2(self, worker_1_output: int, worker_0_output: int = From("worker_0", 1)) -> int:
        # needs the output of worker_0 but does not directly depend on it in execution
        print(f'worker_0_output: {worker_0_output}')
        return worker_1_output + 1

Returns:

Type Description
Any

The output of the worker specified by the key.

Raises:

Type Description
WorkerArgsInjectionError

If the worker specified by the key does not exist and no default value is set.

Note:
  1. Can set a default value for a From declaration, which will be returned when the specified worker does not exist.
  2. Will raise WorkerArgsInjectionError if the worker specified by the key does not exist and no default value is set.
Source code in bridgic/core/automa/args/_args_descriptor.py
@dataclass
class From(ArgsDescriptor):
    """
    Implementing arguments injection for worker parameters with default value.

    When a worker needs the output of another worker but does not directly depend on 
    it in execution, you can use `From` to declare an arguments injection in 
    its parameters.

    Attributes
    ----------
    key : str
        The key of the worker to inject arguments from.
    default : Optional[Any]
        The default value of the arguments.

    Examples
    --------
    ```python
    class MyAutoma(GraphAutoma):
        @worker(is_start=True)
        def worker_0(self, user_input: int) -> int:
            return user_input + 1

        @worker(dependencies=["worker_0"])
        def worker_1(self, worker_0_output: int) -> int:
            return worker_0_output + 1

        @worker(dependencies=["worker_1"], is_output=True)
        def worker_2(self, worker_1_output: int, worker_0_output: int = From("worker_0", 1)) -> int:
            # needs the output of worker_0 but does not directly depend on it in execution
            print(f'worker_0_output: {worker_0_output}')
            return worker_1_output + 1
    ```

    Returns
    -------
    Any
        The output of the worker specified by the key.

    Raises
    ------
    WorkerArgsInjectionError
        If the worker specified by the key does not exist and no default value is set.

    Note:
    ------
    1. Can set a default value for a `From` declaration, which will be returned when the specified worker does not exist.
    2. Will raise `WorkerArgsInjectionError` if the worker specified by the key does not exist and no default value is set.
    """
    key: str
    default: Optional[Any] = InjectorNone()

System dataclass

Bases: ArgsDescriptor

Implementing system-level arguments injection for worker parameters.

System provides access to automa-level resources and context through arguments injection. It supports pattern matching for different types of system resources.

Attributes:

Name Type Description
key str

The system resource key to inject. Supported keys: - "runtime_context": Runtime context for data persistence across worker executions. - "automa": Current automa instance. - "automa:worker_key": Sub-automa instance in current automa.

Examples:

def worker_1(x: int, current_automa = System("automa")) -> int:
    # Access current automa instance
    current_automa.add_worker(
        key="sub_automa",
        worker=SubAutoma(),
        dependencies=["worker_1"]
    )
    return x + 1

class SubAutoma(GraphAutoma):
    @worker(is_start=True)
    def worker_0(self, user_input: int) -> int:
        return user_input + 1

class MyAutoma(GraphAutoma):
    @worker(is_start=True)
    def worker_0(self, user_input: int, rtx = System("runtime_context")) -> int:
        # Access runtime context for data persistence
        local_space = self.get_local_space(rtx)
        count = local_space.get("count", 0)
        local_space["count"] = count + 1

        self.add_func_as_worker(
            key="worker_1",
            func=worker_1,
            dependencies=["worker_0"]
        )

        return user_input + count

    @worker(dependencies=["worker_1"])
    def worker_2(self, worker_1_output: int, sub_automa = System("automa:sub_automa")) -> int:
        # Access sub-automa from worker_1
        sub_automa.add_worker(
            key="worker_3",
            worker=SubAutoma(),
            dependencies=["worker_2"],
            is_output=True,
        )
        return worker_1_output + 1

Returns:

Type Description
Any

The system resource specified by the key: - RuntimeContext: For "runtime_context" - AutomaInstance: For current automa instance or a sub-automa instance from the current automa.

Raises:

Type Description
WorkerArgsInjectionError
  • If the key pattern is not supported.
  • If the specified resource does not exist.
  • If the specified resource is not an Automa.
Note
  1. "runtime_context" provides a RuntimeContext instance for data persistence
  2. "automa" provides access to the current automa instance
  3. "automa:worker_key" provides access to a sub-automa from the specified worker key
Source code in bridgic/core/automa/args/_args_descriptor.py
@dataclass
class System(ArgsDescriptor):
    """
    Implementing system-level arguments injection for worker parameters.

    System provides access to automa-level resources and context through arguments 
    injection. It supports pattern matching for different types of system resources.

    Attributes
    ----------
    key : str
        The system resource key to inject. Supported keys:
        - "runtime_context": Runtime context for data persistence across worker executions.
        - "automa": Current automa instance.
        - "automa:worker_key": Sub-automa instance in current automa.

    Examples
    --------
    ```python
    def worker_1(x: int, current_automa = System("automa")) -> int:
        # Access current automa instance
        current_automa.add_worker(
            key="sub_automa",
            worker=SubAutoma(),
            dependencies=["worker_1"]
        )
        return x + 1

    class SubAutoma(GraphAutoma):
        @worker(is_start=True)
        def worker_0(self, user_input: int) -> int:
            return user_input + 1

    class MyAutoma(GraphAutoma):
        @worker(is_start=True)
        def worker_0(self, user_input: int, rtx = System("runtime_context")) -> int:
            # Access runtime context for data persistence
            local_space = self.get_local_space(rtx)
            count = local_space.get("count", 0)
            local_space["count"] = count + 1

            self.add_func_as_worker(
                key="worker_1",
                func=worker_1,
                dependencies=["worker_0"]
            )

            return user_input + count

        @worker(dependencies=["worker_1"])
        def worker_2(self, worker_1_output: int, sub_automa = System("automa:sub_automa")) -> int:
            # Access sub-automa from worker_1
            sub_automa.add_worker(
                key="worker_3",
                worker=SubAutoma(),
                dependencies=["worker_2"],
                is_output=True,
            )
            return worker_1_output + 1
    ```

    Returns
    -------
    Any
        The system resource specified by the key:
        - RuntimeContext: For "runtime_context"
        - AutomaInstance: For current automa instance or a sub-automa instance from the current automa.

    Raises
    ------
    WorkerArgsInjectionError
        - If the key pattern is not supported.
        - If the specified resource does not exist.
        - If the specified resource is not an Automa.

    Note
    ----
    1. "runtime_context" provides a `RuntimeContext` instance for data persistence
    2. "automa" provides access to the current automa instance
    3. "automa:worker_key" provides access to a sub-automa from the specified worker key
    """
    key: str

    def __post_init__(self):
        allowed_patterns = [
            r"^runtime_context$",
            r"^automa:.*$",
            r"^automa$",
        ]

        if not any(re.match(pattern, self.key) for pattern in allowed_patterns):
            raise WorkerArgsInjectionError(
                f"Key '{self.key}' is not supported. Supported keys: "
                f"`runtime_context`: a context for data persistence of the current worker."
                f"`automa:.*`: a sub-automa in current automa."
            )