18
Introspecting Python Parameter Values via Argument Binding
Sometimes it is important to map function arguments with their parameter values. Python offers basic keyword arguments - which will be discussed below - but if arguments are passed either positionally, or are handled by "catch-alls", such as *args
or **kwargs
, then it quickly can become more difficult to deterministically map parameter names to their associated values in a function call.
Some situations where this may be mapping may be desired include:
- Observability tools and call stack tracing - Seeing what functions were called with what methods helps to determine how efficiently a program is running, or can help debug it when things go wrong.
- Determining function/task ownership for audit purposes - If you system is a product used by organizations and users, it is important to have an easy, low-overhead way to determine what resources are being affected by a function call and who requested the change.
- Argument modification - Sometimes a function's arguments need to be validated or modified before being used, and creating a decorator that can handle any arbituary parameter definitions may be useful.
In this post, we will dive into some more magical Python libraries and utilities that allow us to wrap and perform certain operations against arbituary function parameters. I will start with some Python basics - feel free to skip those sections as needed - before diving into some of the Signature
magic and it's real world use cases.
Python, like most other programming languages, has the ability to define functions. These functions usually define input parameters and they output return values. Argument values can then be provided to the function, which will perform some work and return a value. Note that I will try to use Python annotations where possible to make parameter types more clear.
def add(num_1: int, num_2: int) -> int:
"""Adds and returns the two arguments
"""
return num_1 + num_2
add(5, 7) # returns 12
Quick note: Parameters are the definitions of what variables a function defines, and arguments are the actual runtime values that satisfy those parameters. In the example above,
num_1
andnum_2
are the parameter definitions, and the values of5
and7
are the arguments passed to the function.
Functions in Python can define arguments in two common ways: as positional arguments and keyword arguments.
Positional arguments are defined at the beginning of the parameter list, they are matched with their parameters based on the order they are provided, and they have no default values. A positional argument must be provided to the function when called.
In the example above, since the function defines num_1
and then num_2
, it means that the first argument passed (5
) will be saved to num_1
and the second argument passed (7
) will be saved to num_2
.
Keyword arguments are defined after positional arguments, and they can provide default values in the parameter definition, meaning that a matching argument does not need to be provided when the function is called. These arguments are provided using the parameter identifier, which is the variable name given to the parameter. If they are given positionally, without the identifier, then they act in the same was a positional arguments.
We could call add
using keyword arguments:
add(num_1=10, num_2=8) # returns 18
Notice how we have <identifier>=<value>
. We are now using keyword arguments to set num_1
and num_2
.
add
doesn't require keyword arguments by default, so lets consider the following function:
def exponential(num, power=2):
return num ** power
The power
parameter above is defined as a keyword argument by default, where if its not provided, it will be given a default argument value of 2
.
exponential(2) # returns 4
exponential(2, 3) # returns 8
exponential(2, power=3) # returns 8 as well
Knowing how arguments and parameters work in Python, what if we want to write a program that prints out what arguments are given to a function, lining up positional and keyword arguments to a dict
ionary mapping. We will create a decorator that accomplishes this.
We are going to write the log_function_call
decorator function that will print out the arguments and return value of any function wrapped by the decorator.
@log_function_call
def pow(num, power=2):
return num ** power
pow(5)
# -> "pow was called with { num: 5, power: 2 }"
# -> 25
pow(5, 3)
# -> "pow was called with { num: 5, power: 3 }"
# -> 125
pow(2, power=4)
# -> "pow was called with { num: 2, power: 4 }"
# -> 16
To create this, we will need to solve the problem of determining what arguments given line up with the parameter definition in the function, so that we can print it out. In a more general sense:
How can we intercept and determine what values are passed for a Python callable?
In this case, a callable is anything that can take arguments and return a value, such as a function, method, or even a lambda
function.
Python has a built in inspect
module that provides many different utility functions and classes which allow the caller to get information and metadata about runtime variables during a program's execution. We can leverage this module and it's Signature
class to introspect a function's arguments to create a mapping of parameter names back to their values, even if they are passed as position arguments!
Without the Signature
class, only keyword arguments can be easily converted into a dict
map, which limits the amount of introspection that can be performed. By using the Signature
class, a generic decorator, such as log_function_call
, can convert any arguments given to a function into a map of their parameter names and values. This is powerful as it allows us to create a more powerful decorator that can perform extra actions using a function's arguments, whether passed positionally or as keywords.
Before diving into how the Signature
class can help us with the log_function_call
decorator, lets start setting up the decorator itself so that we can understand that code before going further. While I assume some basic knowledge of Python decorators, I will briefly explain the code in what the decorator block would look like.
def log_function_call(func)
def wrapper(*args, **kwargs):
"""
Wrap the original function call to print the arguments before
calling the intended function
"""
print("TODO: Print the arguments here!")
func(*args, **kwargs)
return wrapper
By wrapping a function with this log_function_call
above, we will emit a print line before the function actually gets called. The only issue is that we are printing an unhelpful line instead of our actual argument values.
So we have a decorator that is intercepting the function call with access to the arbitrary positional argument - as a list represented as *args
- and keyword arguments - represented as the **kwargs
map. By generating a signature of the arbitrarily wrapped function - func
in the example - we can merge these two generic argument objects into a single, declarative map of
<parameter name>: <arg value>
which can then be acted upon easily to perform different operations, such as printing out the function call and arguments for auditing sake!
In the following code, we are continuing off the previous decorator block with the addition of our introspection code; we are generating a signature of the wrapped function and then binding the given arguments to the functions parameters, which returns a dictionary of parameter names to their passed values.
from inspect import signature
def log_function_call(func)
def wrapper(*args, **kwargs):
"""
Wrap the original function call to print the arguments before
calling the intended function
"""
func_sig = signature(func)
# Create the argument binding so we can determine what
# parameters are given what values
argument_binding = sig.bind(*args, **kwargs)
argument_map = argument_binding.arguments
# Perform the print so that it shows the function name
# and arguments as a dictionary
print(f"{func.__name__} was called with {argument_map}")
func(*args, **kwargs)
return wrapper
So when can this actually be used? Well, for our project, we have a Django web application project that implements Celery Asynchronous Tasks. We can schedule these tasks to be run by a background worker, but we require an audit log to determine who started a task and what organization the task is tied back to.
For instance, if we have a task such as resolve_membership
that ensures the right members have the proper permissions, it may take in a organization_id
parameter of the organization to resolve membership for.
from celery import task
@task(bind=True) # <-- Fanciness that just denotes the function as an async task
def resolve_membership(task: TaskResult, organization_id: int):
org = Originization.objects.get(id=organization_id)
do_something_with_the_org(org) # arbituary code being run
While this task is created, scheduled, started, and competed, it's current state and result (SUCCESS
, FAILED
, QUEUED
, ...) will be stored in the database by Celery automatically. This gives us a queriable audit log of all resolve_membership
tasks that have been run, but it doesn't easily actually allow us to see which tasks correspond to which organizations. As a part of task scheduling, we wanted to create an audit log of these tasks and draw a relationship to what organizations they were affecting, and what - if any - user scheduled the tasks.
We could have performed this audit log creation explicitly in every task, like the following code example, but it would have gotten more difficult to manage as more tasks were created, or if any changes were made to the custom TaskResult
class. This would quickly become too cumbersome and would not result in a DRY program structure.
from celery import task
@task(bind=True) # <-- Fanciness that just denotes the function as an async task
def resolve_membership(task: TaskResult, organization_id: int):
org = Originization.objects.get(id=organization_id)
# Assign the organization to the task for audit logging purposes
# NOTE: this would need to be done on EVERY task
task.organization = org
# Perform the actual task needed
do_something_with_the_org(org) # arbituary code being run
Using a wrapping decorator like the log_function_call
above, we were able to create a single assign_ownership
decorator that would wrap each task execution. This decorator would inspect the task arguments and provided some static arguments to the decorator itself, it would be able to create the same ownership relationship without needing to copy anything more than just the decorator name itself. This made it much easier to create the task audit log without having to worry about how the ownership was actually being created. It reduced all of our individual implementations across tasks to a single function block that was much easier to manage.
from celery import task
@task(bind=True)
@audit_ownership(org_param='organization_id')
def resolve_membership(task: TaskResult, organization_id: int):
org = Originization.objects.get(id=organization_id)
# Perform the actual task needed
do_something_with_the_org(org) # arbituary code being run
The following code block is a paraphrased version of the decorator. Notice how we are using the Signature
class to load the value of a given parameter no matter how it was passed as a function; whether as a positional or keyword argument. Note that the following code is a bit advanced and does require some knowledge of how to define decorators in Python.
from inspect import signature
def audit_ownership(org_param: str, task_param: str ='task'):
"""
Wrapping a celery task, this will attempt to create an audit log
tying the task execution back to an organization using passed arguments
: org_param: str : function parameter that denotes the organization ID
: task_param: str : function parameter that denotes the task instance
"""
def decorator(func):
"""
This is the actual task decorator, but nested to allow for parameters
to be passed on the decorator definition
"""
def wrapper(*args, **kwargs):
"""
The wrapper replaces the actual function call and performs the
needed extra auditing work before calling the original function
"""
# create a function signature to introspect the call
sig = signature(func)
# Create the argument binding so we can determine what
# parameters are given what values
argument_binding = sig.bind(*args, **kwargs)
argument_map = argument_binding.arguments
# Using the argument binding and decorator arguments, we can
# fetch the task and organization, no matter how the function
# was invoked. The power of argument introspection!!
task = argument_map[task_param]
organization_id = argument_map[org_param]
organization = Organization.objects.get(id=organization.id)
# The actual logic is abstracted out to be more readable
# Assume this function creates the relationship between the
# task result instance and the organization instance
assign_ownership_to_task(task, organization)
func(*args, **kwargs) # this calls the original function
# as it was original intended
return wrapper
As an extra bonus, we made the org_param
decorator parameter also be a callable
, instead of being a str
, so that we could dynamically fetch the associated value. This was useful in situations where organization-owned resources were passed to the task instead of the organization itself, so we could have the callable load the child object and return a reference to its parent's organization.
Python decorators that modify and perform operations using the provided arguments can rely on function signatures and argument bindings to determine parameter values as a part of their operations. This allows programs to utilize generalized decorators that can be reused across an application to perform operations such as debugging or audit-logging. Have some more examples of where this could be useful? Post them below!
18