snakeoil.deprecation module¶
Deprecation related functionality.
This provides both a compatibility shim over python versions lacking warnings.deprecated, while also allowing some very basic extra metadata to be attached to the deprecation, and tracking all deprecations created by that registry. This allows tests to do introspection for deprecations that can now be removed.
To use this, instantiate a registry, and then use it to decorate functions (exactly like warnings.deprecated in py3.13). This just keeps a record of them so that code analysis can be done for things that need to be removed when future conditions are met.
- class snakeoil.deprecation.RecordCallable(msg: str, removal_in: tuple[int, int, int] | None = None, removal_in_py: tuple[int, int, int] | None = None, *, qualname: str)[source]¶
Bases:
Record- classmethod from_callable(thing: Callable, *args, **kwargs) RecordCallable[source]¶
- class snakeoil.deprecation.Registry(project: str, /, *, record_class: type[RecordCallable] = <class 'snakeoil.deprecation.RecordCallable'>)[source]¶
Bases:
objectDeprecated notice creation and tracking of deprecations
This is a no-op for python<3.13 since it’s internally built around warnings.deprecated. It can be used for compatibility for this reason, and .is_enabled reflects if it’s actually able to create deprecations, or if it’s just in no-op compatibility mode.
- Variables:
project – which project these deprecations are for. This is used as a way to restrict analysis of deprecation metadata for the codebase.
frame_depth – warnings issued have to be issued at the frame that trigged the warning. If you have a deprecated function that reaches up the stack to manipulate a frames scope, this is the depth to subtract, the frames from this issuing a deprecation. Any subclasses that override __call__ must adjust this value.
- code_directive(msg: str, removal_in: tuple[int, int, int] | None = None, removal_in_py: tuple[int, int, int] | None = None) None[source]¶
- expired_deprecations(project_version: tuple[int, int, int], python_version: tuple[int, int, int]) Iterator[Record][source]¶
- module(msg: str, qualname: str, removal_in: tuple[int, int, int] | None = None, removal_in_py: tuple[int, int, int] | None = None) None[source]¶
Deprecation notice that fires for the first import of this module.
- project¶
- record_class: type[RecordCallable]¶
- class suppress_deprecations(category=<class 'DeprecationWarning'>, **kwargs)¶
Bases:
objectSuppress deprecations within this block. Generators and async.Task require special care to function.
This cannot be used to decorate a generator function. Using it within a generator requires explicit code flow for it to work correctly whilst not causing suppressions outside of the intended usage.
The cpython warnings filtering is designed around ContextVar- context specific to a thread, an async.Task, etc. Warnings filtering modifies a context var thus suppressions are active only within that context. Generators do not bind to any context they started in- whenever they resume, it’s resuming in the context of the thing that resumed them.
Do not do this in a generator: >>> def f(): … with suppress_deprecations(): … yield invoke_deprecated() # this will be suppressed, but leaks suppression to what consumed us. … … # in resuming, we have no guarantee we’re in the same context as before the yield, where our … # suppression was added. … yield invoke_deprecated() # this may or may not be suppressed.
You have two options. If you do not need fine grained, wrap the generator; this class will interpose between the generator and consumer and prevent this issue. For example: >>> @suppress_deprecations() … def f(): … yield invoke_deprecated() … yield invoke_deprecated()
If you need the explicit form, use this: >>> def f(): … with suppress_deprecations(): … value = invoke_deprecated() # this will be suppressed … yield value # we do not force our suppression on the consumer of the generator … with suppress_deprecations(): … another_value = invoke_deprecated() … yield another_value
- kwargs¶
- class snakeoil.deprecation.suppress_deprecations(category=<class 'DeprecationWarning'>, **kwargs)[source]¶
Bases:
objectSuppress deprecations within this block. Generators and async.Task require special care to function.
This cannot be used to decorate a generator function. Using it within a generator requires explicit code flow for it to work correctly whilst not causing suppressions outside of the intended usage.
The cpython warnings filtering is designed around ContextVar- context specific to a thread, an async.Task, etc. Warnings filtering modifies a context var thus suppressions are active only within that context. Generators do not bind to any context they started in- whenever they resume, it’s resuming in the context of the thing that resumed them.
Do not do this in a generator: >>> def f(): … with suppress_deprecations(): … yield invoke_deprecated() # this will be suppressed, but leaks suppression to what consumed us. … … # in resuming, we have no guarantee we’re in the same context as before the yield, where our … # suppression was added. … yield invoke_deprecated() # this may or may not be suppressed.
You have two options. If you do not need fine grained, wrap the generator; this class will interpose between the generator and consumer and prevent this issue. For example: >>> @suppress_deprecations() … def f(): … yield invoke_deprecated() … yield invoke_deprecated()
If you need the explicit form, use this: >>> def f(): … with suppress_deprecations(): … value = invoke_deprecated() # this will be suppressed … yield value # we do not force our suppression on the consumer of the generator … with suppress_deprecations(): … another_value = invoke_deprecated() … yield another_value
- kwargs¶