I just realized Python has became so unfamiliar to me. Recap is needed.
This is the note capturing the ‘interesting’ or ‘tricky’ bits of Python based on my understanding.
asyncio package
- func defined with async defiscoroutines.
- asyncio.Taskobject is task, created using- asyncio.create_task().- Taskcan- cancel(), also can be- await.
 
- It has a Queue, when ‘Queue.putcalled, its internal counter will ++, whenQueue.task_done()` called, the internal counter will —.- However, the Queue.get()will just return an item, it won’t decrement!
 
- However, the 
- Futureis a low level object representing an eventual result of an async operation.
- There are asyncioverions oflocks, events, conditions, semaphores, etc synchronization primitives.
[python-asyncio-examples|Python asyncIO Examples]
IMPORTANT
asynciois still just simulating concurrency without introducing multi-thread in Python. so, under the hood, it is still just ‘single threaded’. I.e. atomicity does not really need to be discussed (I figure). More here: [why-saying-asyncio-future-is-generally-not-thread-safe]
This is probably true:
- If you use ‘asyncio’ in any places of your code, your main function will likely need to be async def’ed.
BTW you need to explicitly start the asyncio event loop. In most cases it would be:
async def main():
  xxx
if __name__ == "__main__":
  asyncio.run(main()) # This starts the event loop.Or, you can also explicitly call asyncio.run(foo()) in your normal sync code, assuming your ‘foo()’ function is the top async def.
concurrent.futures package
- 
Provides high level interfaces for async executing callables. 
- 
simplify multi-thread + multi-process 
- 
Under Executorbase class (abstract)- you use ThreadPoolExecutorfor I/O bound tasks.
- you use ProcessPoolExecutorfor CPU bound things (sigh, python…)
 
- you use 
- 
FutureObject- State poke: done(),running(),cancelled().
- Blocking: result(),exception()
- Callback (wow): add_done_callback(fn)
 
- State poke: 
Common helpers:
- <your executor, like ThreadPoolExecutor>.submit(fn, *args, **kwargs)
- <your executor>.map(func, *iterable, timeout=None, chunksize=1)- Return is an iterator typed object. (can be used in for inor converted to list)
- [example-of-python-futureexecutormap]
- Note chunksize defaults to 1, but it is NOT used in ThreadPoolExecutorat all!. This parameter is used in ProcessPoolExecutor though.- More about multi-process [second-thought-about-python-multi-process]
 
- Order is guaranteed! (just like the built-in map)- Of course there is a ‘cost’ of this order-guarantee. If A is still blocking, B will not yield its result.
 
- (for both Thread and Process pool), the iterable is read/collected upfront when map() is to be called. I.e. if your ‘iterable’ is HUGE or is INFINITE (e.g. your array is huge, or you are using itertools.count()), this map() won’t work for you.- More about ‘iter’: Iterator vs iterable
- This is different from the built-in map().
 
- Error handling: If any of the functions executed raises an exception, the same exception will be re-raised when you retrieve the corresponding results from the iterator. (i.e. happens when next()is called.)
- Best practice: always use withwith executor.
 
- Return is an iterator typed object. (can be used in 
- concurrent.futures.as_completed(futures, timeout=None)- Returns an iterator that yields Futureinstances as they complete.- Useful when you want to process results asap.
 
 
- Returns an iterator that yields 
- concurrent.futures.wait(futures, timeout=None, return_when=ALL_COMPLETED)- Wait until all done (or you can set it to, block until first is done, or first raises an exception).
- It returns a tuple of (done_futures, not_done_futures)
 
Diff between asyncio.Future and concurrent.futures.Future:
- asyncio.Futurecan be- await, I don’t think you can do that on- concurrent.futures.Futuredirectly. But probably ok if you call- await res = concurrentFuture.result()?
- asyncio.Futureis not thread safe but the- concurrent.futures.Futureone is?- But WDYM by ‘not thread safe’ here?
 
TIP
Most cases, you don’t need to use the
asyncio.Futurein your application level code.
You typically encounter and interact with asyncio.Future when:
- You’re writing low-level asyncio components: For example, implementing custom network protocols or event loop integrations.
- Bridging synchronous and asynchronous code: When you use loop.run_in_executor(), it wraps theconcurrent.futures.Futurereturned by theexecutorinto anasyncio.Future, making it awaitable.
- Working with callback-based libraries: If you have an existing callback-based asynchronous library, asyncio.Futurecan serve as a bridge to make its results awaitable. (What???)
[examples-of-asyncio-future-as-low-level-bridge]
conventions on import
import vs from import
Google prefer import xxx over from xxx import yyy.
Exception: You can use from module_name import symbol_name for:
- Standard library modules: If the module name is much longer than the symbol name (e.g., from collections.abc import Sequence).
- Type hinting: from typing import List, Dict.
- Aliasing: from my_very_long_module_name import MyClass as ShortClass.
- Re-exporting symbols: In __init__.pyfiles to create a public API for a package.
Alias
- import numpy as np
- import very_long_module_name_that_is_annoying_to_type as shortname
Import is not transitive
- If module A imports module B, and module X imports module A but not module B explicitly, will I access module B member with A.B.SomethingInB?- module imports in Python are not transitive: A.B.SomethingInBwill not work.- unless module A explicitly make module B an attribute of itself
- Basically, A.Bis an attribute ofmodule A, if it happen to be module B, thenA.Bwill be moduleB.
 
- Basically, 
 
- unless module A explicitly make module B an attribute of itself
 
- module imports in Python are not transitive: 
Example of ‘marking B as an attribute of A’:
import B as _B_internal
B = _B_internal # Now 'B' is an attribute of the A module objectFuture import
What is Python Future imports: (e.g., from __future__ import annotations)
- a special kind of import statement in Python. Its purpose is to allow a module or script to enable features that are not yet the default behavior in the current Python version but are scheduled to become standard (or have already become standard) in a future Python version 😨
- common these days: Example: from __future__ import annotations (PEP 563)- Without it, for pyhon < 3.10: When the Python interpreter encounters a type hint in a function or variable annotation, it tries to evaluate that hint immediately at the time the function or class is defined.
- Forward References Issue: If you have a type hint that refers to a class defined later in the same file, or a class that refers to itself within its own type hints, you would traditionally need to write the type hint as a string (e.g., name: ‘MyClass’) to avoid a NameError.
- Holy.
 
- Circular Dependency Issue: Complex type hints involving imported classes could sometimes contribute to import cycle issues.
- Startup Performance Issue.
 
- Forward References Issue: If you have a type hint that refers to a class defined later in the same file, or a class that refers to itself within its own type hints, you would traditionally need to write the type hint as a string (e.g., name: ‘MyClass’) to avoid a NameError.
- With it:
- Python treats all type annotations as strings at runtime. They won’t be evaluated when the function or class is defined.
 
 
- Without it, for pyhon < 3.10: When the Python interpreter encounters a type hint in a function or variable annotation, it tries to evaluate that hint immediately at the time the function or class is defined.
- Other examples (already in history)
- print_functionso people can use- print()in python 2
- divisionto use- /perform true division, and use- //for floor division in python 2
- unicode_literalsto make all string literals in python 2 be unicode strings
- generator_stopto allow- StopIterationto be raised in generators
 
Decorator
Some typical uses of decorator (Flask):
app = Flask(__name__)
 
@app.route('/')
def index():
    return "Hello"the
apphere is an instance object ofFlaskclass. It is an instance! A piece of dynamic data! Not a static class type, or a function definition, or signiture, or any static data. It is dynamic.
the diff between class decorator and typical function decorator
- decorators can apply on ‘class’ too?
- Yes. Just the ‘wrapped’ item becomes a ‘class’ (the type itself, the class instance).
- It can:
- 
- add attributes or methods,
 
- 
- registering classes (Django models and SQLAlchemy),
 
- 
- Modifying class behavior (like the @functools.total_ordering),
 
- Modifying class behavior (like the 
- 
- just wrap the class.
 
 
- 
 
- It can:
 
- Yes. Just the ‘wrapped’ item becomes a ‘class’ (the type itself, the class instance).
- For class decorator, the __init__()methods should take the ‘func’ arg, it returns an ‘instance’ of the class decorator.
- Then the __call(xxxx)__method is the one get called (as it replaces the original func) when the decorated function is to be called. So, the__call()__should have the same signature as the original func.- Comparing with the typical function decorator, the decorator function itself actually matches with the __init__, and the returned functor from the ‘decorator_name’ function is the counterpart of the__call__.
 
- Comparing with the typical function decorator, the decorator function itself actually matches with the 
An example of using class decorator might be better (note that wrapper now is an callable instance of class CallCounter, and greet is now the wrapper instance name!):
# Class decorator example (counting calls)
import functools
 
class CallCounter:
    def __init__(self, func):
        functools.update_wrapper(self, func) # Copies metadata
        self.func = func
        self.count = 0
 
    def __call__(self, *args, **kwargs):
        self.count += 1
        print(f"{self.func.__name__} has been called {self.count} times.")
        return self.func(*args, **kwargs)
 
@CallCounter
def greet(name):
    print(f"Hello, {name}!")
 
greet("Alice") # Hello, Alice! (greet has been called 1 times.)
greet("Bob")   # Hello, Bob! (greet has been called 2 times.)
print(greet.count) # Accessing state: 2NOTE
I figure, compared with the ‘decorator’ pattern in C++ (and probably also Java), Python function decorator has an interesting feature, the function can be a class method and when calling the decorator (not the decorated function yet), it can be called with an instance of the class.
- This is probably only possible as Python is a dynamic language (i.e. an interpreter).
what is @functools.wraps(func)?
- is a decorator used inside other decorators. to make the wrapper function (the one your decorator creates) “look like” the original function it’s wrapping
- normally, the new wrapper function has its own name, docstring, and other metadata
- This can be problematic for debugging, introspection (tools that examine your code), and documentation.
- Using @functools.wrapsis a best practice when writing decorators to maintain the integrity of the original function’s metadata.
- functools.wrapscopies the attributes like- __name__,- __doc__,- __module__,- __qualname__,- __annotations__, and the argument signature from the original function (func) to the wrapper function
 
what else are there in functools
Also is it a module that I always need to import? → probably yes.
- lru_cache(maxsize=128, typed=False)for expensive IO bound functions,
- partial(func, /, *args, **kwargs): Useful for creating specialized versions of functions with some arguments fixed.
- reduce(func, iterable[, initial]): Apply a function of two arguments cumulatively to the items of a sequence or iterable, from left to right, so as to reduce the iterable to a single value.- For example, reduce(lambda x, y: x+y, [1, 2, 3, 4, 5])calculates((((1+2)+3)+4)+5)(i.e. this is a sum)
 
- For example, 
- total_ordering: A class decorator that fills in missing comparison methods (- __lt__,- __le__,- __gt__,- __ge__). If a class defines at least one rich comparison ordering method (e.g.,- __eq__and- __lt__), @total_ordering will supply the rest. This can save a lot of boilerplate. 👍
- cached_property: A decorator that transforms a method of a class into a property whose value is computed once and then cached as an ordinary attribute for the life of the instance. (similar to golang’s- sync.Onceand- sync.OnceValue)- I think it won’t make sense if the method takes more than just the selfargument. So probably having more thanselfwill lead to an error being raised?
 
- I think it won’t make sense if the method takes more than just the 
- singledispatchand- singledispatchmethod: Decorators that transform a function or method into a generic function (also known as a single-dispatch generic function).😨- Useful for writing functions that behave differently for different data types without complex if/elif/else type checks. Example:
- Still lots of ❓ here.
 
Example of singledispatch:
import functools
 
@functools.singledispatch
def process(data):
    print(f"Processing generic data: {data}")
 
@process.register(int)
def _(data): # The name of the specialized function doesn't matter for dispatch
    print(f"Processing an integer: {data * 2}")
 
@process.register(str)
def _(data):
    print(f"Processing a string: {data.upper()}")
 
@process.register(list)
def _(data):
    print(f"Processing a list: {sum(data)}")
 
process(10)         # Output: Processing an integer: 20
process("hello")    # Output: Processing a string: HELLO
process([1, 2, 3])  # Output: Processing a list: 6
process(3.14)       # Output: Processing generic data: 3.14Private/Built-in Attributes
What is __annotations__?
- a dictionary containing function parameter and return value annotations (type hints).
What is __module__, __name__, __doc__?
- __name__is just the basename of the class/function/method.
- __module__: name of the module,- __doc__is the docstring.
what metadata are there for a function?
- __defaults__,- __kwdefaults__,- __code__,- __closure__,- __globals__,- __dict__- globals: a ref to the dict that holds funcs global variables (the global ns of the module in which the function was defined)
- closure: a tuple of cells containing bindings (I think it is by reference) for the funcs free variables. If ‘None’ the function is not a closure.
- dict: just some attributes assigned to the func, like any other objects.
 
What is __qualname__?
- qualifed name, it is basically the full path of a function/method or a class. (remember we have nested class, nested function/method).
- Example:
def top_level_func():
    pass
 
class MyClass:
    def method(self):
        pass
 
    class NestedClass:
        def nested_method(self):
            def deeply_nested_func():
                pass
            print(f"Deeply Nested Func __qualname__: {deeply_nested_func.__qualname__}")
        print(f"Nested Method __qualname__: {nested_method.__qualname__}")
 
print(f"Top Level Func __name__: {top_level_func.__name__}")
print(f"Top Level Func __qualname__: {top_level_func.__qualname__}")
 
print(f"MyClass __name__: {MyClass.__name__}")
print(f"MyClass __qualname__: {MyClass.__qualname__}")
 
instance = MyClass()
print(f"Method __name__: {instance.method.__name__}") # For methods, __name__ is just the method name
print(f"Method __qualname__: {instance.method.__qualname__}") # Shows Class.method
 
nested_instance = MyClass.NestedClass()
print(f"NestedClass __name__: {MyClass.NestedClass.__name__}")
print(f"NestedClass __qualname__: {MyClass.NestedClass.__qualname__}")
nested_instance.nested_method()
# nested_instance.nested_method.deeply_nested_func() # This won't work directly
## Output:
Top Level Func __name__: top_level_func
Top Level Func __qualname__: top_level_func
MyClass __name__: MyClass
MyClass __qualname__: MyClass
Method __name__: method
Method __qualname__: MyClass.method
NestedClass __name__: NestedClass
NestedClass __qualname__: MyClass.NestedClass
Nested Method __qualname__: MyClass.NestedClass.nested_method
Deeply Nested Func __qualname__: MyClass.NestedClass.nested_method.<locals>.deeply_nested_funcPydantic
what is pydantic?
- Looks like pydantichandle data parsing quite well (including JSON for sure)
- Can marshal/unmarshal JSON to/from dictionaries.
- It uses standard type hints. So good support in IDE.
- Widely used in Flask, FastAPI, Django, etc, for request/response validation.
With Pydantic model we don’t write:
class User:
  def __init__(self, id, name):
    self.id = id
    self.name = nameWe write:
from pydantic import BaseModel
class User(BaseModel):
  id: int
  name: str-  Learn more about typical use patterns of pydantic
Iterator vs iterable
If the object has:
- __iter__or- __getitem__, then the object is iterable- The return of __iter__should be an iterator.
 
- The return of 
iterator should:
- impl __iter__(just return itself)
- impl __next__method, and when no more items, returns anStopIterationexception.
what is itertools.count()?
It is a function from Python built-in itertools module that returns an iterator that generates an infinite arithmetic progression of numbers.
- A count that keeps going forever.