
Python Decorators in 12 Steps (2012) - terminalcommand
http://simeonfranklin.com/blog/2012/jul/1/python-decorators-in-12-steps/
======
cven714
I really liked this answer on stackoverflow--I went from not even knowing
decorators existed to a complete understanding:

[http://stackoverflow.com/a/1594484/1158666](http://stackoverflow.com/a/1594484/1158666)

------
amyjess
Decorators are one of my favorite parts of Python. One thing I did with them
really saved our hash at my last company multiple times:

Our infrastructure daemons were written in Python, and I wanted to make sure
we had a debuggable system that allows me to trace any error without having to
manually reproduce it (especially when, say, problems specific to a customer's
network can't be reproduced on ours). I wrote a decorator, called @logme, that
would log a function's name and parameters plus the module it belonged to
(easier said than done!), run the function, then log and return the return
value. Every function was decorated with @logme. I also made a variant for
methods called @logmethod that would also log data about the class the method
belonged to.

Now, obviously, this generated massive amounts of logs, so I put in a healthy
amount of log rotation (the Python logger is _beautifully_ customizable). On
average, logs would keep for a few days.

So many times, I debugged customer problems -- problems that would've been
impossible to reproduce in our lab because of edge cases our customers
encountered -- just from the incredibly detailed logs they'd send me.

Also, it was imperative that the main infrastructure daemon never, ever
crashes. It sat and waited for commands to be sent to it. When it finishes
executing one command, it goes back to listening. If it got an error running
some command, it should report that error and then go back to listening for
the next command; an error running an individual command should _never_ bring
down the system. The decorators took care of that too. In addition to logging
what goes into and out of every function, @logme and @logmethod also caught
any exception those functions raised and logged them, allowing the system to
recover following any exception just by decorating everything (I would _not_
recommend this for a more stateful system: it only worked as well as it did
because any configuration needed was always loaded from disk when each command
was run, so the daemon overall had no state that could be corrupted by an
error).

------
manish
Udacity's course Design of Computer Programs by Peter Norwig was the best
intro to decorators I have seen.

------
agumonkey
Beware when you use with metaprogramming goals, subtle piece of data might get
"lost": [http://apiguy.github.io/blog/2013/06/03/the-dark-side-of-
dec...](http://apiguy.github.io/blog/2013/06/03/the-dark-side-of-decorators/)

~~~
celias
There's also Graham Dumpleton's "How you implemented your Python decorator is
wrong" blog post - [http://blog.dscpl.com.au/2014/01/how-you-implemented-your-
py...](http://blog.dscpl.com.au/2014/01/how-you-implemented-your-python.html)
and the related "Decorators and monkey patching" list
[http://blog.dscpl.com.au/p/decorators-and-monkey-
patching.ht...](http://blog.dscpl.com.au/p/decorators-and-monkey-
patching.html)

~~~
agentultra
The introspection argument is rather difficult... do I intend to see the
descriptor of the wrapped function or the signature of the wrapper? In the
case of stacked decorators it can get a little murky.

------
Sami_Lehtinen
Decorators were easy until I tried making decorators which would work directly
with parameters as well as when being invoked as individual instances with
parameters. Then things got quite messy and it was hard to understand why it
works as it does.

~~~
zardeh
consider the example

    
    
        @decorator_without_args
        def function():
            pass
    

this is equivalent to

    
    
        function = decorator_without_args(function)
    

now consider the other case, where the decorator takes in some args:

    
    
        @dec_with_args("hello", "world")
        def func():
            pass
    

this is equivalent to

    
    
        function = decorator_with_args("hello", "world")(function)
    

Or in other words, the expression after @ needs to evaluate to a Callable
which is then called on the function you are decorating, so with @a_decorator,
you call a_decorator on your function. When you use @with_args(1, 2, 3), you
are calling the function "with_args" and then the result of that function is
your decorator. So an example of this working is like so:

    
    
        debug_list = []
        other_list = []
        def dynamic_debug_decorator(output_list=debug_list):
            # dynamic debug decorator should return a decorator function
            # realistically, ddd is a decorator factory, not a decorator
            def debug_decorator(function):
                # so this is our real decorator
                def decorated_function(*args, **kwargs):
                    # and this is the new function we return
                    output_list.append(function)  # because silly example
                    return function(*args, **kwargs)
                return decorated_function
            return debug_decorator
    

so now using that decorator:

    
    
        @dynamic_debug_decorator  
        def function():
            pass
    

this fails, because dynamic debug decorator takes a list, but is now receiving
a function. Instead we need to call it like so:

    
    
        @dynamic_debug_decorator
        def function():
            pass
    

this works, using the default values.

~~~
RussianCow
I think you meant to call `dynamic_debug_decorator` in your last example.
Right now the two examples you show are exactly the same (both incorrect). It
should be:

    
    
        @dynamic_debug_decorator()
        def function():
            pass

~~~
zardeh
you are exactly correct.

------
timothycrosley
See:
[https://github.com/timothycrosley/hug](https://github.com/timothycrosley/hug)
for an interesting (IMHO) use of decorators. Might help expand your
understanding

------
cJ0th
Maybe writing lots of decorators helps you applying this concept when you need
to solve some sort of complex problems. However, I fail to understand the
usefulnesses of the examples given in all the tutorials I read about
decorators. Sure, it helps to D.R.Y when you want to log something or deal
with authentication. Yet I find they infringe the "explicit is better than
implicit" rule.

When I read code not written by myself and see this:

@something

def foo(bar):

...

It is not immediately clear to me what foo actually does whereas

def foo(bar):

# some function calls here

# next: what foo does with these results

# some more function calls on what has been done before

gives me a hint without scrolling through the code.

~~~
itp
When I worked at my last company, we had to expose an easy-to-use API for an
optimistic-concurrency database that required cooperating clients to retry
operations in a variety of situations. Rather than asking every client to
implement correct retry behavior, we tried to expose a simple retry loop in an
idiomatic way in each client language.

Python was far and away my favorite, since we could take advantage of
decorators to make everything relatively seamless.

Consider a function something like this:

    
    
      def do_something_transactionally(db, *args, **kwargs):
          tr = db.create_transaction()
          # do some things with the transaction object
          tr.commit().get()
    

This code was broken in a number of ways -- there's no retry, or handling of
errors from commit, etc.

So we exposed a decorator in Python to handle all of these issues for the
common case:

    
    
      @transactional
      def do_something_transactionally(db, *args, **kwargs):
          # do some things with the transaction object
    

This was more or less equivalent to:

    
    
      def do_something_transactionally(db, *args, **kwargs):
          tr = db.create_transaction()
          while True:
              try:
                  # do some things with the transaction object
                  tr.commit().wait()
                  break
              except Error as e:
                  tr.on_error(e).get()
    

It was actually even niftier than that, though, because the decorator allowed
you to pass in either a transaction object or a database object. If the caller
provided a transaction rather than a database, then the decorator did not
create a new transaction or commit it; this allowed composition of more
complicated transactional calls of decorated functions.

So I will definitely grant you that if you encountered this code, you would
need to understand what @transactional meant, but I would definitely argue
that this was cleaner and easier to deal with than asking every function to
explicitly implement the retry logic.

~~~
matthiasv
That looks actually more like a use case for a context manager, i.e.

    
    
         with transactional as transaction:
              # do something
    

It's pretty clear at the call site what is and what isn't transactional and
most importantly is practically the pythonic way of writing code with
guaranteed cleanup.

~~~
itp
We really wanted to support a context manager there, believe me. IIRC the
problem was there is/was no facility for retry with a context manager, so
there would still need to be structure around it.

I could be wrong, though -- I only remember being seduced by the context
manager option a couple times before running into the same wall.

I don't think the context manager, even if it had been doable, would have
supported composition in the style of:

    
    
      @transactional
      def thing_one(tr, *args):
          # stuff
    
      @transactional
      def thing_two(tr, *args):
          # other stuff
    
      @transactional
      def both_things(tr, *args):
          thing_one(tr, args)
          thing_two(tr, args)
    

If it's not clear, in this case a caller could provide a database object to
thing_one or thing_two and they would be executed as a single atomic
transaction, but a call to both_things with a database object would execute
both thing_one and thing_two inside of a single atomic transaction.

~~~
icebraining
You're not wrong, it was an explicit decision:
[https://mail.python.org/pipermail/python-
ideas/2013-May/0206...](https://mail.python.org/pipermail/python-
ideas/2013-May/020633.html)

------
theVirginian
This type of article seems to be a right of passage from basic to upper-
level/intermediate understanding of Python. Where it really gets crazy is if
you are making decorators that take parameters and in turn call decorated
functions from within their block. You can do some really crazy stuff, even
decorate classes.

I would like to see some more articles on Python metaclasses as I think that
could benefit a lot of people.

~~~
squidbidness
Just FYI, The expression is "rite of passage."

~~~
theVirginian
Whoops, thanks. My phone doesn't like that word apparently

------
seibelj
Decorators are very easy to understand. They take the arguments going into the
decorated function, do work, then call the decorated function, receive the
return values, and then take the returned values and do more work.

This could mean validating function inputs, changing inputs, raising
exceptions, doing cleanup, etc. I use them liberally for web service endpoints
to validate security, data integrity, etc.

~~~
philh
I don't care for that explanation, because they don't have to call the
original function. (E.g. an @skip('not yet implemented') test decorator.) Or
they could call it multiple times. The post-decoration function doesn't even
need to be a function[1]. And that doesn't touch on decorated classes, either.

I find them easiest to understand using the source-rewriting explanation:

    
    
        @foo
        def bar(...):
           ...
    

is the same as

    
    
        def bar(...):
           ...
    
        bar = foo(bar)
    

Classes can be decorated in the same way. This also explains why decorators
taking arguments work how they do: because

    
    
        @foo(...)
    

turns into

    
    
        bar = foo(...)(bar)
    

and thus foo needs to _return_ a decorator, not _be_ a decorator. (Where a
decorator can be thought of as "any function that accepts a single argument",
or as "any function which happens to be useful as a decorator", or whatever.)

But explaining them like this doesn't make it obvious what they're _for_. (I
can worry about that later, but others find it easier to understand things
given a motivating example.) And some people haven't got their heads around
first-class functions yet. So while this would be my preferred explanation if
I was talking to a version of me who didn't know decorators yet, it's not the
best for everyone.

[1] This hasn't previously occurred to me, and I've never seen it explored,
but I just verified that it works:

    
    
        def const(fun):
            return fun()
    
        @const
        def bar():
            return 3
    
        print 3 * bar # 9
    

I'm not sure if this would be useful for anything.

------
aldanor
The hardest part is to make decorator preserve the function signature, however
it's quite doable and there's a few packages for that -- e.g. _wrapt_ and
_decorator_ in pypi.

------
ChrisArgyle
This is the exact article that taught me decorators. Simple, to the point,
well-written examples with helpful sample output. A very efficient way to
learn decorators.

------
Animats
I was expecting a 12-step program to get Python programmers to stop using
decorators.

------
jdimov9
If you find decorators difficult to grasp, it's because of articles like this
one :)

