
Using attrs for everything in Python - StavrosK
https://glyph.twistedmatrix.com/2016/08/attrs.html
======
mkolodny
While the conciseness of attrs is nice, it's not very readable.

This:

    
    
        class Point3D(object):
            def __init__(self, x, y, z):
                self.x = x
                self.y = y
                self.z = z
    

...is much easier to read than this:

    
    
        import attr
        @attr.s
        class Point3D(object):
            x = attr.ib()
            y = attr.ib()
            z = attr.ib()
    

Readability is what I love about Python. I don't think the conciseness of
attrs is worth the loss in readability.

~~~
mixmastamyk
Also, read the post and looked over the docs, and don't see any explanation of
what `.s` and `.ib` is supposed to mean (though I can figure out from
context). Does anyone know what the letters stand for? Struct and instance
b…???

~~~
Myrmornis
> They are a _concise_ and highly _readable_ way to write attrs and attrib
> with an explicit namespace.

> For those who can’t swallow that API at all, attrs comes with serious
> business aliases: attr.attrs and attr.attrib.

I'm definitely in the can't swallow camp. There's no precedent for using
attribute namespaces to name things with half the name on one side and half
the name on the other side, so who the hell does this project think it is to
start doing it? :) Seriously though I'm pretty sure I hate it.

And I'm having trouble liking the "proper" names. So we use a class decorator
"@attrs" to declare "this class is going to use class attributes to declare
available instance attributes slots, a bit like django or schematics fields".
Maybe it would have been better to call it "@attrs.model" since it is
essentially a model definition in the terminology of other data modeling
projects?

~~~
gknoy
If you use `import ... as ...`, you can rename them to nicer names (full gist
at [0]):

    
    
        from attr import (
            s as AttributedModel,
            ib as attribute,
        )
        
        @AttributedModel
        class Bar(object):
            x = attribute()
            y = attribute()
    
    
    

0:
[https://gist.github.com/gknoy/6e884fce3edda08a1566823fec37ba...](https://gist.github.com/gknoy/6e884fce3edda08a1566823fec37ba70)

------
RubyPinch
It feels a bit hampered by "bad" design decisions to me

I think its safe to say that the current view of what decorators do is, a)
filter things going to functions (so they error, or hit a cache), or b)
register a function with another library (e.g. flask path decorator)

This library does a lot more than that, so it seems that maybe a metaclass
(which exists to mutate the class's creation) would be more appropriate?

This seems to act as a __init__ replacement, but it would be cool to act as a
__init__ assistant (in other words, run the attr.s __init__, then run the user
defined __init__)

The ordering of attrs matters a bit more than one would expect, not only does
it define the attribute order for creating a new instance, it also defines the
order of attributes when sorting against other classes. You would not want a
id=attr.ib() at the top of the class definition, where it most makes sense!

The ordering of attrs is done through the .ib() call itself (which gives
itself a number, and increments a global counter). This is a way of doing it,
and practically might be the best way (it won't lead to any inconsistencies
within a class), but it still feels weird

I personally won't use it, but I might make a competitor to it for personal
use

~~~
emidln
Your view of what decorators doesn't seem to reflect the usage I've
experienced writing Python professionally over the last 10 years.

Decorators take an object and return a object (by convention a callable,
although this isn't enforced in the slightest). Maybe they modify a function
or method signature, maybe it produces an unrelated object, or maybe one
augments the object by adding side effects like caching or logging.

There is a special syntax (@) legal for classes, functions, and methods that
provides a shortcut for the common pattern of:

    
    
        def decorator(): ...
        def foo(): ...
        foo = decorator(foo)
    

such that we can say:

    
    
        @decorator
        def foo(): ...
    

Things I've seen decorators do:

    
    
        * add logging
        * add tracing
        * enable type validation
        * register functions/objects for various reasons
        * produce full objects out of functions
        * swap out implementations for global functions/vars
        * change output formats to json/xml/yaml
        * take a class, inspect a database schema and bind various internal attributes to update items in a row for that schema (somewhat common in ORMs)
    

I don't see this attrs library as any more egregious than other uses. In
particular, I don't see why mutating a type via a metaclass is preferable to
using a decorator. I can offer that decorators generating objects compose
_much_ better than metaclasses do (at least they have in python 2.x, I haven't
had a chance to use 3 professionally).

~~~
RubyPinch
I know how decorators work and you might be missing the point I was making by
a fair mile

The following don't mutate the mental map of how the function works, its "this
function is still what I typed": "add logging" "add tracing" "register
functions/objects for various reasons"

The following act as a filtering role, the inside isn't "changed", the mental
map becomes "This function will operate within this feature in some way, but
otherwise is still what I typed": "enable type validation" "change output
formats to json/xml/yaml" "swap out implementations for global functions/vars"
(assuming the implementations are comparable)

The following is "What I've typed is not a proper representation, and some
things may act differently than usual" e.g. things that can take lists can no
longer, due to needing to be serializable (and the least common decorator
case):

"take a class, inspect a database schema and bind various internal attributes
to update items in a row for that schema (somewhat common in ORMs)"

(and that is usually implemented via a subclassing of a Model class, which
then in turn can implement a metaclass, to get the full mutate-class-on-
creation funs, iirc)

\- - -

> In particular, I don't see why mutating a type via a metaclass is preferable
> to using a decorator.

Its not a technical topic, its a semantics topic. while both can do whatever
silliness they want, decorators are usually reserved for incidental tasks on
the side, and wrapping the original function in some light filtering.
Metaclasses are expected to do mutation to the class (hence requiring a
metaclass), so semantically a subclass or metaclass would be more appropriate
to use for this task, compared to a decorator (from the standard library,
Enums come to mind as an example for this)

~~~
d0mine
Just a data point: MacroPy uses the decorator syntax for "case" classes that
produce objects with a similar behavior to "attr.s" generated objects
[https://github.com/lihaoyi/macropy](https://github.com/lihaoyi/macropy)

~~~
dr_zoidberg
A few weeks ago, someone posted an article where the author showed how they
could "inline" functions by playing with the AST. That was done to increase
performance for the case of very simple functions along the lines of:

    
    
        def mul_by_pi(x):
            return x * 3.14
    

(I'm sure there are better examples, but that was the idea). I've seen
production code where methods and functions are used to provide bitmasking
with a class-dependent mask, for example (where the mask is a const, not an
attribute), which would benefit from inlining.

From what I've seen, it seems MacroPy could do that, but it's not a direct
example anywhere, and it seems they haven't tested it. Has it been done
before?

My other problem is that the client is extremely averse of adding any new
dependency (which is understandable), but I might be able to convince them if
I can show a significant speedup on inlined code. The previous post I
mentioned isn't a viable alternative because only "toy code" (as the author
said) was provided, and in my team we aren't knowledgable enough in AST
handling to feel comfortable hacking it together, but a more mature project
like MacroPy definitely looks viable.

------
skrause
If you just want an object that stores a few attributes and you don't want to
define a namedtuple, you can use types.SimpleNamespace in Python >= 3.3:
[https://docs.python.org/3.5/library/types.html#types.SimpleN...](https://docs.python.org/3.5/library/types.html#types.SimpleNamespace)

It's a pretty unknown feature from the stdlib, but sometimes very useful.

------
japhyr
Would there be any value to allowing a convention for the __init__() method
that allows direct assignment of arguments to attributes? For example,
replacing the following:

    
    
        class Point3D(object):
            def __init__(self, x, y, z):
                self.x = x
                self.y = y
                self.z = z
    

with:

    
    
        class Point3D(object):
            def __init__(self, self.x, self.y, self.z):
                # Other initialization work
    

It seems __init__ could be made to parse the list of parameters, and for any
self.foo that's not defined, assign the corresponding argument to that
attribute. But I have no experience in language design. Is there a reason this
wouldn't work?

~~~
m_ke
you can just do

    
    
      class B:
          def __init__(self, a, b, c):
              self.__dict__.update(locals())

~~~
desdiv
That makes B.self a field too, which you probably don't want.

~~~
xapata
Especially because that creates a circular reference and annoys the garbage
collector.

------
mhneu
"Another place you probably should be defining an object is when you have a
bag of related data that needs its relationships, invariants, and behavior
explained. Python makes it soooo easy to just define a tuple or a list."

Yes, and defining a tuple or a list is often better than a small object
because of Python's problems with serializing (pickling) objects. If you're
doing anything with data, anything functional, anything with multiprocessing,
anything with distributed programming -- you want to avoid using objects as
small bags of data. (Namedtuples should have been a way to improve this, but
they are implemented as a class, so they have serialization issues too.)

Lists of data can be serialized easily. Lists of objects cannot. dill tried
really really hard to fix this, but the problems that remain are problems
created by the Python language spec.

~~~
glyph
None of this is a problem with attrs. Serialize however you like.

    
    
        >>> import attr
        >>> 
        >>> @attr.s
        ... class Thing(object):
        ...     a = attr.ib()
        ...     b = attr.ib()
        ...     
        ... 
        >>> @attr.s
        ... class Many(object):
        ...     things = attr.ib()
        ...     
        ... 
        >>> many = Many([Thing(1, 2), Thing(3, 4)])
        >>> many
        Many(things=[Thing(a=1, b=2), Thing(a=3, b=4)])
        >>> attr.asdict(many)
        {'things': [{'a': 1, 'b': 2}, {'a': 3, 'b': 4}]}
        >>> import pickle
        >>> pickle.dumps(many)
        "ccopy_reg\n_reconstructor\np0\n(c__main__\nMany\np1\nc__builtin__\nobject\np2\n
        Ntp3\nRp4\n(dp5\nS'things'\np6\n(lp7\ng0\n(c__main__\nThing\np8\ng2\nNtp9\nRp10\
        n(dp11\nS'a'\np12\nI1\nsS'b'\np13\nI2\nsbag0\n(g8\ng2\nNtp14\nRp15\n(dp16\ng12\n
        I3\nsg13\nI4\nsbasb."
        >>> import json
        >>> json.dumps(attr.asdict(many))
        '{"things": [{"a": 1, "b": 2}, {"a": 3, "b": 4}]}'
        >>>

~~~
rspeer
...how does it do that?

You call attr.asdict() and it searches the attribute values, including inside
lists, for more attr objects to convert into dict values?

What kinds of values does it search through? The documentation doesn't say, it
just gives an example where it works inside a list for some reason.

~~~
xapata
Why would it need to recurse? The method probably just returns a copy of the
instance's __dict__. Maybe updating it with the __slots__ and their values.

~~~
rspeer
It would need to recurse because you want something you can encode in JSON,
not a dictionary containing a list containing miscellaneous instances.

We have an example there of an 'attrs' instance, containing a list containing
'attrs' instances, and _all_ of the instances turn into dictionaries.

------
whalesalad
This is how I develop in Python. I do not use this helper library (and will
certainly look into it), but definitely wish that more Python hackers built
software in this way. Passing plain-old classes around that are very small and
close to their domain is hugely valuable. Adding convenience methods or
synthetic properties with a @property decorator is also hugely important. It
lends itself to a very nice DSL where, for example, you can have image.png
where the png property is a literal PNG representation of the image.

------
radarsat1
Haven't finished reading yet but I just wanted to say that this _really_
resonates with me...

> You know what? I’m done. 20 lines of code so far and we don’t even have a
> class that does anything; the hard part of this problem was supposed to be
> the quaternion solver, not “make a data structure which can be printed and
> compared”.

YES. I don't know much about attrs, but if it helps solve this, I'm in.

------
rwcollins
The problem with constructs like this one is that it looks nice at first
glance, but adds a cognitive load to an ecosystem that already has about "20
ways to do it" for every single problem.

Python is beginning to rival C++ in that respect.

~~~
Pitarou
Please don't tell the C++ community that! We really don't want to encourage
them.

To someone who uses Python every day, the extra cognitive load of `attr` is
more than made up for by all the boilerplate it eliminates. But for beginners,
behind-the-scenes magic is a real turn off.

I think that making `attr` an optional library is a fair compromise, but
perhaps it would be a good idea to add a note to the library reminding the
programmer not to use it if the code is likely end up in the hands of novices.

~~~
crdoconnor
>To someone who uses Python every day, the extra cognitive load of `attr` is
more than made up for by all the boilerplate it eliminates.

That's funny because the first thought I had when reading this was "this does
not eliminate nearly enough boilerplate to justify using that much magic".

>But for beginners, behind-the-scenes magic is a real turn off.

It's a turn off for beginners and experts but a turn on for intermediates.

------
yladiz
While I do see the usefulness for the specific example, and while it does seem
to reduce boilerplate which is really nice, how useful it is outside of the
Point3D example? For example, how often do you actually gt/lt comparisons
between classes, or equate classes when they're not mathematical constructs
like this? And if you add an attribute that shouldn't be used in repr or
comparison, you have to explicitly say that in the attribute declaration,
meaning it's would be much less clear what's going on down the road -- as in
it won't be apparent how the comparison works or why this is being printed the
way it is -- which I don't like in a language like Python. It almost seems
like you're trading language level boilerplate for this library's boilerplate,
and while it may be less with this library it obfuscates what's going on at
the expense of more explicit code.

I can kind of understand something like this in Ruby, as it kind of encourages
this (and it also has nicer class string-ification by default), but in Python
I'd rather have all of my logic be explicitly stated out at the expense of
less boilerplate. Also how often do you really need to overwrite those gt/lt
methods in production code? If I saw that, rather than an explicit e.g. `.gt`
method, I'd consider that a code smell.

------
Animats
In the end, this is about defining a typed structure. "@attr" plus validator
constraints is effectively a typed structure.

Python got further with minimal declarations than any other mainstream
language. Yet there's a demand for typed variables. This is going into future
Python, although in a strange and clunky way, because types don't fit the
syntax.

We seem to be converging on a consensus on when to declare types. The
breakthrough was "auto", in C/C++, which declares a variable as the type of
the value to which it is initialized. This is the default in Go and Rust.
Function formal parameters and structure slots have explicit type
declarations, while local variables are usually implicitly typed.

Python seems to be moving in this direction, but from the no-type-declarations
direction. Unfortunately, because the language itself doesn't do this well,
it's being kludged through decorator macros.

~~~
alexchamberlain
The latest version does allow you to decorate variables with types... Not sure
its doing anything with them yet.

~~~
Animats
It's not. See [1]. The retrofitting of type declarations to Python is taking a
very strange path.

(Evil interpretation: this is an effort by Python's little tin god to sabotage
PyPy. Type information is a huge win for a compiler; it can now generate hard
code for a specific type. It doesn't help CPython much, since inside CPython
everything is a CObject. With type information, one could write things like
NumPy, with its arrays of uniform type and functions which operate on them, in
Python. The PyPy compiler could then generate efficient code for them. This
would make PyPy the primary Python compiler. But if type annotation isn't
enforced in CPython, code won't port reliably to a compiler which enforces
it.)

[1]
[https://www.python.org/dev/peps/pep-0484/](https://www.python.org/dev/peps/pep-0484/)

~~~
weberc2
How is this "sabotaging" PyPy? Also, even with type declarations, I'm not sure
you could get Numpy-esque performance, given that Numpy doesn't have to
dereference every element in the array.

~~~
Animats
If the compiler knows a Python array is uniformly of type float, it can, and
should, use a dense array of floats to represent it.

~~~
weberc2
Is this safe in all cases? What about complex types? Specifically if you
assign a variable to an element in the array, and then replace the element in
the array? You'd have to take care not to update the value of the variable.

~~~
Animats
Assigning a variable to a destination declared to be an incompatible type
should produce an error at compile time if possible, and at at run time
otherwise. That's what type declarations are all about.

~~~
weberc2
I think you misunderstand. Here's an example:

``` foos = [Foo(a=0), Foo(a=1)] f0 = foos[0] foos[0] = Foo(a=2)

print(f0.a) # 0 print(foos[0].a) # 2 ```

`foos` is a list of type `Foo`, but it still can't be safely made into a dense
list (at least not naively).

~~~
sanxiyn
You are right. Array elements need to have value semantics to do complete
unboxing. But for NumPy-like use cases, they do.

It should be noted that PyPy _already_ does unboxing Animats suggested,
without any type annotation. At least for 4 years. It is not theoretical.
[https://morepypy.blogspot.com/2011/10/more-compact-lists-
wit...](https://morepypy.blogspot.com/2011/10/more-compact-lists-with-list-
strategies.html)

------
jgalt212
This seems nice, but other than easy destructuring of functions that return
long tuples (which really should return dicts), I cannot figure out a good use
case for it over using a dictionary.

I am keen to hear others' opinions.

~~~
Myrmornis
You'd have to expand more on your proposal for using a dictionary for people
to be able to reply. The article was full of use cases, so why not address
those rather than simply state that you can't think of any?

When you say use a dictionary, do you mean subclass `dict`? One of the aims of
the project is to constrain the available instance attributes, as in a normal
class. How would your dict proposal do that?

~~~
jgalt212
In that case, I'd just deal with the overhead of a normal class. More
boilerplate, of course, but I'd have more control over the class.

As to the use cases, I was not really buying into them other than the nice
destructuring example, which is actually very nice one.

~~~
chc
I think the point is that this _is_ basically a normal class, but one with the
boilerplate already written for you. If you don't mind boilerplate, this
library does nothing for you. If you don't want to write classes and think
dicts are good enough, this library does nothing for you.

If you do want to have classes but don't want the boilerplate that comes with
making a basic-but-full-featured class, then that's where this library comes
in.

------
mpdehaan2
Hmm, IDK.

Stuff like @attr.ib is kind of cute looking. (This not being a good thing).

Though in reality, I'd probably want more explicit methods. If you are using
something like Django REST framework, you can validate with the serializer.
And sometimes you want to validate (often) the combination of variables and
how they interact.

Kind of feels a little non-pythonic to me. Clever, but the ways it is changing
things are stuff you have to unroll if the behavior grows, versus stuff you
just add to.

~~~
ctoth
From the docs:

(If you don’t like the playful attr.s and attr.ib, you can also use their no-
nonsense aliases attr.attributes and attr.attr).

~~~
safewayclubcard
I would consider that a negative.

> There should be one-- and preferably only one --obvious way to do it.

Whats going to happen when all professional code is using `attr.attributes`
but all the documentation uses `attr.s`?

Anyway its a minor point. I personally think it is a poor decision when a
programmer decides to be cute than clear.

------
davidmanescu
[https://xkcd.com/927/](https://xkcd.com/927/) comes to mind when I read this.

------
0xmohit

      I love Python; it’s been my primary programming language for
      10+ years and despite a number of interesting developments in
      the interim I have no plans to switch to anything else.
    

"interesting" and "developments" are referred to as haskell and rust
respectively in the original post.

"A History of Haskell: Being Lazy with Class" [0] by Paul Hudak _et al_
suggests that it emerged before 1990. See [http://haskell.cs.yale.edu/wp-
content/uploads/2011/02/histor...](http://haskell.cs.yale.edu/wp-
content/uploads/2011/02/history.pdf#page=7) the page in the report showing
Haskell timeline.

[0] [http://haskell.cs.yale.edu/wp-
content/uploads/2011/02/histor...](http://haskell.cs.yale.edu/wp-
content/uploads/2011/02/history.pdf)

------
anateus
For lightweight containers that improve on the built-in namedtuples consider
using the excellent _boltons_ library's _namedutils_ :
[http://boltons.readthedocs.io/en/latest/namedutils.html](http://boltons.readthedocs.io/en/latest/namedutils.html)

------
jplahn
This feels a little bit like Lombok in the Java world. Can anybody with
experience with both comment on that?

We use Lombok a fair amount within our code to abstract away a lot of the
class setup, which has been nice. But as I move more towards the Python world,
I'm interested to see how attr fits in.

~~~
eliasdorneles
Lombok is kinda bigger commitment than attrs, because it interfaces with Java
compiler internals (last time I used it, it was only possible to compile with
sun/oracle jdk).

attrs seems lesser risk IMO, it's just a plain Python library, no other deps.

If you like Lombok, I'd expect you'll love attrs. :)

------
Myrmornis
How does this compare to schematics? It avoids inheritance, whereas in
schematics and similar projects you'd have to inherit from their `Model` class
(and thus acquire methods you might not want).

~~~
StavrosK
Schematics looks a bit like a less elegant version of schema. It does seem
more focused on validation than attrs, I agree.

~~~
asciihacker
Schema [1] does not appear to have any support for object initialization.

[1]
[https://pypi.python.org/pypi/schema/0.6.2](https://pypi.python.org/pypi/schema/0.6.2)

------
stuaxo
The short names confuse me a bit on this.

~~~
hynek
[http://attrs.readthedocs.io/en/latest/overview.html#on-
the-a...](http://attrs.readthedocs.io/en/latest/overview.html#on-the-attr-s-
and-attr-ib-names)

~~~
eliasdorneles
Hello, Hynek! Thanks for attrs, wonderful library!

I'd suggest to market the "serious business aliases" more prominently. It's
not about aesthetics, but expectations: people like me are immediately puzzled
by "attr.ib()", thinking "why ib?".

The reason is because after reading a lot of Python code, our brain is already
trained to recognize attributes and method names after the dot.

Also, it's a reasonable expectation to be able to import a function or
submodule and the code still make sense, but this won't make sense:

    
    
      from attr import s, ib
    
      @s
      class Thing(object):
          x = ib()
    
    

I understand it looks such a small thing for you and others already used to
this DSL, and also that it's not the most important technical aspect of the
library. However, I do think it's an important human aspect of it.

I have the feeling that by making the "no non-sense" alternative more
prominent in the examples and documentation, it would reduce cognition steps
for newbie users and maybe make your own life easier by not having to explain
and paste this link every time someone finds it odd (which will probably
continue to happen).

Cheers!

~~~
hynek
Hello, thank you for the rare nice words in this thread!

I got a bit hit by surprise by the submission of this article to HN (it’s like
two weeks old now) so it caught me a bit off guard close before a new release
and there’s some inconsistency between the GitHub README and the docs on RTD.

The version that’s on PyPI now
<[https://pypi.org/project/attrs/>](https://pypi.org/project/attrs/>) has it
as the _first_ sentence after the first code sample anyone ever sees (and
people still complained because the just don’t read :() and the new version
<[https://attrs.readthedocs.io/en/latest/>](https://attrs.readthedocs.io/en/latest/>)
has a whole section around the example explaining attrs’ scope and ib/s:
<[https://attrs.readthedocs.io/en/latest/overview.html>](https://attrs.readthedocs.io/en/latest/overview.html>)
(the new version also adds and promotes aliases that make more sense but
that’s beyond the point).

I feel like I’ve _really_ done my due here and I’ll have to accept that I
can’t make everyone happy. :|

------
stinos
I can't find it at the moment but wasn't there another solution to this
problem where you could just populate an object at will (like matlab's struct
for instance)? IIRC it was dict-based and allowed something like

    
    
        foo = SomePythonWizardry()
        foo.x = 1
        foo.a.b = 2
        print( foo ) #yields e.g. foo.x = 1, foo.a.b = 2
    

Or am I just dreaming? It would be all the goodness and none of the weird
syntax.

~~~
desdiv
You're probably thinking of jsobject¹:

    
    
        from jsobject import Object
        foo = Object()
        foo.x = 1
        foo.a = {}  # foo.a.b will fail without this line
        foo.a.b = 2
        print(foo)  # prints: {'a': {'b': 2}, 'x': 1}
    

¹
[https://pypi.python.org/pypi/jsobject/](https://pypi.python.org/pypi/jsobject/)

------
asciihacker
Added to the Python OOP catalog: [https://github.com/metaperl/python-
oop](https://github.com/metaperl/python-oop)

------
BerislavLopac
attrs seems to be all over the place, but I find Schematics
([http://schematics.readthedocs.io/en/latest/](http://schematics.readthedocs.io/en/latest/))
much better designed and more powerful.

~~~
asciihacker
Yes, Schematics is impressive. Thank you, I added it to my list of Python OOP
extensions [1]. I wonder why you need to supply a dictionary to the object
instead of keyword pairs.

[1] [https://github.com/metaperl/python-
oop](https://github.com/metaperl/python-oop)

~~~
BerislavLopac
> I wonder why you need to supply a dictionary to the object instead of
> keyword pairs.

My understanding is that the main reason is to make constructors more
flexible, with arguments like _raw_data_, _trusted_data_ etc. But it is
trivial to write a simple helper function that would accept keyword pairs and
build a model instance.

------
carapace
This is wonderful and cute as hell and please PLEASE don't ever use it in
production code. ;-)

~~~
glyph
Why not? It's aggressively tested and benchmarked to ensure its correct and
its performance impact is as close to zero as possible. (It's pretty close.)

~~~
carapace
tl;dr: I don't like metaprogramming.

\----

Both of those things are very good, even great, but they don't speak to the
issue: It's [bad] magic.

In this case, it does at module-load-time what should have been done
previously at build-time. (And yes, I know Python normally doesn't have build-
time the way e.g. C does.)

I'm working on a large production codebase right now where several of the
cowboy-coders here have added all sorts of crazy metaprogramming and it's a
PITA.

Let me tell you a story. We had a class decorator that introspected the
attributes of the class and added "constants" to the _enclosing module_
providing named string values for each of the class's attributes. It lets us
access dict versions of these objects (they're data models) like
foo_as_dict[foo_module.SOME_ATTR_NAME_BUT_IN_CAPS] ...

Good luck finding foo_module.SOME_ATTR_NAME_BUT_IN_CAPS in foo_module.py
though, because it's not there. (and notice that if the model field name
changes ALL of your uses of the "constant" have to be refactored to the new
name too, so it's not really as helpful as it might seem in the first place.)

And now every new developer has to ask, "Where are these coming from?" and we
get to point out the magic extra-clever decorator. (Try to imagine the
horrible wonderful "voodoo" that decorator encompasses... Imagine the innocent
newbie encountering it.)

We have a "service builder" thing that takes a bunch of arguments and build a
web service object _in memory_ at module-load time. No source for any of that
machinery. A bug in the service gives you a traceback running through a bunch
of weird generic meta-code.

Did I mention the guy who originally wrote it bailed to a new job a month
after he finished it? No one else knows how it works. We can read the code and
reverse engineer it, of curse, but that's kind of B.S., no? A junior dev could
debug the real service code if it existed, but the meta-code that generates
the services is another challenge (that they shouldn't have and the company
shouldn't have to pay them to overcome.)

We had unittest that read a JSON file and built a test suite _in memory_ to
run a bunch of other tests. They finally let me re-write _that_ to a build-
time script that scans the exact same JSON files and emit plain-vanilla
unittest modules, that then can be run as normal, and the dev/user can look at
the actual code of the test, rather than the meta-code of the test-builder.

The same problem applies to this attr package.

If you're trying to trace into or debug code that uses attr you've got to know
attr at least enough to be able to follow what it's doing. It's an _in-joke_.
;-)

(P.S. I'm a big fan dude. Nice to interact with you. All respect.)

~~~
glyph
The same problem does not apply.

I understand that spooky action-at-a-distance magic can really ruin a
codebase's maintainability. But what you've developed here is not a judicious
appreciation of its danger and its power, but a blanket aversion to both its
risks and its benefits.

An apt metaphor would be, let's say: toast. If you try to make some toast, but
accidentally burn your house down, it's understandable that you might want to
have your bread un-toasted for a while. But it doesn't make sense to switch
your diet to be entirely raw as a result, especially if you eat a lot of
chicken.

In python, metaprogramming is like fire. People who haven't seen it before are
fascinated before it, and try to touch it. This always goes badly. But that
doesn't mean it's useless or we shouldn't have it. There are many things it's
good for, if it's carefully and thoughtfully applied.

attrs goes out of its way to avoid any behind-the-scenes effects. It even
generates Python code, rather than using setattr(), so that if any error
happens in the constructor, you can step through it normally in the debugger,
and see it in the traceback. Its semantics are clear and direct. It doesn't
have these problems.

~~~
carapace
I like toast, and a toaster is better than a folded wire clothes hanger
perched over the stove burner, but this robot toast-o-matic is too fancy for
my kitchen at work. We gotta toast that toast and ain't nobody got time to
debug some fancy toaster.

:-) Metaphors. Heh.

Let me reiterate the first part of my statement above, to wit: "this is
wonderful and cute as hell".

I like it.

I could _use_ it if it generated complete code for the classes you define.
That would actually put to rest my "metaprogramming BAD" problem in this case.

Tucking __init__ source in linecache for the debugger is neat but it's also
more magic. You might wind up stepping through code that doesn't exist in any
file. Is this wonderful hacky weirdness mentioned in the docs? I didn't see
it.

And what about __repr__? No precomputed string template?

Don't imagine that I'm crouching in the dirt tentatively reaching out to the
monolith of metaprogramming. I'm into it. I read GvR's "The Killing Joke" for
_fun_. But he called it that for a reason. :-)

I still recall a junior dev slamming into a wall trying to extend a Django
HTML form widget. The reason? The cowboys over at Django made it with a
metaclass.

We should look at a problem and think, "What's the simplest thing that will
work?" Or, more to the point, "Does _this_ problem require metaprogramming to
solve?" ("or am I just showing off?" could be the rest of that question...)

HTML form widgets don't require metaclasses. The problem attrs solves doesn't
require metaprogramming. It's a simple DSL for data models. Right? Riiiight?

If attrs took in a spec (as Python code or whatever) and emitted all your
boilerplate, ready to go, I would love it so much. You would get all of the
benefits without any of the downside. I know that's boring and not sexy or
fun, but I just want to get this bug fixed and ship it and get on with my
life. I'll play with attrs at home, on my own time, but I'm sick of magic at
work.

(Damn, am I really that old? heh)

