Do you, Programmer, take this Object to be part of the persistent state of your application, to have and to hold, through maintenance and iterations, for past and future versions, as long as the application shall live?
- Erm, can I get back to you on that?
Of course, that only works to a certain extent. Removed features can't be "migrated" cleanly, and often config files (or worse -- code written by users in most DSLs) aren't well-structured enough to make migration straightforward.
The compatibility and extensibility issues are mostly coming up from the first approach. And can often be avoided by utilizing a more flexible persistent format, which can be anything from a total domain-specific format up to json, xml, protobuf, etc.
Which means if you aren't careful, you can't easily move to anything else for serialization, ever again. Your code now uses protobuf-specific objects everywhere, because that's what protobuf encourages. I'm currently in a codebase where countless method signatures (which should be serialization-agnostic) take or return `Message`-derived objects because, that's what we get when we read in a request or emit a response, and using those types everywhere was just so tempting.
And now, we have new requirements that introduce some dynamism to our data model, in a way protobuf doesn't provide, so we're trying to move away from protobuf, and it's turning out to require a rewrite of practically everything because these protobuf classes are our data model, so everything depends on them.
What I've come to prefer is for serialization to be implemented a the boundaries of your service, with your models at least somewhat isolated from any given serialization technique. Protobuf is a foot-gun here because it blends these roles in a way that's hard to get away from.
I think this is the right way to do it. Just like how UTF-8 to a string type is kept at the borders. Inevitably, someone comes along with a requirement that implies the first iteration of the data modeling was not only wrong, but backwards-incompatibly wrong.
It's hard to convince coworkers that it isn't code duplication though.
> Protobuf is a foot-gun here because it blends these roles in a way that's hard to get away from.
I'm not sure; in many ways it is just trying to give you a way to supply it the data to serialize with those models. I'd be nice to not have the "foot gun", but I'm not sure what such a serialization framework would look like.
ActiveModel::Serializers work like this in Rails, although I haven’t tried any similar approaches in statically-typed languages where protobuf is so commonly used.
This is exactly what I think about using ORMs, too, and keep repeating it. Using ORM-generated model classes as your models is a semi-automatic footgun with a hair trigger.
The main point was that serialization needs to be thought about very well, because it will involved compatibility issues. It shouldn't be an automating stream of current object structures to disk.
if (islittleendian() && sizeof(mystruct) == REFSIZE_mystruct)
memcpy(buffer, mystruct, sizeof(mystruct));