That's like saying that God existed before creating the universe. Then you have to ask who created God? And if God was created only from data and not any logic, then the config file must have been really huge and unmaintainable.
Given any non-trivial data-only config file, it will always grow to the point that you'll end up needing to generate it automatically with logic. And that goes double for God's config file.
Do you have an example? I don't think I've ever needed to generate a config automatically (except in niche cases like generating configs for services in chef), and if I'm understanding correctly, nothing proposed in this thread would help with that scenario.
I suspect you're front-loading a lot of logic from the app bootup into the config file, and I'm not sure what you stand to gain conflating those two things.
Like—what's the execution order of a config file? Can you refer to a value later in the file? If so, how does it determine which value is executed before the next? If not, how do you setup circular dependencies—can you redefine config values halfway through a file? If not, you're gonna have to fall back on at-boot pre-processing anyway with sufficient complexity, so just treat a config like dumb data to begin with and do all your logic in the bootup and put all the values/whatever in the config file. And god knows, I would be strongly tempted to murder an engineer that introduced a config file capable of rewriting itself—that engineer has clearly never needed to debug another person's shitty code before.
I never used helm / kubernetes before 3 months ago.
Not 2 weeks ago I needed to loop in a helm config file in order to basically say "all this same config, libraries, etc., just run this other command instead" ... because someone who makes those decisions had ~100 lines of environment-injected configuration + boilerplate in the yaml that I couldn't get rid of, needed, and would have otherwise needed to copy / paste.
Since then, those environment variables have been pulled out into a different file (refactoring!), and now we replaced a loop over 100 lines of config, with 2x sets of 15-20 lines of config boilerplate. Better, but still a lot of bull. I don't know what the right answer is, because we've got less helm templating bullshit in there, but we still need boilerplate. Because it's not like I can tear down an entire kubernetes + helm infrastructure because I don't like how the config files are written.
Configs / config generation is hard, and generally awful. If you don't see it that way, congratulations; you're either a genius in your field, you've got not enough experience, and/or you're wrong. If you believe it's easy, and we're all missing something - please, by all means, write a book on how / why configurations aren't as hard as the rest of us say they are.
> Configs / config generation is hard, and generally awful. If you don't see it that way, congratulations; you're either a genius in your field, you've got not enough experience, and/or you're wrong.
The point I'm trying to make is that you're describing broken frameworks, data flows, and work flows, and blaming it on config generation. If you have a counter example, I'd love to see it. Discussing these things in the abstract is pretty pointless and based in emotional language/semantic quibbling rather than meaningful things people can reason about and discuss, like code comparison or time tradeoffs.
Hell, because no specific GOOD examples of configuration-as-code have been brought up, literally everyone in this thread could be considering a different pet example of theirs. It's OBVIOUSLY a waste of everyone's time without examples. Why bother comment at all—to go out of your way to punch down without contributing to the discourse?
You say this is easy. Seems to me that you're claiming to be elevated above us all with something we don't know, claiming that everyone else is doing it wrong, all the while hiding behind anonymity.
Stop clutching your pearls and faking the victim. No one is punching down; you're claiming knowledge you don't have and are being called out for it.
Look at any one of the references cited in the thread.
I've developed programs with tens of thousands of lines of expanded JSON in their config files. No fucking way I'm maintaining all that by hand as pure data.
The opposite of DRY (Don't Repeat Yourself) is WET (Write Everything Twice or We Enjoy Typing or Waste Everyone's Time) -- but twice could be ten times or more, there's no limit. Writing it all out again and again by hand as pure literal data, and hoping I didn't make any typos or omissions in each of the ten repetitions, without some sort of logical algorithmic compression and abstraction, would be idiotic.
FYI, here are some concrete examples, and some demos of a multi player cross platform networked AR system I developed that's based on shitloads of JSON config files describing objects, behaviors, catalog entries, user interface, a multi player networking protocol, etc.
Pantomime extensively (and extensibly) uses the simple JSON templating system I described in this other link, which I wrote in C#. Everything you see is a plug-in object, and they're all described and configured in JSON, and implemented in C#:
If I were to rewrite it from scratch, I'd simply use JavaScript instead of rolling my own JSON templating system, because it would have been much more flexible and powerful.
Oh wait -- I DID rewrite at least some of that stuff from scratch! To illustrate that superior JavaScript-centric approach, here's an example of some other JSON based systems I developed with Unity3D and JavaScript, one for scripting ARKit on iOS, and the other for scripting financial visualization on WebGL, both using UnityJS (an extension I developed for scripting and configuring and debugging Unity3D in JavaScript).
One nice thing about it is that you can debug and live code your Unity3D apps running on the mobile device or in the web browser while it's running, using the standard JavaScript debugging tools!
UnityJS is a plugin for Unity 5 that integrates JavaScript and web browser components into Unity, including a JSON messaging system and a C# bridge, using JSON.net.
Here's a demo of another more recent application using UnityJS that reads shitloads of JSON data from spreadsheets, including both financial data, configuration, parameters, object templates, etc.
Here is an article about how the JSON spreadsheet system works, and discusses some ideas about JSON definition, editing and templating with spreadsheets, which is about a year old, but that I've developed it a lot further since writing that article.
>I’ve been developing a convenient way of representing and editing JSON in spreadsheets, that I’m very happy with, and would love to share!
>I‘ve been successfully synergizing JSON with spreadsheets, and have developed a general purpose approach and sample implementation that I’d like to share. So I’ll briefly describe how it works (and share the code and examples), in the hopes of receiving some feedback and criticism.
Here is the question I’m trying to answer:
>How can you conveniently and compactly represent, view and edit JSON in spreadsheets, using the grid instead of so much punctuation?
>My goal is to be able to easily edit JSON data in any spreadsheet, conveniently copy and paste grids of JSON around as TSV files (the format that Google Sheets puts on your clipboard), and efficiently export and import those spreadsheets as JSON.
>So I’ve come up with a simple format and convenient conventions for representing and editing JSON in spreadsheets, without any sigils, tabs, quoting, escaping or trailing comma problems, but with comments, rich formatting, formulas, and leveraging the full power of the spreadsheet.
>It’s especially powerful with Google Sheets, since it can run JavaScript code to export, import and validate JSON, provide colorized syntax highlighting, error feedback, interactive wizard dialogs, and integrations with other services. Then other apps and services can easily retrieve those live spreadsheets as TSV files, which are super-easy to parse into 2D arrays of strings to convert to JSON.
[...]
>Philosophy: The goal is to leverage the spreadsheet grid format to reduce syntax and ambiguity, and eliminate problems with brackets, braces, quotes, colons, commas, missing commas, tabs versus spaces, etc.
>Instead, you enjoy important benefits missing from JSON like like comments, rich formatting, formulas, and the ability to leverage the spreadsheet’s power, flexibility, programmability, ubiquity and familiarity.
I really appreciate the post, I have a much better understanding.
I don't use configs in this way, or if I did, I would not be inclined to call them configs. I can certainly appreciate the problem of processing of JSON objects in many different contexts. I was more referring to a UX concept of providing a configuration interface—short of something like emacs that gives full functionality, simpler and easily debuggable is emphatically better.
The topic of this discussion is YAML (and JSON as an alternative), and of this thread is "using text template engines to generate YAML", which covers a lot more than just config files. YAML and JSON and template engines are used for a hell of a lot more than just writing config files, but they're also very useful for that common task too. The issues that apply to config files also apply to many other uses of YAML and JSON. Dynamically generated YAML and JSON are very common and useful, and have many applications besides config files.
The fact that you've never done and can't imagine anything complicated enough to need more than a simple hand-written data-only config file doesn't mean other people don't do that all the time. It's simply a failure of your imagination.
What I can't understand is what you were getting at about "punching down". When you say things like "I would be strongly tempted to murder an engineer", that sounds like punching down to me. And why you were complaining nobody gave any examples, by saying "no specific GOOD examples of configuration-as-code have been brought up". Don't my examples count, or do you consider them bad?
So what was bad about my examples (or did you not read them or follow any of the link that you asked for)? Pantomime had many procedurally generated config files, using the JSON templating engine I described, one for every plug-in object (and everything was a plug-in so there were a lot of them), as well as some for the Unity project and the build deployment configuration itself. It also used dynamically generated JSON for many other purposes, but that doesn't cancel out its extensive use of JSON for config files.
Here are some concrete examples of some actual JavaScript code that dynamically generates a bunch of JSON, both to create, configure, send message to, and handle messages from Unity3D prefabs and objects, and also to represent higher level interactive user interface objects like pie menus.
What this illustrates should be blindingly obvious: that JavaScript is the ideal language for doing this kind of dynamic JSON generation and event handling, so there's no need for a special purpose JSON templating language.
Making a JSON templating language in JavaScript would be as silly as making a HTML templating language in PHP (cough i.e. "Smarty" cough).
JavaScript is already a JSON templating language, just as PHP is already an HTML templating language.
UnityJS applications create and configure objects by making lots and lots of parameterized JSON structures and sending them to Unity, to instantiate and parent prefabs, configure and query properties with path expressions, define event handlers that can drill down and cherry pick exactly which parameters are sent back with events using path expressions (the handler functions themselves are filtered out of the JSON and kept and executed on the JavaScript side), etc.
At a higher level, they typically suck in a bunch of application specific JSON data (like company models and financial data), and transform it into a whole bunch of lower level UnityJS JSON object specifications (like balls and springs and special purpose components), or intermediate JSON user interface models like pie menus, to create and configure Unity3D prefabs and wire up their event handlers and user interfaces. Basically you're transforming JSON to JSON, and associating callback functions, and sending it back and forth in messages and events between JavaScript and Unity.
There are also a bunch of standard JSON formats for representing common Unity3D types (colors, vectors, quaternions, animation curves, material updates, etc), and a JSON/C# bridge that converts back and forth.
This is a straightforward function that creates a bunch of default objects (tweener, light, camera, ground) and sets up some event handlers, by creating and configuring a few Unity3D prefabs, and setting up a pie menu and camera mouse tracking handlers.
Notice how the "interests" for events include both a "query" template that says what parameters to send with the event (and can reach around anywhere to grab any accessible value with path expressions), and also a "handler" function that's kept locally and not sent to Unity, but is passed the result of the query that was executed in Unity just before sending the event. The point is that every "MouseDown" handler doesn't need to see the exact same parameters, it's a waste to send unneeded parameter, and some handlers need to see very specific parameters from elsewhere (shift keys, screen coordinates, 3d raycast hits, camera transform, other application state, etc). So each specific handler gets to declare exactly which if any query parameters are sent with the event, up front in the interest specification, to eliminate round trips and unnecessary parameters.
The following code is a more complex example that creates the Unity3D PieTracker object, which handles input and pie menu tracking, and sends JSON messages to the JavaScript world.pieTracker object and JSON pie menu specifications, which handle the messages, present and track pie menus (which it can draw with both the JavaScript canvas 2D api and Unity 3D objects), and execute JavaScript callbacks (both for dynamic tracking feedback, and final menu selection).
Pie menus are also represented by JSON of course. A pie can contain zero or more slices (which are selected by direction), and a slice can contain zero or more items (which are selected or parameterized by cursor distance). They support all kinds of real time tracking callbacks so you can provide custom feedback. And you can make JSON template functions for creating common types of slices and tracking interactions.
This is a JavaScript template function MakeParameterSlice(label, name, calculator, updater), which is a template for creating a parameterized pie menu "pull out" slice that tracks the cursor distance from the center of the pie menu, to control some parameter (i.e. you can pick a selection like a font by moving into a slice, and also "pull out" the font size parameter by moving further away from the menu center, and it can provide feedback showing that font in that size on the overlay, or by updating a 3d object in the world, to preview what you will get in real time. This template simply returns a blob of JSON with handlers (filtered out before being sent to Unity3D, and kept and executed locally) that does all that stuff automatically, so it's very easy to define your own "pull out" pie menu slices that do custom tracking.
It sounds like you're assuming that configurations are only created before running a program. But you can also create them while programs are running, to configure dynamically created objects or structures, too. And you can send those configurations as messages, to implement, for example, a distributed network object system for a multi player game. So you may be programmatically creating hundreds of dynamic parameterized "configuration files" per second.
How about normally-Turing-complete languages that can be stripped down to non-Turing-completeness to make a configuration DSL?
This is exactly what Tcl supports / was designed to do (and in turn is one of my motivations for developing OTPCL). This is also exactly what your average Lisp or Scheme supports.
Programming language design and implementation is a huge and hard problem. What you get is an incomplete frustrating language full of semantic oddities and confusions without any serious support tooling to help you out.
If you use it in anger, you quickly need all language features e.g. importing libraries, namespaces, functions, data-structures, rich string manipulation etc. But you rarely get these.
At run-time, you don’t have a debugger or anything leading to a maddening bug fix experience because config cycle times are really high.
Because it’s a niche language, only one poor soul in a team ends up the expert of all the plentiful traps.
Eventually... you give up and end up generating the config in a proper language and it feels like a breath of fresh air.
One of the most ridiculous examples of this was the Smarty templating language for PHP.
Somebody got the silly idea in their head of implementing a templating language in PHP, even though PHP is ALREADY a templating language. So they took out all the useful features of PHP, then stuck a few of them back in with even goofier inconsistent hard-to-learn syntax, in a way that required a code generation step, and made templates absolutely impossible to debug.
So in the end your template programmers need to know something just as difficult as PHP itself, yet even more esoteric and less well documented, and it doesn't even end up saving PHP programmers any time, either.
>Most people would argue, that Smarty is a good solution for templating. I really can’t see any valid reasons, that that is so. Specially since “Templating” and “Language” should never be in the same statement. Let alone one word after another. People are telling me, that Smarty is “better for designers, since they don’t need to learn PHP!”. Wait. What? You’re not learning one programming language, but you’re learning some other? What’s the point in that, anyway? Do us all a favour, and just think the next time you issue that statement, okay?
>I think the Broken Windows theory applies here. PHP is such a load of crap, right down to the standard library, that it creates a culture where it's acceptable to write horrible code. The bugs and security holes are so common, it doesn't seem so important to keep everything in order and audited. Fixes get applied wholesale, with monstrosities like magic quotes. It's like a shoot-first-ask-questions-later policing policy -- sure some apps get messed up, but maybe you catch a few attacks in the process. It's what happened when the language designers gave up. Maybe with PHP 5 they are trying to clean up the neighborhood, but that doesn't change the fact when you program in PHP you are programming in a dump.