Cool idea, early code, we'll see whether it goes anywhere.
Actually, gRPC is the only requirement for pipelines. I hope can make this more clear in the future! :-)
> Sadly, gaia is in alpha phase and we currently only support pipelines written in go.
Any examples of this being used in the wild?
I feel like I am missing a small but critical piece to understand how this should be used and where.
Also, we've created a 'pipelines as code' feature called Dinghy that may have helped. And our installer & configurator provides a much smoother install & configure experience. Details at www.Armory.io
Hit us up at email@example.com if you have more specific feedback (our exec team reads emails to that addy).
Really any task that can be broken into concrete and independent steps could be made into a pipeline for scaling and reliability.
Unless pipeline code has the ability to match estimated job utility to device capabilities - it won't be useful in many non trivial cases.
Unless there is an automated way to store intermediate assets such that data locality between stages is (at least somewhat) optimal, significant amounts of all time will be spent in process migration.
I don't mind YAML myself, but I could I use something like Enaml and get the same benefits as you see them?
The canonical reference is the main website: https://concourse-ci.org/
* Compile a program.
* Run some tests.
* Create a Debian binary package.
It looks like I'd have to write a final task to upload to a staging repository in the pipeline itself.