Hacker News new | past | comments | ask | show | jobs | submit login
AutoRest: OpenAPI Specification code generator (github.com/azure)
57 points by gfortaine on Oct 6, 2017 | hide | past | favorite | 21 comments



So that's basically a code-generator for OpenAPI/Swagger specs? How is it different from swagger-codegen?


I asked the HN community for feedback on using Swagger Codegen, AutoRest last year: https://news.ycombinator.com/item?id=12485310

They're essentially the same but written in different languages (Java vs C#/TypeScript).

Eventually we picked Swagger Codegen due to its active community and wide-range of supported languages/frameworks.


We are doing it another way:

Start with a gRPC spec -> protoc-gen-swagger (https://github.com/grpc-ecosystem/grpc-gateway)

I think there is a New York Times Github repo, which does:

Swagger -> gRPC


but what’s the point of starting with grpc? I’m sure you have some reasons that is working better for you. would be cool to here any benefits of not just using swagger alone.


The api is consumed by mobile apps and a website.

Therefore, for the apps we actually just need gRPC and its generators for each platform. This way we get protobuf serialization (for free with gRPC).

So only the website needs another iteration. And since grpc-web is not finished yet (gRPC in the Browser), we rely on Swagger, following the idea as described here: https://coreos.com/blog/grpc-protobufs-swagger.html

Possible alternatives in the future I can think of, might be:

  use grpc-web (no more grpc-proxy)
  use Istio's grpc-bridge https://istio.io/ (https://envoyproxy.github.io/envoy/configuration/http_filters/grpc_json_transcoder_filter.html)
Regarding Istio/Envoy grpc-bridge:

I don't know if it makes sense to make Istio "too smart", i.e. you suddenly have to install a Go(lang)-generated file (from your proto-definition) on Istio.

Therefore I think we stick with the gRPC-proxy solution from CoreOS for now, so that the website developer has the chance to use the api directly without going through Istio.

Istio makes sense for tracing and all the other fancy features. :)


Slightly off-topic:

Are there any tools which can validate an OpenAPI Specification against a running instance of the API it describes?

I've played around with apiaryio/dredd, but was hoping to find something more targeted towards OpenAPI/Swagger.


If I undetstood correctly, this might do what you want: https://github.com/cbkelley/swaggerValidator

I have own experience only with "server-side" of Swagger validation. Some time ago, I had to build a simple stateless "gateway" style Node.js backend for a customer webapp. It took API requests, checked authorization and then fetched data from a couple of non-public services, and combined them to a reply JSON for consumption by the frontend webapp.

I wanted to keep the backend really simple and have guarantees it was always returning good data, so I could focus on the more complex frontend code. I thought a "specification-driven" approach would be suitable, where I first described my intended backend REST API with Swagger, and then wrote the Node.js code that implemented that REST API specification. Usually, things are done exactly vice-versa: you write the backend code, and then generate the spec that describes your implementation.

I think I ended up using the swagger-express-validator library to a) validate incoming JSON/form POST requests, b) automatically select the correct Node.js controller that should serve the request, and c) validate that the HTTP/JSON replies the controller eventually returned were correct (per the Swagger spec).

https://github.com/gargol/swagger-express-validator

It worked quite well. The Swagger served as kind of a "index" of the backend, similar to a C header file, and if you did an "oops" and returned bad data you would immediately get a fatal error during development. I caught multiple corner cases where the upstream APIs were returning unexpected data that would have normally been only discovered by monkey testing the UI.

This library allows choosing the NodeJS/Express controller by a Swagger spec, but I think I ended-up rolling my own:

https://github.com/swagger-api/swagger-node


Hi there, I'm the founder of a company called Stoplight, and we have a purpose built solution for this very use case. You can read more about it here: https://stoplight.io/platform/scenarios.

Basically, you setup test cases for your API(s), and we automatically contract test the inputs/outputs of the requests against your OAS specification where possible. If anything does not validate against the schemas defined in your OAS specification, the test will fail with descriptive errors. If your OAS is ever updated, those changes will automatically work in the tests, since the tests are just referencing the OAS spec (not duplicating data from it).

A couple more things:

- You can create these tests with our visual UI, or write the underlying JSON that describes the tests by hand.

- You can run the tests inside of our UI, or install Prism (our command line runner) to run them completely outside of Stoplight (CI, terminal, etc).

- We plan to support OAS3 later in Q4 of this year.

We live and breathe API tooling and specifications. If you have any questions about process, our product, API strategy, etc, happy to chat - just shoot me an email at marc [at] stoplight.io!


What's the goal?

Fwiw, an interesting inverse approach is to define routes/apis/validation directly with swagger specs, which there are a few libraries in various languages for.


Developers working with Django might look at pairing use of this with DRF OpenAPI:

https://github.com/limdauto/drf_openapi


swagger-codegen does this fairly good, how about server side code generation? swagger-codegen also does this, but not so good if you are targeting golang for example


It's fairly easy to modify for your needs though, have a look at the templates: https://github.com/swagger-api/swagger-codegen/tree/master/m... You can create a copy of that template directory, adjust them and pass it as a parameter to swagger-codegen.


Hi, for any feedback on the Go server stub generator, please let us know via https://github.com/swagger-api/swagger-codegen/issues/new.

Also worth sharing with you that the Go client generator in the latest master (2.3.0) has been refactored. Please give it a try and let us know if you've any feedback.

Disclosure: I'm a top contributor Swagger Codegen


This reminds of the SOAP/WSDL days.

The lesson learned from those days is that if your API is complicated enough that you are needing to use such tools, then you should really do some analysis into how you can simplify your API.


It’s all about having a machine-readable definition of the API. When you have that, you can support strictly-, strongly-typed languages much better. In a language like C, Java or Rust, which would you rather: a JSON data type that is essentially Map<String, Object>, or a struct (or class, or whatever) that actually represents the fields and types that exist?

It’s useful to be able to quickly generate a client library with fair ergonomics for such languages, rather than having to use a weakly-typed, string-heavy general-purpose HTTP library. Languages like JavaScript, Ruby and Python don’t benefit so much from things like this, but even so there are definite advantages to it.


https://github.com/metosin/compojure-api this library uses clojure.spec with Swagger APIs to check data validity.


On the contrary; the lesson learned from those days is that codegens and declarative inputs to codegens (i.e. API specs) have their place, despite the particular stack falling out of fashion.

APIs that were put out when the "RESTful" movement was at its peak were documented solely in prose, a big regression from the WSDL days. OpenAPI (or a few years ago, Swagger and its contemporaries RAML and API Blueprint) attempt to retrofit what we've lost.

Spec-driven codegens allow shops that put out APIs to quickly bootstrap first-party clients, gaining an instant competitive advantage over similarly-positioned competitors who don't publish a first-party client.


So why not keep using SOAP and WSDL?


SOAP vs. REST is not simply XML+WSDL vs. JSON+OpenAPI. REST is resource-focused, where SOAP is operation-focused. REST takes advantage of HTTP's built-in features (verbs, caching, etc.) in ways that SOAP doesn't (SOAP is basically RPC tunneled over HTTP POSTs).

I work on an API team that is in the early stages of switching from SOAP services to REST (using Swagger/OpenAPI). Shifting mindsets from RPC-based to resource-based APIs is challenging, but very beneficial (of course, there are still use-cases for RPC-style APIs).


Ah, then I had a misconception of OpenAPI; my perception is that it was mostly used by people who wanted to use REST like how they used SOAP (using URLs and such but still fundamentally RPC). Does that mean OpenAPI lets you model and describe a real hypermedia application, following the REST constraints? Including ignoring the URLs and focusing on media representations and their relationships?


Your API is a contract you're extending to your customers. You need to be able to specify the behavior of a given operation with enough precision that you and your customer have a good understanding of how it will work.

The problem with SOAP/WSDLs is that you break the contract every time you sneeze at the damn thing (adding a field, etc.). Swagger/OpenAPI is generally more permissive, as long as you know what you're doing.

API versioning is hard, regardless of the tool you use. Contract-first API design prioritizes predictability over fluidity. The decision on which trade-offs are correct will depend on the context.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: