Lately Postman suddenly required creating an account to their cloud, to use my five different rest requests from scratchpad. I got annoyed so bad that deleted that piece of cr*p immediately. Never looking back.
Then I found Bruno and fell in love. Thanks for the great work!
For me, I had a project full of test cases in Postman and one day I was connected to the internet it phoned home to update. After the update it said I needed an account (okay, use work email no big deal) after logging in it said all my test cases and collections were gone. From there it was shell scripts with curl for me.
- chain requests passing data from a request to another,
- add tests on every responses: body, headers, certificates, etc... You can use JSONPath or XPath for example,
- there is some sugar syntax to construct request bodies (GraphQL body is annoying with curl, JSON etc...),
- there is some sugar syntax for retrying requests on asserts, delaying requests etc..
Under the hood, Hurl uses libcurl so a lot of curl's options are exposed through Hurl (and we benefit of a lot of curl features like HTTP/3, IPV4/IPV6 etc... and speed and reliability of course!).
From the first three lines of the linked-to paragraph:
> Hurl can run HTTP requests but can also be used to test HTTP responses.
> Different types of queries and predicates are supported, from XPath and JSONPath on body response, to assert on status code and response headers.
Insomnia did the same. There was an option to migrate to an offline account but the migration did not work.
So I turned to Bruno.
I have been happy with it but there are some strange issues. Sometimes it does not save settings when I press Ctrl s. I still don't know when this happens but I lost some work a couple of times because of this.
I am using Insomnia but haven't been prompted for account setup etc (switched to it from Postman after the cloud nonsense).
I hope Bruno has support for separate environments and environment-level variables soon because the ability to set global variables along with environment-specific variables/overrides makes the setup much more manageable (I think there is a feature request already open for this).
PS: if this possible via some sort of external JSON/script, I am more than happy to set that up. I am just not familiar enough with Bruno tips/tricks :(
Same here regarding Insomnia: all my requests have disappeared. I was searching for an alternative, and this is a godsend :) somehow I missed it when searching for alternatives.
We used Postman but it got forbidden in our org for that reason, so no more.
Can't say I really miss it. I personally prefer just using a jupyter notebook for these kinds of tasks. With a custom tool like this it becomes a dead end with the data. Maybe you want to decode it if it is on a binary format. Or you want to plot some basic stats?
This has always been my approach really. Use curl for basic stuff, use a full featured repl for deeper exploration. I use Ruby but same idea.
I never quite saw enough value in tools like postman. Usually either what you are doing is trivial enough for vanilla curl or complex enough to warrant reaching for a general purpose programming language.
I can relate to this. I've tried to use Postman and Insomnia in the past, but the UI is pretty complicated with a lot of domain specific terms.
Instead, I just hacked together a small Python library, called all the APIs from there, and pushed to Git. Everyone on my team understands it, I have 100% control, and no cloud needed.
> With a custom tool like this it becomes a dead end with the data.
Not sure if you're referring to Postman or Bruno. The biggest purported benefit of Bruno over Postman is that it saves API request collection files in simple, human-readable text files that are designed to be committed to a source control repo and easily shared, in a way that's not particularly tied to the Bruno app.
You are putting up all these hypotheticals as if all the info isn't easily available on Bruno's site, and then declare "I don't have time to watch a 15 minute video". Hint, youtube 2x and transcripts are your friend.
I didn't know about Bruno until I saw this post on HN.
Why don't you just rtfm? I'm not interested in "convincing" you, you're free to proudly declare that you refuse to read the docs, and you have an odd definition of "rant".
searching for conflicts probably (I don't know you so I might be getting wrong impression from this thread, sorry if that's the case) takes up more of your time than whatever time you save by not doing whatever you deem "inefficient time usage".
You don’t care enough to read the manual, but clearly care enough to ask a bunch of questions that could have easily been answered by yourself with a modicum of effort. The home page even has an example of the file format, is even opening the page too hard of an ask?
Did you just want people to do all the work for you to “convince” a random stranger? In what world is that reasonable?
I've always preferred Insomnia over Postman. The interface fits me better. But now they also started requiring cloud login, so maybe I'll give Bruno a try if it degrades further
But then you have to deal with this annoying “feature” of it deleting your data when it feels like it. Can’t tell you how much work I’ve lost to this bug.
I’ve just switched to using hurl. Can’t be bothered with any of this nonsense with all these GUI programs
Crunch of capitalism. Anything that provides value can be turned into something that generates money. Not many people will walk away from money out of the goodness of their heart.
> Anything that provides value can be turned into something that generates money.
I don't have much of a problem with that. Heck, looks like Bruno has multiple tiers that they sell the product for, and it looks like a very reasonable way to make money.
The problem is this idea of "infinite growth" that modern markets demand, which results in the eventual enshittification of nearly all tech products these days. Postman could have had a very nice, reasonable business, but once they took an obscene amount of funding (nearly half a BILLION dollars, for an API client tool!!!) their fate was sealed. Beyond taking too much money, Postman clearly thought they had a much bigger moat than they actually do.
For internal APIs, check out Aspen's private-by-default approach & AI powered testing to streamline your workflow. No cloud storage & built-in JSON editing.
I moved to k6 - a tool meant for load tests but flexible and great for any kind of API tests. Great for any kind of scenarios as the tests are in javascript
Postman dug its own grave after selling out itself for VC money. The “File over app” philosophy is a direction that we should be supporting after the Post-ZIRP VC money world.
We recently moved over to bruno (from postman) for integration tests, and have had a great experience so far!
Pros:
- DevEx is great, the experience moving between writing tests through text vs the client is seamless (allows different expertise to collaborate more easily)
- Fast and has all the basic features you’d expect
- Offline first
- Continuous improvements being made
Cons:
- Somehow, if the test fails connecting with the server its a passed test. So watch out for that one! I really hope it gets fixed soon!!
- Would be great to be able to dump the variables after a run through CLI
- Failed outputs from assertions and expects could be a bit more verbose, but it’s mostly likely the result of a dependency I assume
IDEs such as JetBrains Rider and Visual Studio offer a .http file feature for making HTTP requests, reducing the necessity of tools like Postman and Bruno, particularly for basic scenarios. Regardless, I'm happy to see Bruno as an alternative. I don't often need to test HTTP APIs but when I tried Postman in the past signing up was just too painful when I only needed to use it for a few basic calls.
I myself use Paw [0] because it's native to MacOS, but I'm a little bit worried for it's longevity as it being supported by a SaaS business. But so far it's been great to document API for my personal projects.
the extensions and ability to write your own extensions, or chain request/response values is crazy. when i've switched to linux box from macos last year, paw.cloud (rapidapi) was one of few stoppers for me, that good of a software it is. also not to mention the integration with keychain for credentials encryption was nice.
postman is really bad, nobody should use it. same goes for similar solutions, even this one.
Quickly turning into abandonware. I tried moving my paw library from one computer to another and it crashes opening the file. Had to start from scratch.
I’m envy? I’m hoping for a native GUI application for Linux.
Electron looks ugly, it doesn’t integrate, fails to handle HiDPI usually, in best case it eats a ridiculous amount of memory (factor 5x compared to native) and in worst case it has security issues due to Blink and lot of JS.
The person being replied to was talking about technical aspects similar between the two (functionality like HiDPI and performance) while you're mentioning socially-oriented differences. I would say both statements are true but that what the authors of each statement value is different.
Writing it is just like Flash is a statement that Electron should be dropped and never used by anyone.
Which I disagree with and provide my argument to show that any technical inconveniences parent poster noted, are not enough to say “Electron should be trashed and razed out”.
Bruno has been taking root in our organization (displacing postman).
It was nice to find out about the for-pay golden edition in this thread to support Bruno’s efforts. The feature split for free vs Individuals vs Organizations seems well done.
As feedback to the Bruno team:
for Individuals - I most value the gRPC/websocket, then load-testing
for Organizations - My org would most value central license mgmt
Can someone explain what Postman or Bruno is for? I know it's something for interacting with apis but why would i use it. I interact with apis a lot with curl or wrapper in my languages but never really needed something else?
For me it's more convenient than curl for three reasons mainly:
1. Easier to organize collections of requests in a visual hierarchy
2. Environments mean you can use the same collection to easily execute against local, dev, staging, production, etc.
3. Pre- and post-execution scripts mean you can programmatically extract values and chain into other requests (think grabbing an access token from an oauth request then using that token for an authenticated request)
It's basically just convenience features, nothing you can't get with other tools.
One more thing I don't see mentioned is that you can share requests with your team easily, so if you worked on an API integration, you can document it, share it with your team, and when the next time someone else needs to fix something, they can find something that worked at some point and has all the required fields.
Of course you could also just check in to version control these requests as curl commands, so if your team has the technical knowhow, that's about almost the same.
Or even better, you write some tests in your language to make these requests, then you have an integration test, too.
As someone who uses curl and postman regularly, both tools have their places: I've found curl most useful for quick ad-hoc requests, or if I need to figure out why my service is no longer working. Postman I've found most useful to create a library of requests that are available on-hand: If I need to call services but I don't want to have to remember what the exact URL is or what the exact payload is.
To put it simply: Postman is for everything but what curl is for. Sure it also performs the actual request somewhere under the hood but that’s mostly irrelevant. Having a single integrated user interface (it could also be a text UI) to craft a request, sometimes sending JSON, sometimes a file, then sending it to inspect the result, then modifying the request then doing it again is very powerful. Not to mention OAuth/OIDC support and the like.
Sometimes, you have to find out how an API works, exactly. Sometimes you want to test your own APIs in ways the regular clients do not allow.
Alot of people in tech/tech-adjacent cant use CLI, need an easier alternative. Also, instead of having a huge .txt/.md recursive directory of curl commands, programs like these bundle up request workflows into 'collections' etc..
For me a big usecase is the ability to save specific requests and categorize / name them. It's a good way to document test data. If someone creates a feature which I want to test or debug I'd have to dig into the database to find which parameters I'd need to use, or I just find the right request in the tool.
My use case (covered by postman but not Bruno) is to test a code base where unit testing is not available.
We have multiple environments with multiple parallel versions (think like dev/staging/prod and current/legacy), these deployments mostly have the same API with slightly different urls and credentials
We use multiple environments to easily switch between the various versions both for one-off operations (like a clear cache call that only needs to be called few times a year in response to external actions) and to manually test features.
I can see why not everybody would have this use case
Iirc Bruno does not have enough environment/variables/pre-post-request scripting support
Bruno/Postman is basically curl + a filesystem + git + jq + make.
If you're really comfortable with all those tools, you won't understand Postman because you'll say "why don't people just chain all these tools together?". It's like the famous hn comment on dropbox, "why don't people just use rsync"?
It's technically true, and if you're adept at those tools, you should probably use them instead of Postman. But yea, it's a useful product for a bigger set of people.
I had the same question. My current workflow in deploying a REST API is to write my own thin wrapper in Python and publish that so people can use it. I don’t know if postman/bruno saves me from having to write a wrapper. I also don’t understand the deal with “collections”. Maybe postman/bruno are good for creating a test suite ?
Can be used to for api testing. Collections, token handling etc. Mostly for api testing, and collections can be shared among team members and source controlled.
I am using bruno on all my REST projects and the entire team is more than happy. We have bruno file for most endpoints demonstrating basic usage and as soon as something is problematic, someone creates bruno file in the repo and link to it so others can interactively discover what the problem is about. Since its cross platform and doesn't require any kind of on-boarding its joy to use and nobody complained. Previously, people used postman, thunder, curl and whatnot, but now everybody is on the same page. People created all sorts of scripts, such as automatic token refresh, so you jump straight to action. The fact that I can edit it in vscode is also amazing. Thank you.
Funny. I just found Bruno two weeks ago after getting fed up with Insomnia and am loving it. It feels like what Postman and Insomnia were when they started. Simple and to the point.
There are a few minor features I miss (being able to bulk edit headers is the main one), but overall I highly recommend it.
Personally, it was Insomnia shifting to requiring an account and promoting "enterprise" features that clutter the experience and get in the way of me doing my job.
From there, I did research and found `insomnium` and a coworker found `bruno`. Bruno aligned closer to what I value, so now I use that.
Oh wow, I never noticed. It appears I’m running a ~1 year old version (2023.4.0) that just stopped auto-updating. It doesn’t have any restrictions and doesn’t require an account.
Generating revenue through the creation of value via innovation is great; using dark ux patterns to extract revenue from a captive user base, not so much.
I think in cases like postman, people don’t like when features they previously had used for free suddenly comes with a price tag; rather than innovate and create new value that is worth paying for, some companies are opting to take away features user previously had. Yes, they own/control the software and want to make money , which fine, but they created a dissatisfying user experience as they tried to coercively move users into their cloud offering, wether they wanted to or not, and regardless if it provided user value. After they did that, they shouldn’t be surprise a portion of their user base decide to ditch the product and complain about the experience. This is compounded by the fact that they leveraged the contributions of an open source community, and at the same time there are other options freely available that aren’t locked into a proprietary cloud login.
There is no universe where I see myself paying for postman. It was a bloated, hobbling mess that is now a chundering monster that has to somehow make money to satisfy VCs who invested in a glorified cURL interface and have to stuff it with features to try and entice enterprises. It is everything wrong with software and I welcome any lightweight alternatives
This. The UI was one of the most appalling, unintuitive messes I've ever encountered on any software product.
The game was up for Postman when I realised I could replicate all its functionality I needed by asking ChatGPT to write me some Perl scripts that made heavy use of LWP.
This. Proprietary software tends to be very inflexible, often low quality, and difficult to integrate. Fighting with these often causes more work than just doing from scratch or adapting open source code.
Startups are often all about disrupting an industry by lowering the cost of a product. If you can lower it to zero that demonstrates the incumbent businesses are obviously not providing a valuable service that's worth paying for.
Your comment does not make sense. If someone offers a service for free and people start to use it does not mean that the old incumbent service which costs money is not providing value. It shows both provide value as people were using both. It just shows that people prefer free over paid for the same service.
The new entrant maybe has lower internal costs or subsidises the price initially, but it does not mean that one provides more value than the other.
In the Bruno case, it sounds nice with open-source, free for the basic features etc, but it's not much different than any other freemium offering. They will also try to make money on premium features and other related tools. Their own development time will be spent on those features and they will outsource a lot of the development of the base tool on the open source community.
Great to see most people are smart enough to not apply same model on all problems in their life.
Productivity tools better to be lean, simple, free & open source in some cases. This is one of those.
Of course, you can keep throwing money at bulky software continuously making things slower and more complicated, just because you never have to worry about money thanks to the VC money. I’m old enough to not buy that this is how startups
operate in general though.
I think one issue is that in organizations it takes effort to get new paid software approved and start paying for those. Even if the money is very small. Like engineer that costs $80-150k/y needs couple of tools that cost €50-100/y piece.
Usually in organization it is much easier to just increase the spend on already approved vendors. Increase the AWS spend by $50/year? Nobody is going to ask you about it. Or start using some tool that does not involve payments - nobody even knows you are using it.
It would be great if we could change that. A world where independent software developers could make a decent living by selling small quality tools would be a better place.
We have all been fucked over by VC management, especially from YC-funded product bloat and rot. Free means it will not get worse by the developers adding micro transactions and subscriptions.
Hey everyone, this is Anoop - creator of Bruno.
Happy to see Bruno at the top of HN!
I will try to address some common questions in this comment.
> Well based on historical experience with Postman and Insomnia most probably Bruno will go the same way once they get enough users hooked in.
Especially once a VC gets into the fold.
We will never take VC funding. We received around 10 inbound reach outs from VCs till date and have denied funding from all of them. We will remain independent and I have written about it in detail here https://www.usebruno.com/blog/bootstrapping
> I didn't stick with Bruno. I think it was due to not having an equivalent to Postman's pre-request scripts.
Bruno has come a long way, we support pre-request scripts and a lot more
> But can it handle oauth2? I had to write a httpie script recently just to test an oauth2 api.
We have released oauth2 support, some rough edges are being polished
> Good thing it's open source. Money being involved, I don't have long term hopes for it's openness.
I understand this is a hard problem. We are fully bootstrapped and independent. We earn money via selling the Golden Edition. We will build more developer products in the long term, and the goal is to make even the golden edition features also open source in the future. I am committed to this cause.
> History goes in circles. New API client appears, adds features, userbase grows. Forced login is added, users are angry and look for alternative. New API client appears.
I have felt this pain. That's one of the reasons why I denied VC funding and chose to remain independent. Having seen 10 years of this cycle, its enough. We don't want to repeat the same saga.
Some good links where I have discussed about opensource, freedom and monetization
Thank you for the information and the commitment to open source. I know the Gold edition isn't open source under an OSI license, but is it source available for people who purchase it?
I'm strongly considering buying it because it's not a subscription, and is open source. As a general rule, I don't buy proprietary software anymore after having been burned in the past. I have no issues with open source software that has proprietary features as a way to make the project sustainable, if they are source available for paying customers, and personal modifications are allowed. Obviously redistributing any of that code, even my own modifications, would not be acceptable and would violate the license.
You mentioned that the goal is to make the Gold edition features open source eventually, so would you consider going source available?
Good. Postman and Insomnia have shot themselves in the foot with overly complex UI and a mess of managing your collection of API calls in some horrible format and messing up the whole sync process and the simple ability to run everything offline without login, etc. I would also add that streaming (SSE) support would be great.
This looks amazing. Postman is a great tool overall, but definitely seems to have gotten clunky and overly engineered. I don't need so many damn features. 99% of use cases are just hit an api. Will be trying this out immediately!
Very similar idea as Bruno: everything is configured in text, which I always find myself more productive in that full blown GUI where I need to tab from edit text to edit text to get anything don.
What I want, is an API explorer for CLI. A tool that takes an OpenAPI Spec and then converts it into a set of examples and REST endpoint calls I can just fire off or mess with. Kinda like how Swagger does for their web-based docs, but for CLI, and bonus points if you can generate some examples.
I feel like this should be possible with today's tech, but am coming up dry finding anything while searching.
Have you tried using an AI-powered HTTP client for exploring APIs? It can take an OpenAPI spec and generate code snippets based on your requests, letting you test and mess around with endpoints directly. Sounds pretty similar to what you described! Just sayin', there might be a tool out there that can streamline your workflow
I am currently looking for a solution to run automated tests on a sql website generator I am working on ( https://sql.ophir.dev )
I wanted to use hurl (https://hurl.dev/), but Bruno's UI seems to be useful while developing the tests... Has someone tried both ? Which is better for automated testing, including when the response type is html and not json?
History goes in circles.
New API client appears, adds features, userbase grows.
Forced login is added, users are angry and look for alternative.
New API client appears...
One thing I like doing when working with APIa is to have an echo server at hand.
I.e. something that I can query via curl or via a network library that I’m using, and see in response what kind of request it actually received. It helps me verify that I’m making correct requests (and not misusing curl or a network library).
Currently I google for that and use the first online result that comes up. Is there an open source local equivalent, or does Bruno offer some solution for that?
I've been using https://httpbin.org/ to so some client testing and so far it has been great. They provide a docker image which makes it easy to run locally.
> One thing I like doing when working with APIa is to have an echo server at hand.
> I.e. something that I can query via curl or via a network library that I’m using, and see in response what kind of request it actually received. It helps me verify that I’m making correct requests (and not misusing curl or a network library).
Did you consider using a proxy during development. A man-in-the-middle server that does nothing but relay and record all communications between client and server?
Bruno is awesome. The import from postman thing is really what make the difference - it makes adoption instant (because yes everyone is still running on postman).
Regarding long term strategy, they seems to have a valid (non subscription based) where you can get some very nice premium feature for a one-off licence price (which is the kind of business model we need to support - instead of the madness we have going today with everything is subscription)
I needed something like this today - was going to download Insomnia but I happened to check HN.
I tried it out; it's exactly what I wanted. Postman, as echoed in previous comments, has gotten out of control with the upsell and the confusing UI.
I bought the pre-order, seems like a good deal at $9, that's nearly the cost of a latte around here and if this works out I'll get a lot more use out of it than a latte.
Been using Bruno for a while now and it's done what I need, I'm not doing any thing crazy just testing endpoints with maybe a small bit of scripting afterwards. Very pleasant to use.
I've replaced Postman with Bruno (dekstop app), works great so far! It's nice to put the collection folder in git so I can collaborate with others and commit the changes to the repo.
Does anyone know of any good guides to getting the most out of these kind of tools? I’m mostly interested in Postman since that’s what we have to use a work.
I make requests with it, have things organized in collections, and use a variable for JWT handling but that’s as fancy as I’ve gotten and I know these tools have a lot more depth than I’m using.
The complaints about postman collection files being hostile to Git collaboration is valid, but I will just say: we managed to cobble together a reasonable workflow, persisting our postman collections in source control, by writing a utility that strips out all the guids that postman needlessly changes every time you do a collection export. We use Newman to run the collections in our CI pipelines, and as the main way to run tests locally.
It’s painful having to do an import/export of the collection from a file every time you want to modify the collection or debug a test, though, and the JSON files are still not entirely diffable/code reviewable even with the guids stripped. But it works - you can collaborate as a team with postman without using their cloud.
But the idea of switching to a tool that actually wants you to work that way is definitely tempting.
Thank you for the patience. We have been occupied with adding File Uploads and OAuth2 support in Feb and prepping for the Golden Edition release. This issue is priority. Hoping to get the PR merged soon.
Good to see some more open source competition in this space. SoapUI had most of those features some 15 years ago. I never understood why Postman became so popular, my current team even throws money at them.
I found it uncomfortable to watch it making a request once someone pointed out to me that it highlighted letters in the order SOPA when waiting for a response
Looks very nice; I've also given up with Postman. gRPC support would probably get Bruno into my daily workflow immediately. I spent a couple hours in Postman trying to get gRPC to work and could not- the .proto files were never used successfully, and I ended up having better experiences with Kreya [0] and grpcui [1].
Bruno is great. I'm in the process of moving my team to it. Cli, gui, human readable files. They are hitting all the right points. And it's open source so I don't care if they close up later down the line. It can always be forked like insomnium.
I do admit it is rough around the edges (bugs and a bunch of missing features) but there is a lot of momentum right now and devs are working hard. Not long before postman parity.
This looks great, and less bloated than Postman but after a few seconds of testing the Linux (Snap) version, I noticed the system file browser opens with what is probably an invalid font (all characters appear as squares).
As well, it would be nice to be able to import my rather large postman collections (an vice versa - provide a collection from Bruno to a Postman user).
Looking forward to when I can switch to this full time.
Postman snap works, and works well. It takes about 5-7 seconds to startup. Honestly startup time doesn't matter too much for me. I was there when tapes transitioned to floppies. We learned patience then.
But, it also took about 5-7 to see that the file browser is unusable in Bruno so I backed out. I am looking forward to return though when it's fixed and I can create a collection.
Interesting talk about Bruno by the founder about Bruno! Includes its naming origin (very nice btw), the philosophy, a brief demo and overview of the architecture, with a couple of interesting surprises at the end. Is a good watch!
It looks great. Postman always loses me because I usually only need simple requests and I have to go through a big structure. As a result, it only ever edits one request over and over again, which I have configured correctly.
The key thing about Postman is that I was able to configure my own script to refresh the API token. For the internal API, we have a short-lived "access_token" token (~1 minute) and then a long-lived "refresh_token" (~1 day) based on the user's email address and password. When renewing the token, you receive information about its lifespan. You cannot create a new scratch token every time because you will be rate limited. I put all that logic in Postman hook script.
Making these HTTP requests in Curl – due to token refreshing – is painful.
$ eval $(stat -s refresh_token); #set some stat vars, will use change time st_ctime
$ if [ $(expr $(date '+%s') - $st_ctime) > 86400 ]; then rm refresh_token; fi
$ if [ ! -f refresh_token ]; then ./get_refresh_token.sh; fi
I just switched to Bruno because apparently if you have a secret in the “initial values” column it uploads it to their cloud automatically.
It is cool so far. I like that it is more git friendly. I have noticed an increase in request time which is interesting. I assume because postman caches more stuff.
Damn, I wasn't sure if I cared about another postman/insomnia like tool, but I saw the cute dog logo, and it sold the tool to me. Maybe I am just silly, but that got me. Congrats to the team for developing it!
> What if the folders and requests within the API collection could be mapped to folders and files on the filesystem? … Ultimately, I realized that file-based API collections were the future. Unlike other tools, such as Postman or Insomnia, Bruno would allow users to maintain their collections in a source code repository using Git for collaboration.
Can someone help me understand what is meant by this API can be mapped to folders & files comment.
probably thinking of mapping the API routes to folder structure? like customer/transactions/{id} mapped to similar folder structure with a file containing the source for the API call
Funny, i literally wrote an API tester a week ago for my job. It doesn't have a pretty syntax (in fact, it is the opposite of pretty, tests must be written in XML), but it does understand our authentication flow, and is able to not only assert the status codes and the returned json, but also the contents of Excel files, since we do lots of reporting.
I haven't seen anything being able to compare excels. Also, usually test frameworks are aimed at developers (unsurprisingly), not at testers who might not be comfortable in writing scripts (looking at you, Katalon)
Insomnia and Postman import the collection with the host as a variable, centrally editable, while Bruno imported it resolved with no variable, so I have to edit every one of my endpoints whenever I change the host, which happens often in our setup.
My employer just moved from Postman to Bruno. So far, no major complaints or issues. This was caused by the forced move to cloud, and we don't want credentials to leak.
Very tempting! But, will there be a lifetime version/license? Would rather pay upfront instead of ending up potentially not really using it much for 2 years.
It says "one time payment" but then it says "2 years of updates".
I interpreted that as having to pay a second time two years into the future if I will want another two years of updates then. A bit misleading.
It's similar to a lot of pricing plans out there. For example I purchased Home Designer Pro 2022 a couple years ago for $450 and I can use the HDP 2022 forever. They have, of course, HDP 2024 but I can't use that one. I can upgrade for an upgrade price. However I don't need to upgrade and am happy to just use 2022 for the rest of time.
I’ve been happy with Postman, but it has gotten ridiculous lately, and I would ditch it in a heartbeat. It’s so painfully obvious that postman has turned into a money generating corpse. A good API testing tool isn’t rocket science, go forth and eat their lunch :) $9 for a perpetual license is a pretty damn good deal, and a total slap in the face to postman and insomnia, you have my pre-order!
Perpetual license - nice. I do not buy software subscriptions for my development tools. Only perpetual. In cases like Postman - sure I'll take free version but I was holding off from doing anything serious with it.
All data local only - even nicer.
Ported couple of requests from Postman. All works.
Congrats. Unless I hit some stones during testing I will be buying license when available
Most likely Postman will go to trash bin very soon
I tried Bruno couple days ago, it looks good so far. I am waiting for the environment variable interpolation in body fix [1] to merge before switching over to it.
While we are at it, is there a nice tool to document (foreign) APIs in a collaborative way? Maybe using Markup Language? Shared via Git? There are so many undocumented APIs there, that i use daily and it is good to put my annotations anywhere.
I'm tired of Postman freezing when sending JSONs over a MB in size. I had to purge it from the system a couple times because the tab wouldn't even load on start up making the whole thing unresponsive.
I like this a lot. Happy to support any project willing to reject the usual “success story” of selling out to VCs and turning your product into subscription-based SaaS junk.
Yes,Bruno is far better than Postman in terms of many aspects. One of them being the non-usage of third party proprietary server. It provides more security and more control over data.
I’m the principal maintainer at work of a couple hundred bash scripts that use httpie and jq. I don’t have a huge appetite to redo that work… but this certainly seems nicer!
Can someone help me understand the difference between the .bru file and an OpenAPI spec (Swagger), aren't they representing the same thing with different syntax?
I would love to see more UI done in this style. That is to say treating your business logic as an API and then exposing it publicly to clients on different platforms.
MQTT is in the pipeline? That’s gonna be a huge deal for those of us working on IoT projects that also use REST for service data. I’ll be watching this for sure.
I just started to use the intellij built-in http client for my quick and dirty curl debugs and I gotta say it's pretty good for my limited needs so far.
We recently adopted Bruno at our company and I'm very happy with it. Versioning your requests with GIT is great, and the client is really nice as well.
I've been looking for an alternative to Postman since its enshittification. Postman does not even work anymore if it can't reach internet. This looks actually useful and seems to have support for OAuth2 (which is the main reason I use something separate app instead of python/curl + sh etc)
I looked into using Bruno ever since Postman enshittened the free client. I ended up just installing the version of Postman pre collections being removed and ignoring the prompt to update.
I'm struggling to remember the reason I didn't stick with Bruno. I think it was due to not having an equivalent to Postman's pre-request scripts. If that could be added, I'd certainly give Bruno another try.
Edit: Another comment has mentioned that Bruno now has (or maybe always had and I just didn't see) pre request scripts.
looks very good! gonna import my postman collection and give it a try. my collections in postman keep disappearing and reappearing since the required account update happened. glad there is an alternative now!
Postman/Bruno/insomnia all suffer from the 'curse' of web design making it's way into professional tools leading to lower information density and higher 'white space'.
I feel so strongly on the matter is why I started a project that is unfortunately in comatose for the past year : https://nokori.surge.sh/
The design isn't there yet, but I'm aiming for a professional looking information dense UI.
how does compare to the rather resources-hog postman? My go-to setup for quick apis was curl with fish, as postman was growing it became so bloated i'd rather stick to my curl.
I just finished watching the introductory video and found the whole concept of PostMan and Bruno disappointing. It's baffling that one has to dive into the code to unearth the appropriate routes—a completely unnecessary and time-consuming endeavour. This process should be automated, ensuring that documentation and code are always perfectly aligned, eliminating any out of sync issues. The endpoints themselves should be directly accessible from automatically generated documentation using tools like Swagger. This isn't just a convenience; it's a necessity for efficient and effective development.
So, what is the utility of tools like PostMan, Bruno and others when one has automatically generated docs with Swagger with which one can interact? As an example, you can check my current project where I have set docs like that (still need to add an endpoint field): https://peacefounder.org/PeaceFounder.jl/dev/schema
Imagine you have a scenario where you need to get a token first, then with GET retrieve an id of the item, then with POST create a new entity related to that id. And then you need to call another microservice with created id from post request.
How you going to do it in swagger?
And more serious question is how you going to force developers keep swagger up to date so I can execute such scenario?
> Imagine you have a scenario where you need to get a token first, then with GET retrieve an id of the item, then with POST create a new entity related to that id. And then you need to call another microservice with created id from post request.
This is a good explanation. I would however be inclined to do that within a script.
> And more serious question is how you going to force developers keep swagger up to date so I can execute such scenario?
The swagger is created automatically from the routes themselves so it is always up to date. To maintain endpoint an integration test can be written which executes the scenario.
This would complete the e2e DX if you could generate SDKs from .bru. Even if it's through a openapi export, I'm sure it will happen eventually.
I'm sure Bruno is still in the 1st phase of enshittification, but at least with local .bru files we can ensure longterm maintainability.
On the other hand, devs are most likely the stingiest individuals on the planet that expect all of their tools to be free. So I respect the team for even releasing it open source in the first place.
I just installed Bruno. I'm bit disappointed that it takes so many steps to make a request. Create collection, name it and choose location. Create request. Name it. Why?
Well this is fun to see. After Postman deleted my local data after I declined a cloud account, I started working on my own tool: https://github.com/EvWilson/sqump
Some similar ideas - actually treats the file system as authoritative, runs locally, can share collections via source control with teammates. Difference in this case is that I used Lua as a lightweight scripting layer that I gave all my necessary tools to. So now I have a HTTP toolkit, and some for Kafka as well (which I use a good bit). I’ve been able to use it to replace all my API testing and development, as well as perform some more involved migrations and some dashboard-like actions (e.g. can list out resources and then check failures for each of their IDs).
It’s also just a single binary with the web UI and CLI bundled in, which works more for me. Still early days for the little tool, but hoping it could be helpful for someone else.
Layoff happened, and we didn't yet have our postman software in our list of services to remove employees from. This is not Postman's fault.
One person had "deleted" all his collections and workspaces after the layoff to clear his laptop of all things related to our company. After we got an email from Postman saying our workspaces were deleted, I removed the laid off users. Since I removed the laid off user, the "trash bin" associated with them was also deleted. Postman support restored all the collections but the "environment" was gone. Which was all of our QA test keys, etc...
Our Postman collections are still in shambles after that, and we don't have any employees to manage it anymore. While I totally don't hold Postman accountable - there is definitely a reason why "no-cloud" is a good way to go with these kinds of tools.
It makes no sense in the first place for such a tool to even need a login functionality and cloud saves...
What's really needed to store information about a few http requests? Maybe a few kilobytes. I never understood it and I particularly don't understand how any company could fall for that. If they instead invested in teaching their engineers how to use curl even that would have paid off more.
The benefit of using postman is that you can open the app, see your (shared) collections, easily change the params and hit send. Can curl be used like that?
Of course you can. You can use any tool that lets you write down commands, run it, and edit it. Shells, editors, interactive notebooks like Org Mode, etc. The beauty is that it's just text that you can copy and paste between your tool of choice. You're not locked in to a single tool.
It's not very fun to run the auth call, then copy and paste the access token to the next call, and have to update all of your curl cmds all the time... Even if you use env variables, that's a horrible way to use env variables.
You’re making the case for automation, which happens to be something the shell excels at. Use unexported shell variables or command substitution (e.g., “$(pbpaste)”). Directly use the result of the auth call without going through the clipboard if possible. Create a shell script if shell history isn’t enough. Use interactive notebooks if you need something more advanced. The possibilities are infinite.
If I understood correctly, you claimed that saving requests, modifying it, or parametrizing it was somehow more cumbersome to do with curl than with a GUI. I was just pointing out that the shell is literally designed for all those use cases. And human users don't interactively use curl without a shell.
Also, using curl and the shell allows you to progressively iterate. So "write a script at that point" was kind of the point. Though you don't need to go that far to just feed authentication info.
GUI solutions don't have endless possibilities. You start and end with the exact same tool. The clicks and form fillings can't easily be copied around and iterated on unlike commands in a REPL. You can only perform tasks defined by the author of the application.
I was responding to a claim that essentially boiled down to shells can't parametrize input. How is it even remotely comparable to "the infamous Dropbox comment" to point out that shells can do that better than GUI tools?
Also expecting end users to develop their own file syncing solution on top of FTP is unreasonable. Expecting software engineers to be able to use curl instead of a GUI form is not.
By attitude, you mean not agreeing with every negative comments made about curl regardless of its accuracy. It guess it was just a thinly veiled insult for not being on the "right" side, then.
These tools are Mad Libs for curl commands presented as GUI forms. It does a subset of what curl + shell does while requiring more clicks. The target demographic are people who need to know the tool it's meant to replace in order to do their jobs effectively. That not easier or more convenient in my book.
You're describing the same tool with a much worse UI of recreating the tool yourself (by everyone). There is much value in avoiding that, hence people use integrated tools even with the risks of lock in
What UI do you need to recreate and how is it "much worse?" You essentially type in the same information with curl, but without all the mouse clicks and cursor movements.
I found Bruno after Insomnia adopted the Postman strategy of being cloud first, with a disastrous migration - I momentarily lost all my local projects after an update.
I've been using it for a while and I really like the offline first + git collaboration aspect of it. Only missing Websockets functionality at the moment.
Thank you. As soon as Postman asked for a login I uninstalled it and have been curling from text files ever since. My younger coworkers won't drop Postman though. Maybe this will help them switch.
Can't echo this enough. Thank you! Beyond just the login reqs from Postman, the whole Postman UI has become an overcomplicated mess in my opinion. I just want something simple to make remote HTTP calls. I can understand adding some useful extra things like variable interpolation and separate environments, but beyond that, Postman went way off the "enterprisey" deep end.
They locked users data behind a login screen a week after postman, then had a half assed "sorry you misunderstood our corrupt intentions, we'll back out the change" apology
The irony that I switched to Insomnia after Postman started demanding a login... and now I've been actively looking for alternatives (Bruno being on the list) now that Insomnia has done the same thing.
Your link makes sense, and I believe you.
Have struggled with similar issues. But who knows what the future will bring? Google once said "don't be evil"; Oracle bought Sun. Is there any way you can guarantee your future actions? Contracts, articles of association/company constitution? Maybe setup a trust or charity? I don't think there is.
Hi Annop. Thanks for sharing this looks like a good alternative to Postman. I see the company is based out of India (awesome) but wanted to know if the company has gone through the steps needed to sell to teams in the healthcare industry in the US/UK i.e.) HIPPA, SOX2, PCI, GDPR, etc.…
If they never take possession of the user's data into their environment, then most, if not all of those don't even come into play?
Like, that's kinda the whole point of offline-first, local-only tools, you can 100% use them in a an environment you control and take responsibility for. Once you take control of customer's data, there's a whole litany of due diligence that must considered, and often at considerable cost.
While I agree with you and understand that the data is local only, we can only use tools approved by the company. I would like to suggest that my team take Bruno for a 3 – 6 month test drive (since Postman was unable to check the boxes) but cannot without approval….
Yes, that is that I mean. All the modern "ChatGPT" style API's use this, so if you're building anything that invokes them you can choose between buffering the entire response and modifying/relaying it once complete, or building up a toolchain of streaming-capable utilities.
Of all the http-client applications I tested (Curl, Postman, Insomnia, Bruno), somewhat hilariously Curl has the best support. It will output all 'Transfer-Encoding: chunked' with line-by-line buffering, whereas Postman only supports responses precisely following the `Content-Type: text/event-stream` format (strictly less powerful than curl, as this format requires newlines in between events, and a bunch of overhead on top of that). The others buffer the entire response before displaying anything.
The `Content-Type: text/event-stream` format is fine enough, but I personally prefer to just plainly chunk the responses and let the client choose whether to buffer them into memory and operate on the entire value at once, or interpret them live as the chunks come in. With tools like gjp-4-gpt (Gradual JSON Parser 4 GPT's) you can even interpret partial JSON objects live as they come in and display complex/nested data structures in a generative manner.
Personally I use a lame but effective simple 85.9 KiB static binary filter, a small C program, that removes the chunk sizes so the response is ready for use by other programs, e.g., in a pipe. Buffer is set at 8 KiB.
Is there a way to experiment with one of these streaming JSON GPT APIs non-interactively by just sending an HTTP request, without need for a third party program, an account, use of a Javascript engine, etc.
The unknown length isn't much of a problem for me in practice: GPT's are slow enough that getting a large chunk is almost impossible. I like the idea of the C filter, but in the end you're just piping the data to the program, why add the middle step? Is it to protect against too-large chunks in some way?
I don't know a public API that returns JSON slowly, but you could simulate it by just taking a JSON string, splitting it into 3-5 char chunks, and sending each of those in a `Transfer-Encoding: Chunked` response at ~100ms intervals.
Actually, now that I look at the underlying mechanism behind `Transfer-Encoding: Chunked`, it looks like it's already basically the same as the netstrings. What I'm referring to is the (variable length) contents of the netstring/chunk being sequential slices of a JSON object.
"I like the idea of the C filter, but in the end you're just piping the data to the program, why add the middle step?"
Only for the flexibility to use more programs. Otherwise every program I use to process HTTP responses needs to be able to accomodate chunked transfer encoding. Plus only a minority of sites send chunked responses. Instead, have one program that does one thing: remove chunked transfer encoding.
IIUC, what you want is uniform chunk sizes where you know the size before you send the request.
GPTs sound annoying if they are so slow that they only output a few characters every ~100ms..
Ah I see, I'm working a bit further up the stack from you so the JS runtime handles making the transfer encoding of the response more or less irrelevant, for any response with any encoding you can access `response.text()` and get the entire contents when they're ready, or do something like `for await (const chunk of response.body) { ... }` to work with the pieces as they roll in.
> IIUC, what you want is uniform chunk sizes where you know the size before you send the request.
I don't think so... I don't really want anything! Just a GUI that displays existing HTTP chunked responses as they come.
> GPTs sound annoying if they are so slow that they only output a few characters every ~100ms..
That's perhaps an exaggeration, but in general the speed and "smartness" of the model are inversely correlated.
I'm not really a GUI person nor do I use JS. I'm happy to see HTTP responses in textmode. I tried playing around with Postman and some other similar programs a while back in an attempt to understand how they could be superior to working with HTTP as text files. But I failed to see any advantage for what I need to do. One problem with GUIs/TUIs IMHO is that generally few people write them or modify them. And so users must look around to try to find one written by someone else that they like. Then they are stuck with it, and whatever "options" its author chooses to provide. Whereas with textmode programs, I can easily control all the aesthetics myself. Even when using other peoples' software, if there is something that annoys me, I can usually edit the source and change it.
Best of luck. Hope you can find the right program for viewing HTTP.
The "Lite API" mode of current Postman is actually decent, it's the only GUI client I know that supports streaming responses, but you have to use `Content-Type: text/event-stream`. You can't save/share queries, but the local history is decent enough for local development. I prefer it to the Insomnia mutable fixed length saved query implementation for "hacking around" with many different APIs.
It's not open source, but it's in my workflow anyway. The JetBrains HTTP tool is excellent, and has been getting better and better for quite some time.
I think it's just the nature of how I use APIs. I do discovery with curl then just write a golang CLI for long term use. I'd probably save a lot of boilerplate using something like http-files though.
I will usually take curl commands i end up calling a lot and use a function or script to make those calls easily. I find this to be way better for me personally. I feel using Postman teaches you only Postman. Whereas making shell tools and learning curl are so much more valuable. Plus, you can combine them with fzf, jq, fx, yq and friends to easily customize.
Shameless plug for Hurl [1]: it's a cli tool, based on plain text and curl to run and test HTTP requests. It's just libcurl + the possibility to chain and test response. You may like it! (I'm one of the maintainers)
I know people don't love VS Code, but is there a reason Thunder Client isn't more popular? I know it's feature lacking compared to postman, but more so than curling?
It limits collections to 50 endpoints. I also couldn't immediately see how to plug tokens into variables.
Bruno is intuitive and has no pay-to-remove limits.
I also have fallen back to just using curl + jq and a set of saved commands since both postman and insomnia have decided to make my life harder not easer. good old plain unix command line tools never fails you.
You may consider giving httpie cli a test drive, if you are dealing with json endpoints. You can put in Bearer header and JSON arguments quite easily with
While I do hate Postman because its overcomplicated in everything.
But attitude like yours has shifted people and money away from developer tools. Instead of all the possible tools we could have from many developers, we are now totally dependent on big tech to sponsor it, like VSCode etc. And over time it would move in direction that will promote another service from the same company like copilot and vscode.
"A GUI for a command line tool" is, these days, at the level where you can get the first 80% of the value by asking ChatGPT to write it for you by copy-pasting in the bits of the man page for the options you want.
(The second and third 80%'s still need human intervention).
Developer Tool Developers are hurting themselves. I think it is reasonable to not want to have to log in to every darn thing you use on the internet. Especially developer tools.
Someone needs to invent a hat that developers have to wear: when the developer starts to write signup, login or authentication code, a hand pokes out of the hat and slaps him.
Default-on telemetry as well. I'm reminded of Balena Etcher phoning home with the names of ISO files you flash, which leaks the IPs etc. of which users are creating Tor/Tails bootable USBs: https://gitlab.tails.boum.org/tails/tails/-/issues/16381#not...
Yes, telemetry too! And automatic updates. They are all symptoms of companies’ obsession with having some kind of ongoing post-purchase “relationship” with their users. As a user, I don’t want this relationship! I just want to buy the tool and use it, without ever interacting with the manufacturer ever again.
If I buy a circular saw, I don’t want a relationship with Makita. I don’t want to have to log in to use it. I don’t want it telling Makita how many boards I cut and how well the saw is working. No offense, but I’m just not that into you, Manufacturer.
Judging by vscode's or copilot's or postman's popularity, login is not the issue. The issue is the quality of the tool and number of developers they can hire.
I tend to collect little Ruby one or two liners for common REST calls, and now Go. Takes care of things like getting a JWT token and including it in the right header, as reusable code.
Takes slightly longer the first time than curl or Postman. But much more powerful in terms of using in scripts for operations tasks.
Hehe, always the same. Popular non-FOSS software always dies in these kinds of ways. That, and that it's Chrome-only, is why I never wanted to use Postman. I've been using RESTED [1] in Firefox happily for quite some time. Although I don't use it that much since unit tests and Django Rest Framework's web UI is usually sufficient for testing and debugging.
vim-rest-console[0] has been my Postman alternative for years. It basically wraps curl and makes it super easy to make different requests from a single text file. Can even write YAML and have it converted to JSON before being sent in the body. Really great tool
Great point. I almost wrote, "why use a bru file when more established scripting languages and libs exist for this". But it seems the point here is to bridge the gap between some devs (maybe more junior) on a team that prefer a GUI while also providing a more vc/cli driven experience for the rest of the team.
Not to mention, that bru syntax looks really nice.
You can't continue to make a product worse over time without expecting to lose some customers. When someone tells you why they stopped using your product, don't brush off the feedback as whiny entitled complaints.
Baloney. As other's have mentioned, Bruno also has a paid option.
Postman's problem is they took hundreds of millions of funding that somehow thinks an API request tool is worth $5.6 billion (though I'm sure that's waaaaaaaaaaaaaay down from the 2021 ZIRP frenzy). So now Postman has long since gone down the full enshittification path with a totally unnecessarily complicated UI and authentication and cloud features that hardly anybody really wants.
I like Bruno simply because I believe their approach is much better: simpler UI and, importantly, as their demo video points out, collections of API requests can be saved in an easy-to-understand format in git. Much better than some proprietary cloud storage.
>How dare they try and run a business and get paid for their work
They can run whatever they want. I, and the parent, wont be using it.
Bruno has a paid option too, btw, and I'm fine with that. I'd be fine even if it was the only option. Login and extra BS features to tie me in? No, thanks.
Running a business is perfectly acceptable. Paid tools, likewise, as perfectly acceptable. Granted, most of software is built directly or indirectly on free work either of the intellectuals that discovered the foundational knowledge required to build it or of the engineers that built the tools.
What’s not acceptable is the rug-pull of presenting something as free and trying to claw back money from the endeavor after you’ve got users locked in. If you need money, don’t offer a free product. If you offer a free product, make sure you can afford it.
No one is going to judge you for charging the worth of your offering. However, you’re also not going to get free goodwill and advertising for no reason.
Spend your time and energy wisely, that is your personal responsibility.
I really like the idea of serializing requests to a Git-friendly text format.
But if we want a Git-friendly text format, why not mimic HTTP/1.1 request syntax as much as possible? Maybe with Jekyll-like YAML front matter for metadata that doesn’t fit?
So for Get Users.bru instead of the current example of:
meta {
name: Get Users
type: http
seq: 1
}
get {
url: https://reqres.in/api/users
body: none
}
headers {
Content-Type: application/json
}
We could adopt a format like:
---
name: Get Users
type: http
scheme: https
seq: 1
---
GET /api/users HTTP/1.1
Host: reqres.in
Content-Type: application/json
Intellij (and all the other variations) has something very similar to this called [0]HttpClient. Being able to commit and basically just read the file is very useful. You can also do validation and scripting with it too.
Second this. One of the first VSCode extensions I install and recommend for non-devs in the team as well (BAs, QAs, etc). Kind of insane how human friendly HTTP 1.1 is.
This makes me think of an alternative that no one seems to be mentioning: http/rest files. They're git-friendly and there are community plugins to operate them from every major IDE.
Oh yes. This one is Jetbrains only but there is also a VScode alternative for this. There is a plugin called httpyac and I believe it also supports the same kind of configuration (???) and variable syntax. It's great not switching to other apps for making an http request.
I've purchased the golden copy of Bruno, not just because this is the right way to do software, but also because of the Bru DSL and git-based sharing "everything as code" model.
Meanwhile, I often dev on iPad Pro with keyboard and trackpad, and instead of Postman, Insomnia, etc., I've enjoyed HTTPBot:
I have nothing against this app or the other graphical HTTP tools like Postman, Insomnia, etc., as people clearly get value out of them, but personally, I've moved everything over to Hurl --> hurl.dev
- Open source
- Text files all the way down
- Easily understood DSL
- Easily distributed
- Easily versioned
- Fast
Download the executable, copy the two lines below into "first-test.hurl" and you're up and running.
GET http://localhost:3000
HTTP 200
I realize I'm fanboying a bit here, but Hurl really has been so helpful for our particular use case at wafris.org where we're trying to support a large number of different HTTP clients. We can just give an integration partner access to our hurl suite and they're good.
I tried Hurl after Insomnia went the way of Postman. The highlights you list were the strong drivers for testing it out. Where Hurl fell short was composing requests. Example: X.hurl response has authToken. Y.hurl uses authToken. Z.hurl uses authToken. There's no import ability[1], so you've got to use other tooling to copy X.hurl into Y.hurl and Z.hurl.
Ultimately settled on Bruno. It's backed by readable text files[2] as well. The CLI works for scripting. And the GUI is familiar enough that I've managed to convert Postman holdouts at my dayjob.
You reused a lot of "concepts" like captures, queries, filters, predicates etc... I take it as a compliment, there is a place for everybody (I'm one of the Hurl maintainers)
I did. And it is! The Hurl project definitely made Nap leaps and bounds better. Thank you for that. I wanted to take a different approach to how http tests are written (YAML, environments, parallelism, etc.), but for things that I already knew Hurl did very well (assertions especially, but also some other things like cookies) I pretty much followed the Hurl docs and reverse engineered them.
>Bruno stores your collections directly in a folder on your filesystem. We use a plain text markup language, Bru, to save information about API requests. You can use git or any version control of your choice to collaborate over your API collections. Bruno is offline-only. There are no plans to add cloud-sync to Bruno, ever.
AMEN! I've been annoyed by Insomnia and Postman for a good while for these reasons exactly. My latest client of use, Thunder client VS Code extension also started pulling the same shit, and I'm cutting it off too.
Then I found Bruno and fell in love. Thanks for the great work!