I love the idea of Atuin but it's just way too slow with large history files. I've synced my history on my own for the past decade and have like 170k lines and the history search the ctrl-r search just crawls.
I don't need most of the history, but there's 0 chance in hell I'm auditing that many lines to decide what I need and what I don't need.
History is useful enough to exist as a feature, I up-arrow routinely, but it doesn't actually matter when it doesn't exist.
I find the idea of going out of your way to preserve and migrate years of shell history and make it searchable in a db about like:
You have a problem that water is flooding your kitchen floor. Normally you deal with a spill with a mop or towels. There is now too much water and so you decide that your normal towels aren't good enough and so you get more & better towels, or even put a sump pump in the corner to keep pumping all this water away.
I've written a lot of complicated pipelines and awk and sed etc, but they were either one-offs that are of hardly any value later, or I made a script, and the few things that are neither of those, are so few they automatically don't matter because they are few.
> You don't need any of it.
> History is useful enough to exist as a feature, I up-arrow routinely, but it doesn't actually matter when it doesn't exist.
That's absurdly naive to think the simplistic constraints of your own workflow is a general rule.
History can also contain potentially sensitive things like hostnames of non-public systems, usernames, filenames, URLs. I would not want that stuff to hang around indefinitely.
This is one thing that atuin announced to solve: ignore patterns, keys, passwords. From the announcement I got that it's going to be far more useful than HISTIGNORE. Both out of the box and in abilities.
I use those ! Bash history features all the time. I.e. !?some_test to just rerun a test case I ran several months ago. I don’t need to sync histories between PCs (they are different enough) but history is important.
Good tools change the way you work. Imagine if version control systems were notoriously unreliable and difficult to get working. You'd be saying "nobody needs their entire commit history, if you need a version create a source archive and back it up".
But version control systems do work, so we use them, we keep history and we tag releases and don't really need to bother with source archives any more.
Nobody is saying history is a substitution for documentation or an audit trail or anything else, but it is a useful tool if it works. Consider a case where you're exploring a new dataset you find online. You download it into a directory, run some commands to transform it, load it into a database etc. You don't know if this will ever be useful. But if it does turn out to be useful, you now have a log of everything you did to get there. Trying to document everything up front would be insane and you'd never get any exploratory work done.
I had a similar thought when I first looked at it, but then I thought about my browser history and URL bar. It is sort of a lot of work to open files to write scripts, keep them organized, and make them accessible just to make some commands simpler to run. I wrote https://github.com/ionrock/we for this very reason. I moved most args to env vars and made loading different env vars easily via files. Maybe the history is a better way to make these things reproducible and useful by avoiding the redirection necessary by scripts?
While I agree it may not work with everyone's workflow, maybe it could be a powerful change to folks workflow. I'm going to try it out and see for myself!
I’m self hosting my ATUIN server (although it is offsite so about 50 ms away), and have right at 100,000 items as shown by the command atuin status
I don’t have any speed or lag issues, it comes up instantly and searches instantly.
Fwiw.
I Love the project (will donate now, I remember when the project wasn’t taking donations, and I suggested they should)
Thanks
Edit: I should add: I have the client on about 15 to 20 different VM‘s, all with various OSes and versions. The server part I’m running w docker (I think, via the exact steps suggested in the docs). All Works great in my use case, and I do have some very long and complex commands that it’s storing
The delay between the physical keystroke and rendered text insertion + search filtering is uncomfortably high. Compared to FZF's instantaneous rendering of the same this is a UX downgrade.
I’m an atuin user too, and think it’s great. It’s a significant improvement over the bash history configurations (incantations?) I used previously.
Where atuin really shines is in keeping a single unified history across multiple shell windows, which my incantations could never get to work correctly on all the platforms I use (zsh/bash on OSX/Linux/msys/cygwin/babun).
I’ve also enjoyed running SQL queries on my atuin-history to learn more about my own workflows to see where I can optimize.
Here are some of my often use cases: Long kunernetes commands, or ssh into an ip you don’t have remembered, curl commands when testing an api. Anything on the CLI that is long and either hard to remember or just annoying to type.
I tend to write scripts or docs when things are annoying and hard to remember. shell history is usually littered with sensitive stuff, I usually go way out of my way to prevent my machines from saving any of it, but to each their own.
With fish's shell history for example I can just type 's' and it completes it to 'ssh user@host.tld', because that's the last one starting with S I used. If it's not right, I can type it up to 'ssh' and press arrow up to pick the ssh command I want.
Then I might remember that I did this fancy jq thing once to parse a field in a specific way, I can easily use Atuin to look for it with a nice text-mode UI just by pressing C-r and typing 'jq' as the initial filter.
I often hit CTRL-R to reload a services config. I press `CTRL+R`, enter `reload`, and continue to hit `CTRL+R` until the right service appears. Enter. Done. Usually way quicker, especially when switching between distros. As one calls it httpd and one apache, once it's systemd and once it's and init script, and so on.
Yes I'm using this command occasionally. While clearly the demo of Atuin (https://atuin.sh/) looks cool and more powerful than ctrl+r, I must say that ctrl+r has always been enough to me.
I feel like I get a decent return on mining historical command usage for new (single keystroke?) aliases to setup, the most useful ones change over time for me.
Another use case I feel pays off is complicated one-liners where I need to do something similar but not quite the same again - good starting time saver. This depends on you being a mostly cli kinda person obviously, if you instinctively reach for excel over awk then ymmv.
I’ll sometimes write a useful one liner that I want to write a shell script off but if I’m in the middle of something and don’t have time I’ll do this:
<my> | <cool> | <one liner> # add script for this
The # comment makes it easy for me to search through my history to find one liners I want to use to build a shell script from
I'm similar. I do most of my work in Emacs and I don't do much sys admin, so I don't really use the shell much. It's always interesting to me to see other people's workflows, though.
Not to be dramatic but atuin is life-changing. I can resume my working context from any project, no matter how far in the past (e.g. what docker or cmake or make commands I run). I don't need a personal wiki of cli tricks like an "ffmpeg cheatsheet" or whatever, I just access my own personal ffmpeg history.
Oh, also, I use tmux to split my terminal and do separate things in each one, so I like that atuin consolidates all the histories from my separate panes.
I have the same question. Lots of comments here from people hand rolling complex solutions to what it seems like fish does out of the box. Commands per directory, partial completions. Heck there’s even an embedded web server ui for searching and manipulating the history.
Multi device sync is not there without some effort but I don’t really care to mix my personal history with my work machines anyway.
> Multi device sync is not there without some effort but I don’t really care to mix my personal history with my work machines anyway.
Yep, obviously there are many benefits to per-device history, but I think I'd find it more annoying having it synced between devices, especially if there are commands that either won't work on a particular machine or might even be dangerous in a different environment.
Fish has smoother usability, autocompletion and search. It is useful on fresh machine in vanilla configuration. Installing bash plugins is not always possible. Installing some sync plugin on sensitive server is nono!
Also my biggest problem with bash, sometimes it does not keep part of recent history, if bash process gets killed. Fish does not have this problem.
I usually keep useful commands in notes, and sync my notes instead.
I use fish's own completion (the dim text that appears after you type) for most stuff. I also use the fish up arrow search instead of Atuin like this: atuin init fish --disable-up-arrow
Atuin is there when I need to find something more complex I remember doing 3 months ago and could actually repurpose today.
I keep my shell history in sqlite database since 2017. Around 120k records at this point. Never synced history from the work laptops. Only personal history.
In 2017 wrote my own bash script (later optimized for zsh) to just record everything in sqlite with hooks on prompt. [1]
I mostly work right now on Mac, don't need to support Linux anymore, so wrote an app for Mac, that syncs the history over iCloud, and has a GUI interface. [2]
Anyway, storing years of shell history somewhere, where you can do complex searches, and actually find some magic command you run a few years ago, is priceless.
I have some local setup replicating what Atuin does, but I'm realizing I don't use it at all even though I thought it'd be useful, and I don't use my shell history much overall. I have an alias "aled" to edit my aliases quickly, so it's easy for me to add new ones, and that's where I put the commands I want to use regularly, or even the ones I'd like to be able to find if I ever need them again in years. It's easier to add documentation in a .zshrc than in a shell history in any case.
Global aliases (which you can use anywhere, not only at the beginning) are also nice to compose aliases with each other.
I'm curious about the calculus involved in dedicating oneself to a full-time open-source project. Could someone with prior experience share insights on generating income or potential future exits solely through open-source contributions?
Freelancers typically make money by coding up solutions, leaning heavily on open-source, for individual clients (as opposed to mass-market software) or other forms of consulting. Sometimes they will become the caretaker of one or more "projects" that multiple clients rely on. That maintenance is billable. It's not uncommon for contract programmers to become small businesses. If they think they can get support from the community, open-source becomes the best option.
Atuin tends to get shared more readily on Mastodon/Twitter than it does HN, which explains everything other than the 2023-HN-spike. We've also been on a few podcasts and newsletters
Said this last time Atuin popped up, but absolutely love it. I’m by no means a power user but there’s just something so elegant about the UX of combined shell history across multiple machines.
Best of luck. I hope there's a path forward where open source can provide a reasonable income stream.
I maintain a couple of open source package for emacs -- it's a labour of love. I'm happy to help folks with their issues, but it's easy to say "sorry, I don't have capacity to add this feature" or "no, I don't think this is a good fit for the project". If I depended on this for money.. well this would change the whole approach, wouldn't it?
Does Atuin offer ability to save current path (with timestamp as a bonus) for the command as well?
Looking at OP it looks like it does " recording additional command context"
Often (meaning once every few months) I have to SSH to some less used machines and remember a few incantations where path and time is crucial.
In bash there is a hacky way to add current path and timestamp to history, but I've never gotten it to work exactly right. If you add timestamp then it seems to duplicate the timestamp when you repeat the command.
I wonder if atuin could parse tags if given at the end of a command, after a # ?
sudo somecommand arg1 arg2 # admin
mpv arg1 # media
I have work-related stuff that I'd love to be able to tag and then filter history based on that. Hmm now that I think of it, maybe it will just work like that anyway.
I just deployed this to my "everything" NixOS server with `services.atuin.enable` and synced a few of my machines up with it. Very cool! I hope this move goes well for Ellie!
Not the amazing utility Atuin is .. but have been using this tiny bash script to manage named history files that i find pretty useful when juggling between projects. Have to try out Atuin and see if it is easier to use. .. sharing the gist ..
https://gist.github.com/appsmatics/ff27e885460bd345eabe1c5f7...
At this point I just auto-assume there's a domain for every word I'm interested in, though some are not open to registration or prohibitively expensive.
All the best Ellie. I use atuin and never miss any command from my past session.
The only feedback I want to call out is sometimes when I close the terminal tab, the atuin server may run in the background and I got a warning message.
Logging into another machine just to get a long command resonated with me and is enough that I've decided to give it a try. That, and I'm a sucker for tools written in Rust :)
I'm curious to hear what people's usecases are for retaining long shell history. For myself, I never really use it beyond the current work day (up arrow)
I definitely refer to months old command lines. I work on 10-15 projects ~2 are every day projects, ~6 every week projects, the rest every month or 2. I often recall commands I used on those month ago projects. I often remember the first few letters and press up arrow or I Ctrl-R for a part of the command-line I remember. I rarely have to dig out the docs to remember how to full type the line.
for work, there are a lot of commands with complex invocations that I do not use regularly enough to bother remembering all the flags. it is really useful to just grep through my full shell history to find all the times I ran that command and just copy/paste the relevant one.
I use this and it's been fantastic and rock solid; (I've been using Unix shells intensively for 15 years, started using it 1 year ago and never looked back).
Hm, I just use it on my personal machine. I would pay, but it doesn't look like I'll need to:
> Atuin will continue to be open source and available for free in its current form as a self-hosted tool. By going full-time I hope I can focus on adding new premium hosted features for advanced users, and begin to support business usage.
im one of those who disable shell history persistance, since i first found peoples user folders on misconfigured apache servers in the 90s, complete with .bash_history containing passwords and hostnames.
dont recall i needed to dig up old commands typed, like ever.
> 1. If someone steals my laptop & breaks in, can they get access to all my history
Yes, but this is the case anyway with current shell history. I think if someone breaks into your laptop you have bigger problems than your shell history. It's best to get into the habit of not pasting secrets into your shell
> 2. After breaking, if they run `atuin key` will get them the key for my history which they can use from any device (if they know the userid)
They would need your username, your password, _and_ your encryption key
> 3. If you are running servers passing passwords as command line arguments in that device, they have all that.
Yes. If you're doing this, then all of your passwords are currently stored as plaintext in your home directory - with or without Atuin. I'd consider them no longer secure if this is the case, as any program you run could read .bash_history
Atuin by default comes with a set of filters to ignore secrets and not record them to history - AWS creds, slack creds, GitHub tokens, etc etc. So it may well reduce the impact of this
> If you are running servers passing passwords as command line arguments in that device, they have all that.
I make a point out of never doing that. It’s way too easy to accidentally expose things. For instance, doing a live demo with an audience, and using Ctrl-R out of muscle memory? Suddenly you flashed your password in front of everyone.
Generally, I’d recommend using a tool like Unix `pass` or your default OS keyring to store your secrets, then you can run `command1 --password=$(command2)` to feed a password from one command to another. If I really have to type something sensitive, I prefix the whole shell command with a space, which in many shells can be configured to mean that it doesn’t enter history. If you do so by accident, the shell history file can be edited in vim.
What do you have in the history that’s sensitive? Keys, passwords should not be in shell history anyways (e.g. I delete them from bash history if I enter by mistake)
I don't think it's that unusual. What comes to mind immediately is it's not unusual for me to clone something from a private git repo, where a username+password would be needed for permissions. In which case it's possible to put in `git clone http://username:password@example.com` or another git command that interacts with remotes. (To be clear the "password" is typically a token and not human generated string, but still functions like a password).
For that example: Any reason the server doesn’t just have an SSH server? Then you can use `git clone` in the “usual way”, using SSH certificate authentication.
I think she is tackling a problem here that can't be solved by another tool, no matter how good it is. Thus, ever getting enough people to pay for it will be tough.
The issue isn't "I can't remember how to run a command I just ran." The issue is that the universe of CLI tooling you use is too large, too inconsistent, and too complex to remember how to use them all. Conventions may be wildly different between Windows and Unix, BSD and GNU, many tools have existed for over 50 years now and have accumulated enormous feature creep. Many newer tools try to improve upon perceived complexity of past tools, but by being different, they introduce even more complexity into the overall set of tools for anyone who can't abandon the past tools. There are huge debates about environment variables, config file formats, whether parameters should use one dash or two, what even is a parameter versus an argument versus a flag, how a tool should use STDOUT versus STDERR, how it should use exit codes, whether output should be structured or free text, and nobody agrees on the answers. There is very little standardization, and where standards exist, you can't count on anything to actually follow these.
Contrasts these with the tools of a painter or wordworker. They're similar enough that learning to paint in high school art class will transfer muscle memory near perfectly to every brush and surface you ever use for the rest of your life. Creating a tool like this is throwing up your hands and saying no human can ever hope to remember how to use their tools, so they need an additional tool that remembers for them. But now we also need to remember how to query this memory augmenter, so you've introduced yet another thing to learn for anyone who isn't willing to just stop trying to learn other tools at all and rely 100% on yours.
It isn't to say it can't be useful, but you're trying to solve an ecosystem problem with a tool. You can't. At best, you can alleviate a tiny portion of the difficulty for a very small number of users sufficiently similar to you. Then you run into the culture of not having to pay for these things mentioned elsewhere. On systems like Windows and Mac, they may be paid systems, but once you pay, you automatically the full suite of system utilities and CLI tooling. BSD and GNU were free creations made largely by university professors and industy professionals in their spare time for the purpose of sharing, not for making money. Fair or not, the expectation became and will likely remain that these tools either come as part of a larger package, or they're donated from the spare time of their own users.
Exceptions are few and far between. You've got things like curl and openssl that sustain themselves reasonably well as open source CLI packages, but even those don't charge for the tool itself. They only succeed because they're so ubiquitous that if a barely perceptible proportion of users ever donate or pay for support, that is still enough. That model doesn't work if your userbase isn't virtually the entire world of computing.
This is the dream for many of us, but I hope she comes up with a good business model that doesn't rely on the goodwill of people (e.g., open core) because FOSS is terrible for earning money. The reason is that FOSS is essentially part of the commons, but without being maintained by taxes.
In general, people just want free stuff, companies rarely pay for support, and SaaS providers will steal your business if they can. I can think of several apps that macOS users are paying for, such as Bartender, Alfred, or MailMate. Clearly, there's a market for utilities, but only with scarcity.
Author's project is a command line productivity tool. I have co-authored a now-archived command line productivity tool with half the number of stars (6-7k, maybe worth more today due to inflation) in the past, and my observation is people are generally very stingy with this type of projects. We did make people's lives a little bit better, but unlike frameworks, libraries, etc., it's not going to end up in any money-making product, so people don't think it's essential (it's not), and companies which tend to donate larger sums than individuals are completely out of scope. I think the total donation (through a PayPal link in README) we got over more than half a decade was less than $100. In comparison, I once made some web-based analytic tools when playing a casual mobile game, and got a few thousand in donations over a year or two -- not much considering the time that went into it, but two orders of magnitude better than the command line productivity tool.
However, we hardly ever marketed our project and never tried to "build a following" or beg for donations in any way, so maybe the author will do a lot better than us. The server component should also help remind people it's not free.
Edit: One thing I forgot: I think command line utilities are in a worse financial position than GUI utilities, because people are accustomed to paying for GUI apps, but aren’t accustomed to paying for things in the shell at all.
>I think command line utilities are in a worse financial position than GUI utilities, because people are accustomed to paying for GUI apps, but aren’t accustomed to paying for things in the shell at all.
I think that's because command line tools are more appealing to technical people. For convenience sake and ease of use, most average computer users will use GUI tools or applications.
I'm specifically talking about developer utilities. Even developers are accustomed to paying for GUI apps but not command line stuff. To be fair I've never seen anyone selling a local (non-SaaS) command line only tool either.
How to pay for the open source commons is far from a solved problem, but I'm glad individuals are trying to make it work for themselves anyway.
I wish we wouldn't act like producing software and making gobs of money are inextricably linked. Yes, we absolutely need to find a way to fund people who are building critical infrastructure. But sometimes, "I quit my job to work on open source" can be more akin to "I quit my job to hike the Appalachian Trail." I wish tech had a lot fewer people who were here for the money.
>I wish we wouldn't act like producing software and making gobs of money are inextricably linked. Yes, we absolutely need to find a way to fund people who are building critical infrastructure
Don't you see the contradiction? Critical infrastructure costs gobs of money. The software has to pay for itself, or it has to survive on crumbs; that's just reality. Software is really, really expensive to create and maintain, because it takes a lot of time, and time costs money.
I wish Richard Stallman hadn't duped a generation into thinking that they have to use licenses that make Amazon richer instead of just using proprietary licenses to protect yourself, as the licenses were designed to do, so that people with more lawyers can't just steal your work.
Why did our whole generation listen to a guy who was caught on camera eating something off of his foot?
> I wish Richard Stallman hadn't duped a generation into thinking that they have to use licenses that make Amazon richer instead of just using proprietary licenses to protect yourself, as the licenses were designed to do, so that people with more lawyers can't just steal your work.
This is why I like Open Source instead of Free Software. There's no practical difference between the two in terms of licences that are compatible with the two definitions. However, Free Software is an ideology that considers proprietary software to be immoral. Whereas Open Source is a perspective that only cares about the economics of producing software, and is perfectly compatible with capitalism.
Note, there are many scenarios for which I find Amazon getting rich off OSS work is perfectly OK, even advantageous for the contributors, since that kind of freedom and control is the whole point of OSS. It's just that being paid for your OSS contributions is probably not one of those scenarios, and people need to be aware of it, indeed.
You have to weed out the people who just want to dick-around at home versus people who are actually providing value to society, so the bar is set very high right now.
Maybe it's crazy, but I'm one of those people who tries to make a living of the goodwill of people and companies. I am working for free since several months on Biome (https://biomejs.dev), a fast formatter and linter for JS/TS/JSX. At the moment we do not have enough donations to be paid for our contributions.
I can confirm that funding one's work on an open-source project through sponsoring is one hell of a ride.
As you said, most people are not concerned by OSS funding, let alone companies that have a hard time justifying paying for something free...
However, OSS is not incompatible with some form of monetization, coming up with a plan to sell courses, custom services, or cloud options is probably a safer road.
While it's certainly challenging, it's not terrible, but it requires some thought and a sustainable business model, something many FOSS developers don't want to do. https://piero.dev/category/foss-funding/
In terms of making money, I feel like the sweet spot might be a closed source app with a rich open source ecosystem around it, such as Raycast or Obsidian Notes.
The apps themselves are closed source and making money, but the extensions and add-on functionalities are mostly open source.
While I doubt I'd quit my day job for it, over the past couple of years I've been poking at my own database-backed shell history. The key requirements for me were that it be extremely fast and that it support syncing across multiple systems.
The former is easy(ish); the latter is trickier since I didn't want to provide a hosted service but there aren't easily usable APIs like s3 that are "bring your own wallet" that could be used. So I punted and made it directory based and compatible with Dropbox and similar shared storage.
Being able to quickly search history, including tricks like 'show me the last 50 commands I ran in this directory that contained `git`' has been quite useful for my own workflows, and performance is quite fine on my ~400k history across multiple machines starting around 2011. (pxhist is able to import your history file so you can maintain that continuity)
Built something similar (though I've yet to get astound to the frontend for it--vaguely intend to borrow one).
I neither love nor hate it as a sync mechanism, but I ended up satisficing with storing the history in my dotfile repo, treating the sqlite db itself as an install-specific cache, and using sqlite exports with collision-resistant names for avoiding git conflicts.
CouchDB might be useful for this scenario due to its multi-master support so devices can sync to each other without using a centralized database. It's also very performant, though if you put gigabytes of data into it, it'll also consume gigabytes of RAM.
I have a lot of feelings, but I don't have a blog so far. I feel that universities should alloc some of their funding to many of these open source projects and open source community should be better managed rather than donation. As for me, my plan is to start my own company and work on hardware .
Awesome to hear that! Been following Glicol for a while because it is such a unique and bold project!
While Sonic Pi is also beautiful and much easier to start with as a beginner, I later found the hard way that its architecture is incredibly messy - lots of unrelated parts glued together with duct tape. The simplicity and cleanness of Glicol's code is what made me immediately love it!
I'm working on a hobby project at the moment, which at the moment is all about sequencing, and my long term plan is to integrate Glicol in some way - it's a great project, and I can't wait to start digging around further.
I'm looking forward to seeing what you do with hardware - I'm sure it'll be cool!
Maybe this doesn't apply to all situations. It's just that in my case, especially in music technology, I often see funding being allocated to things that I think are somewhat disappointing, and of course, that's just my subjective opinion. More often than not, it is an overall lack of funds. Many open source projects are actually of great academic value and worthy of study. In fact, many contributors to open source projects love blogging. I often feel more rewarded reading these blogs than dozens of pages of academic papers. I can clearly feel that I will have more time to make pure open source contributions during my funded Ph.D., or even when I am lecturing at the university. But now I have to balance it with some practical considerations.
I do think a similar thing to the GP, on my case it's because universities have the problem of "how to fund potentially society-changing projects that mostly go nowhere and depend on really qualified people" closest to solved than any other institution.
There's a lot of problems that I do think universities could be working on. Creating free software is one of them.
I'm curious whether you've figured out a good monetization strategy for Atuin. The article doesn't answer that question.
Like you said, you won't be paying your rent with sponsors any time soon, but you've already quit your job. Are you living on your savings, and trying to come up with valuable paid features meanwhile? Is it the plan?
Anyway, good luck with whatever you're up to! Building a good monetization strategy is hard
I'm a big fan of keeping tons of bash history. I have an idea for your tool: allow per-project history tracking. I do this myself for some projects using some bash history hacks: https://blog.gpkb.org/posts/project-local-bash-history/
I got fired with severance, and took the opportunity to do this. It feels so rewarding to work on your own project, waking up and just deciding what you want to do next with it. There were also periods where anxiety got to me, because it's not giving me enough money to live from it (it barely supports its own costs). But all in all I'm enjoying it and learning a lot in the process. If anything, it makes me want to keep trying to find financial independence and new ideas.
That’s how I approach open source in the first place. And the part that others can help you along your way, while you allow them to use your code as well.
This person has gone from having a solid business model (having a job) to having at best a very vague one (“I hope I can focus on adding new premium hosted features for advanced users, and begin to support business usage.”).
You know how this sounds like, to me? “I quit my job to play with and walk my dog full time. I am hoping this will give me time to develop skills to eventually earn some money by maybe walking other people’s dogs, or maybe doing public performances with my dog.”
Eventually I suspect someone will abuse this for the free (and encrypted) storage, and then they'll have to fight that battle, which won't be fun, and likely end up charging. Probably should start charging for storage sooner rather than later, and just allow self hosting free.
I'd give users several megabytes of command storage, which features LRU forgetting: least recently used command is erased when a new command is remembered, and the storage is full.
Store shit in that, if that floats your boat.
Rate limiting is obviously applicable here; a normal user doesn't generate large numbers of commands in a short period.
This is fantastic. I'd love to live in a world populated by tens of thousands of incredible, practical projects like Atuin, whose maintainers earn a good living. Best of luck, Ellie!
open sourcing is utter waste of time. What do you still working for a big coporates companies in free of cost. This is how 70% of open sourcing literally works. You work for developers who are working in firms and often firms also fork your projects. Before 4 years I've inspected iOS whats app bundle they were used lot of open source projects. I am not about contributing to Linux Systems indeed projects copied by Apple & Windows bussiness.
If you do open source work for like Linux, Server or watever which is already avaiable in high cost, then It could be considerable as great hobby. Open sourcing the innovtive, invensitions aren't good in software industries they are remains fee by commericial firms. All your intentions just go waste. Most of people see this comment negative. The other side, One day come, major jobloss will directly affect you as well even you have settled well. You will understand!
Not really. You just need to be strategic. $5/m for example would be reasonable for a hosted version of this, maybe add some AI touches for pro users, and then have thousands just expense it as it is well below thresholds, or just pay for it themselves.
5 a month with 1,000 users is 5,000 a month. You add in the costs and taxes and you are left with supporting a thousand users that you can't afford to hire for. If you had 2,000 users you might be able to afford someone but now you have to support 2,000 users.
If prices were tripled you could afford to staff for now and the future (r/d).
When small you need to go higher. When you are big you can use size to scale.
$5 a month, the only support you get is the ability to cancel :-). However I kind of agree, given that this has sensitive data perhaps, so customers might get annoying about that, charge a bit more.
Yeah the Marco Arment approach of "I'm one guy, please don't ask me to spend my time on support instead of development" is clearly the correct one at that scale.
Like absolutely you want to accept bug reports and feedback, but your users should not expect active help with normal usage.
Why would they need to hire? I’m running a $5k/mo business with thousands of users and Im working solo. I have 1~2 support chat message per day which takes 10 minutes.
I don't need most of the history, but there's 0 chance in hell I'm auditing that many lines to decide what I need and what I don't need.