b. any particular reason why you chose to implement it the way you did (e.g. I see you use Python + Textual as opposed to something like https://charm.sh/libs/)
c. any major functionality you feel it's missing?
d. any limitations (e.g. doesn't work with Oracle?)
A. I was using the DuckDB CLI and kept hitting walls on analyses. Had a shower thought of “I wonder if anyone has used Textual for a DuckDB client” and decided to build one myself.
B. This uses Textual, a python framework for TUIs. Python is my language of choice.
C. Hoping for more db adapters soon. Canceling queries is a missing piece I have yet to figure out
E. Maybe if you don’t use Python apps and do have another sql client you really like.
I would like to put in a vote for k9s, which is also on the list at Terminal Trove. [0] It's the most convenient tool I've ever found for Kubernetes management. Based on that experience I'll definitely be checking out Harlequin.
Harlequin has been on my "to investigate" list since it popped up on hn a few weeks/months ago. I still need to experiment with it a bit more, but I had been thinking of building a some tragic equivalent to this, and now I don't have to :)
The HN hive mind is amusing as I came here to get to comment this exact post. It looks really interesting, and I'm also tired of giving up so much memory for just a SQL GUI tool.
pspg is a pager intended to be used with SQL command line clients. It was originally created for Postgres (hence the name), but also works with MySQL and others
Every time I notice something is a Python codebase I feel a letdown as it means it will likely break some other Python project from someone else I have on my dev machine which I may not have touched in a while but which I will find got silently broken when I most need it
Especially if it's a nice TUI app like this looks
EDIT: Guys, in my experience, virtual environments do not fix this problem in all cases. At least, not sufficiently for me (after getting used to Nix's guarantees, for example). Not to mention, there's multiple ways/attempts at creating and working with virtual environments: https://twitter.com/pmarreck/status/1735363908515295253). See below comment.
but stuff like this would usually get installed globally, and your projects would instead have a venv.
personally my favorite tool is pyenv, which allows you to have many versions of python on your machine as well as many virtualenv (which are assigned to any version you have installed)
this allows you to keep every project you have isolated not only to the packages required to run it, but also the python version required.
I work on a handful of projects that run on 3.10, 3.11 and 3.12. Each has their own independent python version, and within that version they also have their own python packages (pip environment).
At the end of the day these are simply directories on disk.
Yeah. I've heard this same comment before in different forms from "Python apologists" about a dozen times. In practice it's still a lot of hair-pulling, because your entire comment assumes that EVERY Python project is already doing things exactly this way and hasn't screwed up a single part of it: https://twitter.com/pmarreck/status/1735363908515295253
but also...
> but stuff like this would usually get installed globally
well, you just killed (apparently unknowingly?) your whole argument right there, because globals are bad and absolutely not project-specific and absolutely do cause compatibility issues between different Python projects
If you ever come around to Nix, it takes care of this problem for good (as in, it guarantees that you will never have 2 projects that step on each other), across every ecosystem, not just Python's. Unfortunately, I don't see very many Python projects at all that contain a flake.nix file, which is a damn shame, because it would cause people like me to hate Python just a tiny bit less
It works for installing dependencies from Pip, so yes, unless that Python project is doing something bizarre that it shouldn't be doing.
It's functionally identical to node_modules or Ruby Bundler or Perl local::lib.
It's so weird to me that people continue to hate Python for a problem that literally every programming language has had since shared libraries were invented.
requirements.txt is just a list of packages for pip. It doesn't even need to be called requirements.txt, it could be called poopoopeepee.txt. That name is just a convention.
You could do `for line in requirements.txt, pip install <line>`. They are ostensibly the same. It is not a magical lockfile. It is unix. It is just a list if packages. If you are in a virtualenv, you will be fine.
virtualenv activation just sets the $PATH to refer to those binaries. you can use them directly.
the production deployment for my core python app lives in /srv/app, the venv lives in /srv/venv
To update packages on the system, it is as simple as
cd /srv/app
/srv/venv/bin/pip install -r requirements.txt
Then to invoke this with the correct runtime, it is as easy as
/srv/venv/bin/gunicorn ...
In this example I am running the gunicorn application server. This is running the specific gunicorn version installed into my virtualenv.
The name `/srv/venv` is my decision. You can call that whatever you want and put it wherever you want. For instance, if you have two projects called application-foo and application-bar, you can have the following:
/srv/application-foo/app - the codebase (ie, the github repo)
/srv/application-foo/venv - the corresponding virtualenv
/srv/application-bar/app
/srv/application-bar/venv
Some people will even put their venv dir inside of their source tree and exclude it from git (add to .gitignore), but I do not like this approach because my deployments destroy the app dir and unzip the build into that location on each deploy.
I cannot speak for every python-based project (distinction from pip package) out there. A lot of people do not know what they are doing, open source is literally all the rope to hang yourself with. Anything is possible, and people without engineering experience will glue things together without understanding how they work.
If you are installing something via pip, then yes, you can create N virtualenvs and use them however you want. They are 100% isolated environments.
If you are using homebrew, apt-get, dnf/yum, arch etc... then those are going to obviously vary from distro to distro and that is outside the scope of this discussion.
I try to stick to Python's native tools as much as possible. Using a distribution package is going to cause issues, for sure. IE, don't install `apt-get install python-pil`, use a virtual environment and `pip install PIL`
alright, I will bookmark this and try this next time I want to play with a python project.
OK, how would I include all of these under the same PATH regime?
So for example say I want to run this project from a commandline location elsewhere... I'd only be able to have one venv activated at the same time in the same session, right?
I guess that's part of my issue with this. I want to be able to access 10 different Python projects' commands from the same command line at any time.
Go into this dir, and create your venvs and install the packages
cd venvs
python -m venv harlequin
~/venvs/harlequin/bin/pip install harlequin
Now this binary is available at
$ ~/venvs/harlequin/bin/harlequin
Repeat for the rest
cd ~/venvs
python -m venv pgcli
~/venvs/pgcli/bin/pip install pgcli
cd ~/venvs
python -m venv httpx
~/venvs/pgcli/bin/pip install httpx
Wash, rinse, repeat. Now you have all these binaries available and can alias them
alias harlequin="~/venvs/harlequin/bin/harlequin"
alias pgcli="~/venvs/pgcli/bin/pgcli"
alias httpx="~/venvs/httpx/bin/httpx"
This is a pain in the ass though and usually simple CLI tools like this do not collide with each other. So that is why I say install globally, or install into your "global junk drawer" virtualenv.
Meanwhile, for actual projects that you are developing on, those would have their own isolated venv.
I have a junk drawer venv where I install tools like this. If something goes wrong, it is as simple as rm -rf the venv and make a new one. And then I have isolated ones for each of the actual systems I maintain. Again, I use pyenv for this to make it a little easier to manage in conjunction with their specific python versions such that I do not ever interact with my distribution's Python. This is cross platform so it works across mac, linux etc. Very easy workflow, isolated, safe, can get blown away and recreated in a heartbeat.
You know, at this point in the complexity story, you're literally at (or beyond) the level of Nix complexity which is the very solution that everyone who engages with this level of tooling seems to be trying to avoid, and since Nix solves this problem already definitively without having to jump through all these hoops, why don't Python projects just use Nix? Then they could all be colocated on the same machine and all be accessible from the same PATH and all have their specific dependencies none of which would ever collide with each other!
Like, you're LITERALLY making a FANTASTIC argument for Nix usage in the Python ecosystem, here. In fact I'm going to bookmark this conversation now because of how ridiculously complicated your answer is compared to just using Nix.
Here's the Nix whitepaper. It's 14 pages or so. Read it on your next lunch break.
Your Tweet shows that you don't actually know what these tools do. There's not much overlap in functionality between Pyenv, Tox, and Poetry, for example.
Also, nobody active in the Python community will argue that there's 1 correct way to do packaging. That's a serious straw man.
Fortunately, none of those tools you mentioned other than venv are actually required to run Python applications, and there are in fact exactly 2 recommended ways to install Python applications:
1) Use your system's package manager
2) Use a venv, either manually (as shown in the sibling comment) or using the Pipx tool, which just creates venvs for you.
All of the other tools you mentioned (except for Pyenv) represent ~20 years of active development and iteration on how to manage projects and build packages for distribution, and end-users shouldn't even have to be aware of their existence. And Pyenv is just Rbenv but for Python.
As I've pointed out elsewhere, this is exactly the same situation as with literally every other programming language that doesn't generate standalone executables, and is even a problem with those that do, if they rely on dynamic linking. The special ire towards Python in this case is neither warranted nor valid.
Your pinned post on Twitter is predicated on double standards and lack of basic understanding of the tools you're criticizing. I'm not sure that's a good way to represent oneself.
> Your pinned post on Twitter is predicated on double standards and lack of basic understanding of the tools you're criticizing
If it means that I get to see less Python, then I guess... Mission accomplished? LOL
My first, second and tenth experiences with Python were all negative. Everything from trying to exit the REPL the first time I used it and getting chastised for doing it wrong (which meant that it knew what I was trying to do, and instead of just doing that, decided to be a little snit about it, which is about the jerkiest attitude a tool can take), to the Python 2.x->3.x transition pains, to the significant whitespace, to every project stepping on the dependencies of every other project (you might argue that "I just didn't use venv right" but I did follow every package's installation instructions!), to... Well, just read this, he did a good summary: https://medium.com/nerd-for-tech/python-is-a-bad-programming...
There's literally nothing I like about it. It just seems like a poorly written, older Ruby with a lot of baggage and a minefield of gotchas (except that I like Ruby... or did, before Elixir). Ruby should have absolutely fucking had Python's current market share, and I am 1000% convinced that if it did, everyone would be happier. Whenever Elixir tries to eat Python's lunch (like in the ML space which it's doing now), a part of me is as glad as a puppy. The language, despite its ubiquity, absolutely sucks in my mind, is not only a terrible introduction to programming for newbs but also likely an annoying language to work in, and the people who don't see it HAVE to be blind. That is the only conclusion I can come to, and I'm entitled to my opinion, strong as it is.
Next you're going to say that it is unprofessional to have such strong feelings about code and languages. To that I say: So what, dude. I care. Caring means having very positive feelings about the design decisions behind languages, or very negative feelings about the same. (I don't like Go, either. Python's basically a step above PHP on the "tech I want to minimize my time to zero with" department.)
> every project you have should be in a virtual environment. it is not hard.
If it's not hard then why is it not being done by default?
I am not a Python dev and I don't want to be, but when I have to install something via `pip` there's always pain.
"You could learn it", yeah yeah, Python is so special everyone has to learn it even if they don't ever work with it. How about learning from something modern like Go and Rust? `go install github.com/user/tool@latest`, boom, done. `cargo install the_tool`, boom, done.
Although your tool looks more like Virtualenvwrapper.
It's good to have other options of course (and yours looks nice), but it's also good to at least make a case for improvement over what's already out there.
His attitude is very Pythonic. That's another thing I didn't like. Other languages let you write whatever you want (maybe to a fault, but still) and support you. This guy's all like "Python has venv and everything else is superfluous so why did you write this". Sigh.
While I do like pipx, and am a diehard Python fan, if the interpreter used to install the project changes, I still have to reinstall the project, which is annoying.
I suppose I could get around that by installing a Python interpreter outside of brew, and only use that for packages.
Docker wastes a lot of extra space/resources (disk space, memory, possibly CPU) and can have issues when it comes to needing networking (you start to need a container coordinator). But yes, I've used it in the past. Seems more geared towards server apps than anything else though (although being able to run macOS via Docker is kind of neat).
I did not get a chance to give this a whirl yet but I am excited to do so. I mainly used pgcli but it’s pretty buggy and certainly doesn’t approach IDE status.
If you develop SQL you need to write code and view tables. A grid with monospace characters is perfectly suitable for that.
But other than that, no, people don't love being in a terminal. It just happens that all open source portable toolkits like qt and gtk do not work via ssh and are in general total abominations for developers and users. The vt100 standard is 45 years old and turned out to be the lowest common denominator to write GUIs for better or worse.
Me. I love being in the terminal. Full-screen nvim with split buffers. If I want a terminal I can spawn it from inside nvim as a new buffer, or use my terminal emulator (kitty)’s native split functionality. I can do git actions without leaving nvim, too.
It doesn't seem like it to me. ISTM that they're implying that most sql tools have a gui component which isn't friendly to running remotely over an ssh session.
I've personally never seen a good experience of display forwarding over ssh.
Of course, most people and tools get around this by tunneling a connection to the database over ssh and running the GUI locally.
I like TUI because it usually works nicely with multiplexer like zellij or tmux.
Open a new pane or tab where you can open a new set of TUI apps.
Like have Helix or Nvim open, but you want to quickly check your database queries code changes. You can just open a new tab run the this sql IDE, fire off some queries check the result if it matches go back to coding new features. If not change the code, check queries again etc etc.
Speaking for myself, it wasn't a "movement against application bloat" so much as a "frustrated response to these darn bloated applications." There wasn't anything ideological about it, I literally ran out of memory a few times when using VSCode. The important thing is that editing plain text works perfectly well in a terminal, whereas you need a modern GUI for most other business-related software.
I switched to emacs during the pandemic because of Zoom and Slack (along with my horrible browser habits). VSCode is pretty reasonable on resources compared to many Electron apps, and I slightly prefer it to emacs in terms of overall experience. But emacs is also good, and there were just too many Zoom calls where my computer ran out of memory, with VSCode having a glaringly high footprint. I think at the time its terminal emulator was either excessively inefficient, or it had a specific resource leak. So maybe things have gotten better, but I've stuck with emacs regardless.
These days I can let a few dozen unread tabs in Firefox fill in the extra ~1GB of memory VSCode was formerly occupying :)
I do a lot of work with SQLite, and I do sometimes use one of the GUI clients (specifically sqlitebrowser.org). However, I mostly use the command line client that comes with SQLite, not because of how many features it has (not a lot) but because I use Bash and everything that comes with it, across various sessions managed with Tmux, as part of what I do with SQLite. It’s the same reason I usually use Vim in a terminal session instead of the GUI version. Using the GUI version means stepping away from a lot of my best tools.
1. I occasionally have to browse databases through SSH.
2. The CLI / TUI apps take less memory.
3. They also don't lag and lag is something I am absolutely sick of.
4. Easy to build habits in terms of blind keyboard pressing (very much like cashiers on the older DOS keyboard-only software in retail shops; you just know where everything is and are times more productive).
I originally wrote this because sometimes a CLI or TUI is just super convenient. I used to use the DuckDB and Sqlite CLIs a lot, but was frustrated by their limitations, especially for doing data analysis work (my background).
I'm not excited about it being a TUI, I'm just excited because it looks reasonable and I haven't found a macOS SQL client for writing queries that I was excited about yet
Suggestion: under "works with your database" consider naming the databases rather than only offering a list of icons.
Also seems to be a bit of hubris to claim a SQL IDE "works with your database" when SQL Server and Oracle, two of the database products with the largest market share, are not supported (yet?)
Alright so first off, for 90% of SQL Server's existence, Microsoft was openly open-source-hostile, so give me a fucking break with this. Microsoft used to be way worse than even Apple about keeping everything in their ecosystem- at least Apple is built on BSD underpinnings and was therefore also always compatible with POSIX stuff.
Oracle... last I checked on that monstrosity it had about 20 different clients all written at different times for different use cases over the long course of Oracle's history. So, which of these 20 different clients should one write a TUI for?
Fair enough (author here). I just launched support for any databases other than DuckDb and published a guide to create adapters for new dbs. I’m expecting the community to step up here, since I’d rather spend my time adding features to Harlequin. ODBC should be coming shortly. The hard part is honestly just having access to a DB server for testing.
> SQL Server and Oracle, two of the database products with the largest market share
Agree on the first point, but is 'market share' - a metric comparing commercial sales revenue which by definition excludes open source software, really that relevant when critiquing a tagline for a FOSS tool implicitly targeting FOSS databases at a time when FOSS dominates?
If this wasn't an MIT licensed project without a single hint of a commercial offering I'd get it, but come on.
It's relevant in that they're prolific databases, so there's a nontrivial chance that a user of one of them will:
- stumble across the tool
- read that it "works with their database"
- hopefully read the actual list of supported databases
- become disappointed to learn that it does not in fact work with their database
In the case that hopes are not met, the user actually downloads the application and discovers the hard way that it does not work with their database.
So market share does seem relevant to the frequency of disappointment though. Is it going to be a frequent enough occurrence to be worth spending any time doing anything about? I don't know.
Looks nice already, but a true "SQL IDE" should also strive for feature parity with existing database frontends like LibreOffice Base, or with old-style MS-DOS/TUI applications for database access which had a similar featureset. Meaning an integrated view of database design, data entry/inspection (with full-screen, form-like views for individual records where appropriate!), custom querying (including a more-or-less seamless integration of QBE and raw SQL) and report generation. Hopefully we're going to see some of this in future releases!