Which is why the XQuartz/&c. user experience on macOS really really surprised me. It's absolutely unusable. Inkscape for macOS basically may was well not exist as far as my experience with it goes.
Are there other comparable GTK+ apps that work well under macOS or is this a common story?
Apparently the Inkspace developers were looking for this OSX-native-like version to be further developed before they'd promote it. Which is a shame, as release-early-release-often seems like it would have been a better idea. Oh well. :/
I was a long time Fireworks holdout, and Designer is maybe the closest in feel compared to Sketch or Illustrator (and they're copying each other better now).
All said, the bang for buck is great though, and UI is way easier than Inkscape I gotta say ;)
They just haven't shipped your preferred feature.
Do you have any idea how long Illustrator users had to wait for artboards to land?
Visual Studio for Mac is also Gtk 2.x, but they (Xamarin at the time) invested heavily in a custom Gtk fork as much of the hacks were not suitable upstream.
rpm -qa | grep inkscape
ldd /usr/bin/inkscape | grep gtk
libgtkmm-3.0.so.1 => /lib64/libgtkmm-3.0.so.1 (0x00007f12a6a2f000)
libgtk-3.so.0 => /lib64/libgtk-3.so.0 (0x00007f12a4dbb000)
Qt is an easy sell on fresh projects, but switching a large existing codebase is huge.
Inkscape's UI uses so much common elements out of GTK too that it would really hose some of the developed creature comforts the community has got used to. It'd be a huge project with not a lot of ROI to go and redo all the chrome.
There have been a lot of major projects ported to Qt over the years, both commercial and open source (e.g. Wireshark, Subsurface).
In wireshark, for example, its meat is in the networking code, and the important skills of the developers are there, on inkscape, on the other side, the non-trivial contributions probably happen very near to the graphic frontend. If GTK goes to the gutter, retraining the developers to get to the same level in Qt will set Inkscape back years.
I'm just a light user of the tool and am genuinely curious about what things it may have lost in the transition.
For a specific platform, or multiple?
Yes, you'll don't have to worry: you'll be certain that it is poorly supported today and in the future on all platforms as far as the topics we discuss are concerned (native UI likeness).
Slack hogs RAM and burns battery like crazy, and it's a bloody chat app -- as simple as it gets.
Now imagine an Electron version of Inscape with complex vector editing, gradients and co on screen...
It doesn't match the performance of a native equivalent, but doesn't seem unreasonably slow or bloated either.
- Boxy SVG - https://itunes.apple.com/us/app/boxy-svg/id611658502
- Vectr - https://itunes.apple.com/us/app/vectr/id1204645754
100% feature parity is a meaningless test in any case. Sketch doesn't have 100% parity with Inkscape, which doesn't have 100% parity with Affinity Designer, which doesn't have 100% parity with Illustrator, etc.
Frankly, VS Code is probably the only moderately complex application that really shows off Electron. Many electron apps just aren't that well written, and not to besmirch any developers working in other toolkits, making good JS code in a larger codebase is a different kind of skill than most are used to, beyond this, the techniques and approaches for performance gains are also fairly different.
I don't think Chrome, JS, Node or Electron are going anywhere any time soon. There are some definite value wins in that space. That doesn't make it a great fit here, but I'm happy to be surprised.
1. lots of transparency with the GPU acceleration on - it very quickly becomes an order of magnitude slower than the CPU renderer, especially if you do tricks to generate 3-4 translucent shapes from every path you draw by hand like I do nowadays
2. adding new objects to a complex file starts getting super slow (somewhere around 4-5000 paths, less if you're generating lots of virtual paths via various effects) - I'm not sure if this is due to running out of physical memory, or trying to insert new items. in the middle of a very large and complex list of items, or what
3. a few large bitmap effects at 300dpi can very quickly bring IllustratorI to its knees, I'm not sure if this is due to using up tons of memory, unoptimized image convolution routines, or simply having to grovel through a lot of data.
4. also you can do some really terrible things to Illustrator's performance very quickly by applying a distortion mesh to a shape with a pattern fill that contains a lot of copies of its pattern
5. in general there's a lot of ways to send performance over a cliff by making the program generate a hell of a lot of shapes from simple rules - scatter brushes deposit a lot of copies of a shape along the path you draw, art brushes distort a shape along your drawn path, you can generate multiple paths with various programmatic effects applied to them from a single path...
You could probably edit moderately complex files with a theoretical vector package working under the handicap of being interpreted JS running in a neutered web browser. I did nice stuff with Illustrator way back in 2000, on machines that ran slower than a modern box would after the additional overhead of running compiled JS in a neutered web browser. But I sure wouldn't consider it for high-end work.
For that matter, I'm a pretty big fan of the chromebook model, so all the more happy to see web based apps working, even if google seems to be taking a couple steps back in that space. I'm simply unsure if the time/effort it would take, combined with the differing skills combined would yeild something better... more portable maybe, but not necessarily better.
Who knows though.
No, you just have to worry about JS frameworks that change every week, and HTML/CSS support which also tends not to be very stable over time. Forget about pixel-perfect UIs.
Also, the way people write those pseudo-desktop apps, their UI breaks in really funny ways when your Internet connection lags. This is not Electron issue per se, but the issue of culture around web tools.
Inkscape gives you a possibility to search for elements with its XML-widget, but that one is really fiddly and doesn't support many really useful usecases (quickly toggling visibility, selecting multiple elements, regrouping by dragging elements out of a group...).
Sure, it's not open source, but it is free.
There are flags to do just that, but it's a very lengthy process, because you have to compile GTK and other libraries first.
It works well, although there are occasional bugs, such as the shortcut keys using Ctrl instead of Cmd, if I recall correctly.
brew tap caskformula/caskformula
brew install caskformula/caskformula/inkscape
I know you have to learn to love these things when you use them daily, and you can isolate yourself from the most offensive bits through gradual configuration... but when you step back and be objective, cmon, it's a steaming pile o' crap.
Oh yeah, and moving to gitlab isn't going to help that.
I'm very grateful to my designer friend who recommended it to me.
It's 50 Euros... just buy it and get happy :)
Thankfully, with Docker, hosting complex software has become easier. It’s still not free though.
There’s also the thing about availability. Even with GitHub going down every now and then, a self-hosted solution will probably do worse.
The issues, wiki, CI, and user-ecosystem of github aren't portable. If Github shut down, changed their policies, or just simply did something you don't like, you have no recourse if you bond your software to their platform.
If you use gitlab, phabricator, gogs, or any other open source solution, either you are hosting it yourself - everyone wins - or you are using a centralized host equivalent to github, except now if they pull the rug out from under you you still have all the open source releases up to that point that you can pivot to - and you wouldn't necessarily need to self host, because someone will pick up the reins.
This is not in GitLab yet. What is in Gitlab is a way to export your projects including all the issues so you can import them on a self hosted server without losing fidelity. But that doesn't change the day to day workflow.
Also, the more players we have the less proprietary web interfaces for things like issues and PR will dominate.
I'm curious what you mean by this. In a world of discontinued acquihired services and breaking changes, firebase has just kept working across major organizational and product changes.
For example you can still use the old client libraries and URLs from before google bought them and it just works. REST urls, admin UI urls, everything redirects perfectly.
But git isn't putting all your eggs in one basket. It's a distributed version control system. GitHub is one way to access your source but it doesn't have to be the only one. That's purely a choice made by folks.
If you want to put your eggs into many baskets then great! But that doesn't prevent you from using GitHub at all.
In theory, yes. In practice you quickly end up having long discussions in "pull requests" and therelike and all references using github.com URLs. If (let's hope that doesn't happen) Github one day does what Sourceforge did and injects ads and malware you're in trouble as everybody still points to their domain.
As soon as you start using PRs, you almost certainly have part of your project's version control history (the reasons for changes) in the PR discussions, and now if Github ever shuts down you have dataloss.
I'm just waiting for GitHub to be bought up by MS and turn into Sourceforge 2.0.
github is loosing money like crazy while gitlab and bitbucket are actually profitable.
Sure it's possible but I don't get it. Don't forget as more of these OSS projects take advantage of GitLab's free repositories the more money GitLab will lose.
They're not expecting to be profitable right now but their loses have risen quite a bit.
There was a recent discussion about ElectronConf (organised by GitHub) that highlights this .
It doesn't have a culture so this is really inflammatory, and unnecessarily so. Or were you referring to GitHub the organization and not the set of people that use it?
They are also somewhat infamous for having hired one self-proclaimed "notorious SJW" who previously harassed a GitHub user for his contribution to some unrelated Twitter flamewar and later went on to work at GitHub on "community management and anti-harassment tools". Go figure.
Git is a distributed system but almost every single time GitHub goes down you get people complaining about OSS and others using / relying on GitHub but honestly you really don't have to at all. Git was designed so you wouldn't need to.
So yeah hating on GitHub can be a bit popular on HN.
How in the heck did Canonical squander such an incredible opportunity to be the de facto standard for Ubuntu/FOSS code hosting by letting Launchpad stale so badly?
They freaking built it into their distribution of apt with PPA shortcuts, etc.
Apart from realizing too late that people jumped onto git instead of their bazaar the PPA thing also is limited to Ubuntu, no support for other distributions, not even Debian. If they had made it a more universal solution back in the day this surely would have been a compelling argument for a lot of projects to move there. Another pain point is the UI, maybe not as bad as sourceforge but still cluttered and confusing.
For instance, see this FAQ answer: it comes right out and says that they have no interest in making it easy for anyone else to run instances of Launchpad, and that they only open-sourced it so that outside users could contribute improvements to Canonical's hosted service. I'm not surprised that nobody wants to work on a project like that unless they're getting paid to do so.
They hired bzr developers, including some really bright folks (Robert Collins, who I knew from the Squid project, was one of them, but there were others).
Had git not exploded in popularity and pretty much decimated the competing DVCS projects (or just grew so much faster that the end result was the same...tiny percentage user base for everything other than git), the landscape may have looked different. It took several hosting services a while to figure out that git was all that mattered.
Hell, even Microsoft and Google gave up on hosting code. If they can't do it, how can we expect Ubuntu to get it right enough fast enough? Github both got lucky and made some very good decisions very early. Survival bias looks like anybody could have done it...but, a lot of stuff had to fall into place, and it wasn't obvious to everyone what it ought to look like or how to get there.
To be clear: the resources+alternatives aren't a problem, it's big G's inability to convert them into a revenue stream with light support/maintenance costs.
But suddenly "everything" was on git; github was a rocket; bzr just felt slower and slower as my repos grew; launchpad was incredibly confusing; I'd been using git/github for outside projects and finally tried git for one of my own, and then started getting confused between the UIs. I picked the one that had the better trajectory. Also, we (the startup I was with) were starting to hire people and using git was easier than forcing the new people to learn bzr.
Mercurial looks great, and always has. I just never really had an incentive to switch to it.
Edit: Sorry, got lost in my own story. I agree with the siblings here. "junk" is rather strong for something that's so universally used. I don't love git, but it definitely won my support.
> To be fair git is junk;
git's UI could certainly benefit from some simplification ; this doesn't make git junk, especially considering that it's incredibly fast and reliable.
> it's just the standard so everyone has mostly learned it and it gets the job done, but at least Mercurial is better. Probably bzr too.
Bazaar and git actually have lots in common, they both have the advantages of DVCS (which seem to be often confused with "the advantages of git") (like, say, "rename"?)
Here are the main differences I noticed:
Bazaar is definitely slower, but you need an big-open-source-sized repo before noticing the difference.
Bazaar doesn't have "rebase" by default, but you can install it as a plugin and it works.
Bazaar has an optional "automatic push after each commit", which encourages linear history, and which I find to be more appropriate 90% of the time in a small experimented team (small and frequent commits instead of feature-branches).
Bazaar's UI uses revision numbers instead of commit hashes, this makes lots of things easier, like knowing which of two commits came first, telling a commit number to your coworkers, or bisecting without requiring the support from the VCS ( git-bisect ).
Bazaars allows remotely fetching one single revision from a repo, without requiring to download the whole history. You can't do this with git. The best I found was to loop "git fetch --depth N" with an increasing value of N until the "git checkout <commit_number>" succeeds. This is a pain, especially when working with submodules.
Bazaar doesn't have an index (aka "staging area"), and by default, "bzr commit" will commit all the local modifications. Considering that partial commits are dangerous (test suites generally don't track what's in the index, creating a mismatch between "what was tested" and "what's being pushed"), this is a welcomed simplification (of course, you can "git stash/bzr shelve" if needed).
Bazaar doesn't make a difference between a branch, a repo, and a working copy. All of these are repositories (potentially sharing ancestors) accessible through an URL. So there's no need for "remotes", you directly push/pull to/from an URL (no, you don't have to re-specify it each time).
I hope no git user was harmed during the reading of this post :-)
The flow goes like this for me:
- Make small amount of new code backed by tests in a feature branch
- Run some tests locally (not all)
- Push to the feature branch
- CI will run all the tests
- If a test fails, I get a notification about it in my IDE and can fix it immediately
I personally enjoy having staging area (and IDE supported local stash) that help me keep small amount of difference to the origin for development purposes.
Mercurial works this way too. I strongly prefer Mercurial to git, but I like partial commits and the staging area is one of the few things about git I wish Mercurial had.
There is also `git clone --depth 1` but you have to use ssh+git protocol
If this is a frequent case for you, you could easily add an alias for this.
Initialized empty Git repository in /tmp/phobos/.git/
$ git remote add origin https://github.com/dlang/phobos.git
$ git fetch --depth=1 origin 6e5cdacfa6ac018c6ef42aa9679893676f293f21
error: no such remote ref 6e5cdacfa6ac018c6ef42aa9679893676f293f21
However, the commit exists:
$ git fetch --depth=1000
[... a long time after ...]
$ git checkout 6e5cdacfa6ac018c6ef42aa9679893676f293f21
Note: checking out '6e5cdacfa6ac018c6ef42aa9679893676f293f21'.
HEAD is now at 6e5cdacf... phobos 0.2
$ git fetch --depth=1 origin 6e5cdacfa6ac018c6ef42aa9679893676f293f21
error: Server does not allow request for unadvertised object 6e5cdacfa6ac018c6ef42aa9679893676f293f21
$ git fetch --depth=1 origin phobos-0.2
This also doesn't work on gitlab.
$ git clone -q --depth=10 https://gitlab.com/fdroid/fdroidclient.git
$ cd fdroidclient
$ git fetch --depth=1 origin 5d2c2bc6e636e40eee80c59d1de6c1eff0ba4472
error: no such remote ref 5d2c2bc6e636e40eee80c59d1de6c1eff0ba4472
$ git fetch -q --depth=200
$ git checkout 5d2c2bc6e636e40eee80c59d1de6c1eff0ba4472
Note: checking out '5d2c2bc6e636e40eee80c59d1de6c1eff0ba4472'.
HEAD is now at 5d2c2bc... Merge branch 'fix-npe-verifying-perms' into 'master'
Please be honest: did you see it work at least once?
> If you want the first command to work, you will probably have to host somewhere other than github
Please keep in mind that those are generally third party projects.
Convincing all the maintainers to move away from github is going to be hard.
It seems we're back to square one.
No, I never tried this before. I was just guessing based on the client error message. But a quick look in the source reveals that this is indeed a server setting that defaults to false:
If fetching non-tagged, non-branchhead commits is actually a frequent use case for you, you could ask github whether they might change their config. You are not the first person to want this: https://github.com/isaacs/github/issues/436
> It seems we're back to square one.
Almost :). You said:
> Bazaars allows remotely fetching one single revision from a repo, without requiring to download the whole history. You can't do this with git.
As it turns out that is not correct – git can absolutely do that. But the two biggest hosters don't allow it.
"However, note that calculating object reachability is computationally expensive. Defaults to false."
"it is off by default and generally advised against on performance reasons."
> > Bazaars allows remotely fetching one single revision from a repo, without requiring to download the whole history. You can't do this with git.
> As it turns out that is not correct – git can absolutely do that. But the two biggest hosters don't allow it.
I stand corrected, I should have written instead: "git designs makes this operation so expensive that git disables it by default, which means you can't use it with github and gitlab, and probably the vast majority of git servers in the world, making it unusable in practice".
> you could ask github whether they might change their config.
C (like git) It's very good at what it was intended for, but everyone and their mother using git on the command line for VCS now is as if everyone had just used C to do every website, game, app etc since 1980.
Git, like C, is a solid core but by now we should have better abstractions to help people be even more productive.
> you can only really use Git if you understand how Git works. Merely memorizing which commands you should run at what times will work in the short run, but it’s only a matter of time before you get stuck or, worse, break something.
which is exactly why a lot of people claim that git is shit. The fact that it "won" makes those people more angry.
OTOH, I'm a kind of guy who just must take everything apart before using it and I like the brutal simplicity of git ;)
The naming of commands could be better, though.
who cares? git won, you need to use it if you are in this field. Yes, I liked bzr much better, I even liked darcs better but what can you do? git won, good or not. That git sucks is indisputable nonetheless we need to learn it and this tutorial is what made it possible for me to have some peace with git.
Oh and neutering git reset --hard because it's incredibly dumb for a version control system to just throw away shit. Instead, a backup commit is made first (and some other minor goodies): https://gist.github.com/chx/3a694c2a077451e3d446f85546bb9278
I really wish people would stop talking about computer tech as "winning" and "losing". I mean, it's slightly better than "X is the new Y Killer from X Corp" bullshit we used to get, but its still ridiculous.
Mercurial, SVN, Darcs etc are all valid tools to use and all are maintained.
> you need to use it if you are in this field.
Wow, cargo culting much?
You should be familiar enough to use it when required, sure.
You don't need to use it if you're starting a new project. My client projects default to Mercurial, and I'll give them help getting up and running with hg if they aren't familiar already.
If you have developers who want to collaborate on your work, who are able to get git to do what they want, but who objecting to using something like Mercurial, you need to question their motives.
They're either not smart enough to actually use git, and instead just memorise commands without any clue what they're doing, OR they are objecting because we all know cool kids use git and they are a cool kid.
2. Mozilla has also standardized on Mercurial
3. Mercurial’s changeset evolution is awesome: https://www.mercurial-scm.org/doc/evolution/
is there is some set of problems plaguing git users I've just never run in to?
Somehow, I rarely see a Mercurial tutorial give that same advice unless you are doing something really experimental.
I don't want to pounce on you just because you prefer Mercurial to git, so this isn't really directed at you, but in general this line of argument is always a bit frustrating to me. I've never lost data with git, but I've lost data with Mercurial several times because of the terrible UI of "hg resolve", which throws away your changes without warning unless you remember to type "hg resolve -m". None of git's questionable UI decisions (and there are many) has caused me remotely as much trouble as "hg resolve".
You have to use "git add" on a bunch of files that you have used "git add" on before.
As far as I can tell, every other revision control system tracks a file for "commit" once it has had even a single "add". This is the default case and what 99% of people want--"I told you to keep track of the file. Now keep track of it until I tell you otherwise."
git is the only revision control system I know of where I have to "git add" the same file over and over and over and over ... before doing "git commit".
But that is fairly standard git UI practice--"Optimize the 1% case and make the 99% case annoying."
It will however nuke changes that are not committed. Which is exactly what I use it for... But your script sounds like a decent solution if you want also that to be undoable
That's like complaining that git threw away your changes because you forgot to commit them before pushing. Yes hg resolve is a little bit confusing the first time you encounter it. But all your losing is the conflict resolution. You didn't lose the two heads that you were trying to merge nor did you lose the conflict markers.
If that's the only place that confused you in hg's interface then it did a way better job than Git in it's user interface.
I do agree ending up with git was unfortunate. After trying all the popular choices, Monotone was what I would have picked. Fast, secure, portable and simple.
But Linus didn't like it because it supported cherry-picking, which apparently is an anti-feature for a VCS :)
Monotone was slow back when git didn't exist, but later versions greatly improved that.
Also, it's clear Linus didn't like cherry-picking, but other people convinced him to add it to git.
I'll confess that I feel that github will eventually wane. But, I can't say why I think that right off. When they run out of money? Start failing at the hosting portion of what they do in favor of other things?
Not just ads. Bundling malware into installers of popular projects.
Book: Innovators Dilemma
I'm not saying I'd bet money on Github being a front runner in say 5 years, but it isn't that unlikely for them to still be the site in 5 or 10 years.
GitHub's major selling point is mindshare/popularity, and git by default makes moving to another git hosting service relatively trivial. Yes, there are issues to transfer, but it's nowhere near as difficult as in the days of SVN, where you'd need to have shell access to dump the repo, or hope it was a modern enough svn version to support remote dumps (and that they didn't time out).
Thanks TranquilMarmot for posting the link to the repo.
We detached this comment from https://news.ycombinator.com/item?id=14534872 and marked it off-topic.
They seem to want to differentiate themselves as (e.g. "not photoshop" in gimp's case) but seem to equate that with "ignoring good ui/ux design".
I've been using Photoshop for a long time, and I've learned a lot of its shortcuts and intricacies. Basically, when I want to accomplish something in Photoshop, I already have an idea on how I'll go about it, using the functionality that's available. But GIMP, on the other hand, never really clicked for me. I find it very unintuitive and limiting, and it's a huge pain to have to do something in GIMP when Photoshop's not readily available. I've convinced myself that this is because GIMP has a much inferior UX and is orders of magnitude more limiting than PS (at least the subset of their features I use in my day-to-day usage).
On the other hand, since my light vector editing needs have been satisfied by Photoshop for a long time, I haven't really learned Illustrator. Recently, for various reasons, I've had to do some heavier-than-usual vector editing stuff, but still nothing requiring more than simple Beziers, fills and strokes, so I've been doing it in Inkscape since it's just been handy. After some time, I decided to try and use Illustrator, figuring it'd be like a whole new world. And then, surprisingly, I realized I don't really like it. The interface was illogical and not in line with my mental model at all. I struggled to complete basic tasks, and finally gave up and did the job in Inkscape. Basically, it was very reminiscent of the Photoshop―GIMP situation.
So my conclusion is that the tools and their UX are very powerful in giving me a mental model of a task, and significantly more so than I would have imagined. So it might not be 100% true to say that the UX in these tools is inferior. It's just so different from what we're used to that we have a very, very hard time separating the "different" from "worse" in our heads.
The expected behaviour in Photoshop is that if you drag-move, you only move the selected layer. It seems like in GIMP if you drag-move, you drag the highest layer that has a painted pixel under your cursor. There are probably situations where this saves time, but more often I try and drag some text around and I end up moving a background layer by mistake.
I often find myself using Inkscape to save time, it's intuitive enough and it works well.
All in all, I've always found Gimp more intuitive and easy to use than Photoshop, probably because I learned it first!
It does take a few days for things to sink into your noggin after so long in a different tool, but I certainly think Inkscape's quality is very high. It's worth learning, unlike (imo) GIMP.
I don't think the Inkscape UI is really that hard to understand, it's just different. Granted, I haven't used Photoshop and Illustrator since version 6.0, but I had no trouble getting into Gimp or Inkscape within a couple of days of using it. I didn't find it hard to grok the UI and the intended behavior, it became pretty usable very quickly (bugs aside...).
As a developer, I can easily understand what the intended behavior in these programs is, everything is very logical.... but maybe that's the problem many users have ;)
Edit: and I can see how the bugs can confuse users to no end. It might be even the greatest disadvantage, especially inkscape behaved really buggy the last time I used it half a year ago :(
It's easy to understand the behavior and UI from a programmer/developer's perspective, but the main portion of the target group in this sector does think completely different.
For a thing that should probably take about an hour, it takes me half a day, to relearn the handful of things I need to do.
Which, I think, is pretty good, considering I'm not at all a graphics person, and usually have to search around just to discover the terms I need to search for how to do what I want.
My results are amateurish, but good enough, and probably wouldn't exist without Inkscape. I like it.
Inkscape's interface isn't too bad, not good, but I rarely have significant issues with it. GIMP is an absolute nightmare of UI/UX design, I can't understand how someone hasn't fixed that by now.
The issue, as you point out, is lack of designer. Both Qt and GTK can be extensively customised, but you need to know what you're doing (in an artistic sense).
Of course you can understand that. Try writing actual code to fix any pet peeve with GIMP's UI/UX (or any app's UI/UX). Then we'll sit down and talk again about the role of volunteers in a free software project :)
The bad thing about the UI at the moment for me is the lack of HiDPI support (as far as I can tell). And back when I was using OSX, the interface seemed a bit janky since it used XQuartz and it generally wasn't very native both in terms of UI rendering and its default keyboard shortcuts.
Nope, we bloody well know GIMP's UI sucks in many ways. But UI doesn't fix itself as you might suspect. For every few thousands of people who complain about its UI we maybe get just one occasional contributor.
Adobe Illustrator is, for a novice, surprisingly not that much better.