Hacker News new | past | comments | ask | show | jobs | submit login
Inkscape Moves to GitLab (inkscape.org)
525 points by dabber on June 11, 2017 | hide | past | favorite | 208 comments



I used to use Inkscape constantly on Windows & Linux, and really like it. I found the UI intuitive and it did absolutely everything I asked of it.

Which is why the XQuartz/&c. user experience on macOS really really surprised me. It's absolutely unusable. Inkscape for macOS basically may was well not exist as far as my experience with it goes.

Are there other comparable GTK+ apps that work well under macOS or is this a common story?


There's a branch/package by one of the devs that uses the native menu bar, native copy paste, drag and drop, and a few other things that make it feel like an actual Mac application. No clue why it isn't the official distribution, but it's been that way for years.

https://inkscape.org/cs/~su_v/galleries/


This hasn't been updated in 2 years.


I used to use that, and for my needs (pretty lightweight) it was fine.

Apparently the Inkspace developers were looking for this OSX-native-like version to be further developed before they'd promote it. Which is a shame, as release-early-release-often seems like it would have been a better idea. Oh well. :/


I ran into this issue and ended up using Affinity Designer. I was pleasantly surprised, it's actually quite comparable to Adobe Illustrator. It's not open source but at $50 much more reasonable than Adobe.


I second this. I've been using Affinity Designer for posters, UI mock-ups, icons, etc. Honestly, I can't use Inkscape at all. I find the interface highly bloated and the whole application generally slow.


How does Designer compare to Illustrator feature wise, and as far a usability? We already have Illustrator. Is it worth adding Designer?


I use the two side by side. At times I've found Designer's SVG support actually worked better than Illustrator. Illustrator is definitely faster but there are quite a number of useful pixel/recolouring features in Designer. Generally though the maturity and advanced features (mesh, isolate edit, Pathfinder!) eclipse what Designer brings to the table.

I was a long time Fireworks holdout, and Designer is maybe the closest in feel compared to Sketch or Illustrator (and they're copying each other better now).

All said, the bang for buck is great though, and UI is way easier than Inkscape I gotta say ;)


Just another data point: to me Affinity Designer actually seems to render much faster than Adobe Illustrator (on a 2015 Macbook Pro with Intel integrated graphics chipset).


Thanks for this: I didn't know about Designer, and was similarly let down by how less pleasant Inkscape was on Mac, coming from a lovely experience of it on Windows.


On Affinity Designer's site the price is $39.99, so even more reasonable.


20% off for a while with the launch of Affinity Photo for iPad at Apple's WWDC last week.


Affinity Designer development seems to have stalled. There are many requests for features that have been accepted but untouched for a year or two. My current showstopper: export to DXF.


I don't feel that "stalled" this is correct at all. They had a massive feature update via 1.5 in Oct. and 1.6 entered customer beta in May.

They just haven't shipped your preferred feature.


Ye gods, someone's pet peeve hasn't been touched in a year, and that means development has "stalled". Young people today.

Do you have any idea how long Illustrator users had to wait for artboards to land?


Are you playing with a cheap laser engraver off ebay?


One problem is that they are still on Gtk 2.x. There is so much work that was done in 3.x over the last 7 years that improves things.

Visual Studio for Mac is also Gtk 2.x, but they (Xamarin at the time) invested heavily in a custom Gtk fork as much of the hacks were not suitable upstream.


I just launched Inkscape, and noticed the port seems to be done! In Fedora 26, they ship Inkscape 0.92 which uses Gtk+ 3.


0.92.x does not use gtk3 (there is an experimental compile flag to use it, but it's still experimental)


You're both right. Fedora enables this option on F26.

rpm -qa | grep inkscape

inkscape-0.92.1-4.20170510bzr15686.fc26.x86_64

ldd /usr/bin/inkscape | grep gtk

libgtkmm-3.0.so.1 => /lib64/libgtkmm-3.0.so.1 (0x00007f12a6a2f000)

libgtk-3.so.0 => /lib64/libgtk-3.so.0 (0x00007f12a4dbb000)


I wonder ... if they should do what Wireshark did, and abandon GTK for Qt.


Probably, and I've heard from some Inkscape devs that they'd like to. But it's a pretty massive project, which would directly result in the current developers being less proficient at the UI toolkit they use.

Qt is an easy sell on fresh projects, but switching a large existing codebase is huge.


At this point moving to Qt for something as large as Inkscape would be too massive I imagine. Just +1'ing your point.

Inkscape's UI uses so much common elements out of GTK too that it would really hose some of the developed creature comforts the community has got used to. It'd be a huge project with not a lot of ROI to go and redo all the chrome.


Sure, but how does it compare with gtk 2 vs 3? We're already talking about replacing a working toolkit with a significantly different one.


> Qt is an easy sell on fresh projects, but switching a large existing codebase is huge.

There have been a lot of major projects ported to Qt over the years, both commercial and open source (e.g. Wireshark, Subsurface).


I'm well aware. We've also sent spacecraft to Mars. Porting a large codebase from one UI toolkit to another, all the while giving up hard-earned knowledge in a volunteer-driven project is still a super daunting task.


I'd argue that this depends a lot how the code is structured. Back in the day of Win32 and MFC it was not unusual at all to very tightly integrate logic and the UI, often even outsourcing parts of state machines to GUI controls etc.; such a code base is difficult to port to anything. If, on the other hand, the application is written in a more layered approach, then porting from one toolkit to another may mean to just re-engineer the UI and only the UI. Complex custom controls mentioned by Raphael below are another factor.


It's a graphic design program. If there is a project where tight integration between the graphic library and the rest of the codebase is expected (for performance reasons) and hard to avoid is here.

In wireshark, for example, its meat is in the networking code, and the important skills of the developers are there, on inkscape, on the other side, the non-trivial contributions probably happen very near to the graphic frontend. If GTK goes to the gutter, retraining the developers to get to the same level in Qt will set Inkscape back years.


I'd argue that both WireShark and Subsurface have much simpler UI (and use mostly standard widgets) than Inkscape.


If Wireshark is any example, definitely not. I've not met anyone that likes the Qt version (in VoIP, it's an indispensable tool used all the time) - it seems like it lost a ton of polish.


This is somewhat off-topic, but I'm curious since every person who has had this sentiment that I've known was unable to provide specific parts which were lacking. Do you perhaps have a short list of top things which are missing their polish with the Qt version of Wireshark?

I'm just a light user of the tool and am genuinely curious about what things it may have lost in the transition.


I’m I using it for debugging barcode scanners talking to a web service. I much prefer the qt based UI. It’s much faster and obviously feels much more native to the platform


> more native to the platform

For a specific platform, or multiple?


The Qt version is miles ahead on OSX since it's no longer a clunky X11 app.


Qt works much better than GTK+ on both Windows and macOS. Unlike GTK+, it’s much harder to distinguish a Qt app and a native app for those platforms.


Cue the downvotes, but at least if you use Electron and build a custom UI you don't need to worry about it being unsupported or poorly supported in the future on certain platforms. It's just HTML + CSS which is going to be around a lot longer than the current iteration of UI frameworks.

(Ok yes, you could still face issues with the underlying runtime being unsupported, but I have a feeling that and JavaScript are going to stick around for a while)


>Cue the downvotes, but at least if you use Electron and build a custom UI you don't need to worry about it being unsupported or poorly supported in the future on certain platforms.

Yes, you'll don't have to worry: you'll be certain that it is poorly supported today and in the future on all platforms as far as the topics we discuss are concerned (native UI likeness).


I'll take a non-native looking app that works (every Electron app I've tried on macOS) over XQuartz-based garbage that hangs forever at startup, as the current "stable" Inkscape release does for me.


Maybe, but not for a graphics editor.

Slack hogs RAM and burns battery like crazy, and it's a bloody chat app -- as simple as it gets.

Now imagine an Electron version of Inscape with complex vector editing, gradients and co on screen...


It's not even very hard to imagine. Just go to any web-based diagram drawing service (there's plenty of that), and look how sluggish and crappy it feels even when performing simplest and most basic tasks. Now multiply that crappiness 10x for the complexity of vector drawings (100x if you're doing vector art), and additional 10x for the complexity of the effects vector formats can handle - and you'll get a ballpark of how Inkscape would feel if it was an Electron app.


I've used https://app.moqups.com/ for diagramming/wireframing for a few years, and it doesn't feel sluggish to me at all. Feel pretty snappy, in fact.


We don't have to imagine. Someone has made a clone of Sketch for the web: https://app.designer.io/. There's an app to download as well.

It doesn't match the performance of a native equivalent, but doesn't seem unreasonably slow or bloated either.


There are also two other Electron-based vector graphics editors on the app store:

- Boxy SVG - https://itunes.apple.com/us/app/boxy-svg/id611658502

- Vectr - https://itunes.apple.com/us/app/vectr/id1204645754


Can we tell without feature parity? I were writing such a webapp, I'd skip features that were slow.


The OP suggested that no web/Electron based editor could have "complex vector editing, gradients and co on screen". This has nothing to do with achieving 100% feature parity, so yes, we can tell. Anything else is moving the goalposts.

100% feature parity is a meaningless test in any case. Sketch doesn't have 100% parity with Inkscape, which doesn't have 100% parity with Affinity Designer, which doesn't have 100% parity with Illustrator, etc.


Boxy SVG [1] is an Inkscape clone built with Electron and Xel widget toolkit [2]. I would say its UI is on pair with native macOS apps such as Pages or Keynote.

[1] https://itunes.apple.com/us/app/boxy-svg/id611658502

[2] https://xel-toolkit.org/


I think the downvotes are partly because of your "Cue the downvotes" at the beginning... beyond this, even as a big fan of Electron based applications, for a lot of things, it's not a great fit for a many things. I'm not sure a vector graphics program is a good fit or not, or where the edges in performance may be. I know that using it for heavy filters on raster art would be too slow comparatively for many.

Frankly, VS Code is probably the only moderately complex application that really shows off Electron. Many electron apps just aren't that well written, and not to besmirch any developers working in other toolkits, making good JS code in a larger codebase is a different kind of skill than most are used to, beyond this, the techniques and approaches for performance gains are also fairly different.

I don't think Chrome, JS, Node or Electron are going anywhere any time soon. There are some definite value wins in that space. That doesn't make it a great fit here, but I'm happy to be surprised.


My experience with many years of pushing Illustrator's limits is that the big performance hits are:

1. lots of transparency with the GPU acceleration on - it very quickly becomes an order of magnitude slower than the CPU renderer, especially if you do tricks to generate 3-4 translucent shapes from every path you draw by hand like I do nowadays 2. adding new objects to a complex file starts getting super slow (somewhere around 4-5000 paths, less if you're generating lots of virtual paths via various effects) - I'm not sure if this is due to running out of physical memory, or trying to insert new items. in the middle of a very large and complex list of items, or what 3. a few large bitmap effects at 300dpi can very quickly bring IllustratorI to its knees, I'm not sure if this is due to using up tons of memory, unoptimized image convolution routines, or simply having to grovel through a lot of data. 4. also you can do some really terrible things to Illustrator's performance very quickly by applying a distortion mesh to a shape with a pattern fill that contains a lot of copies of its pattern 5. in general there's a lot of ways to send performance over a cliff by making the program generate a hell of a lot of shapes from simple rules - scatter brushes deposit a lot of copies of a shape along the path you draw, art brushes distort a shape along your drawn path, you can generate multiple paths with various programmatic effects applied to them from a single path...

You could probably edit moderately complex files with a theoretical vector package working under the handicap of being interpreted JS running in a neutered web browser. I did nice stuff with Illustrator way back in 2000, on machines that ran slower than a modern box would after the additional overhead of running compiled JS in a neutered web browser. But I sure wouldn't consider it for high-end work.


I agree... I only meant that there are differences, and the style of programming it takes for high performance JS is sometimes counter intuitive to what one would do in another language. The browser is effectively a sandboxed operating system at this point... there's a LOT you can do there.

For that matter, I'm a pretty big fan of the chromebook model, so all the more happy to see web based apps working, even if google seems to be taking a couple steps back in that space. I'm simply unsure if the time/effort it would take, combined with the differing skills combined would yeild something better... more portable maybe, but not necessarily better.

Who knows though.


> if you use Electron and build a custom UI you don't need to worry about it being unsupported or poorly supported in the future on certain platforms. It's just HTML + CSS which is going to be around a lot longer than the current iteration of UI frameworks.

No, you just have to worry about JS frameworks that change every week, and HTML/CSS support which also tends not to be very stable over time. Forget about pixel-perfect UIs.

Also, the way people write those pseudo-desktop apps, their UI breaks in really funny ways when your Internet connection lags. This is not Electron issue per se, but the issue of culture around web tools.


I've been very pleased with Inkscape for general draw program use. I use it to create Wikipedia illustrations, since Wikipedia prefers SVG.


For simple vector diagrams, I've found https://www.draw.io (similar to MS Visio) to be very convenient. The code is also open source (Apache-2.0): https://github.com/jgraph/draw.io


I didn't know about draw.io's license - that is great. I have been enjoying yEd but also looking for other diagramming software since rotating text is difficult in yEd, and it's not free software [2].

[1] https://www.yworks.com/products/yed

[2] http://yed.yworks.com/support/qa/515/why-not-an-open-source-...


I agree that Inkscape is awesome, but the UI still could use some love in some places. What I am regularly missing, for example, is a widget that displays nested groups as intuitively as Adobe Illustrator does.

Inkscape gives you a possibility to search for elements with its XML-widget, but that one is really fiddly and doesn't support many really useful usecases (quickly toggling visibility, selecting multiple elements, regrouping by dragging elements out of a group...).


Like the objects dialog in 0.92 ?


Oh, did I miss out on something? ...I'll have a look at that then, thanks for the hint! :-)


I found the same issue, so promptly moved to use BoxySVG, which I must say is pretty amazingly simple and modern.

Sure, it's not open source, but it is free.


When I was using OS X, I used to compile my own version using Brew that behaved like a native application.

There are flags to do just that, but it's a very lengthy process, because you have to compile GTK and other libraries first.

It works well, although there are occasional bugs, such as the shortcut keys using Ctrl instead of Cmd, if I recall correctly.


You know, I'd be willing to do that. Do you know of a good tutorial on this?


No need to compile GTK, etc. yourself. Homebrew has you covered.

    brew tap caskformula/caskformula
    brew install caskformula/caskformula/inkscape
Mentioned at the bottom of the page [1].

[1]: https://inkscape.org/en/download/mac-os/


You've just explained why you find it intuitive / better... (familiarity). Inkscape/Illustrator/whatever doesn't really have a layout, it's just: shit all over the place, some in tool palette thingimombobs, some hidden under contextual menus and file menus. Sometimes there is some effort to make mode based or hierarchy give some kind of sense of order but it's always artificial, there are too many tools and there don't need to be.

I know you have to learn to love these things when you use them daily, and you can isolate yourself from the most offensive bits through gradual configuration... but when you step back and be objective, cmon, it's a steaming pile o' crap.

Oh yeah, and moving to gitlab isn't going to help that.


Save yourself the hassle and buy Affinity Designer – it does a better job of most of what I did before with Adobe Photoshop & Illustrator (or Sketch for that matter).

I'm very grateful to my designer friend who recommended it to me.

It's 50 Euros... just buy it and get happy :)


It works okay-ish for me, although I pretty much never use copy&paste in XQuartz on mac. But I still can't name a better free vector graphics program on mac.


Me too. Not a great issue, though the XQuartz background thing kinda screws up Command+tab, it's still perfectly usable, if a little rough around the edges.


GIMP works decently well under OSX. It didn't until recently, though; it used to be the same XQuartz story until they finished the native GTK+ build.


Every time a project moves to GitLab or GitHub it is great news; I find them much easier to contribute to. It's specially goo news when it's gitlab, it's just an all-around awesome service.


It's really bad news when it's GitHub, as far as I'm concerned (Python...). Self-hosted GitLab or Phabricator is great news (GNOME,[1] Wikimedia[2]).

[1]: https://wiki.gnome.org/Initiatives/DevelopmentInfrastructure...

[2]: https://www.mediawiki.org/wiki/Requests_for_comment/Phabrica...


Self-hosting isn’t easy. It adds non-trivial administrative overhead. That time might otherwise be used “productively”.

Thankfully, with Docker, hosting complex software has become easier. It’s still not free though.

There’s also the thing about availability. Even with GitHub going down every now and then, a self-hosted solution will probably do worse.


Care to elaborate on the GitHub python issues for someone not totally in the loop? Thanks!


It isn't so much UX now, it is that you are bonding a lot of locked-in parts of your ecosystem to a proprietary platform.

The issues, wiki, CI, and user-ecosystem of github aren't portable. If Github shut down, changed their policies, or just simply did something you don't like, you have no recourse if you bond your software to their platform.

If you use gitlab, phabricator, gogs, or any other open source solution, either you are hosting it yourself - everyone wins - or you are using a centralized host equivalent to github, except now if they pull the rug out from under you you still have all the open source releases up to that point that you can pivot to - and you wouldn't necessarily need to self host, because someone will pick up the reins.


I really wish GitLab had its issues stored in a Git repo like it stores its Wiki. It would open up a lot of cool new workflows / collaboration.


I opened up an issue for this a year ago https://gitlab.com/gitlab-org/gitlab-ce/issues/4084

This is not in GitLab yet. What is in Gitlab is a way to export your projects including all the issues so you can import them on a self hosted server without losing fidelity. But that doesn't change the day to day workflow.


I agree that Issues are pretty locked, but otherwise it's fine, and you don't have to use GitHub Issues to track work if you don't want to. CI is mostly drive around hooks, and those should be portable. The wiki is just a collection of Markdown documents, so you could pull those off and put them anywhere else you want. Ecosystems are tied by default, so not much you can do there I suppose. I agree with the sentiment, but i think the risk is worth it for a lot of people, and I say that hosting my own private GitLab server.


It's pretty easy to migrate off GitHub, if one cares to. I'm not sure why you'd be bonded really.


Why so much hate for GitHub? My general experience with them has been positive :)


I don't think it is hate, more that people have learned to not put all their eggs in the same basket. Ask any Firebase user what they feel about this.

Also, the more players we have the less proprietary web interfaces for things like issues and PR will dominate.


> Ask any Firebase user what they feel about this.

I'm curious what you mean by this. In a world of discontinued acquihired services and breaking changes, firebase has just kept working across major organizational and product changes.

For example you can still use the old client libraries and URLs from before google bought them and it just works. REST urls, admin UI urls, everything redirects perfectly.


Firebase increased prices three times in one year.


> I don't think it is hate, more that people have learned to not put all their eggs in the same basket.

But git isn't putting all your eggs in one basket. It's a distributed version control system. GitHub is one way to access your source but it doesn't have to be the only one. That's purely a choice made by folks.

If you want to put your eggs into many baskets then great! But that doesn't prevent you from using GitHub at all.


For a dumb git host GH is passable, the problem is with their additional features which many projects use. If/when GH goes the way of the dodo they will be at the mercy of 3rd parties' ability to scrap and import this stuff.


> If you want to put your eggs into many baskets then great! But that doesn't prevent you from using GitHub at all.

In theory, yes. In practice you quickly end up having long discussions in "pull requests" and therelike and all references using github.com URLs. If (let's hope that doesn't happen) Github one day does what Sourceforge did and injects ads and malware you're in trouble as everybody still points to their domain.


Yes, if you use Github but don't use PRs you're not putting all your eggs in one basket.

As soon as you start using PRs, you almost certainly have part of your project's version control history (the reasons for changes) in the PR discussions, and now if Github ever shuts down you have dataloss.


> I don't think it is hate, more that people have learned to not put all their eggs in the same basket.

I'm just waiting for GitHub to be bought up by MS and turn into Sourceforge 2.0.


Jokes aside, you actually make a good point.

github is loosing money like crazy while gitlab and bitbucket are actually profitable.


GitHub is losing money but that doesn't mean GitLab and BitBucket are profitable. BitBucket is part of a larger organization so it really doesn't matter if it loses money. As for GitLab I can't find anything but by 2019 they're expecting SaaS to be the majority of the way they make money (which is identical to GitHub) so I'm not sure why you expect the incumbent to fail and the underdog to succeed with identical business models.

Sure it's possible but I don't get it. Don't forget as more of these OSS projects take advantage of GitLab's free repositories the more money GitLab will lose.


source on that claim? If I remember right, the last numbers I published (from some bloomberg article last year) showed them spending a lot, but profitable.


Not parent but here you go: https://www.bloomberg.com/news/articles/2016-12-15/github-is...

They're not expecting to be profitable right now but their loses have risen quite a bit.


Thx, that was the article I was thinking about, and clearly remembered wrong.


There is the fact it is proprietary, and the fact its got most of the other eggs, but most importantly in my book is the fact it has an incredibly unstable and racist culture.

There was a recent discussion about ElectronConf (organised by GitHub) that highlights this [1].

[1] https://news.ycombinator.com/item?id=14480868


> it has an incredibly unstable and racist culture

It doesn't have a culture so this is really inflammatory, and unnecessarily so. Or were you referring to GitHub the organization and not the set of people that use it?


Not the OP but I presume GitHub-the-company was meant since it is them who are behind ElectronConf, not their customers.

They are also somewhat infamous for having hired one self-proclaimed "notorious SJW" who previously harassed a GitHub user for his contribution to some unrelated Twitter flamewar and later went on to work at GitHub on "community management and anti-harassment tools". Go figure.


GitHub is great but there is a group of people, many of whom are on HN, who prefer OSS to be on non-proprietary systems. Personally I think you should just use whatever tool works best but not everyone agrees.

Git is a distributed system but almost every single time GitHub goes down you get people complaining about OSS and others using / relying on GitHub but honestly you really don't have to at all. Git was designed so you wouldn't need to.

So yeah hating on GitHub can be a bit popular on HN.


People had positive experience with SourceForge too, before they started bundling adware. I wouldn't really use these proprietary platforms for anything more than a dumb DVCS host.


Github praises open source but they sure do keep their own stuff as proprietary as possible. (Even their EE is obfuscated code)


None of the extra data (issues etc) is stored in git. However they do have a comprehensive HTTP API, so one can migrate away using that.



Question: is all the auxiliary project data (issues, etcetera) also stored in the git repository, or is that data stored separately by github (and thus impossible to migrate out of it)?


Separate. You can fetch most of issues and so on as long as 1) GitHub does not change its policy 2) You have an account 3) The project has not been deleted or somehow locked


They play too much politics, I like to keep politics out of tech.


Your experience has been positive because the closed-source platform fueled by $350,000,000 in VC hasn't had to address revenue issues, yet.


Because they comply with government censorship [1]

[1] https://en.wikipedia.org/wiki/Censorship_of_GitHub#Russia


> During the decision about which platform would host our git repositories, we discounted staying on Launchpad itself as its git support was very weak compared to other platforms and the project doesn't appear to be actively developed.

How in the heck did Canonical squander such an incredible opportunity to be the de facto standard for Ubuntu/FOSS code hosting by letting Launchpad stale so badly?

They freaking built it into their distribution of apt with PPA shortcuts, etc.

Unbelievable.


Apparently they started this thing right at the time where distributed version control systems started to take off, years before github (more or less at the same time git appeared). They were very early and clearly missed a big opportunity there.

Apart from realizing too late that people jumped onto git instead of their bazaar the PPA thing also is limited to Ubuntu, no support for other distributions, not even Debian. If they had made it a more universal solution back in the day this surely would have been a compelling argument for a lot of projects to move there. Another pain point is the UI, maybe not as bad as sourceforge but still cluttered and confusing.


Yeah, among many other reasons, this is an example of Canonical's general proprietary attitude coming back to bite them.

For instance, see this FAQ answer[1]: it comes right out and says that they have no interest in making it easy for anyone else to run instances of Launchpad, and that they only open-sourced it so that outside users could contribute improvements to Canonical's hosted service. I'm not surprised that nobody wants to work on a project like that unless they're getting paid to do so.

[1]: https://answers.launchpad.net/launchpad/+faq/920


They bet quite heavily on Launchpad and related infrastructure, but a few things happened on the way to the opera, and here we are.

They hired bzr developers, including some really bright folks (Robert Collins, who I knew from the Squid project, was one of them, but there were others).

Had git not exploded in popularity and pretty much decimated the competing DVCS projects (or just grew so much faster that the end result was the same...tiny percentage user base for everything other than git), the landscape may have looked different. It took several hosting services a while to figure out that git was all that mattered.

Hell, even Microsoft and Google gave up on hosting code. If they can't do it, how can we expect Ubuntu to get it right enough fast enough? Github both got lucky and made some very good decisions very early. Survival bias looks like anybody could have done it...but, a lot of stuff had to fall into place, and it wasn't obvious to everyone what it ought to look like or how to get there.


Google and Microsoft obviously can do it; they just don't want to spend resources on a public service when there are many other alternatives.


Pretty sure Google can't do user-facing tools that have a greater support burden than "FYI type your query into the box" (and also don't generate revenue)

To be clear: the resources+alternatives aren't a problem, it's big G's inability to convert them into a revenue stream with light support/maintenance costs.


Are you implying bzr isn't the next big thing??


To be fair git is junk; it's just the standard so everyone has mostly learned it and it gets the job done, but at least Mercurial is better. Probably bzr too. I have probably 5 more years experience with it than Mercurial and I'm still googling for the correct incantation for things that are just infrequent enough for me to commit them to memory. With Mercurial, the command is what you would expect 90% of the time.


I remember when dvcs really started taking off, I went with bzr because of the simplicity of the UI. I was also on Windows at the time (wow, that was a while ago!), and git didn't have native windows support back then while bzr did. I still much prefer bzr's UI from what I remember of it. It was simple, consistent, and perfectly fine with the occasional typo.

But suddenly "everything" was on git; github was a rocket; bzr just felt slower and slower as my repos grew; launchpad was incredibly confusing; I'd been using git/github for outside projects and finally tried git for one of my own, and then started getting confused between the UIs. I picked the one that had the better trajectory. Also, we (the startup I was with) were starting to hire people and using git was easier than forcing the new people to learn bzr.

Mercurial looks great, and always has. I just never really had an incentive to switch to it.

Edit: Sorry, got lost in my own story. I agree with the siblings here. "junk" is rather strong for something that's so universally used. I don't love git, but it definitely won my support.


I liked your story, and "junk" was definitely hyperbole. Git gets the job done.


Long time user of both bazaar and git here.

> To be fair git is junk;

git's UI could certainly benefit from some simplification ; this doesn't make git junk, especially considering that it's incredibly fast and reliable.

> it's just the standard so everyone has mostly learned it and it gets the job done, but at least Mercurial is better. Probably bzr too.

Bazaar and git actually have lots in common, they both have the advantages of DVCS (which seem to be often confused with "the advantages of git") (like, say, "rename"?)

Here are the main differences I noticed:

Bazaar is definitely slower, but you need an big-open-source-sized repo before noticing the difference.

Bazaar doesn't have "rebase" by default, but you can install it as a plugin and it works.

Bazaar has an optional "automatic push after each commit", which encourages linear history, and which I find to be more appropriate 90% of the time in a small experimented team (small and frequent commits instead of feature-branches).

Bazaar's UI uses revision numbers instead of commit hashes, this makes lots of things easier, like knowing which of two commits came first, telling a commit number to your coworkers, or bisecting without requiring the support from the VCS ( git-bisect ).

Bazaars allows remotely fetching one single revision from a repo, without requiring to download the whole history. You can't do this with git. The best I found was to loop "git fetch --depth N" with an increasing value of N until the "git checkout <commit_number>" succeeds. This is a pain, especially when working with submodules.

Bazaar doesn't have an index (aka "staging area"), and by default, "bzr commit" will commit all the local modifications. Considering that partial commits are dangerous (test suites generally don't track what's in the index, creating a mismatch between "what was tested" and "what's being pushed"), this is a welcomed simplification (of course, you can "git stash/bzr shelve" if needed).

Bazaar doesn't make a difference between a branch, a repo, and a working copy. All of these are repositories (potentially sharing ancestors) accessible through an URL. So there's no need for "remotes", you directly push/pull to/from an URL (no, you don't have to re-specify it each time).

I hope no git user was harmed during the reading of this post :-)


If by "test suites" you mean automatic testing, then I probably don't agree.

The flow goes like this for me:

- Make small amount of new code backed by tests in a feature branch

- Run some tests locally (not all)

- Push to the feature branch

- CI will run all the tests

- If a test fails, I get a notification about it in my IDE and can fix it immediately

I personally enjoy having staging area (and IDE supported local stash) that help me keep small amount of difference to the origin for development purposes.


> Bazaar doesn't have an index (aka "staging area"), and by default, "bzr commit" will commit all the local modifications.

Mercurial works this way too. I strongly prefer Mercurial to git, but I like partial commits and the staging area is one of the few things about git I wish Mercurial had.


> Bazaars allows remotely fetching one single revision from a repo, without requiring to download the whole history. You can't do this with git. The best I found was to loop "git fetch --depth N" with an increasing value of N until the "git checkout <commit_number>" succeeds. This is a pain, especially when working with submodules.

There is also `git clone --depth 1` but you have to use ssh+git protocol


I don't want HEAD, I want a specific commit. How do I specify this from a "git clone" command line, as you're suggesting?


I don't think that's possible. You'd have to `git init`, `git remote add ...`, `git fetch --depth 1 MYREMOTE MYCOMMIT`.

If this is a frequent case for you, you could easily add an alias for this.


$ git init

Initialized empty Git repository in /tmp/phobos/.git/

$ git remote add origin https://github.com/dlang/phobos.git

$ git fetch --depth=1 origin 6e5cdacfa6ac018c6ef42aa9679893676f293f21

error: no such remote ref 6e5cdacfa6ac018c6ef42aa9679893676f293f21

However, the commit exists:

$ git fetch --depth=1000

[... a long time after ...]

$ git checkout 6e5cdacfa6ac018c6ef42aa9679893676f293f21

Note: checking out '6e5cdacfa6ac018c6ef42aa9679893676f293f21'.

[...]

HEAD is now at 6e5cdacf... phobos 0.2


This looks like github configuration. My (probably newer) git tells me:

    $ git fetch --depth=1 origin 6e5cdacfa6ac018c6ef42aa9679893676f293f21
    error: Server does not allow request for unadvertised object 6e5cdacfa6ac018c6ef42aa9679893676f293f21
Note that this works fine and has the same effect:

   $ git fetch --depth=1 origin phobos-0.2
If you want the first command to work, you will probably have to host somewhere other than github.


> This looks like github configuration

This also doesn't work on gitlab.

$ git clone -q --depth=10 https://gitlab.com/fdroid/fdroidclient.git

$ cd fdroidclient

$ git fetch --depth=1 origin 5d2c2bc6e636e40eee80c59d1de6c1eff0ba4472

error: no such remote ref 5d2c2bc6e636e40eee80c59d1de6c1eff0ba4472

$ git fetch -q --depth=200

$ git checkout 5d2c2bc6e636e40eee80c59d1de6c1eff0ba4472

Note: checking out '5d2c2bc6e636e40eee80c59d1de6c1eff0ba4472'.

[...]

HEAD is now at 5d2c2bc... Merge branch 'fix-npe-verifying-perms' into 'master'

Please be honest: did you see it work at least once?

> If you want the first command to work, you will probably have to host somewhere other than github

Please keep in mind that those are generally third party projects. Convincing all the maintainers to move away from github is going to be hard.

It seems we're back to square one.


> Please be honest: did you see it work at least once?

No, I never tried this before. I was just guessing based on the client error message. But a quick look in the source reveals that this is indeed a server setting that defaults to false:

https://git-scm.com/docs/git-config#git-config-uploadpackall...

If fetching non-tagged, non-branchhead commits is actually a frequent use case for you, you could ask github whether they might change their config. You are not the first person to want this: https://github.com/isaacs/github/issues/436

> It seems we're back to square one.

Almost :). You said:

> Bazaars allows remotely fetching one single revision from a repo, without requiring to download the whole history. You can't do this with git.

As it turns out that is not correct – git can absolutely do that. But the two biggest hosters don't allow it.


Extract from the pages you linked:

"However, note that calculating object reachability is computationally expensive. Defaults to false."

"it is off by default and generally advised against on performance reasons."

> > Bazaars allows remotely fetching one single revision from a repo, without requiring to download the whole history. You can't do this with git. > As it turns out that is not correct – git can absolutely do that. But the two biggest hosters don't allow it.

I stand corrected, I should have written instead: "git designs makes this operation so expensive that git disables it by default, which means you can't use it with github and gitlab, and probably the vast majority of git servers in the world, making it unusable in practice".

> you could ask github whether they might change their config.

Yeah, right.


Regarding your fetch issue: you can just specify the Sha/tag/Branch head that you want to fetch


Considering how many people are incredibly productive with Git, calling it junk might be unfair.


People are productive in C too.

C (like git) It's very good at what it was intended for, but everyone and their mother using git on the command line for VCS now is as if everyone had just used C to do every website, game, app etc since 1980.

Git, like C, is a solid core but by now we should have better abstractions to help people be even more productive.


How many people are incredibly productive with git? Or most just kinda productive, as long as an expert isn't too far away from them?


You can become that expert too by reading this tutorial. https://www.sbf5.com/~cduan/technical/git/ It doesn't teach you commands at first because:

> you can only really use Git if you understand how Git works. Merely memorizing which commands you should run at what times will work in the short run, but it’s only a matter of time before you get stuck or, worse, break something.


you shouldn't _need_ to understand how git internally works to use it (and if you did, then it confirms that git isn't good). You should understand the abstract model of DVCS that git presents (just like you'd need to understand the abstract model of a car to drive it).


Understanding the abstract model is exactly what people mean when they say this. It just happens that Git's implementation is incredibly close to the abstract model. This was particularly true in the beginning, before regiments such as pack files were introduced. Those regiments complicate the implementation somewhat, but haven't changed the abstract model at all.


> Git's implementation is incredibly close to the abstract model

which is exactly why a lot of people claim that git is shit. The fact that it "won" makes those people more angry.


Oh well.

OTOH, I'm a kind of guy who just must take everything apart before using it and I like the brutal simplicity of git ;)

The naming of commands could be better, though.


I've never seen anyone complain about Git on the grounds that it _isn't_ full of leaky abstractions.


err, s/regiments/refinements/g. Thank you, autocomplete.


> git isn't good

who cares? git won, you need to use it if you are in this field. Yes, I liked bzr much better, I even liked darcs better but what can you do? git won, good or not. That git sucks is indisputable nonetheless we need to learn it and this tutorial is what made it possible for me to have some peace with git.

Oh and neutering git reset --hard because it's incredibly dumb for a version control system to just throw away shit. Instead, a backup commit is made first (and some other minor goodies): https://gist.github.com/chx/3a694c2a077451e3d446f85546bb9278


> git won,

I really wish people would stop talking about computer tech as "winning" and "losing". I mean, it's slightly better than "X is the new Y Killer from X Corp" bullshit we used to get, but its still ridiculous.

Mercurial, SVN, Darcs etc are all valid tools to use and all are maintained.

> you need to use it if you are in this field.

Wow, cargo culting much?

You should be familiar enough to use it when required, sure.

You don't need to use it if you're starting a new project. My client projects default to Mercurial, and I'll give them help getting up and running with hg if they aren't familiar already.

If you have developers who want to collaborate on your work, who are able to get git to do what they want, but who objecting to using something like Mercurial, you need to question their motives.

They're either not smart enough to actually use git, and instead just memorise commands without any clue what they're doing, OR they are objecting because we all know cool kids use git and they are a cool kid.


1. Git's not "winning" at Facebook: https://code.facebook.com/posts/218678814984400/scaling-merc...

2. Mozilla has also standardized on Mercurial

3. Mercurial’s changeset evolution is awesome: https://www.mercurial-scm.org/doc/evolution/


The implication of which is that git is a leaky abstraction?



git is not complicated 95% of the time. you really only need to learn a handful of commands and read a blog post or two about branching models and you are good to go.

is there is some set of problems plaguing git users I've just never run in to?


Given that practically every non-beginner git tutorial starts off with "back up your repository", it's quite clear that git gets wedged a lot.

Somehow, I rarely see a Mercurial tutorial give that same advice unless you are doing something really experimental.


If you're writing a non-beginner git tutorial and you feel the need to include advice to "back up your repository" then you haven't done your job. It's incredibly hard to lose data with git - no matter what changes you make, the old commits are still around, because they're immutable. If you lose track of them, there's always "git reflog".

I don't want to pounce on you just because you prefer Mercurial to git, so this isn't really directed at you, but in general this line of argument is always a bit frustrating to me. I've never lost data with git, but I've lost data with Mercurial several times because of the terrible UI of "hg resolve", which throws away your changes without warning unless you remember to type "hg resolve -m". None of git's questionable UI decisions (and there are many) has caused me remotely as much trouble as "hg resolve".


It's way too easy to lose work with git. The easy availability of git reset --hard is a menace. I am using this https://gist.github.com/chx/3a694c2a077451e3d446f85546bb9278 shell script to make it not lose data. And it's a disgrace I need to do this. Disk space is free (within measurement error, especially for 99.99% of codebases) so just put that thing somewhere and if necessary I can use date and pickaxe to dig it up.


I do agree with that; "git reset --hard" should stash the changes in the working copy somewhere. I'm sure you'd agree, though, that backing up your repository is not going to protect you from "git reset --hard" unless what you're really doing is backing up the working copy, and if that's what you're doing, there's a built in feature to do that in git called "git commit". =)


Except that "git commit" isn't sufficient.

You have to use "git add" on a bunch of files that you have used "git add" on before.

As far as I can tell, every other revision control system tracks a file for "commit" once it has had even a single "add". This is the default case and what 99% of people want--"I told you to keep track of the file. Now keep track of it until I tell you otherwise."

git is the only revision control system I know of where I have to "git add" the same file over and over and over and over ... before doing "git commit".

But that is fairly standard git UI practice--"Optimize the 1% case and make the 99% case annoying."


git reflog has the previous refs, git reset --hard does not remove anything that has been committed.

It will however nuke changes that are not committed. Which is exactly what I use it for... But your script sounds like a decent solution if you want also that to be undoable


A revision control system losing data on a bad ctrl+r is, as I mentioned, a menace.


Having to navigate the reflog as a beginner can be overwhelming.


A backup can still be very useful if you perform something non-trivial like e.g. history rewriting. Sure you unlikely lose something but restoring a backup might often be the easier solution to restore the state you started from.


re: hg resolve,

That's like complaining that git threw away your changes because you forgot to commit them before pushing. Yes hg resolve is a little bit confusing the first time you encounter it. But all your losing is the conflict resolution. You didn't lose the two heads that you were trying to merge nor did you lose the conflict markers.

If that's the only place that confused you in hg's interface then it did a way better job than Git in it's user interface.


It's not really similar to forgetting to commit before pushing, but as another poster pointed out it's fair to compare it to "git reset --hard". The difference in my mind is that "hg resolve -m" is part of the workflow you'd use commit the changes in the working copy. It would be like if git threw away your changes if you ran "git commit" with no arguments.


I wished I could back up my repository when I worked with AccuRev or SVN. Knowing with absolute certainty that the worst-case scenario was just reverting to a copy meant that I could try anything in git, even when I barely knew what I was doing. Freedom to experiment without consequences made learning git a much faster process than previous version control systems I'd worked with.


That's too bad for Mercurial. Backing up a repo is just good practice—if you care enough to do version tracking, you should care that you have a backup. I have backups of all of my repos, but I have never once gotten Git into an unusable state.


In Mercurial, you backup in case Mercurial does something it isn't supposed to do and you lose data. With Git, you backup because git is hard to reason about and there's a good chance that the command you're running will put you into a technically-valid state that you don't understand or know how to get out of and you don't want to spend hours Googling git jargon to figure out how to get out of it. This might be detached-head-state for beginners, or an odd rebase that somehow lost your data and your company's git experts can't figure out where it went. Obviously the probability of such a state is inversely proportional to your understanding of git.



Yeah, it was hyperbole. Git does the job.


I agree that the CLI of Mercurial is much more thought out than that of git, but calling git crap is unreasonable.


Fair point. I was exaggerating.


Git did have one big advantage over Hg at the time, which was speed; a relevant criteria for Linux.

I do agree ending up with git was unfortunate. After trying all the popular choices, Monotone was what I would have picked. Fast, secure, portable and simple.

But Linus didn't like it because it supported cherry-picking, which apparently is an anti-feature for a VCS :)


This is bullshit. git had to support cherry-picking from day one and monotone was extremely slow. Some background: https://lwn.net/Articles/574075/


Your link has nothing to do with your argument.

Monotone was slow back when git didn't exist, but later versions greatly improved that.

Also, it's clear Linus didn't like cherry-picking, but other people convinced him to add it to git.


Why should git be junk? Sure its commands may not be the most intuitive but apart from that it's fast, flexible and works well. Git has won, get over it.


The time I spend waiting for my dvcs is trivial compared to the time I spend interacting with it. I use git, and it gets the job done, but I'll still lament that fate chose git over any of the superior alternatives.


What compelling reason does bzr offer average teams to use it over git?


Code hosting services are never forever. Sourceforge aged and fell from grace. Google Code shut down. Github's time will pass as well.


Sourceforge fell from grace due to ads. Google Code just went the way of many of Google's side projects. (Also, I don't recall it being that heavily used. Just my memory?)

I'll confess that I feel that github will eventually wane. But, I can't say why I think that right off. When they run out of money? Start failing at the hosting portion of what they do in favor of other things?


> Sourceforge fell from grace due to ads.

Not just ads. Bundling malware into installers of popular projects.


They make crazy amount of money from paid accounts and hosted option. They won't run out anytime soon.


Google Code was the best when you wanted public Subversion with issues and wiki. IMO it was a thing that started dragging on Source Forge who then flailed. Once git started it's uptick GitHub became the leader. Classic S-Corp of adoption of new-shiny.

Book: Innovators Dilemma


Maybe. But that was a common thing to say about search engines before the mid to late 00s, about Facebook related to social networks just a few years ago..and to this day, and about messaging perhaps to this day as well while that has pretty much stabilized too.

I'm not saying I'd bet money on Github being a front runner in say 5 years, but it isn't that unlikely for them to still be the site in 5 or 10 years.


i think github will have learnt from sourceforge's mistakes though.


While that's better than not learning from the mistakes of the past, how can you possibly believe that there are no new mistakes to be made, or opportunities to miss out on.

GitHub's major selling point is mindshare/popularity, and git by default makes moving to another git hosting service relatively trivial. Yes, there are issues to transfer, but it's nowhere near as difficult as in the days of SVN, where you'd need to have shell access to dump the repo, or hope it was a modern enough svn version to support remote dumps (and that they didn't time out).


So will Gitlab.


I can't find a link to their GitLab instance/repositories. Where is it?



I couldn't find it either. Neither on their project pages nor that specific blog article and not even the "Search" option (where I searched for inkscape) on the gitlab.com/ site.

Thanks TranquilMarmot for posting the link to the repo.


Even more ridiculously the page name "Inkscape's Reposorities" (https://inkscape.org/en/develop/inkscape-git/) does not contain a link to their reposority.


Self-hosted GitLab, or gitlab.com? Would a link in the article to the repo be too hard?



I really hope other FOSS projects take the same initiative


I am very curious how many devs will stop and how many will start contributing because of this move.


Now just please make use of https://hosted.weblate.org/ for translations


[flagged]


We've banned this account for trolling. Would you please not create accounts to break the site guidelines here?

We detached this comment from https://news.ycombinator.com/item?id=14534872 and marked it off-topic.


:)


I really want to learn to use inkscape well, but just can't grok the interface. It's a sad symptom shared by many open-source projects.

They seem to want to differentiate themselves as (e.g. "not photoshop" in gimp's case) but seem to equate that with "ignoring good ui/ux design".


This is what I used to think, but reflecting on my usage of these tools lately has shaken my belief. Here's my theory, illustrated on a personal example:

I've been using Photoshop for a long time, and I've learned a lot of its shortcuts and intricacies. Basically, when I want to accomplish something in Photoshop, I already have an idea on how I'll go about it, using the functionality that's available. But GIMP, on the other hand, never really clicked for me. I find it very unintuitive and limiting, and it's a huge pain to have to do something in GIMP when Photoshop's not readily available. I've convinced myself that this is because GIMP has a much inferior UX and is orders of magnitude more limiting than PS (at least the subset of their features I use in my day-to-day usage).

On the other hand, since my light vector editing needs have been satisfied by Photoshop for a long time, I haven't really learned Illustrator. Recently, for various reasons, I've had to do some heavier-than-usual vector editing stuff, but still nothing requiring more than simple Beziers, fills and strokes, so I've been doing it in Inkscape since it's just been handy. After some time, I decided to try and use Illustrator, figuring it'd be like a whole new world. And then, surprisingly, I realized I don't really like it. The interface was illogical and not in line with my mental model at all. I struggled to complete basic tasks, and finally gave up and did the job in Inkscape. Basically, it was very reminiscent of the Photoshop―GIMP situation.

So my conclusion is that the tools and their UX are very powerful in giving me a mental model of a task, and significantly more so than I would have imagined. So it might not be 100% true to say that the UX in these tools is inferior. It's just so different from what we're used to that we have a very, very hard time separating the "different" from "worse" in our heads.


GIMP also has some weird omissions. You can't lock a layer so that it won't move, for instance. This is apparently coming in the unstable branch, but it boggles my mind that this wasn't a feature from day one. There is also 'anchor' but that doesn't seem to mean what anchor means in just about any other CAD/Graphics application.

The expected behaviour in Photoshop is that if you drag-move, you only move the selected layer. It seems like in GIMP if you drag-move, you drag the highest layer that has a painted pixel under your cursor. There are probably situations where this saves time, but more often I try and drag some text around and I end up moving a background layer by mistake.

I often find myself using Inkscape to save time, it's intuitive enough and it works well.


Drag layer under pointer and Drag selected layer are two options of the move tool. You will find them in the tool properties pane.

All in all, I've always found Gimp more intuitive and easy to use than Photoshop, probably because I learned it first!


I've been using Gimp forever and I have the opposite experience: to me it's much more intuitive and powerful than Photoshop. I have used both in the past, but I always come back to Gimp because for me it's so much better.


Reading between the lines, I suspect you're trying to say that Inkscape is gratuitously different from Adobe Illustrator? Well, it's interface is also suspiciously similar to (older versions of) Corel DRAW, so maybe it's not so much deliberate differentiation as it is just borrowing from a different tradition than you're used to.


When I last used both regularly (2009), Inkscape's editing tools were pretty far ahead of Adobe Illustrator. I found it much easier to work with Bézier curves, automatic tracing, complex stroke and fill styles, and especially gradients.

It does take a few days for things to sink into your noggin after so long in a different tool, but I certainly think Inkscape's quality is very high. It's worth learning, unlike (imo) GIMP.


Maybe you are used to certain UIs and behaviors? OSS doesn't want to 'be different', it just doesn't adhere to industry practices and doesn't try to imitate the competition, especially in niches like video/photo/audio editing, which is IMHO a good thing.

I don't think the Inkscape UI is really that hard to understand, it's just different. Granted, I haven't used Photoshop and Illustrator since version 6.0, but I had no trouble getting into Gimp or Inkscape within a couple of days of using it. I didn't find it hard to grok the UI and the intended behavior, it became pretty usable very quickly (bugs aside...).

As a developer, I can easily understand what the intended behavior in these programs is, everything is very logical.... but maybe that's the problem many users have ;)

Edit: and I can see how the bugs can confuse users to no end. It might be even the greatest disadvantage, especially inkscape behaved really buggy the last time I used it half a year ago :(


I have the opposite issue - I've been using Inkscape for years for vector editing, and have a really hard time trying to get used to Illustrator's. Inkscape's keyboard command in particular are much more practical than Illustrator's for my workflow.


Open source UIs are by and large designed by committee. Even when the governance of the project is by "benevolent dictator", it is hard to turn away contributions from the wider community. Even if a UI starts out simple and elegant, it will eventually die a death by a thousand cuts.


I never liked illustrator but inkscape is straight forward to me. But I am a long time user of Corel Draw which is quite similar so inkscape feels like home, so I guess your background matters. It's simple things like point selection that make all the difference. Inkscape is open source with good ui while gimp feels like they are trying to add as many tools as possible under the same application with no regard of user experience.


I use inkscape and it seems fine and logical.


I'm thinking right now: maybe it's too logical.

It's easy to understand the behavior and UI from a programmer/developer's perspective, but the main portion of the target group in this sector does think completely different.


My demographic: I use Inkscape about once a year, to make a single small thing that I know would probably be best as a vector graphic, for sizing options.

For a thing that should probably take about an hour, it takes me half a day, to relearn the handful of things I need to do.

Which, I think, is pretty good, considering I'm not at all a graphics person, and usually have to search around just to discover the terms I need to search for how to do what I want.

My results are amateurish, but good enough, and probably wouldn't exist without Inkscape. I like it.


yes, making things illogical always helps to improve a product.


I agree, the list of well designed and usable OSS software is very short IMO. I suspect that most OSS software doesn't have a skilled UI/UX designer so they make due with what they have. Not to mention that GTK+ and Qt do not look very nice in general.

Inkscape's interface isn't too bad, not good, but I rarely have significant issues with it. GIMP is an absolute nightmare of UI/UX design, I can't understand how someone hasn't fixed that by now.


Qt, by default, mimics the look and feel of the host OS. On Windows and Mac, Qt uses native GUI elements so it's a little unfair to say it doesn't look nice. A well laid out Qt program should be indistinguishable from a native application. I'm sure people have very strong opinions on whether native UIs are nice or not!

The issue, as you point out, is lack of designer. Both Qt and GTK can be extensively customised, but you need to know what you're doing (in an artistic sense).


> GIMP is an absolute nightmare of UI/UX design, I can't understand how someone hasn't fixed that by now.

Of course you can understand that. Try writing actual code to fix any pet peeve with GIMP's UI/UX (or any app's UI/UX). Then we'll sit down and talk again about the role of volunteers in a free software project :)


I wouldn't say it's bad. It's confusing at first, like most interfaces with a ton of controls, but you gradually get used to it and know where things are.

The bad thing about the UI at the moment for me is the lack of HiDPI support (as far as I can tell). And back when I was using OSX, the interface seemed a bit janky since it used XQuartz and it generally wasn't very native both in terms of UI rendering and its default keyboard shortcuts.


There is support for HiDPI, to an extent (see e.g. [1]). What there doesn't seem to be support for is having both a HiDPI and low DPI screen.

[1] http://www.inkscapeforum.com/viewtopic.php?t=18684


This is coming in the next GTK Version.


> They seem to want to differentiate themselves as (e.g. "not photoshop" in gimp's case) but seem to equate that with "ignoring good ui/ux design".

Nope, we bloody well know GIMP's UI sucks in many ways. But UI doesn't fix itself as you might suspect. For every few thousands of people who complain about its UI we maybe get just one occasional contributor.


I think that it's more that they ignore the UX you are used to from other tools, not necessarily bad, just different.


Usually I'm the first to lament the sad state of UX in many (most?) OS programs but I've always found Inkscape to be very reasonable in that regard. However, I don't have experience with eg. Illustrator so there was no unlearning of muscle memory required.


I use it -- I agree that it's clunky, and there's probably a YMMV factor depending on your window manager. Once you learn where everything is you can get a surprising amount done with it.

Adobe Illustrator is, for a novice, surprisingly not that much better.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: