Miguel loves C and that's perfectly fine - it's a great language, easy to read and write, powerful, but does suffer from some of the same fundamental problems as C++... Both languages are tightly bound to today's base machine architecture - they are called "systems programming languages" for this very reason.
I find it odd that we have this continual debate about older (and widely used) systems programming languages (the ones that are actually used and form the basis of modern computing....), but neglect the arguably more interesting - and obvious - problem: the lack of a viable, consumer class, non-Von Neumann machine architecture. What would it mean for the advent and design of new high level programming languages that are safe, performant, productive, portable, and that directly map to this new machine model?
The future will always be built on the past, but this doesn't mean we have to keep re-inventing the same problems to solve... Here's to a future of new machine models and the languages used to program them efficiently, effectively, safely, and easily.
Agg is more like a meta library for constructing drawing APIs. Using it directly is quite painful. It even comes with Agg2D which is a simplified API.
Another easy to use C++ drawing API is Qt's QPainter.
It all comes down to how you design the API. C++ doesn't force you to come up with Agg if you really want something like QPainter. I expect the C++-translation of Cairo to be equally simple. We're talking about translating cairo_foo(ctx, params); to ctx.foo(params); and adding automatic memory management.
API design is hard. If you have it in you to design a good C API, you could probably do the same in C++.
I'd rather they provided some standard template interfaces to all of the common 2D graphics algorithms (simple line clipping, tessellation, polygon clipping, bezier subdivision, etc) and not couple it to a classical scanline rasterizer CPU graphics library. But I don't think that even this belongs in the language standard.
There are a bunch of other opensource libraries they'd need to "standardize" if they want to draw text...
Because it means you can start making assumptions about available resources on platforms you target. If all you need is the included 2d api in the standard (which would be terse at best) then as soon as you can throw a compiler flag for, say, c++14, you can assume those primitives are there. You don't have to fight to either bundle a 3rd party library with your software, or pray to god the included version on the target system still has a version with the same abi, etc. You can just use it outright without hesitation because standard compliance means you have it at your disposal.
A thousand ad hoc separately licensed libraries with wildly varying build chains, supported compilers and operating systems, with similarly varying code quality, naming conventions, build configuration options, conflicting runtime selection, error handling styles, documentation quality, etc. may be the bread and butter of your standard C++ programmer -- I know it is of mine -- but it hardly seems like the ideal state of things, especially having also used languages with much, much larger standard libraries.
What practical problems have you, personally, run into as a result of standard library "bloat"?
Just yesterday, as a practical problem I've personally had with the "leanness" of non-standard libraries, I spend a good half the day trying to wrangle a set of third party libraries into submission. 4 library configs x 3 platforms varying just enough to spend a good half the day setting up library paths, library names, library dependencies, tracking down #defines for "optional" functionality that was forgotten (or off because it doesn't work yet -- I haven't figured out which) on one platform, all in yet another build system (adapting their projects to ours would take too long), harmonizing runtime library settings to match instead of crash...
A whole lot of busywork that simply doesn't happen for standard libraries, because "batteries included" means it's already been taken care of, without cutting off the rabbit hole of extra libraries when you need them.
The problem of package and configuration management is an OS problem and should not be the responsibility of compiler developers who would ideally be working on an OS-agnostic language compiler. For every non-language feature you push into the standard, you increase the amount of coupling the compiler has to an OS. (It's already the case with existing APIs, even down to basic IO - it actually forces the operating system to be designed to match the requirements of the language, and thus limits to scope for experimentation and other solutions.)
C++ has significant usage outside of our mainstream operating systems. It's used in embedded systems, and they have the exact opposite problems you describe - the trouble of taking things out of the language, rather than putting them in.
The way we maintain packages should be agnostic of any particular OS or language - we need a way to eliminate the "hidden knowledge" that goes into building and configuring the packages in the first place, so that any particular setup can be automatically reproduced. It's not a solved problem yet, but Nix is making great steps towards that capability, which don't require adding bloat to existing solutions. (We usually call it the Open/Closed principle)
The C++ Committee are attempting to solve the problem in a different way - by saying "we'll handle everything for you". What about adding libraries to ISO C++ for scraping websites, or accessing RDBMS, parsing text, or one of the dozens of common programming problems that exist? Why graphics in particular, and where do we stop attempting to push things into the standard? Screw it, why don't we just make ISO C++ a standard operating system?
But if you insist on having a GUI library for C++, why not use Qt? Linking to Qt isn't really that complicated. If you're beginning, you can just use qmake and you're done.
With Visual Studio or Eclipse I am still downloading the libs for my architecture, sometimes compiling them with an entire set of potential issues and dependencies, configuring paths, etc...
They have NuGet for .NET in Visual Studio, you'd think this could be done for native software.
It is actually far easier to do this on Linux with package managers, which is sad for Windows imho.
The "answer" seems to be to download as source and plug vcxproj dependencies right into your solution. Which, coupled with Windows's problems with static libraries (I use nothing else on OS X - static library right out of Homebrew), is hilarious.
It's not even that great a library from what I can tell.
If you're coding to a specific IDE, it can be made pretty simple.
> They have NuGet for .NET in Visual Studio, you'd think this could be done for native software.
You're quite correct, it can be done for native software. Apparently, even has:
I won't hold my breath for XCode or KDE to support the same, though.
> It is actually far easier to do this on Linux with package managers, which is sad for Windows imho.
Equally sad, I've found this doesn't apply much beyond "Hello World".
No, but keeping these concerns "downstream" causes these problems to be repeatedly re-solved. This is wasteful in it's massive redundancy. How many times have I seen the neophyte make a mistake when trying to link SDL? I've beyond lost count. And that's one of the super easy cases.
> For every non-language feature you push into the standard, you increase the amount of coupling the compiler has to an OS.
And decrease the amount of coupling the end programs have to the OS, the number of which hopefully significantly outnumber the compiler. All this assuming we're talking about functionality that must interoperate with the OS of course -- things like new container types have no such concerns.
> The problem of package and configuration management is an OS problem
> The way we maintain packages should be agnostic of any particular OS or language
These two statements directly conflict with each other by my parsing. I agree in theory with the latter, but in practice things like #include path configuration are fundamentally language specific. In some distant land of the future where package management is a solved problem, lean standard libraries might make more sense. However, it'd take some convincing to convince me package management is even a solvable problem in the generalized case.
OS focused package managers like RPMs and APT have as a rule caused me far more work than they've solved. They don't have the versions I need, they don't configure things for the platforms I need, they don't run on the OSes I need. Even for something as fundamental as, say, integrating the boost libraries, I've found that the moment I want to go beyond "hello world" it's simpler to completely ignore APT entirely and build the bloody thing yourself.
Non-OS package managers like PIPs, RubyGems, Cpan Modules, NuGet packages, etc. I've had more luck with. I've heard good arguments against standardization of libraries in the python world -- why freeze the API and discourage competition when package management is easy and works?
I've yet to see anything resembling progress of a similar sort in the world of C++. You say "Nix is making great steps towards that capability, which don't require adding bloat to existing solutions" -- but I've yet to see it. I suspect I never will: I suspect it'd take over a decade to become a proper force, and more interest in solving the problem than I've seen. I suspect those who would solve such problems are instead writing entire new languages and ecosystems that will one day relegate C++ to the world of legacy programming, leaving behind so much language cruft.
> The C++ Committee are attempting to solve the problem in a different way - by saying "we'll handle everything for you". What about adding libraries to ISO C++ for scraping websites, or accessing RDBMS, parsing text, or one of the dozens of common programming problems that exist? Why graphics in particular, and where do we stop attempting to push things into the standard? Screw it, why don't we just make ISO C++ a standard operating system?
What's your stance on the .NET Framework? It sounds like the poster child that could have inspired this paragraph in frustration, or perhaps you don't consider that a standard library. It has libraries for scraping websites, accessing RDBMs, parsing text, and many of the dozens of common programming problems that exist, including graphics. You might call it emacs syndrome. I say it's a damn sight better than where C++'s currently at, even if it has given me grief trying to port Mono.
> It's used in embedded systems, and they have the exact opposite problems you describe - the trouble of taking things out of the language, rather than putting them in.
This, at least, I'm willing to not label a corner case. But even C++'s existent pitifully tiny standard library outlines some parts that don't need to be made available on embedded targets, and I see no reason such a graphics library wouldn't also be so outlined, or particularly difficult to "take out".
people that have C++ or already have they own toolkits, or use some provided by the platform.. i think its just too late for this..
its different with a network stack... its badly needed! and you use it in every way... (backend, frontend)
i think people just wont use it.. because of the points i've made earlier..
A better proposition for graphics(because its futuristic) would be to focus on what LLVM are doing for instance... like the ability to compile directly to gpu instructions..
In the direction of Opencl, cuda.. heterogeneous computing..
something that can make graphics AND vectorized computation
..than it would be just a matter of creating 2d graphics kernels, and wrap a api on top of it, C++ has a unique position here!
they need to think looking forward.. not backwards..
I'll grant a graphics library seems a tad silly at this point, I'd much rather see more containers, or working modules. I'm more concerned by the aversion to standard library "bloat" than a specific aversion to a 2D graphics library.
Well, you don't want a graphics library in a kernel, for example. In Rust most of the feature requests for the standard library are to remove things these days, for this reason (see M:N scheduling, GC, etc.) :)
C's standard library doesn't seem to encumber Linux much: They just eschew it or implement it.
C++'s language semantics are a much bigger problem, last I heard Kernel devs generally didn't like the idea of implementing the supporting cruft behind exception handling, for example.
All of these were included to counter one of the weakest points of C and C++: the language may be portable, but even simple programs aren't. That's one area where Java was a huge improvement over C.
So, the argument should not be "If the proposed graphics API does not require changes to the language it really doesn't need to be there.", but one should ask "how far should we go in adding standard libraries?". The answer to that question should guide the answer to the "Do we want a 2D graphics API in the standard?" question.
I find 2D graphics going a bit far for that, but can see why people would see this otherwise.
And to those stating that they don't see that making it into embedded use: C++ already has optional features (http://en.wikipedia.org/wiki/C11_(C_standard_revision)#Optio...). Yes, that is less than ideal from the portability viewpoint, but that's the real world for you.
What c++ is trying to be, java?
In what way was Cairo the "clear winner"?
Life's way too short to not have things destruct on scope exit. I'm pretty confident that my own projects would be a lot more verbose and much, much more error prone (like, failure-to-get-off-the-ground error-prone) if I had to deal with C's limitations.
I was shocked to read this when I saw the blogpost from Miguel and wanted to find out why development of Antigrain stalled.
Edit: But i respect the man! :)
See the case for the Dart VM for a good example.. its everything wrapped in the C api.. (and also if you want a example of beautiful C++ code for a complex application. hack the VM).. in every language you can write pretty code or nasty code.. i think the only exception here would be haskell (as a proof of the impossibility to write a code that do not look ugly)