Miguel loves C and that's perfectly fine - it's a great language, easy to read and write, powerful, but does suffer from some of the same fundamental problems as C++... Both languages are tightly bound to today's base machine architecture - they are called "systems programming languages" for this very reason.
I find it odd that we have this continual debate about older (and widely used) systems programming languages (the ones that are actually used and form the basis of modern computing....), but neglect the arguably more interesting - and obvious - problem: the lack of a viable, consumer class, non-Von Neumann machine architecture. What would it mean for the advent and design of new high level programming languages that are safe, performant, productive, portable, and that directly map to this new machine model?
The future will always be built on the past, but this doesn't mean we have to keep re-inventing the same problems to solve... Here's to a future of new machine models and the languages used to program them efficiently, effectively, safely, and easily.
Agg is more like a meta library for constructing drawing APIs. Using it directly is quite painful. It even comes with Agg2D which is a simplified API.
Another easy to use C++ drawing API is Qt's QPainter.
It all comes down to how you design the API. C++ doesn't force you to come up with Agg if you really want something like QPainter. I expect the C++-translation of Cairo to be equally simple. We're talking about translating cairo_foo(ctx, params); to ctx.foo(params); and adding automatic memory management.
API design is hard. If you have it in you to design a good C API, you could probably do the same in C++.
What is this? Python? Why "standardize" Cairo? I can already use Cairo if I want to.
I'd rather they provided some standard template interfaces to all of the common 2D graphics algorithms (simple line clipping, tessellation, polygon clipping, bezier subdivision, etc) and not couple it to a classical scanline rasterizer CPU graphics library. But I don't think that even this belongs in the language standard.
There are a bunch of other opensource libraries they'd need to "standardize" if they want to draw text...
Because it means you can start making assumptions about available resources on platforms you target. If all you need is the included 2d api in the standard (which would be terse at best) then as soon as you can throw a compiler flag for, say, c++14, you can assume those primitives are there. You don't have to fight to either bundle a 3rd party library with your software, or pray to god the included version on the target system still has a version with the same abi, etc. You can just use it outright without hesitation because standard compliance means you have it at your disposal.
While I agree that a 2D API doesn't sound like something that needs to be in the standard library, it sounds to me like they're just basing the API on Cairo's and not necessarily forcing everyone to use Cairo bindings. Individual standard library vendors would be free to implement that API any way they like, just like video card vendors have their own unique, yet consistent OpenGL implementations.
Do we really need a vector graphics library in standard C++? Why not just create a new ISO standard using Cairo as the base, and stop making the language even more bloated that it currently is. If the proposed graphics API does not require changes to the language it really doesn't need to be there.
Quality standard libraries are wonderful in that you can rely on their presence.
A thousand ad hoc separately licensed libraries with wildly varying build chains, supported compilers and operating systems, with similarly varying code quality, naming conventions, build configuration options, conflicting runtime selection, error handling styles, documentation quality, etc. may be the bread and butter of your standard C++ programmer -- I know it is of mine -- but it hardly seems like the ideal state of things, especially having also used languages with much, much larger standard libraries.
What practical problems have you, personally, run into as a result of standard library "bloat"?
Just yesterday, as a practical problem I've personally had with the "leanness" of non-standard libraries, I spend a good half the day trying to wrangle a set of third party libraries into submission. 4 library configs x 3 platforms varying just enough to spend a good half the day setting up library paths, library names, library dependencies, tracking down #defines for "optional" functionality that was forgotten (or off because it doesn't work yet -- I haven't figured out which) on one platform, all in yet another build system (adapting their projects to ours would take too long), harmonizing runtime library settings to match instead of crash...
A whole lot of busywork that simply doesn't happen for standard libraries, because "batteries included" means it's already been taken care of, without cutting off the rabbit hole of extra libraries when you need them.
I run into many of the same issues as you, but pushing some library to become standard does not solve the problem at all - it just forces the problem on somebody else. Somebody still needs to configure and maintain the library for a particular distribution, but rather than it being a distribution's package maintainer (or yourself), the problem is pushed onto the compiler/standard library devs, and the distribution's maintainers will still need to configure the standard library for their own distributions anyway. This is the "bloat" - it's increasing the workload for those developers. (I don't personally work on a C++ compiler, but I've worked on tooling for other languages, and I've maintained packages.)
The problem of package and configuration management is an OS problem and should not be the responsibility of compiler developers who would ideally be working on an OS-agnostic language compiler. For every non-language feature you push into the standard, you increase the amount of coupling the compiler has to an OS. (It's already the case with existing APIs, even down to basic IO - it actually forces the operating system to be designed to match the requirements of the language, and thus limits to scope for experimentation and other solutions.)
C++ has significant usage outside of our mainstream operating systems. It's used in embedded systems, and they have the exact opposite problems you describe - the trouble of taking things out of the language, rather than putting them in.
The way we maintain packages should be agnostic of any particular OS or language - we need a way to eliminate the "hidden knowledge" that goes into building and configuring the packages in the first place, so that any particular setup can be automatically reproduced. It's not a solved problem yet, but Nix is making great steps towards that capability, which don't require adding bloat to existing solutions. (We usually call it the Open/Closed principle)
The C++ Committee are attempting to solve the problem in a different way - by saying "we'll handle everything for you". What about adding libraries to ISO C++ for scraping websites, or accessing RDBMS, parsing text, or one of the dozens of common programming problems that exist? Why graphics in particular, and where do we stop attempting to push things into the standard? Screw it, why don't we just make ISO C++ a standard operating system?
I think Herb has said the answer to "why graphics in particular" is: beginners. If someone wants to start out programming in C++, it can be disheartening to learn that just drawing some simple shapes requires using external libraries, with all the complexity that brings.
Maybe starting out programming in C++ isn't a good idea, then. I don't think it is, for this and many, many, many other reasons besides. Shoving a graphics library into C++ doesn't make it substantially easier for beginners -- the biggest problem with C++ is not that it doesn't have enough features.
But if you insist on having a GUI library for C++, why not use Qt? Linking to Qt isn't really that complicated. If you're beginning, you can just use qmake and you're done.
Maybe some effort should be put into making it much easier to use external libraries in a C++ IDE then?
With Visual Studio or Eclipse I am still downloading the libs for my architecture, sometimes compiling them with an entire set of potential issues and dependencies, configuring paths, etc...
They have NuGet for .NET in Visual Studio, you'd think this could be done for native software.
It is actually far easier to do this on Linux with package managers, which is sad for Windows imho.
The main roadblock is the disgusting combinatoric explosion of compiler versions and debug/release single/multi versions.
The "answer" seems to be to download as source and plug vcxproj dependencies right into your solution. Which, coupled with Windows's problems with static libraries (I use nothing else on OS X - static library right out of Homebrew), is hilarious.
Sometimes that's the answer. But just yesterday this set of library vcxproj s ended up being generated by CMake... with one supported platform each and absolute paths in them. Not worth even checking into the depot, nevermind including in your .sln, not that the first person to take a crack at it didn't try checking them in and adding them to the sln...
It's not even that great a library from what I can tell.
> I run into many of the same issues as you, but pushing some library to become standard does not solve the problem at all
No, but keeping these concerns "downstream" causes these problems to be repeatedly re-solved. This is wasteful in it's massive redundancy. How many times have I seen the neophyte make a mistake when trying to link SDL? I've beyond lost count. And that's one of the super easy cases.
> For every non-language feature you push into the standard, you increase the amount of coupling the compiler has to an OS.
And decrease the amount of coupling the end programs have to the OS, the number of which hopefully significantly outnumber the compiler. All this assuming we're talking about functionality that must interoperate with the OS of course -- things like new container types have no such concerns.
> The problem of package and configuration management is an OS problem
> The way we maintain packages should be agnostic of any particular OS or language
These two statements directly conflict with each other by my parsing. I agree in theory with the latter, but in practice things like #include path configuration are fundamentally language specific. In some distant land of the future where package management is a solved problem, lean standard libraries might make more sense. However, it'd take some convincing to convince me package management is even a solvable problem in the generalized case.
OS focused package managers like RPMs and APT have as a rule caused me far more work than they've solved. They don't have the versions I need, they don't configure things for the platforms I need, they don't run on the OSes I need. Even for something as fundamental as, say, integrating the boost libraries, I've found that the moment I want to go beyond "hello world" it's simpler to completely ignore APT entirely and build the bloody thing yourself.
Non-OS package managers like PIPs, RubyGems, Cpan Modules, NuGet packages, etc. I've had more luck with. I've heard good arguments against standardization of libraries in the python world -- why freeze the API and discourage competition when package management is easy and works?
I've yet to see anything resembling progress of a similar sort in the world of C++. You say "Nix is making great steps towards that capability, which don't require adding bloat to existing solutions" -- but I've yet to see it. I suspect I never will: I suspect it'd take over a decade to become a proper force, and more interest in solving the problem than I've seen. I suspect those who would solve such problems are instead writing entire new languages and ecosystems that will one day relegate C++ to the world of legacy programming, leaving behind so much language cruft.
> The C++ Committee are attempting to solve the problem in a different way - by saying "we'll handle everything for you". What about adding libraries to ISO C++ for scraping websites, or accessing RDBMS, parsing text, or one of the dozens of common programming problems that exist? Why graphics in particular, and where do we stop attempting to push things into the standard? Screw it, why don't we just make ISO C++ a standard operating system?
What's your stance on the .NET Framework? It sounds like the poster child that could have inspired this paragraph in frustration, or perhaps you don't consider that a standard library. It has libraries for scraping websites, accessing RDBMs, parsing text, and many of the dozens of common programming problems that exist, including graphics. You might call it emacs syndrome. I say it's a damn sight better than where C++'s currently at, even if it has given me grief trying to port Mono.
> It's used in embedded systems, and they have the exact opposite problems you describe - the trouble of taking things out of the language, rather than putting them in.
This, at least, I'm willing to not label a corner case. But even C++'s existent pitifully tiny standard library outlines some parts that don't need to be made available on embedded targets, and I see no reason such a graphics library wouldn't also be so outlined, or particularly difficult to "take out".
It would be ok, if started from the beggining with this.. maybe people would use it.. and then when ios and android happen, its would be adapted to work on them.. but now?
people that have C++ or already have they own toolkits, or use some provided by the platform.. i think its just too late for this..
its different with a network stack... its badly needed! and you use it in every way... (backend, frontend)
i think people just wont use it.. because of the points i've made earlier..
A better proposition for graphics(because its futuristic) would be to focus on what LLVM are doing for instance... like the ability to compile directly to gpu instructions..
In the direction of Opencl, cuda.. heterogeneous computing..
something that can make graphics AND vectorized computation
..than it would be just a matter of creating 2d graphics kernels, and wrap a api on top of it, C++ has a unique position here!
they need to think looking forward.. not backwards..
> It would be ok, if started from the beggining with this.. maybe people would use it.. and then when ios and android happen, its would be adapted to work on them.. but now?
I'll grant a graphics library seems a tad silly at this point, I'd much rather see more containers, or working modules. I'm more concerned by the aversion to standard library "bloat" than a specific aversion to a 2D graphics library.
> What practical problems have you, personally, run into as a result of standard library "bloat"?
Well, you don't want a graphics library in a kernel, for example. In Rust most of the feature requests for the standard library are to remove things these days, for this reason (see M:N scheduling, GC, etc.) :)
Kernel hacking's a bit of a corner case, wouldn't you agree?
C's standard library doesn't seem to encumber Linux much: They just eschew it or implement it.
C++'s language semantics are a much bigger problem, last I heard Kernel devs generally didn't like the idea of implementing the supporting cruft behind exception handling, for example.
It's been possible to use a subset of C++ in kernel mode on Windows for longer than that. I have seen code that does it that predates Win8 by more than a decade. (Not that I consider it good or a good idea.)
Neither of the containers requires changes to the language, nor does the regex support, the RNG support, etc. Taking that to the extreme, you don't even need character I/O streams, as people could build them on raw byte streams, adding char-to-byte conversions and formatting.
All of these were included to counter one of the weakest points of C and C++: the language may be portable, but even simple programs aren't. That's one area where Java was a huge improvement over C.
So, the argument should not be "If the proposed graphics API does not require changes to the language it really doesn't need to be there.", but one should ask "how far should we go in adding standard libraries?". The answer to that question should guide the answer to the "Do we want a 2D graphics API in the standard?" question.
I find 2D graphics going a bit far for that, but can see why people would see this otherwise.
And to those stating that they don't see that making it into embedded use: C++ already has optional features (http://en.wikipedia.org/wiki/C11_(C_standard_revision)#Optio...). Yes, that is less than ideal from the portability viewpoint, but that's the real world for you.
Apparently it's an often-requested feature. But the question you stated is exactly what the Committee is trying to find an answer to by forming the study group. Nothing's been decided yet; they're just trying to chart the possibilities.
yeah.. i dont think that add much to the language.. as for 2d we are better of pluging into the native 2d api of the systems we are plumbing.. thats the true value of C and C++
I mean, C++ is the de facto standard for graphics drawing so it kind of makes sense. It's kind of similar to how Go has a HTTP library as a part of the standard library.
I don't know many people who'd call the JVM good C++, though. It's got a lot of baggage.
Life's way too short to not have things destruct on scope exit. I'm pretty confident that my own projects would be a lot more verbose and much, much more error prone (like, failure-to-get-off-the-ground error-prone) if I had to deal with C's limitations.
really, in the case he is pointing out (choosing cairo for moonlight) its just a matter of wrapping a C api in a C++ application.. so people can embed more easily...
See the case for the Dart VM for a good example.. its everything wrapped in the C api.. (and also if you want a example of beautiful C++ code for a complex application. hack the VM).. in every language you can write pretty code or nasty code.. i think the only exception here would be haskell (as a proof of the impossibility to write a code that do not look ugly)
Miguel loves C and that's perfectly fine - it's a great language, easy to read and write, powerful, but does suffer from some of the same fundamental problems as C++... Both languages are tightly bound to today's base machine architecture - they are called "systems programming languages" for this very reason.
I find it odd that we have this continual debate about older (and widely used) systems programming languages (the ones that are actually used and form the basis of modern computing....), but neglect the arguably more interesting - and obvious - problem: the lack of a viable, consumer class, non-Von Neumann machine architecture. What would it mean for the advent and design of new high level programming languages that are safe, performant, productive, portable, and that directly map to this new machine model?
The future will always be built on the past, but this doesn't mean we have to keep re-inventing the same problems to solve... Here's to a future of new machine models and the languages used to program them efficiently, effectively, safely, and easily.