Related to the Coral accelerator, is it usable as a general accelerator outside AI workflows? When I looked into it, there was no general SDK on using the Coral, only an "ingest ML model" compiler.
Not surprising. It always seemed likely to me that there is model bias if you train your models on model generated data, like a feedback loop (second order effects?). Similar to how applying a linear system over and over stretches the inputs in the direction of its largest eigenvector.
Now wait till the generated content is indistinguishable from human content (to humans) and it will be hard to figure out what's in your training set.
Not a direct answer, but last year I got a 13" eink second screen and it's been the best investment I made for my eye health since getting glasses. The difference was noticeable from the start, at the end of a day without the eink screen my eyes feel tired, similar to sand in your eyes. After a day of using the ink screen my eyes feel a lot better, still relatively tired, but like the tired you get at the end of a day not in front of your computer.
Getting an eink secondary monitor for my workstation. I use to get pretty bad eye strain that lasted for days from LCD screens. It got so bad I forgot how it feels to wake up without my eyes hurting. Now it's all gone.
The product offering of eink screens is virtually non-existent, the only usable one for interactive work that I found was the Dasung 13.3", which is the one I got.
One geometric model that actually handles numerical approximations is called exact geometric computation. It's pretty interesting and starts by explicitly tracking arithmetic errors in the operations you do. At its core it's a mix of interval arithmetics and multi-precision floating point. This is something you can do since the floating point implementations on most (all?) CPUs give you strong guarantees on the error bound of floating point arithmetic operations and these are usually within 1 ulp (unit in the last place) of the correct result. Now you have an interval for the result and can build your geometric constructs on interval arithmetic rather than classic, single point scalar arithmetic.
Geometry works with constructs, not numbers. An intersection is a construct, collinearity is a construct, orientation / handedness is a construct, etc. If you define your fundamental constructs to be rigorous then the algorithms you build on top will also be rigorous. If we consider the article's inside-outside test example, this result would be computed based on the intersections (intervals) we've computed. Since we know for sure the correct result is within our interval of error then inside-testing boils down to comparing intervals and we know there are three possible situations here: 1. [a, b] < [c, d]; 2 [a, b] > [c, d]; 3. [a, b] overlaps with [c, d]. In case 1 and 2 we have a robust conclusion and in case 3 we recompute the whole predicate with higher precision until it becomes decidable.
The golden standard for exact geometric computation is CGAL [1]. All the research around it is very interesting and refreshing to read.
Hi, author here. I looked into that side of things, but CGAL only offers exact precision implementations for lines and circular arcs. The fact that Bézier curves are left as an exercise for the reader is further proof of the disconnect between computational geometry and modern computer graphics.
Fair point. At the same time the exact computation paradigm is proof that problems like these can be solved in a rigorous way if you start with the right foundation.
By the way thanks for writing the article, it was a refreshing read.
I'm curious; I've played with OpenSCAD a little, which uses CGAL, and found it to be painfully slow (with no particular point of comparison). Do you think you have a way to do similar CSG calculations faster, or at least trade speed for accuracy or something?
It's very plausible that these techniques could be used to fix the output of a CSG library that uses floating point math, but I'm not sure what the specifics would look like. If anyone has ideas in that vein, I'd be very interested to hear them.
I was thinking along the lines (based on skimming documentation) of CGAL using arbitrary precision integer based rationals, which are slow, and using floating point with the error correction might potentially be faster.
Unfortunately, it's probably way beyond my ability to delve in to it.
'Exact real arithmetic' is probably the best starting point for what you're describing here. It does have plenty of application areas besides computational geometry. The issue with simple interval arithmetic is that intervals will widen when chaining multiple computations. To cope with this and obtain rigorous answers, one needs to set a required bound on the final result and do arbitrary-precision computations as entailed by that final bound.
I work on developing better tools for solving computational geometry problems robustly and efficiently, so I got quite excited when this paper first appeared.
However, while the type theoretic developments based on Abstract Stone Duality is interesting, they mostly ignore the problem of efficiency by simply representing every real number as a Dedekind cut. Thus, it doesn't scale without significant advances in compiling real arithmetic. A problem I'm working on presently, but it might take a few years...
At https://www.3dhubs.com we're using Rust to build the geometry processing side of our platform, which is the most computationally intensive custom part. Previously it was in C++ and we're a lot happier after migrating to Rust two years ago.