The German government, and maybe to a lesser extent the EU more generally, feels that the privacy of citizens from corporations (especially foreign ones, especially American tech ones) is important to preserve and protect. Privacy from the government is a different matter. The government, due to its being duly elected by the people, has an implicit right to pry into people's affairs if such prying can be justified as serving the public interest.
It's quite a different attitude than the USA. The USA is almost unique in being individualist to what some might consider an extremist degree, to the point where if the government intends to violate someone's fundamental rights, they may do so only under very limited circumstances and must document everything and prove that those circumstances had been met. In most of the democratic world, individual rights are just one factor that needs to be balanced against public safety, public order, etc. and the government has much wider latitude to violate even constitutionally protected rights on its own say-so.
This is how you get German prosecutors on 60 Minutes, grinning and laughing as they describe the shock people undergo as they are arrested and their computers confiscated and searched over literal mean tweets. For the Germans, it's normal -- necesssary, even, to have a functional society. To Americans it's abhorrent.
I went to a somewhat highly regarded (not MIT or CalTech tier) tech school, and then to a state university.
The tech school considered it a boast that it had more graduate students than undergrad. It was clear where the professors' emphasis was. I recognize the lecture halls where you couldn't ask questions, and the barely-anglophone instructors. (Everyone in the EE department, in particular, seemed to come "fresh off the boat" from China bringing precious little English knowledge with them. The prof for my introductory EE course mumbled on top of it.)
Then I went to state school. Ho-lee shit. Complete difference. The bad profs were incompetent chucklefucks who couldn't cut it in real academia. The good profs actually cared about teaching undergrads.
I learned a lot about choosing a college -- a few years and a few tens of thousands of dollars too late.
"Up to" is still doing a lot of work there. What kinds of workloads are we talking that get the big numbers, and what can we realistically expect on real workloads?
I'm reminded of 90s advertisements in which the new G3 processor was supposed to be so many times faster than the Pentium or even Pentium II. Their chosen benchmark: how long it takes to run a Photoshop plugin. On Mac OS pre-X, a Photoshop plugin got 100% of the CPU because there was no preemptive multitasking. Windows 9x versions of Photoshop had to share the CPU with whatever else was running.
> Testing conducted by Apple in January 2025 using preproduction 13-inch and 15-inch MacBook Air systems with Apple M4, 10-core CPU, 10-core GPU, and 32GB of RAM, as well as production 1.2GHz quad-core Intel Core i7-based MacBook Air systems with Intel Iris Plus Graphics and 16GB of RAM, all configured with 2TB SSD. Tested using Super Resolution with Pixelmator Pro 3.6.14 and a 4.4MB image. Performance tests are conducted using specific computer systems and reflect the approximate performance of MacBook Air.
It's not common, as only one was ever made, but the Lisp processor described in Sussman and Steele's paper "Design of LISP-based Processors, or SCHEME: A Dielectric LISP, or Finite Memories Considered Harmful, or LAMBDA: The Ultimate Opcode", had built-in, hardware-implemented garbage collection.
I was once at a meetup for Lisp hackers, and discussing something or another with one of them, who referred to Lisp as a "low-level language". When I expressed some astonishment at this characterization, he decided I needed to be introduced to another hacker named "Jerry", who would explain everything.
"Jerry" turned out to be Gerald Sussman, who very excitedly explained to me that Lisp was the instruction set for a virtual machine, which he and a colleague had turned into an actual machine, the processor mentioned above.
Lynn Conway, co-author along with Carver Mead of "the textbook" on VLSI design, "Introduction to VLSI Systems", created and taught this historic VLSI Design Course in 1978, which was the first time students designed and fabricated their own integrated circuits:
>"Importantly, these weren’t just any designs, for many pushed the envelope of system architecture. Jim Clark, for instance, prototyped the Geometry Engine and went on to launch Silicon Graphics Incorporated based on that work (see Fig. 16). Guy Steele, Gerry Sussman, Jack Holloway and Alan Bell created the follow-on ‘Scheme’ (a dialect of LISP) microprocessor, another stunning design."
The original Lisp badge (or rather, SCHEME badge):
Design of LISP-Based Processors or, SCHEME: A Dielectric LISP or, Finite Memories Considered Harmful or, LAMBDA: The Ultimate Opcode, by Guy Lewis Steele Jr. and Gerald Jay Sussman, (about their hardware project for Lynn Conway's groundbreaking 1978 MIT VLSI System Design Course) (1979) [pdf] (dspace.mit.edu)
The Great Quux's Lisp Microprocessor is the big one on the left of the second image, and you can see his name "(C) 1978 GUY L STEELE JR" if you zoom in. David's project is in the lower right corner of the first image, and you can see his name "LEVITT" if you zoom way in.
Here is a photo of a chalkboard with status of the various projects:
The final sanity check before maskmaking: A wall-sized overall check plot made at Xerox PARC from Arpanet-transmitted design files, showing the student design projects merged into multiproject chip set.
One of the wafers just off the HP fab line containing the MIT'78 VLSI design projects: Wafers were then diced into chips, and the chips packaged and wire bonded to specific projects, which were then tested back at M.I.T.
We present a design for a class of computers whose “instruction sets” are based on LISP. LISP, like traditional stored-program machine languages and unlike most high-level languages, conceptually stores programs and data in the same way and explicitly allows programs to be manipulated as data, and so is a suitable basis for a stored-program computer architecture. LISP differs from traditional machine languages in that the program/data storage is conceptually an unordered set of linked record structures of various sizes, rather than an ordered, indexable vector of integers or bit fields of fixed size. An instruction set can be designed for programs expressed as trees of record structures. A processor can interpret these program trees in a recursive fashion and provide automatic storage management for the record structures. We discuss a small-scale prototype VLSI microprocessor which has been designed and fabricated, containing a sufficiently complete instruction interpreter to execute small programs and a rudimentary storage allocator.
Here's a map of the projects on that chip, and a list of the people who made them and what they did:
Just 29 days after the design deadline time at the end of the courses, packaged custom wire-bonded chips were shipped back to all the MPC79 designers. Many of these worked as planned, and the overall activity was a great success. I'll now project photos of several interesting MPC79 projects. First is one of the multiproject chips produced by students and faculty researchers at Stanford University (Fig. 5). Among these is the first prototype of the "Geometry Engine", a high performance computer graphics image-generation system, designed by Jim Clark. That project has since evolved into a very interesting architectural exploration and development project.[9]
Figure 5. Photo of MPC79 Die-Type BK (containing projects from Stanford University):
The text itself passed through drafts, became a manuscript, went on to become a published text. Design environments evolved from primitive CIF editors and CIF plotting software on to include all sorts of advanced symbolic layout generators and analysis aids. Some new architectural paradigms have begun to similarly evolve. An example is the series of designs produced by the OM project here at Caltech. At MIT there has been the work on evolving the LISP microprocessors [3,10]. At Stanford, Jim Clark's prototype geometry engine, done as a project for MPC79, has gone on to become the basis of a very powerful graphics processing system architecture [9], involving a later iteration of his prototype plus new work by Marc Hannah on an image memory processor [20].
[...]
For example, the early circuit extractor work done by Clark Baker [16] at MIT became very widely known because Clark made access to the program available to a number of people in the network community. From Clark's viewpoint, this further tested the program and validated the concepts involved. But Clark's use of the network made many, many people aware of what the concept was about. The extractor proved so useful that knowledge about it propagated very rapidly through the community. (Another factor may have been the clever and often bizarre error-messages that Clark's program generated when it found an error in a user's design!)
9. J. Clark, "A VLSI Geometry Processor for Graphics", Computer, Vol. 13, No. 7, July, 1980.
[...]
The above is all from Lynn Conway's fascinating web site, which includes her great book "VLSI Reminiscence" available for free:
These photos look very beautiful to me, and it's interesting to scroll around the hires image of the Quux's Lisp Microprocessor while looking at the map from page 22 that I linked to above. There really isn't that much too it, so even though it's the biggest one, it really isn't all that complicated, so I'd say that "SIMPLE" graffiti is not totally inappropriate. (It's microcoded, and you can actually see the rough but semi-regular "texture" of the code!)
This paper has lots more beautiful Vintage VLSI Porn, if you're into that kind of stuff like I am:
A full color hires image of the chip including James Clark's Geometry Engine is on page 23, model "MPC79BK", upside down in the upper right corner, "Geometry Engine (C) 1979 James Clark", with a close-up "centerfold spread" on page 27.
Is the "document chip" on page 20, model "MPC79AH", a hardware implementation of Literate Programming?
If somebody catches you looking at page 27, you can quickly flip to page 20, and tell them that you only look at Vintage VLSI Porn Magazines for the articles!
There is quite literally a Playboy Bunny logo on page 21, model "MPC79B1", so who knows what else you might find in there by zooming in and scrolling around stuff like the "infamous buffalo chip"?
I remember seeing a Java microprocessor for sale years ago. It claimed that the CPUs native instruction set is Java bytecode.
I can't find that exact microcontroller that I remember, I think the domain is gone, but there are other things like this, including some FPGA cores which make the same claim that I remember from that microcontroller I read about in the early 2000s. I wonder how those would perform compared to a JVM running on a traditional instruction set on the same FPGA.
nah it was a processor whose native instruction set was java bytecode. it garbage collected natively, and all the other stuff. It was not Jazelle, nor was it an ARM CPU which interpreted bytecode and ran it.
I think it was the "aJile" processor listed in your final link, but I'm not 100% sure. It was over 20 years ago that I read about it and was about to buy a development kit when I got pulled off of all java work I was doing.
WebDAV didn't come out until the back half of the 90s, and it was slow to be adopted at first.
Back in the day, you could author a web page directly in GruntPage, and publish it straight to your web server provided said server had the FPSE (FrontPage Server Extensions), a proprietary Microsoft add-on, installed. WebDAV was like the open-standards response to that. Eventually in later versions of FrontPage the FPSE was deprecated and support for WebDAV was provided.
Managers should not be evaluated based on code output -- it's not their job. However, writing code here and there -- to evaluate new technologies, make a rough prototype, or demonstrate a technique to be adopted by individual contributors -- may aid them in their management responsibilities and should be embraced when it does.
I've seen what happens when a manager is also responsible for individual coding duties. He ended up with roughly twice the work, shifting between two mutually incompatible mental modalities all the time, cranky with his subordinates and making a lot of sad phone calls to his fiancée explaining that he'd be late home from work, again. Not a good fate for any worker, even if the pay and prestige are better.
Except those technologies are now deprecated and you don't know when they might be removed. Jetpack Compose is now the vendor-favored way to build apps, so best practice is to use that.
I don't care what "best practices" are. Seemingly everyone sticks to these, yet here we are discussing that software quality everywhere throughout the industry has taken a dip.
> Except those technologies are now deprecated and you don't know when they might be removed.
Views and activities and XML layout will never be removed, of that I'm certain. After all, Compose does use views in the end. That's the only way to build UIs that the system itself understands. And, unlike SwiftUI, Compose itself isn't even part of the system, it's a thing you put inside your app.
I don't care about deprecations. Google has discredited itself for me and its abuse of the @Deprecated annotation is one of the reasons. The one thing that's very unfortunate is that all tools unquestionably trust that the person who puts @Deprecated in the code they maintain knows what they're doing, and nothing allows you to selectively un-deprecate specific classes or packages; you can only ignore all deprecations in your class/method/statement.
And, by the way, I also ignore the existence of Kotlin. I still write Java, albeit it's Java 17. The one time I had to deal with Kotlin code (on a hackathon) it felt like I'm coding through molasses.
It's quite a different attitude than the USA. The USA is almost unique in being individualist to what some might consider an extremist degree, to the point where if the government intends to violate someone's fundamental rights, they may do so only under very limited circumstances and must document everything and prove that those circumstances had been met. In most of the democratic world, individual rights are just one factor that needs to be balanced against public safety, public order, etc. and the government has much wider latitude to violate even constitutionally protected rights on its own say-so.
This is how you get German prosecutors on 60 Minutes, grinning and laughing as they describe the shock people undergo as they are arrested and their computers confiscated and searched over literal mean tweets. For the Germans, it's normal -- necesssary, even, to have a functional society. To Americans it's abhorrent.
reply