Your link says ſ (the long s) didn't disappear (from English) until several hundred years after the movable type printing press and makes no mention of physical problems when using that letter, suggesting instead removal gave a type a more modern feel:
> Pioneer of type design John Bell (1746–1831), who started the British Letter Foundry in 1788, is often "credited with the demise of the long s".[12] Paul W. Nash concluded that the change mostly happened very fast in 1800, and believes that this was triggered by the Seditious Societies Act. To discourage subversive publications, this required printing to name the identity of the printer, and so in Nash's view gave printers an incentive to make their work look more modern.
It's more of a pet theory I have. The 1787 Printer's grammar mentions the following:
"Kerned Letters being attended with more trouble than other Sorts, Founders are sometimes sparing in casting them; whereas they rather require a larger number than their Casting-Bill specifies; considering the chance which Kerned Letters stand, to have their Beaks broke, especially the Roman f, when it stands at the end of a line, where it is exposed to other accidents, besides those from the lye-brush: but in still more danger are Kerned Letters of the Italic; especially d f l, when they stand, with their Beaks unguarded, at the end of lines; and at the beginning of lines, f g j [long s] y run a great hazard; though of these, f and [long s] in particular are most liable to suffer."[0]
So, foundries are less likely to cast letters that break easily. This is just 4 years before Bell dropped the long s, so while the other reasons outlined in the Wikipedia are probably the main reasons, I speculate that it was also an economic decision based on them breaking quite easily. Especially when the new "modern" look required ever sharper and finer details.
And my point was that it is (partly) this material aspect of typography that contributed to the disappearance of a whole letter from English written language. Doesn't really matter if it's hundreds of years after the "invention" of printing press, it's still related to it.
Related to, perhaps, but not so relevant as my comment was in response to mousethatroared writing 'As I understand it, English lost a lot of characters when the movable type printing press was created.'
Also, the long s is not a letter of the English alphabet but rather a form of the letter 's', like how ꝛ (the r rotunda) is an archaic variant form of the letter 'r'. https://en.wikipedia.org/wiki/R_rotunda
Similar, I believe, to how Greek σ when in the word-final position is ς, but both are lower-case sigma.
For more, read Paul Nash's "The abandoning of the long s in Britain in 1800", which mentions the material and economic aspects, but then digs deeper into why it happened so suddenly in 1800 (which he speculates is realted to the Act).
My wish is for a fast SVG renderer in the browser. At the moment, basic vector drawing is fast, but almost any use of filters or effects lags the browser. Theres a lot that SVG could do for (web) UI, but won't because it's so slow. Here's a small thought experiment I made a while ago for using SVG for more unconventional webdesign. Sadly it lags a lot.
I can also highly recommend Enzio Mari's Autoprogettazione furniture. Although slightly more involved in construction, all you need is standard planks, hand saw, a hammer and some nails. The instruction PDF can be found online (chairs in the latter half):
Two of the core ideas are that the majority of the work can be undertaken by a single person with basic carpentry skills, using readily-available materials in standard sizes so there is minimal cutting and waste.
Having built a couple of smaller structures, I don't see why these can't be done today. Ignoring the current trend of building for curb appeal instead of practicality, you can build a house using standard size materials (8/10/12/16 foot). Even studs come in 92 5/8" to accomodate for top/bottom plates in 8 foot walls.
I'm always intrigued by the Segal method, but it's so closely tied to the sizes of construction materials available at the time and I can't help wondering if anything has changed since.
I've been thinking about doing this, and my main thought is that the insulation standards were probably effectively non-existent at the time, beyond "as few draughts as possible"!
Edit: not sure if these will be geo-blocked, but there are a couple of programmes about that project here:
Oh, absolutely. It's just that I've seen some of the original buildings in south east London and they looked absolutely freezing - single glazing with metal-framed windows. The walls looked like thin painted plywood, but I don't think that can be the case. It looks like they've updated them now: https://programme.openhouse.org.uk/listings/1615
Just looking through that PDF, unlike the chair in the blog, some of that furniture is not as robust. Pages 46 & 47 the load rests on the corner of the wood, and pages 46, 47 & 52 add almost all load onto the screws.
The chair in the blog benefits from essentially having all load bearing done by the wood, any screws or nails would be superficial only. We have several good hard wood chairs here with dove tail joints and spring based cushions - and they are excellent.
The chair in the blog is not robust. It is simple, but not robust. It contains a hinge with an extreem amount of force placed on a small area of wood. It will deform very quickly, changing the angle of the recline. Similarly, the sharp points in contact with the ground will wear/weather quickly, putting it out of level. Rustic and utilitarian, but not meant to last. Imho.
I think you should check your own knowledge before double-guessing Enzo Mari - a designer who did the work and had extensive knowledge of form and materials spanning decades.
This is not an ""argument from authority" but "Chesterton's Fence".
Maybe Enzo's implementation was fantastic, but the problem with not specifying these things is that it's not clear if one or one hundred nails should be used to secure each part.
One thing to observe is that people were lighter traditionally (i.e. prior to 1974 when this was published), and the load bearing capacity of a chair was less important than it is today. Also bare in mind that wood has become far more expensive, and people of today would likely be using less dense wood.
On page 52 for example, each leg (E) is nailed/screwed into C by just three points. The C wood itself is in a strong configuration, but the legs are almost an afterthought. Without any lower support (e.g. as shown on page 56), the nails/screws will eventually be levered out. The actual loading on the leg itself is not great, with three nails/screws seemingly aligned with the grain. The result would be a split that runs along the grain, and that may have been what happened to the left leg in the picture.
Yeah, and not specifying them - as well as not specifying the assembly steps, which for some models isn't obvious; I know it, I build the bed frame on page 24 - is part of the learning process.
"Autoprogettazione" is not a DIY book, or a guide, or a procedure. It is a project to gift common people the insight of what goes into making an object. The result Enzo seeks is not that you end up with another object/furniture in your house, but that you end up with a new appreciation of what makes an object stand up in 3D and, for example, support your weight, or flatter your eye.
If you want to follow step-by-step orders, there was and there is already hundreds of books, and thousands of workers do the same in factories - just executing orders.
Very nice! thanks for that. I really like basic agricultural do it yourself furniture. We should reclaim our furniture and be less dependent on IKEA et al. I'm defo going to try the adjustable table at the top of the pdf, thats a work of art.
Ironically, companies like Ikea have started selling products aimed at people making their own furniture; the "outdoor bench made from pallets" is pretty popular, and ikea & co sell cushions just for those.
I never understood those tbh, used pallets are splintery. I wouldn't be surprised if you can buy pallets specifically made for use in upcycle projects.
fake edit after a quick search: yup, you can buy readymade pallet benches or benches "inspired by" pallet projects.
Like Amazon continuing the DIY door desk long after it became more expensive than just buying desks. For some reason, people will pay a premium for the refurbished industrial look.
My understanding is that Amazon continues it for 2 reasons: it's similar priced, but more importantly they can get the doors at massive scale and they're the same. Styles, colors, materials with desks change over time. They have to source sometimes ~10k desktops in a quarter with little lead time. You can't do this with desks but you can with doors.
Or this is why they told us we wouldn't get white desks in Seattle when we had em in sfo. Even though we got the same legs.
What is interesting also in Enzo Mari's concept is that there is no instructions, you have to figure out the best order of operations and how to offset some planks with others.
And and and last but not least, the great Christopher Schwarz and team at Last Art Press just got out a whole video serie and book on how to make a highly respectable chair design from very basic materials and tools:
> Can it be brutalist when you have the richness of wood in full display, veins, shimmer and knots all apparent?
Insofar as brutalism is about showcasing the raw building materials, yes, I think this is precisely what brutalism is about. Brutalism often uses concrete, but the big idea is to showcase the underlying material. (And if wood is more beautiful than concrete, great!)
IME, most people that complain about brutalism don't know what it is anyway and even if they roughly do, are only familiar with decades old ran down versions and not the original vision.
I totally agree. I think the neo-liberal university model is the real culprit. Where I live, Universities get money for each student who graduates. This is up to 100k euros for a new doctorate. This means that the University and its admin want as many students to graduate as possible. The (BA&MA) students also want to graduate in target time: if they do, they get a huge part of their student loans forgiven.
What has AI done? I teach a BA thesis seminar. Last year, when AI wasn't used as much, around 30% of the students failed to turn in their BA thesises. 30% drop-out rate was normal. This year: only 5% dropped out, while the amount of ChatGPT generated text has skyrocketed. I think there is a correlation: ChatGPT helps students write their thesises, so they're not as likely to drop out.
The University and the admins are probably very happy that so many students are graduating. But also, some colleagues are seeing an upside to this: if more graduate, the University gets more money, which means less cuts to teaching budgets, which means that the teachers can actually do their job and improve their courses, for those students who are actually there to learn. But personally, as a teacher, I'm at loss of what to do. Some thesises had hallucinated sources, some had AI slop blogs as sources, the texts are robotic and boring. But should I fail them, out of principle on what the ideal University should be? Nobody else seems to care. Or should I pass them, let them graduate, and reserve my energy to teach those who are motivated and are willing to engage?
I can say from some working experience in the United States that way too many jobs require a university degree. I remember being an intern or my first job after uni (which I struggled a great deal to complete), looking around and thinking: "There is no way that all of these people need a uni degree to do their jobs." I couldn't believe how easy work was compared to my uni studies (it was hell). I felt like I was playing at life with a cheat code (infinite lives, or whatever). I don't write that to brag; I am sure many people here feel the same. So many jobs at mega corps require little more than common sense: Come to work on time, dress well, say your pleases and thank yous, be compliant, do what is asked, etc. Repeat and you will have a reasonable middle class life.
Then there's Europe, where making it easy to get a master's degree just let to jobs requiring people to waste time getting yet another unneeded degree.
This entire situation is something that is predictable, and I have personally called it out years ago - not because of some unique ability, but because this is what happened in India and China decades upon decades ago.
There’s only so many jobs which have you a good salary.
So everyone had to become a doctor lawyer or engineer. Business degrees were seen as washouts.
Even for the job of a peon, you had to be educated.
So people followed incentives and got degrees - in any way or form they could.
This meant that degrees became a measure, and they were then ruthlessly optimized for, till they stopped having any ability to indicate that people were actually engineers.
So people then needed more degrees and so on - to distinguish their fitness amongst other candidates.
Education is what liberal arts colleges were meant to provide - but this worked only in an economy that could still provide employment for all the people who never wanted to be engineers, lawyers or doctors.
This mess will continue constantly, because we simply cannot match/sort humans, geographies, skills, and jobs well enough - and verifiably.
Not everyone is meant to be a startup founder. Or a doctor. Or a plumber, or a historian or an architect or an archaeologist.
It’s a jobs market problem, and has been this way ever since the American economy wasn’t able to match people with money for their skills.
Yep, it's a job market problem. Only degrees that are somehow limited in their supply will continue to hold value, the rest approach worthlessness. Neither the state nor universities have any interest to limit the supply.
In my country doctors earn huge salaries and have 100% job security, because their powerful interest groups have successfully lobbied to limit the number of grads below job market's demand. Other degrees don't come even close.
I agree. I tend to think though that the best way forward is to ignore all of these education issues and just focus on raising the floor. The difference between a "good-paying job" and a "not-so-good-paying job" should be small, and everyone should be able to have a good life regardless of what job they have. Then people can choose to go to college if they want to learn about things, and maybe to learn about subjects related to a job they want, but not because they think it's a way to make more money.
Well, see Germany. They do it pretty well. The expected lifetime earnings difference between university graduates and someone who took the trade/apprenticeship route is very similar. Does anyone know of other countries that are similar? Is it also true in Austria or Switzerland?
This is why you need the degree. HR has a stack of resumes a mile high, if they can throw out all the non-degrees to narrow the field then their job is easier.
> Some thesises had hallucinated sources, some had AI slop blogs as sources, the texts are robotic and boring. But should I fail them, out of principle on what the ideal University should be?
No, you should fail them for turning in bad theses, just like you would before AI.
That's probably what should happen, but it's not what happens in reality. In grading I have to follow a very detailed grading matrix (made by some higher-ups) and the requirements for passing and getting the lowest grade are so incredibly low that it's almost impossible to fail, if the text even somewhat resembles a thesis. The only way I could fail a student, is if they cheated, plagiarised or fabricated stuff.
The person who used the AI slop blog for sources, we asked them to just remove them and resubmit. The person who hallucinated sources is however getting investigated for fabrication. But this is an incredibly long process to go through, which takes away time and energy from actual teaching / research / course prep. Most of the faculty is already overworked and on the verge of burnout (or are recovering post-burnout), so everybody tries to avoid it if they can. Besides, playing a cop is not what anybody wants to do, and its not what teaching should be about, as the original blog post mentioned. IF the University as an institution had some standards and actually valued education, it could be different. But it's not. The University only cares about some imaginary metrics, like international rankings and money. A few years ago they built a multi-million datacenter just for gathering data from everything that happens in the University, so they could make more convincing presentations for the ministry of education — to get more money and to "prove" that the money had a measurable impact. The University is a student-factory (this is a direct quote by a previous principal).
Yeah, our information and training systems are kinda failing at dealing with the reality of our actual information environment.
Take law for example and free speech - a central tenet to a functional democracy is effective ways to trade ideas.
A core response in our structure to falsehoods and rhetoric is counter speech.
But I can show you that counter speech fails. We have realms upon realms of data inside tech firms and online communities that shows us the mechanics of how our information economies actually work, and counter speech does diddly squat.
Education is also stuck in a bind. People need degrees to be employable today, but the idea of education is tied up with the idea of being a good educated thinking human being.
Meaning you are someone who is engaged with the ideas and concepts of your field, and have a mental model in your head, that takes calories, training and effort to use to do complex reasoning about the world.
This is often overkill for many jobs - the issue isn’t doing high level stats in a day science role, it’s doing boring data munging and actually getting the data in the first place. (Just an example).
High quality work is hard, and demanding, and in a market with unclear signals, people game the few systems that used to be signals.
Which eventually deteriorated signal till you get this mess.
We need jobs that give a living wage, or provide a pathway to achieving mastery while working, so that the pressure on the education lever can be reduced and spread elsewhere.
I get the feeling that you aren’t asking for the short version, because most people wouldn’t latch onto that point and create an account for it.
Hmmm.
An example - the inefficacy of Fact checking efforts. Fact checking is quintessentially counter speech, and we know that it has failed to stop the uptake and popularity of falsehoods. And I say this after speaking to people who work at fact checking orgs.
However, this is in itself too simple an example.
The mechanics of online forums are more interesting to illustrate the point - Truth is too expensive to compete with cheaper content.
Complex articles can be shared on a community, which debunk certain points, but the community doesn’t read it. They do engage heavily on emotional content, which ends up supporting their priors.
I struggle to make this point nicely, but The accuracy of your content is secondary to its value as an emotional and narrative utility for the audience.
People are not coming online to be scientists. They are coming online to be engaged. Counter speech solves the issue of inaccuracy, and is only valuable if inaccuracy is a negative force.
It is too expensive a good to produce, vs alternatives. People will coalesce around wounds and lacunae in their lives, and actively reject information that counters their beliefs. Cognitive dissonance results in mental strife and will result in people simply rejecting information rather than altering their priors.
Do note - this is a point about the efficacy of this intervention in upholding the effectiveness of the market where we exchange ideas. There will be many individual exchanges where counter speech does change minds.
But at a market level, it is ineffective as a guardian and tonic against the competitive advantage of falsehoods against facts.
——
Do forgive the disjointed quality in the response. It’s late here, and I wish I could have just linked you to a bunch of papers, but I dont think that would have been the response you are looking for.
I’ve been recommending network propaganda recently. The book has the data that makes the case better than I can about structural issues in the information ecosystem.
Also started going through this legal essay (paper?) recently, Lies, Counter-lies, and Disinformation in the Marketplace of Ideas
The book "Nexus" by Yuval Noah Harari essentially makes this same point. The way he phrases it is that information's primary role throughout history hasn't necessarily been to convey objective truth but to connect people and enable large scale cooperation. So more information is not necessarily better.
Worth a read or you can check out one of his recent podcast appearances for a quicker download.
In The Netherlands, we have a three-tier tertiary system: MBO (practical job education / trades), HBO (college job education / applied college) and WO (scientific education / university).
A lot of the fancy jobs require WO. But in my opinion, WO is much too broad a program, because it tries to both create future high tier workers as well as researchers. The former would be served much better by a reduced, focused programme, which would leave more bandwidth for future researchers to get the 'true' university education they need.
> In grading I have to follow a very detailed grading matrix (made by some higher-ups) and the requirements for passing and getting the lowest grade are so incredibly low that it's almost impossible to fail, if the text even somewhat resembles a thesis. The only way I could fail a student, is if they cheated, plagiarised or fabricated stuff.
This is another example of "AI is exacerbating existing problems". :-) That kind of grading policy is absurd and should never have existed in the first place, but now AI is really making that obvious.
I've talked with professors at a major US research university. For Master's students, they are all paying a lot of money to get a credential. That's the transaction. They don't really care about cheating as long as they go through the motions of completing the assigned work. It's just a given, and like you say it takes more time than they have to go through the acacdemic dishonesty process for all the students who are getting outside help or (now) using AI.
The larger work that the intellectual and academic forces of a liberal democracy is that of “verification”.
Part of the core part of the output, is showing that the output is actually what it claims to be.
The reproducibility crisis is a problem Precisely because a standard was missed.
In a larger perspective, we have mispriced facts and verification processes.
They are treated as public goods, when they are hard to produce and uphold.
Yet they compete with entertainment and “good enough” output, that is cheaper to produce.
The choice to fail or pass someone doesn’t address the mispricing of the output. We need new ways to address that issue.
Yet a major part of the job you do. is to hold up the result to a standard.
You and the institutions we depend on will continue to be crushed by these forces. Dealing with that is a separate discussion from the pass or fail discussion.
> Some thesises had hallucinated sources, some had AI slop blogs as sources, the texts are robotic and boring. But should I fail them, out of principle on what the ideal University should be?
I don't think you should fail them - instead, give them feedback on how to improve their thesis themselves, and how to make better use of tools like ChatGPT.
If instead of flat out failing to turn in their thesis, instead they are submitting work that needs more iteration due to bad use of AI, that sounds like a net win to me. The latter can be turned into something useful.
I did consider that, but the 1980 IBM Displaywriter uses a filled downwards triangle, not a house character, to indicate the center line [0].
But you're right that the Displaywriter inspited 1984 DisplayWrite DOS program [1] did use the house character for the same purpose. (Although, CP437 also included a filled downwards triangle character at 0x1F.)
I'd wager serious money that if you put that on a sign and surveyed people, at least in the US, they'd all still conclude it is a "New York" to "London" flight.
What's the use of a communication tool, if it doesn't actually communicate anything to real people?
In my region at least, -5 ~ -2°C, or -5°C ~ -2°C.
If the something is making people confuse, we replace it with a suitable substitution. Re-educating people is really just last resort. Is there anything keeping us from changing it other than ego?
I highly recommend reading this paper^[0] on permacomputing. It explains the concept in depth.
> In this paper, we argue for the potential of permacomputing as a
rich framework for exploring creative design constraints building
on a long history of applying constraints in art, design and cultural
practices.
https://en.m.wikipedia.org/wiki/Long_s
AFAIK it was dropped out because the top hook of the long s punch broke easily, and could be easily replaced with a basic s.
reply