Oh wow. So I read "The Bug" of hers just after it came out (14 years ago) and it's such a poignant read about someone being slowly driven mad because he can't find a very peculiar bug which unpredictably haunts the sales people demo'ing their product. It's a great read and I highly recommend it to anyone here. The technical accuracy is hilarious too but you don't need to be a programmer whatsoever to understand the novel. I won't spoil the reveal of the actual bug or the ending but it's very much worth it.
After reading it I loaned my hardcover copy out to a guy named Boris when I told him how much I liked it. Boris; I don't know where you are and what you're up to nowadays; but if you somehow end up reading this; I kinda want it back. Ping me.
And the terrifying thing about language is that we can still strive to be precise and yet (perhaps even unknowingly) be wildly misunderstood...perhaps malevolently. I've heard someone describe programming as telling a very precise story to a very stupid human. However, I think it would be better to describe a compiler as a more humble listener. One who is willing to say "I'm sorry, you just told me to multiply a string, and I think I'm misunderstanding you." Thinking of compiler errors as an expression of humility can explain the comfort one finds in a stronger type system.
It is the same comfort one finds by not speaking publicly on twitter but speaking to a person who has the charity to say "huh? I think you just said this thing that sounds wildly offensive, and I think I'm misunderstanding you."
I would actually like to see the opposite...get rid of all elective coursework and try to graduate students in two years. They'll have an incomplete education, but we all have incomplete educations.
If anything, we as a society and as individuals are creating more harm by pointlessly overeducating ourselves.
Most of the CS students learn in school will be obsolete in a few years, and they probably won’t be using it in their job anyway.
But the humanities they learn will be useful their whole lives.
Are trees and graphs suddenly not used anymore? Are loops and folds and calculus invalidated? I harp here once a month about how even senior Google engineers and Amazon principles don't know the optimal asymptotic time complexity for sorting 50m pairs of 32 bit unsigned numbers, as the answer changed substantially earlier this decade.
The humanities are more important than they get credit for, but this argument that everything is invalidated and industry teaches you everything is absurd.
I suspect we'll be teaching quicksort forever, because it's a really good example of a divide and conquer algorithm.
But that's not really the premise I was responding too. Plus, I obviously enkoying being The Person In The Room talking about how the state of the art in sort puts it at linear time and right up against the edge of the information theoretical limits for bitwise comparisons.
I've never been a fan of quicksort as an algorithm. It's more like, "Let's fit this sort option to the weird hardware constraints we arrived at instead of asking for better hardware."
However, if I had one criticism it would be that the interview and resulting article focus quite a bit on her being a victim of prejudice rather than focusing on the great work she has done. But maybe that's me.
I changed the title to a phrase from the article which doesn't cover the whole thing but at least covers some of it.
In reality the algorithms are doing exactly what the humans would do. School districting for example...this has to be the most objective determination in public policy. In 99% of the US the school districts are already set. Where you go to school is determined by your home address or some other rule. None of the humans involved are particularly motivated to find a spot in their hearts to bend these rules out of pity...so why would the algorithms? No one is going to risk their job seating your child in another school district because their nuanced worldview has resolved that you merit being an exception. Bureaucracy is a meat computer.
The reality is that public policy has become algorithmic by public demand. Look at California's "three strikes" law. This is a very simple algorithm that has widespread public support, regardless of actual applicability.
Fairness is also a policy motivation. The public has rejected policies that are unfairly applied. We actually have little desire to grant a policy bureaucrat dictatorial power to apply benefits as they see fit simply by virtue of their having read Descartes.
CS majors should be required to take at least 20 hours of philosophy or history. Or, be required to do 20 hours of community service. It's too easy to live in a code bubble and forget the rest of the world lives on $2 a day. That's why we have Uber which solves problems for the first world and the third world remains a disaster.
There are plenty of financiers who are draining the wealth of the world into their own pockets who were educated in the humanities.
Yeah, that sounds a bit like a blank check for someone to corrupt the generated output under the guise of a nebulous "It's better for our community" while scaremongering about the big bad algorithm.
> Algorithms make decisions all through his borough, the East Bronx, including where children go to school, where police patrols are assigned, and the schedule for garbage pickup.
Which seems a bit bizarre, open source algorithms would be nice, but the tie in to 'the humanities' seems tenuous at best.
I think it's an interesting topic, though. Do tech companies need more humanities grads? I'm skeptical because plenty of other industries have caused their own social problems, and the humanities haven't saved them. Look at advertising, which was originally built by artists rather than hackers.
It may well be true that big tech companies need to be more socially aware and environmentally conscious, but that doesn't mean "the humanities" is necessarily the answer.
What would help, as a start, is an appreciation for the values (and value) of the humanities.
On HN specifically, you frequently see something akin to Schadenfreude when referring to students of the liberal arts serving fries to tech grads. That is to some degree just factually wrong, considering many liberal arts undergrads go on to rather lucrative careers in law, or powerful careers in politics.
But it is also short-sighted, because history has always been a two-step process, with technology enabling us to do new things, and other disciplines enabling us to use these capabilities for something resembling the "common good".
I always like this thought experiment: Would you rather live in a world with the middle age's social/political/law system and today's technology–or the other way around?
This doesn't mean that social and political progress are the exclusive domain of those who studied greek mythology. In fact, this article explicitly argues that it shouldn't be: Hackers should engage with that community.
But what it does take is, sometimes, a bit of humility. Too often, I feel the tech community is looking at technological solutions to social and political problems. See cryptocurrencies, or the encryption debate as examples where you can feel cynicism, bordering on disdain, for the political process.
The problem with that is, first, that it's wrong: a lot of power rests in politics, and China is a living example that the internet community is currently unable to win a direct confrontation with the power of the state. It is also wrong in that it portraits all governments as equally bad, all laws as subject to changed or broken when expedient, when clearly there are differences in the rule of law between countries, and there are differences in the willingness of politicians to act with integrity.
Secondly, using technology to create facts is somewhat undemocratic in itself. The Federal Reserve may be committing all sorts of errors. But it is hard to deny that derives legitimacy through a transparent, tested, process much more democratic than bitcoin. It is also an illusion to suggest that bitcoin is free of ideology, only because that ideology takes the form of an algorithm decided in the past, ticking away with almost no method to influence it.
You say that as though technology isn't part of the political process.
If the last few years have taught us anything it's that the general population is not a fan of the existing banking laws. But that hasn't resulted in reform because the banks and law enforcement agencies have undue disproportionate political power and prevent that reform.
Technologies that can disrupt the balance of power that keeps the broken status quo aren't apolitical, but that doesn't make them dirty. They're a tool that can be used for good or for ill. It's not as if the other side isn't using their own tools. AES didn't exist in 1950 but neither did ISMI catchers, room 641A, the Utah Data Center, always-on surveillance devices in everyone's pocket, etc.
And resisting government overreach with maths is a whole lot cleaner than doing it with bullets and IEDs, which are also part of politics.
Ullman's book probably doesn't oversimplify like that, but the title of the article alludes to it.
What I would like to know is, what are some good examples of industries that are doing the right thing, that could be used to guide the way forward for tech companies?
I'm very skeptical that there are any industries that have successfully self-policed in this way. But there are many examples of industries that have exploited shared resources and human weaknesses for a quick profit, and inflicted lasting harm in the process: oil, tobacco, intensive farming, fashion, cars, guns, advertising. (And just like social media, most of those have their good points too!)
Tech and social media in particular is an outlier because it's so successful, but even so it's not totally unique -- the economic and social media effect of oil is massive too.
One possible positive example that occurs to me is nuclear weapons. They could have proliferated massively, but so far they haven't quite. Those weapons are developed by states rather than corporations, though.
I really object to the stereotype of socially-awkward programmers being to blame for everything, because we don't understand how real squishy humans work. The problems we face are the same ones that have faced every highly successful and transformative industry. They're at the root of business and capitalism.
The publications generally regarded as the top, auch as The Economist, the New York Times, and, until this year, The Wall Street Journal: they are not catering to the lowest instincts. When they have made mistakes (somebody will inevitably bring up Judith Miller), they have self-corrected, with new policies, and by firing people.
I also wouldn’t consider the wish for „more humanities in tech“ as an attack on you personally, or any single person. In fact, Google, Apple, and many smaller companies have an excellent track record on many issues. Their leadership actually has an appreciation for non-Technology issues far better than most companies’. Yet the example of Apple engineers preparing to go to jail to protect their customers stands out, because others (Yahoo, AT&T) so eagerly cooperated to intrude on customers’ privacy.
I do worry, though, that it's being degraded by the widespread removal of the firewall between the news side and the advertising side of the business.
That appears undemocratic. Part of the solution to this tension is to point out that democracy is more than just majority rule, nicely enshrined in the saying "Democracy isn't two wolves and a sheep voting on what's for dinner".
Another part is the custom that constitutions are mostly concerned with the process of democracy, not the contents. It's somewhat tenuous in light of the the US constitution's amendments, some of which seem to deal with subjects that don't directly relate to the political process. But I think a case can be made that, for example, religion is constitutionally protected because it plays an important role in society's deliberations.
I once saw the late Justice Scalia, for whom I usually have nothing but disdain, at a talk, and he had a relevant point: Too often, people call for a constitutional amendment to enshrine their ideology, removing it from being a subject of the usual political process. That is, I believe, valid for attempts to elevate animal rights, or restrictions on abortions, to the constitution, among other things.
I think this discussion mostly serves to show that politics is almost never about easy decisions. Because easy decisions are made silently, never to appear in the news and the subject of partisan strive. But the tech community has a tendency to believe in oh-but-thats-so-obvious, and the result of that believe is, for example, the idea that "smart contracts" could replace the ambiguity and complexity of law in capturing the myriad circumstances of life that may make an appearance in a contract, or a dispute.