Hacker News new | comments | show | ask | jobs | submit login
Ellen Ullman on the importance of making algorithms accessible to the public (slackhq.com)
50 points by fagnerbrack 8 months ago | hide | past | web | favorite | 48 comments



Tangibly related due to it being Ellen Ullman.

Oh wow. So I read "The Bug" of hers just after it came out (14 years ago) and it's such a poignant read about someone being slowly driven mad because he can't find a very peculiar bug which unpredictably haunts the sales people demo'ing their product. It's a great read and I highly recommend it to anyone here. The technical accuracy is hilarious too but you don't need to be a programmer whatsoever to understand the novel. I won't spoil the reveal of the actual bug or the ending but it's very much worth it.

After reading it I loaned my hardcover copy out to a guy named Boris when I told him how much I liked it. Boris; I don't know where you are and what you're up to nowadays; but if you somehow end up reading this; I kinda want it back. Ping me.



> The beauty of language is that we can be imprecise and still be understood.

And the terrifying thing about language is that we can still strive to be precise and yet (perhaps even unknowingly) be wildly misunderstood...perhaps malevolently. I've heard someone describe programming as telling a very precise story to a very stupid human. However, I think it would be better to describe a compiler as a more humble listener. One who is willing to say "I'm sorry, you just told me to multiply a string, and I think I'm misunderstanding you." Thinking of compiler errors as an expression of humility can explain the comfort one finds in a stronger type system.

It is the same comfort one finds by not speaking publicly on twitter but speaking to a person who has the charity to say "huh? I think you just said this thing that sounds wildly offensive, and I think I'm misunderstanding you."


I don't think ascribing humility to a piece of code makes any kind of sense.


I would love to see an article on this topic - this one is not it though. The reason: we struggle at many major universities to determine this balance. On the one hand, students need a lot of CS in order to become mildly proficient; there is just so much to learn. On the other, school is a last chance to expose students to some of the big ideas of the world, the stuff (as Keating says in Dead Poets) that makes life worth living. How can you do both in 4 short years?


High school is where students should be getting their breadth. At some point you have to learn what you will do, and balance that out with a realization that more and more school just equals more and more debt.

I would actually like to see the opposite...get rid of all elective coursework and try to graduate students in two years. They'll have an incomplete education, but we all have incomplete educations.

If anything, we as a society and as individuals are creating more harm by pointlessly overeducating ourselves.


> On the one hand, students need a lot of CS in order to become mildly proficient

Most of the CS students learn in school will be obsolete in a few years, and they probably won’t be using it in their job anyway.

But the humanities they learn will be useful their whole lives.


What exactly goes obsolete?

Are trees and graphs suddenly not used anymore? Are loops and folds and calculus invalidated? I harp here once a month about how even senior Google engineers and Amazon principles don't know the optimal asymptotic time complexity for sorting 50m pairs of 32 bit unsigned numbers, as the answer changed substantially earlier this decade.

The humanities are more important than they get credit for, but this argument that everything is invalidated and industry teaches you everything is absurd.


If all one gains from a CS degree is knowing a few programming languages, they either weren't paying attention or their school's program was less than stellar.


Hogwash. If your CS program isn't teaching the general fundamentals (algorithms, data structures, basic machine architecture, compiler theory, etc) that's unfortunate. But a CS degree isn't a 4 year equivalent of a programming certificate.


If it becomes obsolete, then it isn't computer science.


We could argue quicksort could be rendered obsolete soon.


We still teach all sorts of slow and inefficient sorting algorithms, because it helps to understand the core principles of sorting algorithms and computational complexity. I don't think I've ever actually used a bubble sort or a selection sort in anger, but the process of studying those algorithms taught me to reason more generally about algorithms, which is really the entire point. Computer science is a branch of mathematics; mathematics isn't really about knowing things, but about knowing how to work things out.

I suspect we'll be teaching quicksort forever, because it's a really good example of a divide and conquer algorithm.


I don't disagree! It has didactic value.

But that's not really the premise I was responding too. Plus, I obviously enkoying being The Person In The Room talking about how the state of the art in sort puts it at linear time and right up against the edge of the information theoretical limits for bitwise comparisons.


Obsolete or not, I'd hope it remains in the collection of sorting algorithms tht are presented to students in undergraduate algorithms course. It's instructive to compare and contrast the strengths and weaknesses of these various approaches.


Do any courses include radix or discrimination sort yet? Mine sure didn't.

I've never been a fan of quicksort as an algorithm. It's more like, "Let's fit this sort option to the weird hardware constraints we arrived at instead of asking for better hardware."


I think quicksort is a fantastic illustration of the 'divide and conquer' principle and for that reason alone it deserves a spot in any kind of CS course. Besides that it is also a fairly good sorting algorithm with an interesting pathological case that is fortunately reasonably easy to test for. All in all, as a good first start when you need to sort stuff quicksort does just fine.


I don't disagree that it has didactic value. I think in industry its time is probably going to decline.


That’s just not true.


She strikes me as a person who has a lot of interesting things to say, I haven't read her books so I thank the author for the exposure.

However, if I had one criticism it would be that the interview and resulting article focus quite a bit on her being a victim of prejudice rather than focusing on the great work she has done. But maybe that's me.


I recently finished her “Close to the Machine”, and I loved it.


Can someone point me to what the argument here is for why "hackers need the humanities"? I can't seem to find it at the link.


I imagine someone else made up that title and thought 'close enough'. They obviously don't know Hacker News readers...

I changed the title to a phrase from the article which doesn't cover the whole thing but at least covers some of it.


In the article, she never directly states that hackers need to study the humanities. What she was basically saying - in the part about Vacca - is that there needs to be human oversight to algorithms that affect public policies (school districting, police paroles, ...). So the implication, in that example, is that programmers should work with people in the social sciences to deliver solutions that may not be algorithmically optimal, but better meet the needs of the community.


This assumes that humans will be compassionate, be pitiable, and always bend the rules to benefit those in need, whereas algorithms are harsh and cruel.

In reality the algorithms are doing exactly what the humans would do. School districting for example...this has to be the most objective determination in public policy. In 99% of the US the school districts are already set. Where you go to school is determined by your home address or some other rule. None of the humans involved are particularly motivated to find a spot in their hearts to bend these rules out of pity...so why would the algorithms? No one is going to risk their job seating your child in another school district because their nuanced worldview has resolved that you merit being an exception. Bureaucracy is a meat computer.

The reality is that public policy has become algorithmic by public demand. Look at California's "three strikes" law. This is a very simple algorithm that has widespread public support, regardless of actual applicability.

Fairness is also a policy motivation. The public has rejected policies that are unfairly applied. We actually have little desire to grant a policy bureaucrat dictatorial power to apply benefits as they see fit simply by virtue of their having read Descartes.


[dead]


We've already asked you to please post civilly and substantively, so we've banned the account.

https://news.ycombinator.com/newsguidelines.html


The problem with the hacker mindset as it applies to algorithms affecting lives is that most folk who studied computer science ( or any of the hard sciences for that matter ) usually went through programs that never approach the human interaction beyond, how to hook the user and keep them involved.

CS majors should be required to take at least 20 hours of philosophy or history. Or, be required to do 20 hours of community service. It's too easy to live in a code bubble and forget the rest of the world lives on $2 a day. That's why we have Uber which solves problems for the first world and the third world remains a disaster.


I think we have Uber because you can make money from people who can afford a taxi ride, but you can't from someone who lives on $2/day. Not that I disagree that learning about people in real hardship is important.


Why shouldn't a philosophy major do community service? How does reading Foucalt or Plato make someone inherently virtuous?

There are plenty of financiers who are draining the wealth of the world into their own pockets who were educated in the humanities.


Sure, some computer people live in bubble. But the idea that humanities professors are going to help change that instead of making them even more deluded I think is wrong. The humanities are currently a toxic breeding ground for pro statist, pro racist and pro sexist ideas. Only since their sexist and racist views express the kind of sexism and racism that government types approve of, it gets promoted as being "humanitarian"


Exactly. If you want to learn about the human condition, you don't enroll in a literature program at Yale, you go work at MacDonald's for year.


> may not be algorithmically optimal, but better meet the needs of the community

Yeah, that sounds a bit like a blank check for someone to corrupt the generated output under the guise of a nebulous "It's better for our community" while scaremongering about the big bad algorithm.


Corruption could happen in any case, but I still agree with having human oversight [1].

[1] https://en.wikipedia.org/wiki/All_models_are_wrong


She is only asked about it in the last question, and mentions a New York City politician wanting open source algorithms:

> Algorithms make decisions all through his borough, the East Bronx, including where children go to school, where police patrols are assigned, and the schedule for garbage pickup.

Which seems a bit bizarre, open source algorithms would be nice, but the tie in to 'the humanities' seems tenuous at best.


Perhaps it was left out in editing, or is made in Ullman's recently published book Life in Code: A Personal History of Technology.


This is a nice little interview, but the title is very misleading! The word "humanities" only appears once outside the title and the answer doesn't really address the title's claim.

I think it's an interesting topic, though. Do tech companies need more humanities grads? I'm skeptical because plenty of other industries have caused their own social problems, and the humanities haven't saved them. Look at advertising, which was originally built by artists rather than hackers.

It may well be true that big tech companies need to be more socially aware and environmentally conscious, but that doesn't mean "the humanities" is necessarily the answer.


I don't believe she's necessarily calling for the hiring of humanities grads.

What would help, as a start, is an appreciation for the values (and value) of the humanities.

On HN specifically, you frequently see something akin to Schadenfreude when referring to students of the liberal arts serving fries to tech grads. That is to some degree just factually wrong, considering many liberal arts undergrads go on to rather lucrative careers in law, or powerful careers in politics.

But it is also short-sighted, because history has always been a two-step process, with technology enabling us to do new things, and other disciplines enabling us to use these capabilities for something resembling the "common good".

I always like this thought experiment: Would you rather live in a world with the middle age's social/political/law system and today's technology–or the other way around?

This doesn't mean that social and political progress are the exclusive domain of those who studied greek mythology. In fact, this article explicitly argues that it shouldn't be: Hackers should engage with that community.

But what it does take is, sometimes, a bit of humility. Too often, I feel the tech community is looking at technological solutions to social and political problems. See cryptocurrencies, or the encryption debate as examples where you can feel cynicism, bordering on disdain, for the political process.

The problem with that is, first, that it's wrong: a lot of power rests in politics, and China is a living example that the internet community is currently unable to win a direct confrontation with the power of the state. It is also wrong in that it portraits all governments as equally bad, all laws as subject to changed or broken when expedient, when clearly there are differences in the rule of law between countries, and there are differences in the willingness of politicians to act with integrity.

Secondly, using technology to create facts is somewhat undemocratic in itself. The Federal Reserve may be committing all sorts of errors. But it is hard to deny that derives legitimacy through a transparent, tested, process much more democratic than bitcoin. It is also an illusion to suggest that bitcoin is free of ideology, only because that ideology takes the form of an algorithm decided in the past, ticking away with almost no method to influence it.


> Too often, I feel the tech community is looking at technological solutions to social and political problems. See cryptocurrencies, or the encryption debate as examples where you can feel cynicism, bordering on disdain, for the political process.

You say that as though technology isn't part of the political process.

If the last few years have taught us anything it's that the general population is not a fan of the existing banking laws. But that hasn't resulted in reform because the banks and law enforcement agencies have undue disproportionate political power and prevent that reform.

Technologies that can disrupt the balance of power that keeps the broken status quo aren't apolitical, but that doesn't make them dirty. They're a tool that can be used for good or for ill. It's not as if the other side isn't using their own tools. AES didn't exist in 1950 but neither did ISMI catchers, room 641A, the Utah Data Center, always-on surveillance devices in everyone's pocket, etc.

And resisting government overreach with maths is a whole lot cleaner than doing it with bullets and IEDs, which are also part of politics.


I guess what I'm reacting to is not specifically this interview, but the notion I've seen in a few places that "big tech companies need the humanities". For example, John Naughton in the Guardian: https://www.theguardian.com/commentisfree/2017/nov/19/how-te...

Ullman's book probably doesn't oversimplify like that, but the title of the article alludes to it.

What I would like to know is, what are some good examples of industries that are doing the right thing, that could be used to guide the way forward for tech companies?

I'm very skeptical that there are any industries that have successfully self-policed in this way. But there are many examples of industries that have exploited shared resources and human weaknesses for a quick profit, and inflicted lasting harm in the process: oil, tobacco, intensive farming, fashion, cars, guns, advertising. (And just like social media, most of those have their good points too!)

Tech and social media in particular is an outlier because it's so successful, but even so it's not totally unique -- the economic and social media effect of oil is massive too.

One possible positive example that occurs to me is nuclear weapons. They could have proliferated massively, but so far they haven't quite. Those weapons are developed by states rather than corporations, though.

I really object to the stereotype of socially-awkward programmers being to blame for everything, because we don't understand how real squishy humans work. The problems we face are the same ones that have faced every highly successful and transformative industry. They're at the root of business and capitalism.


This is a contentious opinion, but I believe journalism, at least in part, may be a good example.

The publications generally regarded as the top, auch as The Economist, the New York Times, and, until this year, The Wall Street Journal: they are not catering to the lowest instincts. When they have made mistakes (somebody will inevitably bring up Judith Miller), they have self-corrected, with new policies, and by firing people.

I also wouldn’t consider the wish for „more humanities in tech“ as an attack on you personally, or any single person. In fact, Google, Apple, and many smaller companies have an excellent track record on many issues. Their leadership actually has an appreciation for non-Technology issues far better than most companies’. Yet the example of Apple engineers preparing to go to jail to protect their customers stands out, because others (Yahoo, AT&T) so eagerly cooperated to intrude on customers’ privacy.


You're right, journalism is a good positive example.

I do worry, though, that it's being degraded by the widespread removal of the firewall between the news side and the advertising side of the business.


I agree that it's an error to describe bitcoin as free of ideology, but regarding the fact that it's an algorithm decided in the past and hard to change, wouldn't that describe Constitutions too?


That is, indeed, an apt comparison. Constitutions are a strange beast, because they are not subject to majority rule, but require supermajorities far beyond 50%.

That appears undemocratic. Part of the solution to this tension is to point out that democracy is more than just majority rule, nicely enshrined in the saying "Democracy isn't two wolves and a sheep voting on what's for dinner".

Another part is the custom that constitutions are mostly concerned with the process of democracy, not the contents. It's somewhat tenuous in light of the the US constitution's amendments, some of which seem to deal with subjects that don't directly relate to the political process. But I think a case can be made that, for example, religion is constitutionally protected because it plays an important role in society's deliberations.

I once saw the late Justice Scalia, for whom I usually have nothing but disdain, at a talk, and he had a relevant point: Too often, people call for a constitutional amendment to enshrine their ideology, removing it from being a subject of the usual political process. That is, I believe, valid for attempts to elevate animal rights, or restrictions on abortions, to the constitution, among other things.

I think this discussion mostly serves to show that politics is almost never about easy decisions. Because easy decisions are made silently, never to appear in the news and the subject of partisan strive. But the tech community has a tendency to believe in oh-but-thats-so-obvious, and the result of that believe is, for example, the idea that "smart contracts" could replace the ambiguity and complexity of law in capturing the myriad circumstances of life that may make an appearance in a contract, or a dispute.


Actually modern advertising ( post WW2 ) was built by psychologists - the artists just dressed up the intentions. Edward Bernays pivoted from being a propagandist to advertising and virtually created the foundation for Madison Avenue as we understand it. Until social media and it's dopamine manipulation strategies, advertising was focused on the manipulation of the viewer.


Huh, interesting! I was under the impression that the typical ad executive of the 50s and 60s would have studied philosophy or classics or something like that -- wannabe writers and artists who got tempted into high-paying jobs on Madison Avenue, similar to today's wannabe mathematicians and physicists who end up working at tech companies and hedge funds. But maybe that's an incorrect stereotype.


I think big tech companies are perfectly aware of how people react to their actions.


In fact, do they not study reactions in order to maximize the amount of time spent on their platforms?


Everyone needs the humanities. You go to Princeton and you're gonna find Brian Kernighan teaching computer science and digital humanities crossover class and explaining tech basics to the 'educated laymen'. The people at the top of the field know this; why haven't the rest of us figured this out yet?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: