Hacker News new | past | comments | ask | show | jobs | submit login
One of the dumbest things I’ve ever published (dcurt.is)
143 points by vernalkick on May 30, 2019 | hide | past | favorite | 61 comments



It's frustrating to read articles making the kinds of points that Dustin's original piece did, and it's slightly cheering to see somebody change their mind. But this:

"There is no valid counterargument to the abstract idea. The fact is that if a company had better data and analyzed it more completely, they could obviously produce better experiences for their users."

I think this is something that you don't need hindsight to understand is wrong. Modern ad driven companies work to maximize 'daily user interactions', not quality of experience. A lot of people think these are strongly correlated, but I think optimizing for that is likely to drive your products into bad directions for users.


>"There is no valid counterargument to the abstract idea. The fact is that if a company had better data and analyzed it more completely, they could obviously produce better experiences for their users."

There is a simple counterargument to the phrasing: could does not imply would.

When you have relatively little information about your users, the two goals "make the user happy" and "make the user engage more with the product" are approximately the same goal. So making the user happy = engagement, with little data.

But when you have a lot of data about your users' emotions and behavior, the two goals become separate. A typical business is going to pursue the second goal, not the first. They're not charities.

It's sort of like an archer shooting at two neighboring targets. With fuzzy vision, they look like the same target, but with clearer vision, the archer sees them separately. The users want the archer to hit one target, but the archer will aim for the other. So, the users may be better off if the archer's vision is a little fuzzy, to make the targets blend together.


This is a good take, well phrased.

To add another example:

In the abstract, the fact is that a government with absolute monitoring and enforcement power over its citizens could obviously move faster to improve society and address injustices.

However, like you said, could does not imply would, and most people would say that giving governments absolute power to monitor and enforce their whims on citizens is a terrible idea.


> if a company had better data and analyzed it more completely, they could [...]

They could do a better job at doing whatever it is that solely-for-profit companies do!

Which may include "provide a better user experience" but may also include other activities such as "sell the data of their customers to third parties for profit" or "make ads for unnecessary products even more accurately targeted" or "more effectively scrape the contact books of their customers to provide better sales leads" or "use individual data to influence custom pricing models to offer higher prices or decline service to some potential customers who are statistically less likely to be attractive customers" or "more accurately predict where individuals associated with government regulation might be trying to constrain the company's business activities" ...


> > "There is no valid counterargument to the abstract idea. The fact is that if a company had better data and analyzed it more completely, they could obviously produce better experiences for their users."

> I think this is something that you don't need hindsight to understand is wrong. Modern ad driven companies work to maximize 'daily user interactions', not quality of experience. A lot of people think these are strongly correlated, but I think optimizing for that is likely to drive your products into bad directions for users.

Unfortunately that is also not a counterargument to "the abstract idea": this shows that companies don't produce better experiences, not that they couldn't produce better experiences. In fact I think, much as I am on your side on this, that the evidence shows that companies with more information can and do provide better experiences—for just long enough to normalise the privacy violation and lower expectations, at which point they turn from their users to the true audience of advertisers and other, uh, curious parties.


Another way to think of it is that you can't truly optimize for user interests if you're not beholden to them; and if you take possession of their data, you're no longer beholden to them. To serve your users fundamentally means to empower them.


Case in point: does anyone honestly think pay-to-play games are "better" products for the user? Charging easily hundreds, if not thousands, of dollars for what used to be a single purchase, on doing so in such a cookie cutter fashion that the games are basically all the same game with different graphics?


When you optimize too much for a proxy the overall correlation that made you choose this proxy doesn't matter anymore, because at the limits you are dealing with details, not the main picture.


Here's the gist of the original piece (from 2014):

> Apple is going to realize very soon that it has made a grave mistake by positioning itself as a bastion of privacy against Google, the evil invader of everyone’s secrets. The truth is that collecting information about people allows you to make significantly better products, and the more information you collect, the better products you can build. ...

https://dcurt.is/privacy-vs-user-experience

How the times change. Not a whiff of Facebook. Snowden who? Google was still mostly not evil.

From the more recent article:

> I argued in my essay that Tim Cook had conflated privacy with security. He may have. But in the five years since 2014, the following fact has become absurdly clear to me: there is no difference between privacy and security. Security is an illusion, just like the lock on your front door. ...

In time we'll all come to realize that privacy is merely a special case of security. What can't be seen can't be attacked.

Until then, a lot of damage will have been wrought on people too distracted or naive to care.


> In time we'll all come to realize that privacy is merely a special case of security.

I can hardly wait. I can't tell you the number of times I've been lambasted for asserting that very thing.


Overall, I agree.

It's a bit amusing, though, that the original piece wasn't far off. I've heard the 2015 mbp was the best laptop they've made, and since then have been a number of missteps. I was unfortunate enough to personally experience the awful lack of a tactile escape key on one of the newer mbp+touchbar laptops for about a year.


It doesn't take data science to understand that putting a purely visual button on a device that people do not look at while using is a tremendous blunder.

A computer company forgetting how a keyboard works. Really now.


Yeah, in retrospect, I probably should have emphasized that I was amused by the coincidence, not that there was causation.

Obviously, gathering usage statistics from hardware / software analytics wouldn't have captured the fact that there are touch typists who would be quite displeased by (a) the loss of the tactile escape key, and (b) the downward quality of the keyboards in general.


First of all, I commend the author for publicly acknowledging their mistakes here. Though I admit I still struggle to understand the mindset that allowed the author to make those wrong-headed conclusions on this topic in the first place. The best I can figure is that this person bought into the great neoliberal lie of our time - the claim that most corporations will value long-term societal stability and well-being over short-term profits + market capture opportunities.

When the author writes the following, I wonder if he is still struggling today to make that connection:

> The building of tools to aggregate private information in order to ostensibly improve user experience has in fact, at scale, caused strange and negative things to happen. Some of these things are threatening totally unrelated social constructs like democracy, addiction, and human decency.

Those constructs are absolutely all related private information in a predictable way. They're related by the fact that the disruption of those fields can be very profitable. Disrupting democracies at scale is profitable to governments. Addiction has been profitable for all of human history. Training people to believe that human decency can be something expressed through which brands of products they buy or do not buy is incredibly profitable for certain brands that sell a certain "ethos", while simultaneously convincing buyers that any other sort of direct action (protests, legislation) are not necessary because the market can make meaningful change if people would only choose to buy from more ethical companies.

The problems we're currently facing as a society, due to the the motivations of powerful corporations being misaligned with the best interests of the general public is not unique to tech either. It's long past time to acknowledge that these outcomes are neither strange nor unpredictable.


Unpopular opinion, but: I think too much privacy is bad for society.

Consider the world where privacy is the absolute king:

- Much harder to solve crimes as police work is thwarted at every turn

- Much easier to plot terrorist attacks/mass shootings/etc

- Can't setup cameras/audio recording on your own property/phone/car/etc. without consent of the recorded (even if they are a criminal) or you are breaking the law.

- Hard to create companies that deal with data because of the high legal barrier you have to clear

- Much easier to hide your past mistakes, making things like "background checks" less useful

A world where privacy is king has a lot of problems and a lot of avenues of abuse. Before you grab your pitchforks, I do think there is a reasonable compromise somewhere in the middle, but some of the privacy "purists" I see on this site have some really extreme views, in my opinion.


> Unpopular opinion, but: I think too much privacy is bad for society.

Not unpopular in general, just unpopular among specific groups (including HN).

I agree with you, generally. We give up certain freedoms in order to live in a civilized society as opposed to living in the woods as hermits, and I would rather err on the side of more safety / less privacy than vice versa. But there’s no right or wrong; it’s just your personal philosophy. Some people believe that without absolute privacy, the government will start imposing surveillance on everything and we will live in a dystopian, Minority Report, social credit score future. It’s the same reason a lot of people want the right to bear arms; they want the (perceived) ability to balance against a slippery slope scenario with the government or other large organizations exerting too much control.


No privacy advocates are arguing for a world where 'privacy is absolute king'. Most of the things you bring up are covered by existing, established law, much of it centuries old. The debates tend to revolve around this:

companies that deal with data

Here the problem is the opposite - it's now not only possible but typical for private entities to accumulate massive amounts of previously private information about individuals, far beyond what the most powerful totalitarian governments ever could. We got there very quickly and our societal and legal norms and frameworks have not caught up. What's to be done about this?


I'm 100% behind all of your points. That's the world I want to live in, for the exact same reasons you do not...

The only one I would even want to bend is the "can't setup cameras/audio recording on your own property" bit. I'm not sure how that would work, in a privacy-centric world, wouldn't the person you are recording avoid the entire area - to protect their privacy?


What about when those cameras start sending data to some central 3rd party in who then tracks people. What if everybody installs those cameras? Do you now have to avoid everywhere?

This is exactly what has happened with the tracking scripts from facebook and google. They provide some service, but the tradeoff is that you have to let them spy on your users.

So the counter question is: "Can you not just avoid google - to protect your privacy?"


And with Nest's new neighborhood watch feature, my dad just took me on a cctv tour around the neighborhood as all the neighbors had opted into sharing their doorbell feeds. It was cool. And creepy.


Damn. That's a neighborhood I'd be moving out of.


The role of the purists might be to counteract the "privacy is dead and buried" crowd


You're being downvoted, but to a certain extent, I agree.

I personally find people like Richard Stallman extremist to the point of considering their positions to be cult-ish or religious. But I think people like that serve as a valuable counterbalance to the people at the other end of the spectrum who don't care about software freedom and are purely pragmatists (or who actively try to thwart FOSS).

At the very least, when bad things happen, and the people on the other side say "we never could have foreseen this", we can point to RMS and say, "yes, you could have".


Yes, but I've always disliked the idea that one extreme will balance out another. Like advocates of Fox News or CNN saying their bias is justified because is balances the bias of their counterpart on the left/right. In theory if people watched both, digested the arguments of both sides, there might be some balance. In practice, it tends to simply entrench people with confirmation bias.


I think you should consider the differences between types of extremism.

The difference between Stallman and Fox News is that Stallman's extremism is well reasoned and consistent while Fox News is only consistent in serving the interests of "the man behind the curtain". Stallman offers a base for logical reasoning while Fox News offers a bunch of positions only good for uncritical acceptance.


I honestly don't see many people arguing for the sort of absolute privacy you're positing here. People are arguing against the continued reduction of privacy.


I think people are more concerned with the invasive data collection that happens during day to day, casual internet/cell phone use. They are wondering why Company X would need to know your blood type, prescription history and your political preferences to make a smart phone that opens snapchat .3 seconds faster.


I think you may be conflating right to privacy with certain individual rights, but I'll start with a relatively trite observation by Benjamin Franklin that you should not give up liberty for security, and if you do, you deserve neither.

Nobody said society was or should be perfectly "safe" and it's not something we should optimize for. We should optimize for individual liberty and the rule of law. (I'm an American, for what it's worth). As for rebuttals, here's what I would consider for each of your points as a potential balancing counterpoint:

> "much harder to solve crimes as police work is thwarted at every turn" - there are reasons police require warrants to obtain information. This should continue to be the case. Warrants make their job harder, of course, but it also preserves individual freedom. This balancing act is at the crux of our society and always will be. I lean towards making it harder for the state if I have to choose.

> "much easier to plot terrorist attacks / etc" - see above point about security and liberty. We are not optimizing for complete safety, and any open society will run into these issues. The alternative is not something I would like to consider, personally. The tools that might solve these issues are also readily used to suppress dissent and empower the state. Best to make it difficult to employ them.

> "can't set up cameras on your property" - this is an individual rights issue not a privacy issue. You have the right to do this on your property.

> "hard to create companies that deal with data" - I don't think this is particularly hard right now, and we could use a few more laws about this to prevent third parties from wielding more power than the state without any recourse to the individual. In the end, it's not as much about privacy but rather about rights of the individual to appeal their being "on a list" or discriminated against as a result of being arbitrarily added to the list. If you want a concrete example of this, I'd say let's talk about Credit Ratings agencies and the power they wield in the United States.

> "much easier to hide your past mistakes" making things like "background checks" less useful. - I'm pretty sure it'll still be damned near impossible to really bury this kind of stuff without significant means. Furthermore, as a hypothesis, I'd argue the general benefit to society of not having a giant database is a net positive and the outliers / background check avoiders are probably very very few statistically. Leave blacklisting to the legal system where a felon, sex-offender, etc are tagged for life but at least it's done via laws. (see the previous HN post about a company tracking which bars you've been kicked out of)

I don't feel that my views here are extreme. Mostly I'm left feeling quite powerless to do anything without being tracked and monetized by somebody (at a minimum), and worse - potentially having my real-world life altered because of some random algorithmic detection that I have not opted into, did not realize, and cannot control or appeal. That feels weird, and I think there's a middle ground between the two that we can continue to investigate.

A combination of views on optimization as well as trust in government to uphold law in the way one expects likely determines a lot about where one falls in the continuum of individual privacy vs. access to information on this. I would argue for checks and balances.


That post might have been wrong, but that doesn't make it dumb.

When something is wrong and a waste of time (in the sense that the author should have known better) that's dumb.

When something is wrong but not obviously so, and it helps us summarize existing opinions, or guide exploration, and/or otherwise advances the state of our understanding, it's not dumb.

For example, a wrong scientific hypothesis or a wrong startup thesis is not always dumb; they are only dumb when the author should have known better, but chose not to do the homework. Incidentally, this might be the easiest way to fail a YC interview - demonstrate that you think the homework is beneath you.

It's ok to be wrong, just make sure it's "the right kind of wrong".


Absolutely. Something can be wrong, but constructively so, adding to the discourse in a positive way by still raising valid points even if it draws conclusions that don't quite follow from those points. It's why I often find myself upvoting things I might really disagree with because I nonetheless believe it adds to the conversation in an interesting and constructive way. And conversely, why I sometimes downvote things I agree with when the comment is shallow, snarky, antagonistic, or dismissive of other possible views.


This article really grates on me. The author seems to live in a bubble of his own mind and making. Never once considering that, maybe it wasn't the expansive changes of a whole five years that set him wrong, but maybe the fact that he never once considered outside opinion. It's not as if Richard Stallman or Jaron Lanier or countless other people have been warning against this exact moment in time for decades now.


That's Dustin for you.

He steals "kudos" even. If you hover over a little icon on his website to try and find out what it does, he'll record it and publish it as praise of his article.


That's a standard Svbtle thing, and yes, it's awful.

Oh! I just realized Dustin is the guy behind Svtble.


Five years ago I believed this same thing, exactly as you described. I went all-in on Google services and left all the options to send them data on, trusting them to be the benevolent system you described. And believing that the benefit I was getting in terms of cool UX would be worth it.

Now with how horrible ad-tracking has gotten, and the decline in Google's products (Inbox especially, for me) the shine is wearing off quickly. I'm strongly considering abandoning the Google platform as completely as possible whenever my Pixel 2 bites the dust.


I used to share the exact same feeling. Two weeks ago, I finally made the jump and switched to an iPhone. The first thing I did was go back through and take more time in selecting the Google services that are actually important to me and adjusting them tracking levels i'm more comfortable with.

Having spent a few weeks on the phone (from a Samsung S7 before), the biggest difference I feel is that the iPhone is no longer nagging me to interact with it the same way my old one did. No blinking notification light, no Samsung pestering me to accept some changes to their terms and conditions etc.


My biggest pet peeve about my Nexus 5 was the notification banner that would spawn a few minutes before an alarm was going to go off.


Google is bad, but at least they normally let you control your device; Apple lock things down so much that you can't even do that.

I really hope that there's a real market out there for people who want to own their phones and control their destinies. I really don't want to live in a world where my only choice is which feudal lord I serve.


This. Though it seems to be getting harder to root android phones, at least from my very limited perspective. It's also why I'm a bit horrified at the direction Windows has taken, making some things nearly impossible to disable or remove. I find myself having to dig into the registry much more than I used to in order to stop annoying behavior or system-draining processes. (For example: I'm happy to stay updated with the latest security patched, but do not reboot my computer on your own authority when I may be running very long processes on it. Rather annoying to wake up and find a model I was training was aborted because Update decided to reboot)


> I really hope that there's a real market out there for people who want to own their phones and control their destinies.

If there is, I haven't found anyone servicing it. This is why I've been reduced to building my own smartphone. That appears to be the only way I can keep control over my own machine.


>But in the five years since 2014, the following fact has become absurdly clear to me: there is no difference between privacy and security. Security is an illusion, just like the lock on your front door. Advanced cryptography can prevent immediate threats, but in the long run, it is impossible to keep things private at scale. Humans can only build flawed software. There will always be bugs. And thus your “private” information is not now and will never be safe in the hands of a third party, no matter how competent. The only solution is to keep the information within only your control, and that is how Apple has attempted to architect its systems.

>And: (2) If, after Snowden, Experian, and countless other examples of leaks, you think security is going to protect your privacy, you are either ignorant or insane.

This feels incoherent. Security is exactly what Apple is leaning on, e.g. by designing iMessage so they themselves can’t read the content.

And just like Google, they are also asking you trust them, e.g. to not release an update that uploads all of your plain text messages to Apple.

What’s different is the size and nature of the attack surfaces, how distributed vs. centralized they are, and how each company agrees to use the data they do legitimately collect.

There are interesting, persuasive arguments to be made in favor of the different approaches out there, but I think it’s important to have conceptual clarity.


I'm not sure if this represents a personal fault, or something any self-aware, self-critical person would experience (I hope it's the later) but it seems to me that when I look back on anything I wrote 4-5 years ago (or longer) then 1/3 of it is at least vaguely embarrassing, with my views becoming more nuanced, more scaffolded with experience and information as time goes by.


I think that's normal and indicates growth over time.


the dumbest thing anyone wrote (and he did it on this later article) is to personalize the algorithm.

the algorithm is code. put in place by someone. it is not magical. it is not hard to predict. it doesn't run around gainig sentience.

personalizing the "algorithm" is idiotic. and remove all the guilty from the operator, who is the one who fine tuned all the trade offs.

think of what he describe as the algorithm as a person that works as your personal assistant. if that person uses all the info on your life to help you, they are a good assistant. If instead they use that information in ways you didn't know about, just like google does, that person is a criminal. yet, google gets a pass because it wasn't them, it was the algorithm.

see how dumb and detrimental (to everyone but google) speaking of the algorithm as an entity is?


"The truth is that collecting information about people allows you to make significantly better products, and the more information you collect, the better products you can build."

This is basically right; the only thing making it wrong is the question "better for whom?"


>There is no valid counterargument to the abstract idea.

I think that there is. It's hard to explain on short notice, but it's the intersection of the abstract concepts of model overfitting, precognition based on data (such as pregnancy), and what happens when the products are not built to serve the user whose data powers them.


From the title and the domain name I was sure this was going to be about his 2011 article rationalizing that iPhone 4's 3.5 inch screen was a better size than larger Android screens of the day.

https://dcurt.is/3-point-5-inches


Just because Apple has changed their mind doesn't mean he was wrong? I know multiple people who won't upgrade past the iPhone SE because it is better for them. One of them has pointed out that Apple used to market iPhones as "the perfect size for your hand"---they don't say that anymore.


Absolutely. Especially because my target has always been "the biggest screen that will comfortable fit in my pocket". I strongly look forward to the coming times of folding screens. Sadly, the Galaxy fold isn't going to cut it. One or two more generations at least.


This seems like a post he should have published to his private diary. It comes off very self-important and navel gazing.

Everyone wrote dumb stuff years ago. Move on with your life.


I'd say the dumbest thing that Dustin ever posted was when he said his blog could only be updated by his email address, and then I updated his blog for him by simply spoofing the sender. Via Gmail SMTP.

Edit: Aftermath. He deleted the blog post. It's still in the way back machine I suspect. https://news.ycombinator.com/item?id=1441914


I think when it comes to superlatives and this particular author, he hit somewhere very close to the theoretical maximum a while back and it's still public and helpfully labeled.

https://dcurt.is/the-best

Although if someone has invented a mouthfeelier fork in the meantime and Curtis has written about it, I'd like to read that piece as well.


I don't know what I just read but it comes across as a delusion of grandeur.

There is nothing about the original article that makes it a 'thesis', or a 'piece', or even an 'essay' by any standard. I was scrolling up and down wondering if I'd missed something, expecting some seriously detailed research. It is being given an inflated opinion of itself.

'Privacy vs. User Experience' is a 500 word long opinion. It starts off like this:

> Apple is going to realize very soon that it has made a grave mistake by positioning itself as a bastion of privacy against Google, the evil invader of everyone’s secrets. The truth is that collecting information about people allows you to make significantly better products, and the more information you collect, the better products you can build.

There is no thesis or structure here, it's a fucking blog post. There are no references, there is no research. It's opinion.

In my personal opinion this follow up is even more dumb than the earlier post it criticises.

Edit: if I was to offer something more constructive, this follow up post should be called 'Famous Last Words'.


That word is used in different ways. This level of nitpicking breaks the site guidelines, which ask:

"Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith."

https://news.ycombinator.com/newsguidelines.html


I think that's an uncharitable view of the author's choice of words. He also doesn't even use "thesis" in the way you imply, he refers to "the thesis of the essay", which unambiguously is the meaning of the word that is "the central claim of the essay".


I disagree. You accept his blog post as an essay and in those terms I'll agree, you can see a thesis in there.

But it's not an essay. It's a blog post. There are no references or sources. There is no way for you to take that post and research the underpinnings of its conclusion. It is wholly self-referential.

The original post is a hot-take, it's pure opinion. It's not even a noteworthy post, it's just part of the Svtble echo chamber and has inflated worth.

There is no essay in either the original post or the apology for it, and similarly so there is no thesis. Language changes and a thesis now is commonly accepted as a significant body of work with academic significance and a grounding in empiricism or at least extensive research.

Going back to the Ancient Greek (and beyond) roots of 'thesis' to claim an innocent usage of the term is a monumental stretch.


I don't think the lack of sources make it "not an essay". You can dispute the value of the writing, but it's a bit weird to think it's pompous to use words with the definition of the most common usage:

Essay

1. A short piece of writing on a particular subject

Thesis

1. A statement or theory that is put forward as a premise to be maintained or proved.


> The original post is a hot-take, it's pure opinion.

Sure. And it's also an essay.


What the author said:

> It has a strong thesis.

As in, the gist/central meaning is good/strong.

What I think that you think he said:

> It is a strong thesis.

As in, PhD thesis.


Do you say your opinion has a good thesis? Or are you checking up on the dictionary to see the traditional definition?

In any case, it's high flung language for a blog.


Outside of academia, I’d wager the usage of thesis in the article is more common.


In which case, let's talk about my original essay about dcurtis' apology for publishing something dumb (in his words) and my thesis that he is talking out of his ass (sorry, πρωκτός).




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: