Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Bulverism (wikipedia.org)
56 points by apsec112 on Dec 25, 2020 | hide | past | favorite | 86 comments


At this point I basically view preoccupation with fallacies, presenting arguments in formal format (premise, premise, conclusion), etc. as a proxy for immaturity. That isn't how people are persuaded, that isn't how people ever were persuaded, and that isn't how people ever will be persuaded. It also isn't how people have ever been able to access any sort of truth. Western philosophy has been incredibly effective at tearing apart everything that exists, but utterly ineffective in building anything back up from the ashes.

It's only after you've tried, many times over, in futility, to build a coherent & effective belief system from first principles that you'll realize this approach is a dead end. All we have are beliefs, and our brains retroactively make up stories about why we think they are true. In this framework it is actually very useful to illustrate how your ideological opponent came to their beliefs. Then people can determine whether they find that story relatable.


>That isn't how people are persuaded, that isn't how people ever were persuaded, and that isn't how people ever will be persuaded.

Why would anybody have to be persuaded ? The point of thinking isn't just to manipulate people.


The point is to persuade yourself. Or rather, to satisfy the collection of niggling doubts[1] we call the "inner critic". That voice constantly questioning us in our own heads, poking holes in our logic, and ensuring we're actually thinking and not pretending to think.

The problem with fallacies, as I see it, is they're often employed to fool that inner critic. They can dress up any emotional, thoughtless reaction in the guise of reason.

"I thought that person was rude" becomes "I believe this person is engaging in an ad-hominen attack." Or "they have a belief I don't like" becomes "Clearly, the other person has a case of confirmation bias."

I think our inner critic will detect us taking such mental shortcuts; it won't be quieted so easily. I also think that's why fallacy-first thinkers will spam their views across threads. Unable to persuade themselves, let alone others, they resort to rhetoric to silence their own creeping doubts.

[1] http://www.peirce.org/writings/p107.html


The original article is about a manipulation technique, hence the focus on that aspect


This comment reminds me of a tweet I saw years ago, that roughly said "programmers read posts like a compiler: stopping immediately & printing an error when they see something they disagree with"

Read the very next couple of sentences in my original comment.


The point of the fallacies is inline with your point. These are arguments we believe, even though they are not logically sound.

That is, learning them will help you identify your own beliefs that are worth questioning. And, in this case, what behavior you may use that makes it hard to engage with others on their beliefs.


> That is, learning them will help you identify your own beliefs that are worth questioning.

But the OP's point is that they're actually not a particularly good way of identifying that (particularly the rhetorical fallacies). Many if not most things people believe which are incorrect do not involve any logical fallacy. Conversely, if I wanted to identify which of my current beliefs are probably incorrect, beliefs I hold largely because of the appeal of authorities supporting them, for example, would probably be a very bad place to focus my doubts.


I disagree. Authority is a very common base for why you believe something. In large, we call this trust. It isn't necessarily a bad thing.

This can also lead to why a rhetorical won't work. Authority works, but only with a shared belief in the authority. Such that if your message requires a new authority to your audience, it has less of a chance of working. Zero, if there is an active lack of belief in your authority figure.

If examined with empathy, it can also let you accept someone's belief without believing it.


I don't think you actually do disagree. :)

Authority is a very common base for why I believe things. 'Appeal to authority' is also a classical logical fallacy, i.e. 'but this conflicts with the positions of most scientists' is regarded as a weak and inadmissible argument in a debate versus someone who claims to have performed an experiment themselves. However, this logical fallacy is not particularly useful in assessing which of my own beliefs are incorrect, as suggested by the OP, because beliefs I have in common with most scientists are unlikely to be the beliefs I most need to question.


> Many if not most things people believe which are incorrect do not involve any logical fallacy.

Every false belief involves a logical fallacy, either the fallacy of false premises or a fallacy of reasoning, because starting with true premises and employing sound logic will always produce a true conclusion.


And herein lies the immaturity the grandparent was talking about.

"starting with true premises and employing sound logic will always produce a true conclusion"

is only useful where premises can be determined to be true, and the conclusion follows a logical syllogism end to end.

This is only done (or even achievable) for quantifiable / statements of fact ("a book is on the table", "the earth is round", "the GDP is 10 trillion" and the like).

For 99% of what humans care about - claims and their veracity, values, morals, politics, goals, tastes, aesthetics, interests, etc, there are no "true staatements" to begin with, and many of them doesn't involve a sound logical syllogist-sound process, but only regarding their simplest sub-parts.

E.g. two political enemies might have different opinions on what to do with the national debt, but they can agree on the actual amount of the national debt, and use pure logic to calculate it, check where it was spent, etc. That, though, wont help them determine logically and agree on whether something like welfare or military spending is good or bad, and it's usually those latter questions that matter in such discussions.

The thing is, people still name all those other things (that are not ammenable to logical process end to end) as true and false, based on their own opinions.

Logicians can disagree all they want, this ain't gonna change.

And it's those "true" and "false" statements that matter in most arguments, not their less interesting, quantifiable, easily acceptable by everybody and logically analysed brethen.

The "let's point people to logical fallacy lists, as the solution" crowd doesn't understand this.


> This is only done (or even achievable) for quantifiable / statements of fact

Those are also the only things which can be unambiguously incorrect. So, the problem isn’t that there are incorrect beliefs that are not due to fallacies, it is that there are beliefs which are not knowably correct or incorrect, which is a very different issue. But even there, if we are to discuss them at all and have a discussion about our beliefs that they are correct or not, the same analytical tools allow us to effectively discuss beliefs and our beliefs as to why and whether they are justified while getting beyond rhetorical trickery. This allows us to pinpoint where disagreement lies, and to either converge on belief rationally, or identify disagreement on premises that are not subject to resolution through logic (which, if they are to be resolved, must be resolved through other means.)

> That, though, wont help them determine logically and agree on whether something like welfare or military spending is good or bad, and it’s usually those latter questions that matter in such discussions.

To the extent that there is a difference in axiomatic preferences, you can’t deal with that through logic. OTOH, you can identify it through logic, and particularly you can pierce the attempts to conceal it that are common rhetorical tools once an actor has determined that that exists and wishes to manipulate another party despite it using the tools of logic.

> Logicians can disagree all they want,

But they don’t. Literally nothing you say prior to this in your post (and nothing after it except for the the last sentence) is even slightly controversial. And while there may be people that meet the description in your last sentence, you seem to miss the point of what the understanding of logical fallacies and the application of that knowledge are about as badly as the people you are attacking.


>nothing you say prior to this in your post (and nothing after it except for the the last sentence) is even slightly controversial.

And yet, the past 2-3 centuries (anglosaxon especially) philosophy is full of counter-examples thinking the opposite - and tons of philosophically naive people today as well.


This. (and even some of those beliefs which can be determined to be true/false aren't determined to be true/false based around representing my beliefs as a logical syllogism. If you want to evaluate whether my belief about the book being on the table is still correct, it's probably more productive to look at the table than evaluate whether my argument depends on implicit premises like my eyesight having been reliable and nobody moving the book in the last five minutes. And if I'm wrong it's probably memory failure or knowledge gaps rather than faith in the premise that the book could not have been moved)

And some of my beliefs which are pure logical fallacies when it comes to formal debate like 'Brain Force Plus is unlikely to be a useful nootropic because the man promoting it, Alex Jones, is a mountebank' are actually pretty useful heuristics.


Fallacies are your failure to apply scientific method. The fact that you are aware of them doesn’t change the fact that you don’t know how to use it.

It’s like a blind person learning this something isn’t red. They’ll continue guessing until they came up with the right colour.


> Fallacies are your failure to apply scientific method

No, they are mostly failures to apply sound abstract logic.


While not wrong, this seems less than helpful. Not all of us have the equipment/context to replicate experiments.

Such that the existence of a fallacy does not equal the existence that something is false. That is itself a logical fallacy.


Logical fallacies are just glorified cargo cults


This is why I love Greek philosophy so much. They were keen observers of the human condition, but they didn’t bind it all up in (formal) logic. Their philosophies were aimed at being practical life philosophies. How to live a good life (according to the philosophers). I find much in stoicism to be of particular use in modern life. CBT, cognitive behavioral therapy, has many similarities with and could even be viewed as strongly influenced by stoicism. CBT really works for a lot of people and is one of the most successful therapeutic approaches in modern psychology at improving people’s outlook and fighting depression.


I don't understand, so you just pick beliefs and ignore that the reasoning behind them may be fallacious? A fallacy might not point you towards the truth, but it can identify when something is not the truth.


A fallacy can not identify when something is not the truth. A fallacious statement is not a necessary result of the argument that led to it, but it can still be completely true.

There’s even a fallacy about it! https://en.m.wikipedia.org/wiki/Argument_from_fallacy


This isn't me saying how people should pick their beliefs. This is me saying how people do pick their beliefs. It's an exhortation to know thyself, and to let go of the delusion that you're different. If you're stuck in the illusion that your beliefs all follow logically from some carefully-chosen first principles, you have a muddier view of the world (and yourself) than someone who hasn't put the least thought whatsoever into choosing their beliefs.


You can observe the world. I'm assuming you're not locked in a room somewhere and handed a set of first principles that you have to use to reason out what the world is all about. You adopt beliefs that explain observations and successfully predict future observations. When beliefs fail to do that you abandon or refine them. It's in this process of refining beliefs that you watch out for fallacious reasoning to try to ensure that you are correctly understanding what you observed. I don't know anyone who just picks some first principles and goes from there. I agree that seems silly.


Striving for a single “The Truth” seems like a fundamentally losing battle when I’d have to convince 8+ billion other humans to agree with me in every aspect of how I see the world, but an entire world that could fit in my single viewport would be a very narrow world indeed.


I agree, striving to find a single great Truth and convince everyone of it does sound like a losing battle. Luckily this is neither desirable or necessary.

Ascertaining the small-t truth of individual beliefs is useful insofar as it enables you to better achieve things you want to do with tangible results in the real world. There is no need to convince anyone of anything in this endeavor, because they can observe the results for themselves.


I've read an interesting opinion on this in the past, something like: trying to improve one's argumentative skill by studying fallacies is like trying to improve one's mastery of the violin by studying examples of poor technique.

There's value to it, but its not a way to reach true greatness in a field.


It's probably not the path to greatness. I'd say it's more a way to have a basic foundation for thinking.

The thing about fallacies is that they can be persuasive. But if you recognize them for what they are, you can defend against the error.

To extend your metaphor, very few people will be masters of the violin. But it's pretty cool to play it it anyway.


> At this point I basically view preoccupation with fallacies, presenting arguments in formal format (premise, premise, conclusion), etc. as a proxy for immaturity. That isn't how people are persuaded, that isn't how people ever were persuaded, and that isn't how people ever will be persuaded.

Escaping the rhetorical tricks that facilitate persuasion independent of the merits of a position is actually the point; the approach you are attacking is a tool for doing this for abstract systems of ideas, empirical science provides the same thing for fact claims.

> Western philosophy has been incredibly effective at tearing apart everything that exists, but utterly ineffective in building anything back up from the ashes.

The practical knowledge and technology built on the back of empirical science, a system itself built on the back of Western philosophy—and specifically an outgrowth of the exact aspect of Western philosophy you are attacking—is, more than a little “anything” that Western philosophy has provided the tools to build up from the ashes of self-delusion that it years down.


Perhaps the value in philosophy is destruction. As you say our brains already build up beliefs, why do we need a formal system for that? What we don't do naturally is re-evaluate those beliefs in the presence of new information.


Popper was onto something with putting everything under criticism.

And yet, as every piece of computer hardware needs an OS to be more than a brick, one yearns for an explanation of what "is" as badly as "what is not".


The intent is something more like studying the chess tricks and traps books[1]. Humans learn new things by exploring the boundaries of those things. This is as true for secondhand stories of explorations as it is for firsthand experiences. I think it's just humans a-human-ing. Nothing immature about learning.

[1] There are a bunch of them. Here's more on one: https://www.goodreads.com/book/show/8051867-the-greatest-eve...


Picking apart someone's belief through explaining their logical fallacies presupposes that they got to their belief through a logical process.

"Reasoning will never make a Man correct an ill Opinion, which by Reasoning he never acquired." -- Jonathan Swift, 1721.

To rephrase in a more modern way: You can't use reason or logic to get to someone to stop believing something they didn't use reason or logic to start believing.


I always like the short version: "you can't reason someone out of a belief they didn't reason themselves into"


That’s ironic because haven’t you presented your argument in a formal format here? And attempted to build something for a coherent and effective belief system? And passed on sharing the relatable story that would allow others to understand how you’ve come to believe what you’ve outlined?

Just because you haven’t been able to access any truth through philosophizing doesn’t mean it’s not possible.


The limit of your vocabulary is the limit of your world. If you don’t know about fallacies then you don’t truly understand their existence.


Maybe a more conversational approach where

- trust is established,

- a relationship built, and

- questions are asked in a sincere effort to trigger reflection

could produce better results. Takes more patience and effort than we typically invest these days.


How do you really propose to establish trust though? Based on what?

Trust requires the initial establishment of some reason to believe the other person, and we're mid discussing why we don't.


"Conversational" implies dialogue.

We're by no means the same, but the Venn has substanital overlap.

By the time we speak to people, we often find them far less monstrous than we imagine, even if their traditions differ.


The solution, I think, lays in systems thinking. “Premise, premise, conclusion” can be rightfully used to represent (better: project) a perspective of a frame of a system.

Simple causality doesn’t do the intricate complexity of existence and the patterns that govern it justice, but you can surely map systemic obervations, much like you can project a timestamp of a 3D-room on a lower-dimensional surface. Just don’t act like that’s the whole truth, because it isn’t, and it won’t be.


They're all just buzzword shortcuts for rapidly dismissing people and arguments we don't like. If we're going to hold arguments up to the standards of pure logic during any mildly subjective debate, one can only conclude that both opponents are wrong and always will be. All that's left after that is the political game of winning over the audience, which is almost certainly the reason the debate has held in the first place.


For what it's worth, I don't identify preoccupation with identifying fallacies, or with rhetorical classifications like "bulverism" or "steelmanning" with philosophy. At least, if you take a degree in philosophy I think you'll find that it's very much not like this. In my experience, this approach is much more characteristic of online forum argumentation, and subcultures like new atheism and rationalism. (Case in point: I mean "rationalism" in the LessWrong sense, not the Cartesian sense!) I agree with you that the style is irritating and superficial.


Looks like you're looking for the: https://en.wikipedia.org/wiki/Argument_from_fallacy

That is to say people assume that logical equivalence is the same as logical entailment.


Well, I for one encounter this fallacy constantly and discussion would be a net boost if all participants recognized and avoided said fallacy.


Pieces like this make me want to find the original books and read them, I wonder how many more insights I could find there. C.S. Lewis seems to resonate particularly well with me and with how I reason (same holds for Chesterton).

I think Lewis is right in blaming Bulverism. So I asked myself: can I identify a way in which Bulverism originates? How can I avoid it and how can I suggest other people to avoid it? Sorry for the excursus, but I hope someone will find it useful :D

It's very interesting, to me at least, to notice that getting "completely" rid of bulverism seems very difficult - perhaps impossible? Perhaps because it's part of how humans start the process of knowledge.

Let me explain...

Reasoning can go, a priori, in several different directions. Which one to pick? From a practical point of view, you pick the one that stimulates you the most: and this way, someone becomes a mathematician, someone else an engineer, someone else a poet, a doctor, and so on.

This movement has both some bulverism and some anti-bulverism: you hear someone speak, and a-priori, you feel like he's right (is there a name for this fallacy too?).

And I think that starting the process of knowledge with some pre-conception is unavoidable and also what makes diversity possible. So when does a safe practice become a threat?

I think it's when we lose faith in the positiveness of reality, that is: in the possibility of knowing what's true and that this knowledge is what really makes us more humans.

Bulverism is saying that I value more what I think than what's true, or that I value more to blame you than to affirm the truth.

It's interesting to see in how many way reasoning can go wrong once you exclude that reasoning has its natural end in the knowledge of what's true.


One of the highest utility-density pieces of advice I've ever gotten:

"Assume positive intent. Ask lots of questions."


This doesn't protect someone (or a whole society) from bad actors who know how to exploit the assumption of positive intent. This might work for arguing with Aunt Tillie at Thanksgiving dinner, but it's a loophole actively exploited by people in power whose motives run counter to what's good, in general, for society as a whole. See: tobacco executives, climate change deniers, those who profit off anti-vaxx sentiment, televangelists, etc.


Engaging people and meeting them where they are, with honest discussion, can sometimes turn an ardent detractor of something positive into an ardent supporter of something positive.

Can't win if you don't try. :).


"Ardent detractors" aren't the people I'm talking about. I'm talking about bad faith actors who understand exactly what is harmful about their position and are choosing that position anyway, because their incentives are aligned with maintaining the harmful situation. These aren't people you can engage in good faith with precisely because they will use your assumption of good faith against you.


Works well with most people, but not with the >1% who actually don’t have positive intent.


Climate denialism comes to mind here:

https://www.youtube.com/watch?v=RLqXkYrdmjY


That becomes real clear once you've asked enough questions.


...and then you'll have to ask all the same questions with the next person in that group, or maybe even the next time the same person talks, because an intentional part of the tactics involved is to waste the time of people who assume positive intent.


I see that primarily in heated internet conversations that touch on identity and ego, such as politics and religion (programming language/technology religion included!). On such topics, there very often is bad faith, perhaps more often than not (feels like it!). In such cases, trying to pinpoint the intentions of every participant in every single interaction is exhausting and typically pointless. The solution is to avoid those topics like the plague in the first place, if you have a choice, unless it's with people you trust. In most other cases, you can (and should) productively assume good faith. Usually.


Yeah, but then the problem is that politics, the topic that invites a ton bad faith tactics, also has tons of real life repercussions. And if you just try to ignore it because of the bad faith stuff, then the people with the bad faith arguments are more likely to get their way. So it sucks to engage, but there’s not any other great choice.


Good point! It's tough: what's good for the individual may not be best for society at large. Thanks to those who do endure and engage in a positive way with the genuine intent to make a difference. It's surely not easy in the environment we have today.


How is it clear that people are intentionally wasting the time of those who assume positive intent?


I think the existence of trolling is the prima fascia case for this. Unless you believe that trolling doesn’t exist.


True -- my default is to assume that people are not trolls unless they are repeatedly inflammatory and unwilling to engage in meaningful discussion of any kind.


Intentions are never ‘clear’, but that doesn’t mean they aren’t a factor.


To you, presumably, but not necessarily to others.


I like how your framed it, giving the lower bound only


"Some arguments are valid and some conclusions true, regardless of the identity and motives of the one who argues them."

Yes I know, "both sides", but this has been the biggest challenge I've had speaking with progressive friends. Unless I agree that someone's beliefs/opinions/arguments are totally invalid because who they are or who they voted for, I'm also part of the problem..


It sounds less like a dispute about the objective validity or soundness of arguments, and more like a dispute about which side you’ve chosen in a conflict.


> Unless I agree that someone's beliefs/opinions/arguments are totally invalid because who they are or who they voted for, I'm also part of the problem..

Look, we've got a bunch of people attacking the foundations of our democracy.

If you voted with them, yeah, you've got a pretty steep climb to convince me that you're not part of the problem.

"Shallow understanding from people of good will is more frustrating than absolute misunderstanding from people of ill will. Lukewarm acceptance is much more bewildering than outright rejection."--Rev. Martin Luther King Jr.


Somebody can be part of the problem and also be right about something.

Call it "accidentally right" if you need to deprive them from the merit of being right. But let's not turn facts and logic into collateral damage of a fight that happens at a different level.


A stopped watch is correct twice a day--yet everybody would still throw it out.


Sure. But you won't say that it's not 8:30am when it's 8:30am just because a broken watch happens to say so.

Humans care about reputation. Our trust in people is based a lot on past performance and what other people though about that past performance. Thus, we also find it dangerous to let people get away with being perceived as having competence on something where they don't have it, because they may in future capitalize on that and spread their wrong ideas and be believed because you conceded that they had competence in the past.

However it cuts both ways. If you want to expose somebody's incompetence you surely don't want to behave in ways that are clearly dishonest and bending facts just to strip the other person of being accidentally right once or twice a day. That shortcut won't pay off. I've seen so many discourse between otherwise honest people derailed by this practice that is now so entrenched. Call out fallacies by all means. Give credit when credit is due and move on to attack arguments that are worth attacking.


There isn’t a hill for people to climb, there’s a fortress you’re standing on top of with an assault rifle firing at them before they can say a word. The rise of this closed-off-to-discussion approach is legitimately dangerous.


You are judged by the company you keep.

Supporting people who "Stand by and stand ready" is "legitimately dangerous."

Supporting people who think kidnapping a governor is a good idea is "legitimately dangerous."

Supporting people who think inciting violence is perfectly okay is "legitimately dangerous."

Yeah, putting the supporters of such people in the "part of the problem" bucket is perfectly valid, thanks.

The majority of people in Germany didn't really support gassing the Jews. However, the majority of people in Germany also didn't stand up and oppose it either--and that's why we consider them "part of the problem".

“The only thing necessary for the triumph of evil is for good men to do nothing.” ― Edmund Burke


The main thing is that people do this because it works fairly well. Nobody ever says how to counter them. Arguably, not even professional persuaders like politicians haven’t figured it out, because the most common response during elections or debates is to just counter attack with similar tactics


I think the best response depends on whether you can want to persuade the person you are talking to or bystanders. Either you could call out the tactic or you could use something like non-violent communication.


I think most of the time this sort of tactic is used with an audience in mind, so presumably we would want to convince the audience. Does calling out “that’s just a bulverism fallacy!” really work?


Obliviously you wouldn't want to use those words exact unless the audience is a bunch of philosophy mayors. But for example here [1] is a video of Chris Christie effectively calling out a somewhat similar technique from Marco Rubio.

[1] https://www.youtube.com/watch?v=CkdpzRDxTXU


You could try it out and see if it works. My gut says that it's good logic, but poor rhetoric. Maybe something like 'My opponent sees conspiracies everywhere. Let them prove my motives rather than assume them'?


The funny thing is that this article is attacking Bulverism by describing it as a fallacy, thereby proving that Bulverism is a useful mode of thinking.

I quite enjoy the Bulverism mindset. It's close to the bug seeker or pen-tester mindset. If the code seems complex you have to assume there is a mistake somewhere. You put yourself in the shoes of the one who write the code and try to imagine where some complex interactions could be badly anticipated. To exploit, you have to understand what went wrong.

Seeking the truth probably can't be done in absolute. It's probably a better idea to hold a variety of beliefs and Bayes updating them with the information you gather, but at the same time keeping in mind that other people may have received different information.


The description there gets into assumed motive which is really irrelevant to the fallacy. The fallacy isn't in the assumption of error, but rather in attacking a characterization of the process by which the opponent arrived at their argument rather than the argument itself. That one might infer from someone engaging in that fallacy that they have also assumed the opponent wrong and sought an explanation is, well, a reasonable conclusion but to highlight that as the fundamental issue is, ironically, making the same mistake that is at issue with Bulverism: focussing on the process by which one gets to the argument rather than the flaws in the argument.


I think the original problem with all of this is that we see arguments as fights that we need to win. This is wrong. We should argue to understand each other better and to come up with better ideas than we separately could. I've written a longer piece on how fallacies affect how we argue: https://ochronus.online/how-to-stop-winning-arguments/


If you can get your hands on it, I highly recommend an old book by (Robert?) Thouless called: "Straight and crooked thinking", which identifies the form of several common disingenuous argumentation techniques, brings out the fallacy or dishonest rhetorical device employed, and suggestions on how to counter them (or at least spot them more easily).


Where you went wrong is by wanting a way to quickly and easily disqualify your opponents objections, so you took out this concept of bulverism instead of looking at whether there may be some thruth to it.


"You" meaning the person who submitted the link? Huh?


Its an ironic comment bulverizing the idea of bulverism. Who 'you' is doesn't matter so much. Might be the inventor of bulverism or the link poster.


Thank you for the clarification! Makes sense now.


Wow, so much of our political discourse makes sense now...


You only say that because you are desperate to believe there's more to politics than a villainous will to power


Excellent example!


Watching Rudy Giuliani's common sense right now, making a case for why Trump won. Despite his chances of actually winning now are something like less than 1% on betting markets. No court case has gotten to the stage that allowed significant evidence to be presented (They've rejected for standing or other reasons). The media has all but inaugurated Biden. And it looks like the Republican party will turn against Trump too now. Trump is really a populist politician that somehow won the Republican party nomination. But he doesn't have backing in the institutions of government, especially the courts. Allowed most of his closest political allies to be prosecuted, and removed or taken out or he fired them. And his support in republican party is also shaky. They would love to get rid of him.

Joseph Stalin said some version of "It's not the people who vote that count, it's the people who count the votes."

Its interesting listen Rudy to make his points and argue his case, how the votes should be counted.

He's a skilled orator.

https://www.youtube.com/watch?v=NAzCECx8XeE&t=547s




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: