Hacker News new | past | comments | ask | show | jobs | submit login
Why do we still teach people to calculate? (freakonomics.com)
30 points by amichail 5 days ago | hide | past | favorite | 87 comments





Because being able to do basic order-of-magnitude arithmetic in one's head is a critical thinking tool essential to detecting nonsense claims in real time.

It’ll also help you from getting ripped off, whether that’s a cashier mistakenly giving you the wrong change or friends calculating a restaurant split in their favor, or a malicious business trying to screw you over (banks!).

Mental arithmetic is vital to mental self defense.


Interesting how malice is implied in the safest circumstances and mistake in the most hostile in those examples.

In the schools I'm familiar with, this isn't really a skill that's taught. The emphasis is on carrying out steps correctly on paper. Making a good estimate without showing work would definitely be frowned upon.

In Germany this was part of the physics curriculum.

It made sense since order-of-magnitude estimations constantly come up naturally in all fields of physics, whether you are asking "is this plausibly possible", "what kind of instrument do I need for this measurement" (and later "can we even measure this in this setup") or "are the results of my experiment plausible". There's a reason we joke that to the physicist g=10 and pi=3.

It's difficult to test this kind of thinking in a written test without turning it into something entirely different, but not everything in school has to be on a test. Typically half of our marks were made up from classroom participation.


Once you learn how to use a slide rule to estimate a calculation to just one or two significant digits, you no longer need the slide rule.

Taking the steps multiple times (practicing) will make it fast and trustful.

Guessing may get you a part of the way if you're very good at it, but repeatability is what really hammers it into you. It also gets you out of trouble when for any reason your guessing instinct is not at full steam (when tired or otherwise incapacitated, anxious, etc) since you can usually relatively quickly rely on the steps you have drilled for.


It's also just useful. If I and my coworkers had to constantly stop to pull out a calculator (or an equivalent app) to compute anything, we'd never get anything done.

Precisely. It is critically important to develop the correct cognitive abstractions.

Because you need to be able to run numbers in your head for a lot of practical tasks, and inability to do napkin math can cost you very dearly

This isn't just folk wisdom either. There's an active subfield of cognitive science that focuses on working memory/cognitive load and math instruction. Broadly speaking, it's hard for students to successfully transition to even basic algebra if they don't have basic calculations down.

That fits with my non expert intuition.

If you don't get the basics of arithmetic it is incredibly hard to grasp the idea of "equals" as a pivot point. Division by hand helps to give us a hint that numbers can be broken up into constituents and still be the same.

You internalise a whole set of behaviours, typically via repetition, and it unlocks understanding beyond itself imo


It's all about building higher and higher abstractions. Arithmetic - > Algebra -> higher order algebraic algorithms.

For example, when I taught differential equations, my non-scientific observation was that otherwise smart students struggled more with getting the algorithms down, if they struggled with algebra. Having to constantly jump down to a lower level of abstraction while performing the algorithmic steps is what caused confusion to mistakes.


And calculators get broken very quickly on construction sites. And you must be capable of adding and subtracting and finding the LCD and GCD of fractions to read a tape measure in US customary units.

If durability is the chief constraint, and tape measures are adequately durable and already on-hand for the work being performed, then:

The tape measure itself can be used to perform basic arithmetic, and will even return results in Freedom Fractions if that's how it is labeled.


There must be ruggedized calculators out there, so that argument doesn't seem very compelling. People use their phones at construction sites, and you can run whatever type of calculator you want on a cell phone.

If you run every basic calculation through a calculator, you'll make every task take 3x time it requires today, in best case.

I have seen people eyeball measurements and taking notes, and bringing in perfectly manufactured goods (blinds, cupboards, etc.) in return. Esp, in construction, nothing is precise, so being able to eyeball correctly after a quick measurement with a tape and doing some basic math on the numbers, not only accelerates tasks, but is a crucial pillar of being able to work on construction sites, and be productive.


I just left a comment on a different thread about detached from reality a lot of tech workers are, and then here I find a guy being like "but rugged calculators are a thing, why do you need to do math in your head?"

Tell me you have never built a thing[1] in your entire life that was more complex than an IKEA bookshelf without telling me that challenge: I do hobby projects all the time and I couldn't even tell you how many mental-math problems I do per hour while doing so. It's a LOT. Depending what I'm making I might well spend more time doing math in my head and on paper than I do actually putting tools to materials to build the thing.

[1]: by this I mean actually building. With your hands.


Hell even installing shelves in my closet had me doing trigonometry on a scrap of printer paper to figure out how close I could space together the shelves and still be able to get the boards through the door frame.

Calculation is one of the core skills of human cognition and is correspondingly foundational for just about all of human enterprise. Not doing it means limiting your ability to participate in said enterprise.


I never said you don't need to do math in your head. I said that the argument "calculators are fragile, so you have to do math in your head" doesn't hold water. You're arguing something different, which I tend to agree with.

> and finding the LCD and GCD of fractions to read a tape measure in US customary units.

Well, that's what tolerances are for. I don't need a perfectly accurate answer. Just one that's within 1/32" or more likely 1/4" to 1/8".


I don't see what tolerances have to do with it? You still have to do math if you have to site a member between two other members regardless of the tolerance. For example, to frame an 8 foot wall you have to cut your studs down to accommodate the top plate and bottom plate and the trim plate above the top plate. Similarly with windows, doors, and headers. Whether you have a 1/16 tolerance or a 1/8 tolerance it doesn't matter. The only time I've had to deal with tolerances when framing is in making sure the drywall doesn't wave anything around by firring out or planing down each stud to put the wall into tolerance.

The idea that you have to do a full LCD or GCD to work with fractions in engineering or building is not rational.

What is going on with your tape measures? Confused european over here.

Customary units are given in fractions of an inch. Eg half, quarter, eighth, sixteenth etc. So you’re dealing with simple fractions. That is why I buy tape measures that also have metric scale - to sidestep all that bullshit entirely when adding, multiplying, dividing or subtracting, yet be able to measure distances in inches too

The decimal point was discovered very late in American mathematics.

I keep a fractional-inch tape measure in the toolbox in my work truck because that's what everyone else (in the US) has, and I need to be able to work with others. I also have to carry around two sets of wrenches, two sets of sockets, two sets of Allen keys, and it's all quite frankly absolute bullshit. But that's the world I get to participate in.

At home in my own geeky little workshop, where I don't need to satisfy anyone else's proclivities, everything is exhaustively and exclusively metric-only. My tape measures are metric, my hand tools are metric, my drills are metric, and my fasteners are metric. Working with Freedom Fractions is absolutely forbidden in my workshop (as are all screws with Phillips, JIS, Pozidrive, or Reed-Prince heads, but that's a different rant).

And, I must say: It's better this way.


> screws

Torx?


For screws that go into wood (or thereabouts; sheet metal screws can count here too), Robertson is first on my list. The slightly tapered interface fits snugly and tends to hold screws onto a driver bit very well while providing good angular alignment, and it can do all of this without needing magnets. It can also allow for a bit of angular misalignment when necessary, and the chonky squared-off corners and flats resist cam-out/stripping rather well. I'm not even Canadian and I use Robertson wherever I get a chance.

Second general choice is hex-head screws. These snap into inexpensive shallow magnetic driver bits with a satisfying click. But they're ugly and kind of rough once installed, and they're not available in flat-head versions. (Hex-head shoulder screws are my only choice for drill-point screws: They maintain positive angular alignment by default and that's crucial for drilling holes in metal.)

Torx is fine too, I suppose. I don't have an anti-Torx rule in my workshop, but I try to avoid buying them. They don't tolerate angular misalignment as well as Robertson, and they don't maintain positive angular alignment like a hexagonal shoulder screw does, and they don't stay put with friction like Robertson or snap onto a magnetic driver bit like hex. There's lots of stuff Torx is not very good at doing.

Theoretically, I can probably put more torque into Torx than any of the other options listed here, but I don't find that to be a practical advantage in this kind of application: When I can already drive a Robertson screw through a chunk of old-growth wood, I don't need to improve that part.

---

For machine screws, I've standardized on stainless steel button-head socket cap screws as a first choice. I've got a deeper selection of them than a good hardware store does, and they're all sorted. (Why stainless instead of graded? Because I don't want them to rot, whether sitting in a bin for decades or used outdoors or whatever and I do not want to stock more than one kind. My fastener collection is crazy enough without also multiplying it by different grades.)

There's other stuff, though, too. For instance: Regular socket cap screws have their place -- it just isn't first place.

And at the scale of things I build, I mostly use M3, M4, and M5.

I buy regular-length Bondhus non-ball hex keys in simple bulk packaging to fit my standard M3, M4, and M5 machine screws. They're very high quality tools, and they're rather inexpensive in bulk. And thus, it is no big deal if I misplace one while I'm working -- I've got more on-hand, and I'm not afraid to get more coming if stock gets low.

Bondhus makes a decent-quality hexagonal screwdriver, too, and these are nice to keep around because the design is not like the gigantic T-handled abortions that so many other manufacturers sell: It's just screwdriver-shaped, and it works just like a familiar screwdriver does -- but for socket-cap screws! This was the discovery that allowed me to completely abolish Phillips screws forever from my workshop.

I've also got sets of hex keys -- of course I do. Long, ball-end, plain, whatever. I try to avoid cornering myself into a situations where these non-regular variations would ever begin to be useful to begin with, and the long versions are mostly only useful for disassembling stuff that someone else had overtorqued. (But overtorqued fasteners are different rant.)


I mostly use screws for metal and plastic, so I use metric hex as well. Therein lies one disadvantage of the metric system: M3 is often too large, M2 is too small, and M2.5 is not super common. I use 2.5, so it’s only a minor annoyance.

The metric system has uniform computational cost whenever you add/subtract or divide any two lengths.

The inch-foot system does a tradeoff. Easier computation when your lengths are certain combinations of simple fractions, but difficult ones when not.


Yes we should start my six year old with group theory. When we go to the farmer's market I can ask him under what transformations is an apple invariant, and then pay for it by having him solve a Rubiks cube.

A pile of apples (or a stack of one dollar bills) would make a pretty natural introduction to the Peano axioms, and from there the basic mathematical operations.

- A pile without apples is a pile of apples. We call it the empty pile.

- The successor function is to add an apple to the pile. Apply the successor function to the pile of apples and you still have a pile of apples.

- A pile of apples is as large as itself

- If the left pile has as many apples as the right pile, then the right pile also has as many apples as the left pile

- If you have two piles of apples you can add them by checking if the second pile is empty. If it is empty you are done and the result is in the first pile. If not take an apple from the second pile and add it to the first pile. Then check again and repeat

And then you can teach him about the bijection between money and apples. After a little more set theory you can start constructing two piles of apples, one representing what you have one what you have to give away ...

The possibilities are endless.


Not sure if serious or you are the supreme master of deadpan humor. Or both.

You joke, but it's become a thing for kids to do Rubik's cube. I think it's great for algorithmic thinking. My kid has a whole bunch of classmates who spend their time cubing, going from one algo to a more advanced one, discussing with their friends.

When I took Abstract Algebra 1, I had to sit in the front row because I was too cheap to buy glasses.

I learned about commutators from a guy on the front row who was really good at Rubik's cubes. Concrete examples relatable to your hobbies are great.

I also tried to pay for my lunch by commutating, but the lunch lady wouldn't have it.


You could have him find the roots of the tree it fell from (with multiplicities).

to me this seems very ill-thought through, and doesn't feel like they actually remember what math was like. they are suggesting letting the computers do the computation part, but we already do that once that skill has been mastered. i.e.: simple calculators are allowed in calculous exams.

the problem with bad ideas in education is the system might be stupid enough to follow through with it, like that new math nonsense.


I spent years having a negative opinion of that same "new math nonsense"; my only experience with it being viral examples of its worst excesses. But then I had kids in elementary school and saw it up close every day. It was weird and frustrating to me as a child of the 80s, but eventually I came to see that it was teaching approaches to problem solving that I was intuitively using myself. There's still rote memorization involved, but that's not the primary focus. As a result, I think it's better positioned my kids than I was at their age.

I think there is a huge problem when parents or other adults can't sit down with the kids and help them with their homework. whomever thought that was a good idea should be bared from all education decisions ever

It's interesting to discuss mathematical literacy with primary school teachers, mathematicians, statisticians, data scientists. I do periodically.

Primary school teachers respect core fundamentals as kids acquire "mental muscle memory" and realize they have to both create some axiomatic knowledge (axiomatic in as much as you know 9x9 is 81 from rote recall, not because of a belief in inductive reasoning) as well as try to begin an uplift to reasoned knowledge (that 2^2 x 2^2 is 2^(2+2) is 2^4) and some coding/transcoding (1/2 == 0.5) Cuisenaire rods come in and out of fashion. Crows can count. Kids are sometimes dumber than crows.

Mathematicians are very much in davis/hersh "what is mathematics anyway" -I believe Hersh noted that you can be in a field where only 3 other people worldwide can talk to you cogently about your work, and peer review is meaningless.

Statisticians are very comfortable that approximations work, but are less concered with accuracy at times, and very much concerned with methodology. I've had quite remarkable conversations with them about sample size, and how UX people can survive on 5 responses. I neve predict which side of the problem they're going to respond.

Data scientists are almost intuitive at times. sometimes the reliance on codified knowledge (numpy/pandas) and a belief in the p-jacked value or an obvious excel error is frightening. I think they divide sheep/goats into the numerate, and the highly visual.

I consider myself semi literate, mathematically speaking but in fact, I stumble over basic arithmetic all the time, and I struggle with ideas behind complex numbers, trig. I have to re-prove things which should be known, re-induce belief in things which are based on inductive reasoning, I question commutation all the time. How the hell can 2 x 3 be the same as 3 x 2 there's a fundamental left-right ordering in my brain which at times I ask myself is this inside the farsi or hebrew or thai or boudestrophon flow texts, suggesting that not all right-to-left ordered languages obey it yet alas I do.

I also still don't entirely understand why school focussed on trig so much given that very few of us are navigating by sextant. I suspect at times it was dividing us into the ones which drink from the hand, and the ones which lap from the stream.

Do they teach decimal to octal and hex and binary in primary school yet? Will the world be different when the last of the duodecimal measurement learners have died?


While I can agree that in the later years, the math becomes unnecessarily prescriptive, as a mathematician myself, I am categorically not in the 'what is mathematics anyway' camp, at least not for kids.

For children, it is fundamental that they calculate. Learning arithmetic is the first introduction into the idea of infinity, which is a vital concept all children must come to terms with. The natural numbers pop up everywhere and is a basic life skill.

Mathematicians and philosophers can pontificate later as to what the exact nature of mathematics is, and if the real numbers are actually real, but none of that means that kids shouldn't learn how to count.


Nothing I said, nothing whatsoever was meant to imply kids should not learn to count or acquire the times tables, and learn the fundamentals of mathematics we call arithmetic on the streets.

Nothing.


I don't think I said you did say that. It was a response, not an argument.

> I also still don't entirely understand why school focussed on trig so much given that very few of us are navigating by sextant.

Trig is useful for practical problems involving angles, and ubiquitous in applications in engineering and statistics. One year in school we spent a long time on logarithms and trig. Approaching 20 years later I consider it time well spent, and my one regret is that I didn't strive to understand it more deeply at the time, and had to revisit some details later (mostly about logarithms, but same category IMO).


> Mathematicians are very much in davis/hersh "what is mathematics anyway"

This reminds me of getting my CS degree in the early 2000s, then slowly realizing that my comp-sci program had taught me lots of interesting things about how computers and programs work, and essentially nothing about how to write code for a living.

Also reminds me of playing Dominion (the card game) with another student, me saying "Okay, I've got 1 and 2 and 2 and 3, that's 8..." and him gasping "Wow! How did you do that so fast?"

Dude was a math grad student. I narrowly restrained myself from saying "Well, you see, I finished the 3rd grade..."


I got my degree in the mid 90's and let me tell you - high school guidance counsellors in the 1980's had no clue at all what computer science was.

I arrived at university expecting to be writing a lot of programs and was soon struggling with a heavy load of mathematics and proofs. Our introduction to computer programming professor made it clear that that class was the only one where we would be taught to program. Computer science, he said, was mostly done with paper and pencil.

It wasn't quite true. We eventually did a lot of programming, but it was nothing like I was expecting.


> high school guidance counsellors in the 1980's had no clue at all what computer science was.

My problem was that the fucking college counselors didn't know either.

When I signed up, I told the lady I wanted to be a computer programmer, and she said "You'll want computer science, then," and that wasn't true. The CS program was only interested in teaching me how to be a CS professor. A few years in I was scouring the course lists for classes on graphics, web, anything with a GUI even, and there was nothing. It was 100% command line C++, and they didn't even bother teaching us about IDEs and debuggers. Just Telnet, vi, g++.

It was like taking a pure math degree to become an architect.


> Just Telnet, vi, g++.

Those have absolutely stood the test of time than any specific IDE or debugger.


That doesn't necessarily make them a great choice for teaching.

A '57 Chevy has stood the test of time, but a modern car with an automatic transmission, power steering, power brakes, and air bags is probably a better choice for a driving school today.


The problem with actually learning to code in school is that you'd spend a lot of time learning specifics that become outdated, which you could've self-studied anyway. Idk what it was like in the 1980s, but in 2015 our CS practice problems and tests were mostly pencil+paper, with actual code only used as a teaching tool in projects and some homework (more in lower divs).

That’s why I think the old SICP-based curricula are so great. Learning Scheme isn’t really the point. It’s all about fundamentals of computation.

We used Scheme too, it was a good way of teaching general computation.

> Dude was a math grad student. I narrowly restrained myself from saying ...

Do not show restraint, and neither will I. For I am a mathematician, not a fucking calculator.

But once, early in undergrad, I was capable of multiplying 4-digit numbers rather quickly. But tallying never seemed useful except while grading exams -- during which my brain populates a lookup table over the course of an hour, and then I can tally small sums without hesitation. But after that day of grading, the lookup table is flushed for more important uses of short term memory.


> Do not show restraint, and neither will I. For I am a mathematician, not a fucking calculator.

I don't expect mathematicians to be lightning calculators, but expressing awe at grade-school arithmetic is a bit much.


It sounds like your rapidity was found remarkable. Not everybody has that. Not all mathematicians have that. Perhaps respect was due.

At the end of the day, writing code is not difficult. Understanding what it's actually doing is the hard part. Moreover, understanding how to write code that reasons about code (and higher) is when you start to need computer science, or really just axiomatic systems (math).

> At the end of the day, writing code is not difficult. Understanding what it's actually doing is the hard part.

That sounds nice in the abstract, but in practice it's slow going trying to do web dev when your experience is nothing but command line C++.


It's really not.

So you are typing Javascript into your browser's console to see what it does instead of a text editor and compiler. The iterative process is still very similar. Write code -> get the syntax correct -> see if output matches your mental model -> repeat.

You still want unit testing or some other system of reliably testing your code and detecting regressions. You need to use a distributed version control system to track your work over time and collaborate with others. You need to be able to figure out why your code is suddenly taking much longer to execute it than you thought it would. You need to figure out why the memory usage keeps going up and never comes down. You need to be able to gather requirements for the software you are writing.

All of those things are skills you need to learn regardless of the specific programming language.


> It's really not.

From personal experience, yes, it is. People spend years getting really good at Java, or Ruby, or SQL, or whatever. You'll get better quicker at all of those if you start with a solid grounding in basic coding principles, but basic principles alone are not enough.

Maybe you're one of those 10x programmers I keep hearing about who can master any subfield instantly. That's lovely for you. Most of us do better with a bit of specialization, especially at the beginning of our careers.


I guess my honest answer is that web dev is very far removed in general from computer science and touches on design, marketing, etc, more than computer science. There's computer science in the back end and in the implementation of the front end, but like a significant portion of web dev is just aesthetics, not any kind of CS work. Perhaps there should be someone making that obvious at some point.

To see patterns in things and to extend from the known to beyond.

While numeracy is certainly useful, isn't that only the surface point. The depth in deeper learning (we all learn - Edward Deming - but learn what?) is in shaping self and the internally and externally perceived worlds.


I have rarely skimmed over such a long interview with virtually no content.

Ok, we get it. Wolfram Research has fantastic and somewhat underappreciated products (Wolfram Alpha is far more impressive than LLMs!) and you want to cash in on the AI hype. But please don't take it out on innocent students.


There's two parts:

1. Basic math, eg long division. This is what all the commenters are focusing on, and I largely agree with folks - we need to keep teaching this.

2. HS / College math. E.g. memorizing all the rules to solve integral calculus equations. Or solving all those circuits in Intro to EE. I agree with Wolfram here. At this point, students should be mentally developed enough to start focusing on formalizing and reducing problems rather than rote memorization or hand calculations. It's good to go over all the rules and practice a few times. But to make a whole quarter of it seems a bit much.


He is making a really long winded point I'm not sure I completely understand, so feel free to correct me if I am off here, but I could not really disagree more.

While there is value in emphasizing things like defining/abstracting a problem rather than calculating the result of it, things like the "algorithm" typically taught to elementary schoolers to do long division is pretty much how you would naively/algorithmically program a division operator that can give a remainder. The multiplication "algorithm" given to me at a young age was what first gave me the insight into the connection between addition and multiplication. Abstracting this away in education could not possibly be doing society, math, or computation any service whatsoever, at least not in the way (I think?) he is imagining.

I do think there is value to asking whether a problem needs to be solved or redefined or abstracted in a different manner. I had a calculus teacher in college a long time ago that had a really hilarious way of emphasizing this - he'd often bury "trap" questions into his tests that looked really simple but would test your knowledge of a simple algebraic or trigonometric trick he'd only covered during lecture, and lack of knowledge of this trick would lead one down a hopeless 20+ page response to the problem, to the point where if you found yourself going off the rails with a complex solution, you knew you messed up somewhere. This forced me to look at the problem from a variety angles to see if I could solve it in a more clever, general, or simpler way, or if it could be rewritten (this was usually the correct way to solve it). I really valued that, even years and years later, but maybe that's not the kind of thing Wolfram is getting after in this interview.


This is why I think we'd be better off doing a load of olympiad questions. Those are exactly the kind of thing where you can practice the flash-of-insight type thing that often gives an elegant solution.

I think that learning some basic by-hand computation skills is useful. The question is where do you draw the line. The current practice seems to be to stop at long-division. To me this seems a reasonable place to stop. There are hand methods to do more complicated calculations, such as computing square roots, that are not taught in school, which is probably fine. I don't think leaving out long-division would save that much time, and I think it's a good mental exercise for kids to have to learn a fairly complex algorithm. If I could suggest one improvement, it would be that in later advanced math courses, they returned to long-division and explained at a deeper level why that algorithm works.

Every time this topic is brought up I can't help but think about Asimov's short story: https://en.m.wikipedia.org/wiki/The_Feeling_of_Power

As someone with reasonably good mental math skills (120+ score consistently on https://arithmetic.zetamac.com/, default settings), I cannot really say it helps me much in the real life, at least not on the surface. It sure is nice to be able to estimate some things accurately without having to resort to a calculator, but in no way would anything change in the way I think or conclusions I make if I had to use one, so there might be some truth to what Mr Wolfram is suggesting.

Calculation requires practice which builds intuition. Moreover, it's just more time consuming to input things into a calculator. Its much faster to just have 8 / 4 = 2 in your head.

The only time I said "why don't we just use a calculator" in school was during the oddly specific focus on integration techniques in calculus (trig substitution etc). Seems like the problems were carefully set up to be solvable that way, but IRL you'd actually need a computer.

I hope they stop teaching people to calculate. Then I'll start up my career in fraud and embezzlement.

> Then I'll start up my career in fraud and embezzlement.

You're planning to become a politician?


I imagine you know more than enough math to already do that!

I know plenty of math, but having more ignorant victims will make it easier.

>Why Do We Still Teach People to Calculate?

I would say it is intended for everyone who can not figure it out for themselves.


Because it’s easy to test and grade?

Because a computer isn’t embedded in our heads yet, wait we have one, why not use it.

My parents used to force me to memorize my times tables and quiz me all the time.

Definitely paid off. Can calculate numbers in my head pretty easily. Definitely used to help with tips or commissions or whatever.


I stopped about a third of the way through, once machine learning cropped up (I'm biased against it, as I'd rather humans use energy to learn and practice than a computer do it, but I accept I might be missing something), but I support the first two points:

1. Define the problem and why we're going to solve it (in D&D terms, I'd say this is a wisdom check)

2. Translate the problem into math language (needs imagination and experience, which is an intelligence check)

2.5. Levitt noted the importance of interleaved practice; why practice a tool intensively for the test only to never use it again? Supported by the authors of Make it Stick: the science of successful learning.

That all sounds better than "here kids, learn these tools out of context".


If we get hit by an EMP blast and get blown into the 19th century those calculation skills would make some rich.

As would a stack of paper books of logarithms

I have inherited my Grandfather's pocket slide rule.

"because you live in a country where the primary purpose for you as a citizen is to go into debt, so you better know how to, at the very least, do financial planning"

Arguably, being able to do financial planing will keep you out of debt. At the same time the number of people who understand compound interest is shockingly low - a great boon for credit issuers hoping to exploit you.

It would be fanstastic if they taught it in school. They don't, though, which is all kinds of questionable.

When interest rates were 2.5% guaranteed for 30 years and the govt was running a 3.1 trillion deficit on 3.6 trillion of revenue, hell yeah we went into debt! 2020 was a good time to be able to calculate.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: