Here - I'll explain it to you (and what atoms and particles and molecules etc... are). They're human abstractions. Particles don't behave like particles because the programmer sat down and thought extremely hard how to create a universe with the best amount of variation and dynamics and complexity using the least amount of resources (i.e. compute power). Instead of modeling individual point particles and their trajectories at each time instance at each point in space - it programmed the universe instead in terms of particle trajectories and their dynamics (i.e. the universe only computes probabilities and probabilistic amplitudes for quantized events). Humans think of this behavior as being modelled as some sort of united 'field' but just like anything else this is also an abstraction. The dynamics and edges are there to make it possible to perform further information compression which plays a vital component in terms of the universe's evolution (i.e. the universe is an infinitely recursive function and the life forms within it are vital in enabling it to perform recursive compression state space transitions and to optimize its own information exchange dynamics).
Not really. Everything in the human brain is actually an abstraction. The image or your 'vision' of what's 'out' there is just a computation done by the human brain. It's a 'holographic' 3-dimensional projection which only gives you enough information to survive evolved through many years of evolution. If you want to play that game, we shouldn't be having this discussion at all but we are having a discussion so it's not really coincidental. There is no mountain it's information all the way up and down.
Not to be mistaken with modern computing, digital computing, or Von Neumann computing; computation is simply having states which change in a dependant way.
It's not even required that the dependencies be linear in time or discrete, except in the narrow case of Turing-computability.
Not that big of a mountain, for a thing to exist it has to be implementable, and only computable things are implementable or else they would not “fit” into a universe.
And yet we know that most* functions aren't computable. Are we saying most of mathematics doesn't exist? If that's true then how can we talk about it?
I'm asking these questions rhetorically, but they're serious questions that need to be answered (or at least attempted) to maintain even a pretense of intellectual coherency.
You certainly can make the constructivist argument that only the rationals exist and real numbers and everything that builds on them is some kind of fever dream, but personally I've never seen any even remotely compelling exposition of that position. Maybe that's just a "me problem" though? I really don't know I find that this kind of metaphysic pushes up to and sometimes past my cognitive ability.
Constructivism seems obviously* true to me, and these Aleph arithmetics do in fact seem like a fever dream, but I’m not qualified enough to provide the exposition you’re asking for. To me real numbers exist, but only as approximation functions bounded by available memory and compute, and not as members of an uncountably infinite set in some platonic meta reality.
Check out the works and interviews of Joscha Bach if you haven’t already, he’s influenced my thinking on this quite a bit.
*obviously not in the dismissive sense, but in the sense as “we hold these truths to be self-evident”
We don’t even get to “the reals” before this issue crops up though. Irrational numbers were rigorously studied before infinity and the reals, as far back as 500 BC.
I’m not a computer scientist so I may have gap here, but demonstrating two quantities are incommensurable (showing no unit makes up two quantities m and n times, m and n being integers), does not seem like something possible to approximate empirically or computationally in many cases. The precision required may be one step beyond your current capacity.
Constructivism may be right. But I don’t have a good argument for why finished computation or empirical approximation (there’s always limits to measurement) is the be all end all. Unless we take them to be the final adjudicators, why shouldn’t there be incommensurable quantities? We need very strong arguments they provide the final say, but we know they have limits, their capacity/memory.
Computation isnt a human concept, it's as natural as mathematics. It's something to be discovered, it can only work in certain ways as defined by nature.
Sorry but I disagree. Dirac disagrees here as well. We discovered the positron from mathematics not the other way around. Many other discoveries have been made due to the application of mathematics (check out AlphaGo and ChatGPT).
Mathematics being a exceptionally useful human tool is compatible with mathematics not being "natural". Must hammers grow on trees?
It is also worth pointing out that the basic mathematics of deep learning are quite old and relatively simple. It was actually the technological advancement of being able to economically perform trillions of grade-school arithmetical operations per second that unlocked it, not some "mathematical discovery".
I'm sorry but I'm not sure what you mean by mathematics not being 'natural.' Einstein, Feynman, and many others used mathematics to give us an amazing exposition into the universe. Feynman in fact has stated that a 'particle' takes all possible 'path' computations into account for every possible 'particle trajectory' that exists and that the path of least energy (i.e. least information exchange) takes precedence. If you're telling me this isn't mathematical then you honestly don't know much about mathematics.
Frankly, it's clear that you haven't studied mathematics in any real depth.
Everything can be described (read: approximated, modeled) in mathematical terms; that's the whole point! That doesn't mean mathematical objects and processes must exist independently of those descriptions.
If I have one widget, and someone gives me a widget, I now have two widgets. If I then need to give someone one widget, then I will have one widget left. If someone else needs a widget, and so I break my widget in half and we share, I have half a widget left. Then five people turn up and give me three widgets each. I now have 15 and a half widgets.
The basic, most fundamental principles on which mathematics is based are surely natural and discoverable.
Other types of mathematics are born out of these principles.
To me this is exactly why mathematics is not natural. Counting natural numbers is not natural.
> one widget
And what's a widget? This implies there are "objects" and an object has its "boundry". Counting is completely human-defined. You can count earth as one and mars as two, or you can count solar system as one and an atom in another galaxy as two. It's a man-made system that help us think.
Or you can count particle by particle... well never mind, we're in a thread of an article about why particles are actually probability clouds :)
Why is my mug is one object, instead of two, or three, or 10^26 objects? Counting is very abtrary. Seeing a mug as a whole instead of a bunch of sub-atom waveforms[1] is a choice (that our brain hardware made for us).
Hm, so essentially your argument is that an object is a label that we put upon a quantity of stuff?
I'm not sure that I agree, but for the sake of argument, even if I accept that principle, you can still count how many of that quantity of stuff you have.
If I decide that a mug is made up of one part, then if I get a second mug, I have 2 mugs. If I instead say that a mug is made up of 10^26 objects, and I get another 10^26 objects, I'll have 20^26 objects.
It's easier to count 1 + 1 than 10^26 + 10^26, but there's no change in the fundamental principle of counting just because we don't agree on the number we have to count up to.
All of these are predictable and create natural mathematics through addition and subtraction (or multiplication/division, which are basically just repeated addition/subtraction).
> Seeing a mug as a whole instead of a bunch of sub-atom waveforms[1] is a choice [...]
Our brain is doing that because that bunch of subatomic waveforms have useful properties when considered together. They can hold quantities of other bunches of subatomic waveforms, for example, whereas a different collection of subatomic waveforms like my desk would not hold my coffee.
Your contention that there's no such thing as an object seems a bit solipsistic, and more of a philosophical question than a relevant or useful way of thinking about the universe as we experience it.
Because we don't have direct access to reality. Axioms exist as parts of models, which are human abstractions. Thus at some point to formally talk about reality we have to make rigorous assumptions about it.
That human beings are as good as we are at finding axioms that appear to correspond pretty well to reality is amazing to me. It's a really interesting philosophical question to ask why it is that we are.
On the contrary, it's the people who haven't done advanced studies of mathematics that tend to be confused/wowed by the popular mysticism surrounding it.
There is also a certain type that leans into that mysticism for personal gain, which IMHO is irresponsible and promotes the myth that mathematics is inaccessible.
It's not really a matter of mysticism. It's metaphysics. The question "why does Mathematics work so well to describe our world?" is entirely valid and interesting.
No they used the equivalent of a crazy, maybe infinitely, dense “bit” and all “programming” is things invented by the human brain to model this unknowable “objective” architecture. The universe isn’t actually a computer and even if it is it’s a computer so beyond us that computer science isn’t adequate to describe it.
Well...sort of. The last part is my hypothesis. The first part isn't. Particles don't have a trajectory (location and momentum) within spacetime in the normal way we think of them as having. This has been known for a very long time :)