
The Derivative of a Number - j2kun
http://rjlipton.wordpress.com/2014/08/19/the-derivative-of-a-number/
======
scythe
Here's a lemma that helps to prove D(n) is well-defined:

LEMMA: D(pq•r) = D(p•qr):

D(pq•r) = pq D(r) + r D(pq)

= pq D(r) + rp D(q) + qr D(p)

= p (q D(r) + r D(q)) + qr D(p)

= p D(qr) + qr D(p)

= D(p•qr)

There are a number of generalizations of the notion of "derivative"; in
particular, you can take the derivative of a regular expression:

[https://github.com/scythe/dreg/blob/master/re-
deriv.pdf](https://github.com/scythe/dreg/blob/master/re-deriv.pdf)

...which I've been slowly turning into a regex matcher.

~~~
gergoerdi
If you're interested, regex matching via derivatives have been done before,
see e.g.
[http://matt.might.net/papers/might2011derivatives.pdf](http://matt.might.net/papers/might2011derivatives.pdf)

------
martincmartin
Barbeau was a professor of mine back in the late 1980s, great to hear that
he's still around. It was a wonderful class, with Ravi Vakil [1] and Nima
Arkani-Hamed [2] in it.

[1]
[http://en.wikipedia.org/wiki/Ravi_Vakil](http://en.wikipedia.org/wiki/Ravi_Vakil)
[2] [http://en.wikipedia.org/wiki/Nima_Arkani-
Hamed](http://en.wikipedia.org/wiki/Nima_Arkani-Hamed)

~~~
adamgravitis
He taught our Calculus II course a few years ago, which had rather famous
exams.

A couple years later I was poking around the library and found a book he'd
just published entitled something like "50 challenging undergraduate
mathematics problems", of which I was annoyed to recognize several :-)

------
dead10ck
I feel like I'm missing something. Isn't the derivative of a constant just
defined to be 0? Why does the definition in the article restrict it to primes?

~~~
martincmartin
You're right, it's not the normal definition of derivative. He's defining a
function on numbers that has some similarities to the derivative (the
multiplication rule is the same, and therefore so are some other properties),
but isn't the derivative.

It's common in mathematics to take a word that already means something, and
re-use it for a different but related concept. For example, in group and field
theory, you talk about addition and multiplication, but they're not
necessarily the standard definition.

~~~
robzyb
> It's common in mathematics to take a word that already means something, and
> re-use it for a different but related concept. For example, in group and
> field theory, you talk about addition and multiplication, but they're not
> necessarily the standard definition.

That's true, but I feel like the article mislead me because it didn't give me
a proper and timely explanation of what it means by "derivative".

------
nilsimsa
Shouldn't derivatives by respect to change of another variable. For example
d/dx of f(x).

~~~
tel
That's the intuition used to develop the concept, but it becomes increasingly
difficult to apply that intuition to in more exotic locales.

Thus, it's important to eventually seek out more abstract ways of
characterizing the derivative (and integral). In more advanced mathematics,
you usually state that the derivative is any operation which follows two rules

    
    
        1. Linearity, d(ax + by) = a d(x) + b d(y)
        2. The Product Rule, d(xy) = x d(y) + d(x) y
    

and then try to squeeze things until that operation is defined uniquely.[0]

Likewise, it's often valuable to define integration as nothing more than the
relationship such that

    
    
          I(region, derivative(quantity)) 
        = I(boundary(region), quantity)
    

which is known as the Generalized Stokes Rule. It basically is the
"Fundamental Theorem of Calculus" on steroids and it gives a characterization
of integration in terms of nothing more than it's algebraic/topological
relationship with derivation... which is itself abstracted as mentioned above.

\---

Why do all this? Because you can squeeze most of Calculus so that it depends
only upon this "abstract interface" and then apply things you learned from
calculus all over the place.

\---

Finally, note that this is more like a "proposed" derivative than "the"
derivative on natural numbers. The author notes that linearity fails, for
instance. Thus, some intuition might "port over" but we shouldn't expect too
much of it to do so.

Which echoes back to your original question—there's not really a notion of
instantaneous change for us to be talking about... so how much sense does it
make to talk about a derivative here?

Apparently, more than no sense at all, but less than you might want.

[0] Note that all we need to state this property of the derivative is a notion
of multiplication and addition. This structure is, at its most abstract
usually called a ring (but can be made even weaker if needed). An example
"exotic" ring might be concurrent processes. If P and Q are two processes then
P*Q is P "followed by" Q and P + Q is P and Q "together". Can we write a
derivative here? Who knows? (As another comment in this thread suggests, this
kind of formulation can be used to consider the "derivative of a grammar" to
be a parser! It's also well-known that the derivative of an algebraic data
type is its "zipper"!)

~~~
AnimalMuppet
> That's the intuition used to develop the concept, but it becomes
> increasingly difficult to apply that intuition to in more exotic locales.

Then maybe it's better to use a different term for things that are different.
Maybe it's better to keep the term "derivative" for the rate of change of one
thing with respect to another thing, and to let the generalizations that
aren't that be called something else.

~~~
Chinjut
Perhaps. Would you also say "Maybe it's better to keep the term 'sum' for
combining the count of two sets, and to let the generalizations that aren't
that (e.g., adding integers, adding vectors, adding polynomials, ...) be
called something else"?

~~~
AnimalMuppet
I'd say that you're historically wrong - that "sum" was used for adding
integers, and then found to extend naturally to adding vectors, polynomials,
matrices, real numbers, complex numbers, quaternions, and so on. If you want
to extend it to combining the count of two sets (because you're trying to re-
found all of mathematics on set theory), then the word still fits.

Just don't try to make the set theory version the "real" version, and then try
to deny the use of the word in other places. Those other places were using it
first; you don't have the right to hijack the word.

~~~
Chinjut
By "count of two sets", I don't mean to invoke set theory in any imposing,
modern sense. Just the observation that, historically, we were adding counting
numbers (in particular, non-negative ones with such properties as "The sum of
x and y is always at least as large as x itself") long before we were adding
integers.

Regardless, the point still stands: why would you allow the word "sum" to fit
all those uses (disparate, but with a web of family resemblances), but not
grant the same to "derivative"?

~~~
AnimalMuppet
Because it seems to me that the web of family resemblances for "derivative"
should include the rate of change of one thing with respect to another, not
just that the product rule is satisfied. That is, it seems to me that the
attempts to extend "derivative" are extending it to the point that the web of
family resemblances no longer fits.

~~~
tel
Unfortunately, all I can say to comfort that is that choosing "rate of change"
as your centralizing analogy for "derivative" has been shown through the
history of mathematics to be a great start, but a slow finish.

Frequently mathematics benefits a lot from abstracting to algebra because, at
this point, it's purely about how to define elements and operations by their
apparent behavior instead of by their metaphor or interaction with a larger
idea (such as notions of space, continuity, rate, change... all of those
require quite a _lot_ of mechanics to get in place, while algebra is very
light-weight).

As an example, there have been a lot of attempts to discretize calculus for
computers. Usually, the goal here is to create a scheme of discretization
which, in the limit, resembles the smooth computations we'd like to perform.
This has been a successful program in practice, but it's known to be fraught
with weird edge cases. It's easy to create discretized situations which
violate intuition.

Much of the reason these failed is because they attempted to generalize from
the notion of "rate of change".

There's also the idea of discrete calculus (not "discretized") which is what
you get when you apply the algebraic laws _alone_ to some very standard
notions of discrete spaces (oriented simplicial complexes, in particular—the
simplest discrete object which "has enough topology" to meaningfully have the
algebraic laws of integration applied to it).

What you get in this case is a rich theory of discrete calculus which
rederives half of manifold learning and graph theory as a special case. All of
the laws follow precisely—and they must, as the entire construction was built
to prevent such violations.

Finally, you can examine discrete calculus to find a notion of "rate of
change" if you like. But it's alien from that which you might be familiar with
from continuous domains. It would have been very difficult to arrive at this
point trying to generalize that intuition.

But it's practically inevitable (not to say it's easy, just inevitable) to if
you say that you want to take the algebraic structure of derivatives and
integration and apply it to oriented simplicial complexes.

------
tempodox
Wonderfully strange. I had to implement it as a shell command:

[https://github.com/tempodox/NumberDerivative](https://github.com/tempodox/NumberDerivative)

