

A pointer is just a number - srl
http://bytbox.net/blog/2012/10/pointer-is-just-number.html

======
_kst_
He's wrong. A pointer, at least in C, is not a number, and saying that it is
just causes confusion.

Pointers have _some_ of the same properties as numbers (specifically
integers). In particular, you can do arithmetic on them -- but only in a very
restricted way. You can add or subtract an integer to a pointer, yielding a
pointer, and you can subtract a pointer from a pointer, yielding an integer.

Most current architectures have a single monolithic addressing space, and
pointers can be treated as indices into that address space, treated as a very
large array. But the C standard is carefully written to permit other memory
models.

For example, pointers to distinct objects cannot meaningfully be subtracted or
compared, except for equality or inequality.

Sure, a pointer is represented as a sequence of bits, and that sequence of
bits can be treated as an integer -- but the mapping is unspecified.
Similarly, a floating-point representation can be mapped onto an integer
representation; that doesn't mean that the integer value 1069547520 will help
you understand the floating-point number 1.5.

A pointer is a pointer. If you want to understand how pointers work, learn
about pointers.

See also sections 4 and 6 of the comp.lang.c FAQ at <http://www.c-faq.com/>.

------
EvaPeron
A function is a type, is a set, is a number. And that, frankly, is difficult
to get around. Coming from the C / Java world, it is hard to understand that
"types" are really sets (in a mathematical sense) which in turn, are numbers
(i.e. the set of natural integers), and a function (in terms of the value it
returns) is a number as well and can be considered as being an "operation"
upon a set, i.e., put set A into set B and one has set C. Function == Type ==
Set == Number, or, perhaps, a Number is simply an instance of a Set which in
turn is another way of saying Type, so, basically, Function == Set. What makes
it conceptually difficult is that the word "function" entails "operation",
which entails "time", when, really, one could think of "function" as sort of
being "those steps one has to go through in order to define a set of type S".

A set is a theorem, so now we have, Theorum == Type == Set == Number. A
function is simply the way of defining or proving the Theorum, e.g. if set A
== B, and B == C, then A == C is a function, which itself is a Set, conjoining
A and C.

The "type" of an object (integer, char, whatever) is really a Set, and the way
to construct that set (also being itself a set) is a function, which is what
one might call a "program".

These things are different ways of looking at the same thing. A "type" (set)
is the Theorum, its proof is the function (program which constructs the
Theorum or Set).

The only difference between function/program and Set/Theorum is the former
"constructs" the later, the former being "temporal" the latter being "eternal"
or put it another way, the former being the "building blocks" the latter being
the "finished product".

So yes pointers are numbers, which though are Sets, constructed by functions
and these sets construct the Theorums. The type is the Theorum, the function
is the construction of that Theorum, done vis-a-vis Sets and that is what
(IMHO) that thing which we call "computer science" is all about. :-)

~~~
aaronblohowiak
>the word "function" entails "operation", which entails "time"

Pure functions do not. Pure functions are relations from a domain to a
codomain, implying a "mapping", but not an "operation", which I believe
implies change. This mapping (from the domain to the codomain) can be
presented as a subset of the cartesian product of the domain by the codomain
(which you allude to.) This is devoid of any notion of "time".

~~~
EvaPeron
"A touch, a touch I do confess." (Hamlet) LOL. Yes, "mapping" is a better term
than "operation" for reasons you put forward. :-)

------
evincarofautumn
A good thought, but C’s syntax makes it a bit hairy to explicate that you’re
using an integral value _as_ a pointer:

    
    
        #include <stdint.h>
        #include <stdio.h>
    
        struct big_data { int i; };
    
        void f(intptr_t p) {
            printf("%d\n", ((struct big_data*)p)->i);
        }
    
        int main(int argc, char** argv) {
            struct big_data data = { 42 };
            intptr_t p = &data;
            f(p);
            return 0;
        }
    

So there’s no real pedagogical gain. If someone understands what “a pointer is
just an integer” _means_ , then they probably understand pointers already.

~~~
aaronblohowiak
also, you need to be careful about your assumptions about the sizeof a pointer
and the sizeof an integer.

------
jaylynch
I've always found it helpful to explain a pointer as exactly that - a pointer,
ie. a person standing there pointing at something.

If you need to keep track of something and know that no matter where you put
him he's always going to be pointing at the same thing, sometimes it's going
to be easier to move him around than, say, the 72-story building he's pointing
at.

If you need to explain it more technically then the simple truth of them
tracking the memory address of something has always seemed the best approach -
then start learning what that actually means if need be. Anyone hoping to end
up using C for much and not be shooting yourself in the foot constantly should
be learning at least the basics of such things anyway.

------
dbh937
Coming from Java, I didn't understand the point of pointers (no pun intended)
until I was re-reading through chapter 1 of K&R C.

At one point, they mention how when you pass a variable into a function, it
only passes the value really, "copying" the variable, in a way. Then they go
on to say that changing the original variable is talked about in chapter 5,
which covers pointers. Then it just clicked. If you want to change the
original value, pass a pointer to said variable. If you want to just use the
value of a variable, pass the variable through the function.

It's so much easier than what I was doing before, assigning the variable to
the return value of a function.

~~~
Jach
I'd stick with avoiding passing pointers unless you really need to change more
than one parameter within the function or unless you have a "large" data
structure or unless you're using C++ and passing objects around, then you want
to avoid the copy constructor and destructor. (Java passes all objects by
pointer-value.) I'd also recommend getting in the "const pointer" habit,
because then anyone (or compiler) reading your code can be guaranteed you
aren't going to change things passed to your function.

------
jere
Er... yes? But I suppose every piece of data on a computer could be considered
a number as well.

~~~
jerf
Not "considered"... _is_. It's not just an interesting trivia tidbit, it's a
fundamental part of the machine ceasing to be simply magic and becoming
comprehensible. _Everything_ is just numbers flying around.

Furthermore, if that's a revelation of some sort, or if you'd forgotten it,
I'd commend to you a reading of any assembly language specification. Probably
something simpler like MIPS or something, since in the end they are all the
same w.r.t. the point I'm making here. I suggest this because it's important
to see that at the very, very bottom of the machine, the very, very few things
that the CPU can be truly said to be capable of, it's damn near "add,
subtract, multiply, jump to some other instruction depending on the results of
said", some instructions to fiddle with loading numbers from various places
and a bit for receiving messages or interrupts when certain numbers arrive in
certain places, and _little else_. There's no "display text" or "paint
monster" instructions down there.

It's actually very important to understand this. It's really easy when
studying assembler to not only correctly notice what is there, but to notice
what _isn't_.

~~~
jere
>Furthermore, if that's a revelation of some sort... There's no "display text"
or "paint monster" instructions down there.

Ouch. It wasn't. A little painful though after having studied computer
engineering and even recently writing games in DCPU-16.

But anyway I will agree with most of what you said. The coolest part for me
was always learning that there really was _no magic_ underlying computers.

>Not "considered"... is.

My point is that everything in the computer being a number (large pieces of
data being a single number) is just one of many interpretations and not a
necessarily typical or useful one.

At bottom, everything in the computer is actually some configuration of
electrons. But it's not very useful to work conceptually at the level either.

~~~
jerf
Sorry, didn't mean that targeted at your directly. I meant in general, for the
reader.

"At bottom, everything in the computer is actually some configuration of
electrons. But it's not very useful to work conceptually at the level either."

And actually part of my point is that it is all of those things, at once. It's
electrons and it's all just numbers and it's text, too. It's not really some
of them but really not other things, it's all of them all at once.

~~~
aaronblohowiak
risking getting too deep in the mud, saying things are all of them at once is
not correct. You can treat most sections of memory as executable, numerical,
textual or otherwise, but that does not make them so. For instance, control
code points in ASCII are not text, they are an in-band control mechanism. NULL
is not text; not in the way most people's working definitions of text would
implicate. The fact that there are abstractions upon binary sequences are
matters of convention and occasionally hardware implementation. Teaching that
there are meaningful layers of abstraction is part of having a consistent
(sound) understanding of the computer.

~~~
jerf
I did not mean everything is text; take it as everything is electrons,
numbers, and text/pictures/video/emails/etc., and a other layers too. There's
no "right" or "true" abstraction layer, they're all true. Not necessarily the
same kind of true in absolutely every case, but there's no "true" true here,
where one can dismiss some layer or other as not important. The forest and the
trees are both true.

