
Duality of Vector Spaces (2017) - hosolmaz
https://solmaz.io/notes/duality-vector-spaces/
======
ivan_ah
For anyone not familiar with the concept, a one-form is something that takes
vectors as inputs and produces numbers as outputs. See
[https://en.wikipedia.org/wiki/One-form](https://en.wikipedia.org/wiki/One-
form)

~~~
smlckz
Can I think of one-forms as partially applied (curried) dot product? (Forgive
me for the usage of pseudo-Haskell notation below.)

Let V F be a vector space over F. The (curried) inner product is

    
    
            dot :: V F -> V F -> F
    

Now I take each vector in a vector v :: V F and apply it to ''dot'' partially
to get a covector/one-form v* :: V F -> F.

If I take all vectors in V F and apply each of them partially, I should get
the covector space V* F :: V F -> F over the field F.

Now, if this makes sense, tell me how to construct

    
    
            V** F
    
    

from this and show that it's isomorphic to V F.

~~~
layoutIfNeeded
Yes.

In other words: if vectors are column vectors, then covectors are row vectors,
and vice versa. In this case, taking the dual is transposing your vectors, and
the transpose of a transpose gives you back the original vectors.

But mathematicians try to build the most general, most abstract framework,
which doesn't rely on the specifics of a concrete example (i.e. the dot
product of finite-dimensional real vectors).

------
snicker7
Dual spaces are an essential component of multilinear algebra, a subject that
is simultaneously essential yet often ignored in undergraduate education.

------
Lucasoato
Remember that V* * = V is valid only for vector spaces with finite dimensions.

~~~
RossBencina
It is true that finite dimensional vector spaces are reflexive. But the
broader claim that _only_ finite dimensional vector spaces are reflexive is
false. For example, Hilbert spaces are reflexive and infinite dimensional. See
the last paragraph here:

[https://en.wikipedia.org/wiki/Hilbert_space#Duality](https://en.wikipedia.org/wiki/Hilbert_space#Duality)

See also:

[https://en.wikipedia.org/wiki/Reflexive_space](https://en.wikipedia.org/wiki/Reflexive_space)

~~~
edflsafoiewq
OP is correct assuming "dual" is defined as all the linear functionals on a
space. The dual of an infinite-dimensional space in this sense always has
larger dimension than the original space.

The difference comes in that for Hilbert spaces, "dual" is usually taken to
mean only the _continuous_ linear functionals.

~~~
snicker7
eg "continuous" vs "algebraic" duals.

------
delaaxe
Does this have anything to do with covariance and contravariance in terms on
CS ? I couldn't understand from the article what those terms mean in the
context of vectors spaces.

[https://en.wikipedia.org/wiki/Covariance_and_contravariance_...](https://en.wikipedia.org/wiki/Covariance_and_contravariance_\(computer_science\))

~~~
fchu
(See
[https://en.m.wikipedia.org/wiki/Covariance_and_contravarianc...](https://en.m.wikipedia.org/wiki/Covariance_and_contravariance_of_vectors)
for more context)

Yes, as the terms "co/contra-variance" loosely relate to how certain derived
attributes "varies" when base attributes changes:

\- vectors: whether scalars scales in the same direction or not of the base
vectors (ie is obtained through the base transformation matrix, or its
inverse) \- generics: whether generics of a subclass of T is interpreted as a
subclass (same direction) or superclass (opposite direction) of a generics of
T.

(Maybe they all formally tie together through category theory? I don't know
but would love to hear from someone more educated about it!)

------
teleforce
I've got the feeling that the problems that vector spaces is trying to solve
can be elegantly solved by quaternion approach [1]

[1][https://www.researchgate.net/publication/2130951_On_quaterni...](https://www.researchgate.net/publication/2130951_On_quaternionic_functional_analysis)

------
fchu
There is beauty in first constructing V* as the one-forms over V, which seems
like a "one-way" derivation from it, and then finding through V __= V that
they 're actually two very-equal faces of the same coin.

------
soVeryTired
The machinery described in the article is powerful and useful, but you don’t
_need_ it to understand dual spaces. Column vectors in R^n form a vector
space. Row vectors map them into real numbers via standard matrix
multiplication (on the left). Also vice-versa with right multiplication.

So row vectors are the dual of column vectors. Job done!

Edit: I admit you need a bit more for covariance and contravariance though.

