
Matrices in Julia - alexhwoods
http://alexhwoods.com/2015/08/22/linear-algebra-in-julia/
======
ced
_Having only been around since 2012, Julia’s greatest disadvantage is a lack
of community support._

That is true, but if you post a question to their forum, you will get swift
answers, often from the core language team:

[https://groups.google.com/forum/#!forum/julia-
users](https://groups.google.com/forum/#!forum/julia-users)

Unfortunately these posts don't always show up at the top of a google search,
so it's sometimes better to search the user group directly.

~~~
ffriend
_often from the core language team_

Very friendly core language team, by the way :)

------
contravariant
My favourite trick in Julia using matrices is the following one liner:

fib(n) = ((BigInt [1 1; 1 0])^n)[2,1]

Which defines a fast, exact, Fibonacci function. It's so far the easiest and
only way I've managed to calculate the 10^9th Fibonacci number.

~~~
ntenenz
The following defines the Fibonacci recurrence relation (in matrix form):

[F(n+1); F(n)] = [1 1; 1 0]^n * [F(n); F(n-1)]

or

[F(n+1); F(n)] = [1 1; 1 0]^n * [1; 0] if we define F(0) = 0 and F(1) = 1.

However, we can diagonalize [1 1; 1 0] using its eigenvalues and eigenvectors
(see here:
[http://www.wolframalpha.com/input/?i=eigenvectors+of+%7B%7B1...](http://www.wolframalpha.com/input/?i=eigenvectors+of+%7B%7B1%2C+1%7D%2C+%7B1%2C+0%7D%7D)).

For any diagonalizable matrix A = PDP^-1 where P is the matrix of eigenvectors
and D is the matrix of eigenvalues on the diagonal, A^n = PD^nP^-1. Try to
expand the A^n if you don't believe it. Using this trick, you have simplified
the power of a matrix to a power of two floats, namely the eigenvalues.

~~~
gaze
From this you'll end up with the closed form in terms of the golden ratio.
Might as well use that.

~~~
ntenenz
Yup, you will. It's nice to show where that comes from though. :)

------
andrepd
What does this offer me that Numpy doesn't already offer, with the advantage
of not having to learn a new language?

~~~
boromi
Also numpy code looks terribly ugly for numerical computing, when compared
with julia (looks like matlab, but the language doesn't suck like matlab.)

~~~
stared
Example? I keep hearing this, but mostly I got such arguments from people who
do not know NumPy (and e.g. use np.array where np.matrix would make the code
readable).

(I mean, not everything is perfect, but are there side-by-side vector
computations, which are nice in Julia, but not ugly in Python?)

~~~
dagw
It's a bit of a matter of taste, but many people coming from languages like
Matlab prefer the more terse coding style of for example

    
    
       A\X'
    

over the more verbose

    
    
       np.linalg.solve(A,np.transpose(X))

------
leni536
Right now I'm using the armadillo c++ library for linear algebra. It's quite
fast (can use openBLAS too + it does some optimizations with template
metaprogramming) and quite convenient to use. Julia seems really interesting
though.

------
amelius
Is it possible to make one of the items in the matrix a variable? (And then
e.g. put the matrix in an equation, expand it, and solve for the variable).

~~~
micro2588
You can do this using the SymPY.jl package. As a good chunk of matrix
functionality is parametric on element type, you can do matrix operations on
SymPY's symbolic variables.

See example 6 in:
[https://github.com/andreasnoack/andreasnoack.github.io/blob/...](https://github.com/andreasnoack/andreasnoack.github.io/blob/master/talks/2015AprilStanford_AndreasNoack.ipynb)

