Klapaucius witnessed the first trial run of Trurl's poetry machine, the Elecronic Bard. Here are the some of the wonderful poems it instantly composed to Klapaucius's specifications:
A love poem, lyrical, pastoral, and expressed in the language of pure mathematics. Tensor algebra mainly, with a little topology and higher calculus, if need be. But with feeling, you understand, and in the cybernetic spirit.
Come, let us hasten to a higher plane,
Where dyads tread the fairy fields of Venn,
Their indices bedecked from one to n,
Commingled in an endless Markov chain!
Come, every frustum longs to be a cone,
And every vector dreams of matrices.
Hark to the gentle gradient of the breeze:
It whispers of a more ergodic zone.
In Riemann, Hilbert or in Banach space
Let superscripts and subscripts go their ways.
Our asymptotes no longer out of phase,
We shall encounter, counting, face to face.
I'll grant thee random access to my heart,
Thou'lt tell me all the constants of thy love;
And so we two shall all love's lemmas prove,
And in our bound partition never part.
For what did Cauchy know, or Christoffel,
Or Fourier, or any Boole or Euler,
Wielding their compasses, their pens and rulers,
Of thy supernal sinusoidal spell?
Cancel me not -- for what then shall remain?
Abscissas, some mantissas, modules, modes,
A root or two, a torus and a node:
The inverse of my verse, a null domain.
Ellipse of bliss, converse, O lips divine!
The product of our scalars is defined!
Cyberiad draws nigh, and the skew mind
cuts capers like a happy haversine.
I see the eigenvalue in thine eye,
I hear the tender tensor in thy sigh.
Bernoulli would have been content to die,
Had he but known such a squared cosine 2 phi!
Often only Hessian-vector products or Jacobean-vector products are required, and these can be computed via more standard autodiff techniques, usually a lot more efficiently than if you were to compute the Hessian or Jacobean directly.
Also for models with lots of parameters, the Jacobean and Hessian are usually impractically large to realise in memory (N^2 in the number of parameters).
Nevertheless the symbolic tensor calculus approach is very appealing to me. For one thing it could make it a lot easier to see in a more readable symbolic notation what the gradient computations look like in standard backprop, and could perhaps make it easier to implement powerful symbolic optimizations.