> This leads to the selection of a hybrid design that composes digital hashing with low precision vector-matrix multiplication (intended for photonic acceleration) to produce HeavyHash. HeavyHash is an iterated composition of an existing hash function, i.e. SHA256, and a weighting function such that the cost of evaluation of HeavyHash is dominated by the computing of the weighting function.
What is the weighting function? How do we verify that the result is valid? There have been other attempts to make proof of work more capex sensitive (especially memory-hard variants, like birthday paradox) but they all end up suffering from the fact that the very fact that you can verify the result means that you can brute force the outcome, and often that's a tradeoff that works.
Without knowing specifics it's very hard to say whether this particular proof of work algorithm does not permit an energy-inefficient brute force solution that will end up making the energy problems just as bad -- my intuition is that of course this won't work, as a matter of "no free lunch" -- the cost to secure a coin will be equal to the value of keeping it secure.
I guess we'll have to wait:
> Beyond these intuitions, the specifics of the algorithm and a detailed proof of its security will be published in a separate manuscript. 
>  Michael Dubrovsky and Marshall Ball. Towards optical proof of work; oPoW. Unpublished Manuscript, 2019.
No; you cannot brute-force the outcome in any realistic sense. For example, a Cuckoo Cycle  proof consists of 42 n-bit indices of edges that together form a cycle in a random bipartite graph on 2^n+2^n nodes, with typically n >= 29. Brute forcing over all possible size-42 subsets of 2^n indices will take well beyond the heat death of the universe. It's way easier to brute force the 256-bit private keys of all bitcoin balances...
However, it's easier to make it work on top of a more radical shift in hardware. At the basic level, we are just using simple random matrix-vector mults. Of course, the photonics or other analog low-energy approaches have to win in the market for this operation, and that will be tested empirically (though there has been a ton of investment into this kind of processing going analog as we discuss in the paper).