But in practice, at a certain level some kind of informative or even quasi-normative convention will have to creep in if you want to define the question "computer X can run program Y reasonably well." It's a qualitatively different problem, but it remains a real problem. Not one we have in the early days of the system, though.
I am happy to solve most but not all of any problem...
But it's a distinction you have to make. Concrete ≠ obvious.
Other things that are sometimes in specs and sometimes not:
Is the intermediate state of a running Hoon program specified by the code? (It matters for someone writing a debugger!)
Is how the compiled program handles data at runtime specified? (It matters for a cryptographer!)
Are the errors produced by the compiler specified? (It matters for an IDE!)
Does the compiler have any limitations that should be allowed to be reduced / not have any limitations that should be allowed to be restricted? (It matters for anyone trying to make a faster compiler!)
In general the answer to your questions is "yes and no." Well, really it's no - except that as a fairly common case, for example when we want to catch compiler errors without breaking out of the system, we virtualize Nock within itself. This is also serviceable when it comes to adding an extra operator, 11, that dereferences the global namespace. But the fact that "virtual Nock" is just a virtualization stack within one Nock interpreter is not semantically detectable.