Also, why do you consider the introduction of these new bytecodes as breaking changes for the JVM? Backward compatibility has always been one of the highest priorities for the Java architects.
No and yes.
No, it doesn't really matter for dynamically typed languages (just use Object for generic types).
Yes, it matters for statically typed languages. One of the factors for Scala .NET abondonment was the difficulty in creating a interoperable advanced type system in a fully reified environment.
IMO, full type erasure (like C/C++) is the right way to go. Java erasure is weird only because it didn't do it completely, not because they did it at all.
At which point any call to the underlying environment fails with a type error. Sorry your List<Object> is not and will never be a List<int>, trying to pass it as one wont do. Either you provide a way to construct a generic type with the right type parameters in the dynamic language or you accept that you can only inter-op with a limited subset of the underlying environment.
Of course that also gets ugly. You now have to deal with the fact that you not only have a List but several incompatible List<T> floating around, adding the contents of two lists suddenly includes the question what sort of list to return, List<Int>,List<Double>,List<Number> or an error? In a type erased context the answer is always the same, you return a List<Object>.
This line is disproven by all the dynamic languages that happened and failed on the CLR.
I would sincerely doubt they're considering breaking back-compat in Java 10 (that is, making it so older bytecode targeting something <10 cannot run on 10+).