I agree - echoing the sentiments of another commentator here, I feel like one of the tenets of Java is backwards compatibility. While the change doesn't affect functionality, it can turn code that previously had a space complexity of O(1) into one that is O(n). This is probably a Bad Thing.
Conversely, for people new or somewhat-new to the language, the change probably makes sense from a principle of least surprise. From the start, you're taught that Strings are immutable objects, so you probably understand that `.substring()` produces a new instance object. Not having the original memory freed when you remove all references to the original string would likely be puzzling at first.
In this respect, the Java/Oracle folks likely decided that optimizing for the "parsing/tokenization" use case (where you make lots of substrings from a large original string and thus it makes sense to use the same underlying character array) was more novel and less frequent than the use case of "just pulling a small substring from a much larger one and then discarding the large one."
Conversely, for people new or somewhat-new to the language, the change probably makes sense from a principle of least surprise. From the start, you're taught that Strings are immutable objects, so you probably understand that `.substring()` produces a new instance object. Not having the original memory freed when you remove all references to the original string would likely be puzzling at first.
In this respect, the Java/Oracle folks likely decided that optimizing for the "parsing/tokenization" use case (where you make lots of substrings from a large original string and thus it makes sense to use the same underlying character array) was more novel and less frequent than the use case of "just pulling a small substring from a much larger one and then discarding the large one."