However it's not quite unequivocal. Windows still uses UTF-16 in the kernel (or actually an array of 16bit integers, but UTF-16 is a very strong convention). The code page will often allow the Win32 API to perform the conversion back and forth instead of your application doing it.
AFAICT, it's not only "internal representation". .NET strings are defined as a sequence of UTF-16 units, including the definition of the Char type representing a single UTF-16 code unit. I can't imagine how such a change could be implemented (other than changing the internal representation but converting on all accesses which would be nonsense, I think).