Technically, it's probably GarageBand that's wrong here, but I've changed AutoMIDIFlip so that GarageBand should be happy.
What was happening is that the original file is a Format 0 MIDI file. Internally, a Format 0 file only ever has one track, so all the instruments have to go into one track. That's okay because in MIDI, the channel numbers are stored per-note, so it still manages to work out. Each channel can only play one instrument at a time, though you can play multiple notes at once.
It seems that when GarageBand opens a Format 0 file, it automatically splits out that one track into (I'm guessing) 16 tracks, one for each channel. It also sounds like GarageBand makes no distinction between tracks and channels, so you can only have one instrument per track.
Prior to the change I just made, AutoMIDIFlip would always set the format field in the header to 1, so that it would output a Format 1 file; this was necessary so that it could add the three empty attribution tracks to the file. Everything else should have worked as before, but when GarageBand opens a Format 1 file, it will keep the layout of the tracks as specified in the MIDI file rather than splitting them into channels.
Because of this, and because GarageBand doesn't have the distinction between tracks and channels, GarageBand was losing the instrumentation data that was in the original MIDI file.
I've changed AutoMIDIFlip now so that it doesn't set the format field differently; if you give it a Format 0 file, you'll get a Format 0 file back. Because you can't add more tracks in that kind of file, it will append a short attribution - "(auto-flipped with automidiflip.com)" - to the title of the song instead.
Again, thanks for the report!
Do you know why it's possible to have different channels on the same track in MIDI? It's kind of cool to save channel information per-note, but do people really use this to have multiple instruments on one track?
Like GarageBand, it seems most DAW associate a channel with a track and not individual notes. What was the use case originally?
It didn't take long for multi-output MIDI interfaces to appear, but until they did everything went out of a single port, and the software worked to suit, with the channel setting on each track being used to select a playback device.
 Technically the single line went through multiple MIDI thru connectors. But it was still the same stream of information.
Because events in a track in a Standard MIDI File are stored more-or-less the same way they'd be sent down the wire, that includes the channel information. The sender doesn't need to have any concept of channels in order to send the events stored in a SMF.
(Yes, this does mean that you can have notes on different tracks that are on the same channel. That's actually useful sometimes when using a DAW, especially for solos and drum parts.)
MIDI messages tend to be 3 bytes. For example, a "Note On" message is 0x9Z 0xKK 0xVV where 9 is the "Note On" command, Z is the MIDI channel (since its 4 bits, you can have 16 channels), KK is the key/note (up to 128 ) and VV is the velocity (up to 128 different values).
 excluding system messages and the channel pressure command which has only 2 bytes
 the most significant bit is always 0 for data bytes. If its 1 (like in the first of the messages byte) it is treated as a command byte.