It's interesting that when we think "post-touch" we still think about files and folders ("move the folder where I'm looking"). Maybe we won't need to think about organization -- the computer can handle that, and just play the songs we want, pull up the project we want, play the video we want.
Look at the terrible touch-driven applications and you'll probably find that a goodly number of them assume the same sort of interaction paradigm that you'd find with a keyboard and mouse. The excellent touch applications have made good use of the fact that you have more than one finger and that you can perform gestures with them, that fingers get in the way of stuff, and that you don't want to have to tap multiple times to get something to happen (whether that's finding a file or so on).
It'll take a shift in thinking to make post-touch effective (whether that's gesture, vision tracking, thought etc). If we think about a computer as being a desktop with folders on it, or an 80x24 terminal then we're looking at it the wrong way (an extension of the "if you see a stylus they blew it" principle I guess).
Well, in this case “folder” could be substituted with a generic “item”, I guess. We still have items: songs, projects, videos. And in order to let computer know which items we want, we somehow have to organize them. This is where experimentation happens in UI, but it's all based on “moving” items between folders (playlists, etc.). It's familiar to us from real world, which makes it convenient and easy to learn.
There are, though, UIs where item organization may happen without user input, like Genius playlists in Apple iTunes. This might be the future, I guess.