Not really. Think of the experiment as a very, very high speed camera. They can't store every frame, so they try to capture just the "interesting" ones. They also store some random ones that can be used later as controls or in case they realize they've missed something. That's the whole job of these various layers of algorithms: recognizing interesting frames. Sometimes a new experiment basically just changes the definition of "interesting"
That we're building theories on what's left of mostly-trashed data has scientific implications. Most people hearing LHC proved something probably didn't think a preprocessor threw away most observations first. That layer of interpretation could cause errors.
I wonder how much independent review went into that step.
reply