Hmm, discrete only? So, can't take averages, state the law of large numbers, the central limit theorem, convergence, completeness, do linear regression, etc.
Can't multiply a random variable by a real number. Hmm ....
Will be in trouble when considering a sequence of, say, independent random variables, say, as in coin flipping.
Right. And if don't want to get involved in what, really, are the measure theory foundations of probability, then fine. Nearly all of statistics has been done this way.
But if do try to give the measure theory foundations, as the OP did, then at least don't make a mess out of it.
If don't want to get the measure theory right, then, sure, just leave out the foundations and start with events and random variables. The measure theory foundations are so powerful, so robust, so general that in practice it's tough to get into trouble. A place to get into trouble: In stochastic processes, take the union of uncountably infinitely many points, call that an event, and ask for its probability. Okay, then, don't do that.