Sorry, but there are multiple reports that point to pilot confusion. What you're displaying is exactly what Boeing leadership showed: an oversimplification of the problem, leading to poor understanding of the risk and necessary mitigations.
MCAS had a built in delay; it wasn't a continuous command. It would push the nose down and periodically disengage and allow pilots to bring the nose back up. This type of intermittent feedback is difficult to resolve in real-time, especially for an untrained pilot under stress.
But to the larger point, what you're relating is actually bolstering my point. The design philosophy was poor; they didn't fully understand the interaction effects of the system (to include software, hardware, people, and the environment). In that simplified mental model, they thought software was an easy fix to their problem and they didn't follow through with the necessary risk mitigation. This includes having a redundant AOA sensor feed into MCAS (which their hazard analysis already required), characterizing MCAS properly as having the potential for causing a 'catastrophic' mishap, training for their pilots (which they didn't think was necessary because it was the 'same' airframe, despite different handling characteristics), and an appropriate understanding the human factors that govern its use.
If we erroneously simplify our mental model and claim it's "just software" and an "easy fix," we miss all of that.