Conventional programming languages is one answer. They associate programs with text. Some believe there is another way, by associating programs with diagrams. A more abstract example: machine learning associates programs with parameters and weights.
In some weird way, I feel these are all skeuomorphisms. We choose text because that's how we comprehend literature. We choose diagrams because we are visual. We choose ML because we mimic how our brains work.
We don't, however, try to understand what "thought" is, and work backwards to form a representation of it.
For example, take thinking of programming textually. Text assumes a beginning, an end, and an ordered sequence between them. But in this small programming example, is there a well-defined ordering?
a = 0;
b = 1;
c = a + b;
Since the first and second lines can be switched, in some sense Text itself does not do Thought justice.
Visual representations like the one in this video also have their shortcomings. The most obvious being, a monitor is 2D. Thought is not 2D. Going to 3D won't help either. Thought also not only spatial and temporal. For example, how would you represent concatenating strings visually?
I think the more interesting question is how we can accurately represent thought?