

Touch based WP7 programing environment and language by Microsoft Research - lukencode
http://research.microsoft.com/en-us/projects/touchstudio/

======
6ren
It's nice that it has built-in touch and cloud support, and it's nice that you
can code on-the-unit, using the touch interface. But it seems even less visual
than existing visual languages (they are mostly academic/teaching experimnts;
I don't mean "V"B - though its form creator comes close).

Could there be an entirely different approach to programming, built around the
touch-screen from the ground up? Using its strengths (e.g. analog 2D input)
and skipping its weaknesses (e.g. no digital symbols from keystrokes) -
instead of trying to cram the old ideas into the new? Let it grow in its own
way, I say.

A starting point might be visual form creation; lines for bindings, for
transitions, for calls; multiple fingers for looping. Or, perhaps even better,
to be like the _diagrams_ we sketch to represent coding ideas! Instead of
adapting our most natural and intuitive expression (diagrams) to coding in
text, why not work directly with diagrams, now that we finally have the
appropriate tools?

Diagrams aren't suitable for all coding tasks, but I recommend fitting a
technology to the tasks where it naturally excels.

 _disclaimer_ I haven't played with the tech (no WP7), just going by the
webpage. Lest I offend with my ridiculously idealistic demands (who of us has
created a fundamentally new approach to coding? Not I), this really shows that
MS is pro-developer (while apple is pro-consumer) - and just _starting_ to use
multi-touch as a programming fundamental is the most important step in this
quest.

~~~
mks
I am visual programming skeptic (or better said pragmatic). I've seen quite a
few visual environments where at the end of the day (once you've learned the
language) it was faster to switch to text mode. Visual programming was great
for learning the language though.

The main problem visual programming encounters is the sheer number of options
you have when you are entering code in text form.

Just consider that on almost any new line in code you can type in control
structure, variable declaration, assignment, function/method call etc. And it
is almost always faster to type "if" or "for" than to click through some
decision tree to get to your required command.

In touchqode we try to overcome this limitation using prefix tree when showing
code suggestions - e.g. when you have methods showTime(), showDate() and
showLocation() you can type "s", press "tab" and you see longest common
prefixes of methods starting "s" - so you would see "show". Then you can click
it and see the actual method names.

Needless to say - templates for structured commands (e.g. "if") are probably
necessary for programming on mobile devices.

I have few experimental features for touch programming in my mind, so I hope
I'll get to implementing them sooner than later. If anyone has interesting
ideas I'll be glad to discuss them.

~~~
6ren
I agree; yet I do think that there likely are a subset of coding tasks that
are suitable for a visual approach (esp if a multi-touch screen gives other
benefits, specific to that task). The only thing missing is knowing what those
tasks are...

1) a way to address the choice explosion is (highly) domain specific
languages, with limited choices. I guess this would be closer to
scripting/macros than true programming, but still useful.

2) could the number of options be constrained in some other way? Or, perhaps
chosen in a way that's more intuitive to touch - such as pointing to a
graphical representation (of something that naturally has a graphical
representation). OTOH, while good for nouns, less good for verbs. What about
gestures for verbs (like sign language)? Maybe not for the specific choice,
but to narrow it down to a family of choices?

3) and the bizarre idea of using the strength of multi-touch to make choices -
like a chording keyboard (eg).

Is a keyboard really intrinsically better for coding than pen and paper?
(maybe - there's linguistic power in symbols lacking in pictures).

BTW: A prefix tree is useful, but not specialized to multi-touch...

~~~
mks
Just two ideas off the cuff:

\- have your methods and variables put into "categories". Then you could
quickly choose category and you would be presented with limited choice in the
category

\- code snippets - templated pieces of code (control structures, database
access...) where you would just choose variables/methods to fill in the
template

~~~
wladimir
Good ideas. Basically, you want to make the language as high-level as
possible.

Everything that can be generated or inferred from a high level description
should be, for example using templates or snippets. Maybe the whole text based
idea of programming needs to be side-tracked.

For example, represent the program as kind of (multilevel) graph. After all,
graphs are easier to manipulate with a touch interface than text...

Interesting stuff.

------
jmah
Might someone with a WP7 phone be kind enough to make a video of this? I'm
super curious.

------
mks
It has some nice UI ideas - I particularly like the idea of calculator-like
editor.

Environments with limited text input ability get significant advantage from
staticaly (or at least strong) typed languages - code autocomplete is really
killer weapon when editing text.

As a sidenote something similar can be achieved on Android phones using
Touchqode [1] and Scripting Layer For Android [2]. It does not have specific
language but you can use any of its supported languages - Python, Ruby,
BeanShell (Java), Rhino (JavaScript)

Disclaimer: I work on touchqode

[1] <http://www.touchqode.com> [2] <http://code.google.com/p/android-
scripting/>

------
underwater
Duck Duck Go gets a bit of love in the sample applications. There is a sample
app that talks to their API.

Interesting to see how they've shoehorned an editor into the touch screen
interface. As an end user its nice to have a little more control over my
device.

------
pavlov
The editing UI with the keywords as a calculator-style interface in the bottom
half of the screen is an interesting '80s flashback. It reminds me of
LucasArts-style adventure games with a grid of action verbs below the game
view.

Also, those home computer keyboards where each key had both an alphabet and a
BASIC keyword printed on it:

[http://ewyse.files.wordpress.com/2011/03/acornelectron_top.j...](http://ewyse.files.wordpress.com/2011/03/acornelectron_top.jpeg)

That was a pretty clever way to make the programming language quick reference
available at a glance.

------
abtinf
This probably dates me and marks me as unhip but..

When I first read the headline, I thought "Why would anyone add a touch
interface to WordPerfect 7?"

------
zdw
First thought: Wow, the syntax looks like AppleScript.

Looks quite compelling though - hopefully they'll tie in the PIM and GPS
features - imagine being able to ask you phone how long you spent physically
at work during the week, or who are the top 5 people you emailed in the last
month.

------
josch
it's probably not there yet: <http://www.ycombinator.com/rfs5.html>

