You spend all this time ramping up on a language, toolchain, library, etc, that you will eventually be unable to leverage beyond a certain point since it is not one used by everyday programmers. The delta in "quality learning" between a teaching language and a non-teaching language would have to be astronomically high, imho, to offset the difference in capability and the opportunity cost of learning an applicable skill to a wider problem set. Something that is just a slightly better Python or a slightly better Clojure for the purpose of illustrating computer science concepts seems like something that you'd be doing a disservice to students to teach, vs the "real thing" which they could carry for the rest of their lives into future projects ad infinitum.
This does not preclude Pyret from being a useful language for general-purpose programming; it just means that language design decisions are driven foremost by pedagogy. For example, Pyret trades performance for mathematical familiarity by using exact representations of rational numbers as its primary number type. Language like C, Java, Python, and Rust are all distinguished by different priorities that have guided their language design, too.
I disagree with your characterization of any language as being the "real thing" because it privileges some languages above others, but let's run with it for a moment: A systems programmer might call Rust or C the "real thing", in contrast to Python, and perhaps they would even consider delta "quality performance". A sysadmin might call Python the "real thing" in contrast to C, and consider delta "quality productivity". Assuming learning one programming language does not preclude you from learning another, why shouldn't a computer science education call "Pyret" the "real thing" and be asked to justify the "delta learning" of moving to language that isn't designed with pedagogic interests in mind?
(Disclosure: I am an occasional developer to Pyret, and this comment might not represent the views of the rest of the team. If skrishnamurthi or jpolitz replies, read their comment instead!)
You have drawn privileged lines of your own across programming languages by suggesting that languages designed for education are better in the education context, and that languages designed for engineering are less fit for education. Do you think there is a pedagogical outcome difference arising from programming language difference?
So how much is that difference? How does one demonstrate the additional pedagogical value? Conversely, how does one demonstrate the relative disadvantages of using Python in education? Is the difference from Python -> Pyret an interesting barrier to students of computer science?
Meanwhile, the argument about the cost of maintaining tools and environment stands unchallenged, and the argument about the inertia of moving to a completely different ecosystem remains unchallenged.
PS: What pedagogical literature guides the pedagogically-focused design of this language? I couldn't find any on the website.
2. Python does not offer a neat integrated testing story.
3. Python does not have a type-like annotation mechanism along with a type-checker and a type system that is not extremely complex. For some people who want to teach introductory programming, this really matters.
4. Python does not have a clean functional event-loop construct, which is central to the pedagogy we use in many application so Pyret.
5. Python does not offer a typed interface to external data. Pyret's table support is designed with types in mind, but you can also use it just fine without types. [Quick table demo: https://twitter.com/PyretLang/status/773605473824145408]
6. Pyret runs entirely in the browser. With a Stop button in the IDE. There's tremendous complexity involved in making that run. Just about no other language does that in the browser. That's a really valuable feature to a beginner whose programs may go haywire.
That's not all, but that's a start.
I've talked to a super-smart Hack Reactor graduate who was blown away by the concept of static typing when I explained it to him in detail; it was simply irrelevant to his understanding of functional programming, data structures, and algorithms, subjects which familiar with at quite a high level.
On 2), what's missing in Python beyond the unittest package?
On 4) and 5), are these things that you introduce early in your curriculum? They seem like quite niche aspects of programming. I've found that most python programmers haven't had cause to use an event loop in production, for example; only really if you're implementing a web server or networking stack, something that frameworks usually abstract from you.
Because programmers are always aware of types. You may not annotate that the function's `licensePlate` parameter is a string, but if you are thinking of taking a substring or string-appending it, of course you're aware of the type.
If a beginner is only vaguely/intuitively thinking that the licensePlate is a string (sometimes confusedly thinking it's a number), then there'll be problems. So allowing a beginner to mention a type is important.
So the question is more: when learning to program, do you require every name to be annotated with a type? Some people feel this is overkill, and others think it helps, for CS1. And languages with implicit conversion-rules can confuse some beginners, while converting explicitly can annoy others.
Once somebody has internalized the rules, then yes of course you want the language to make "obvious" conversions and/or declarations for you (e.g. true-ish expressions, and (for Java) auto-boxing and auto-toSTring'ing). But just because experienced programmers find it indispensable doesn't mean it's appropriate for beginners.
(Btw, in addition to learner-vs-competent programmer, there is a separate issue, about the strength of the students: some people pick up programming-way-of-thinking easily, and others need MUCH more guidance before they become competent. Different language-choices might be appropriate for those two audiences, as well.)
Furthermore, we use types as a central notion in our program design curriculum. The structure of the data (aka, "types") provide a hint as to the structure of the program. This helps students who are stuck get unstuck. That is, they are most helpful to the students least able to get started, which seems like quite a valuable place to focus.
Of course, having annotations and enforcing them statically are two different things. That's why Pyret lets you check annotations dynamically instead. And that makes it even simpler and more intuitive to introduce the concept.
2. Direct integration into the syntax of the language makes a huge difference. There's no library and additional complexity. For rank beginners, every new "moving part" is a source of problems. This is why most Python curricula do not emphasize writing tests early. In our curricula, students begin with writing examples of their program's behavior — expressed as tests — and use that to help figure out the way the program must work. (Just like you use examples to help figure out a more general structure in other disciplines.)
4. Event loops are central to reactive programming, i.e., animations and games. HtDP 2nd edition begins with writing an animation. The Bootstrap curriculum is focused around teaching students to write a game — but in the process learn algebra. It's niche to you because you have a particular model of programming education, which may well resemble what people were teaching in the 80s. That doesn't mean curricula haven't moved on. And we're talking about education, not production, which are two different things.
5. Combine the earlier points for this one.
Whenever making "I haven't seen", "I don't see", etc. comments, it may be helpful to keep in mind Paul Graham's Blub Paradox [http://www.paulgraham.com/avg.html].
I understand people have difficulty with pointers and such, but not types. Without that knowledge you can't go anywhere.
From my point of view there is absolutely no advantage in hiding types when you always end up using them. Better to face the fact and teach them early on.
Once a student has core concepts, then they have something they can build on with other languages that have more issues (potentially complex) stemming from practical problems. This prevents needing to learn two complex things at the same time.
Does pyret have easier to understand errors around syntax and typographical errors than Python? Most real languages won't search for similar identifiers to one that is misspelled and make suggestions, does pyret do anything like this (Other teaching tools do)?
The value of an education-oriented programming language is a boost to X education outcome relative to competing alternatives, and that "avoiding specific complexities" would be the mechanism by which to achieve your outcome.
(Pyret intervention) -> (educators don't struggle with language complexities) -> (education outcome)
But first I want to know that there's even a difference between (Pyret intervention) and (Python intervention) on (education outcome).
Let me provide an example of a language complexity that hinders education, presuming education cares about things common to many languages like function, types, control flow, classes and objects.
In C++ there is a strong separation of declaration and definition, this is required for silly, technical and mostly historical reasons. Teaching all of the concepts needed to make sense of why the compiler cannot elide definitions included twice is not useful when trying to teach about anything other than C++.
Another example, in Java there is much boilerplate and an implicit package scope when scope specifiers are omitted. Most languages need little boilerplate and in order to understand why the implicit package level specification is bad (or good if you disagree with me), one must understand the concept of public and private already.
For each language I know I can highlight such things. There is plenty of room for simplified teaching languages then extra courses to cover language specific things.
None of this should be taken as a general defense of how programming is taught. I just mean that whatever problems there are, seem not to stem directly from the use of education oriented languages. I even think that classes focusing on specific languages should more closely resemble the real world, but not all people are ready for such detail.
You're telling me that:
(Pyret intervention) -> (educators not struggling with complexity) -> (education outcome)
was so bad you think I'm dishonest? And you think that's the "only way" you can interpret the situation? And just in case you're wrong -- sorry?
1. Proposing a causal relationship: (independent variable) -> (dependent variable)
2. Proposing complexity: (independent variable) -> (mediating factors) -> (dependent variable).
This is a fucking good start to any policy conversation. Where else do you find this online?
This discussion clearly has too many factors to be argued by formal proof. We cannot prove our premises nor prove an argument is formally sound. Therefor we must rely heavily on induction as many trades do, both software development and teaching are trades in many regards. Each often requiring action without the time or ability to formally prove and instead relying on large scale informal experimentation. The very existence of unit tests demonstrates how clearly practical programming cannot be formally proven and the existence of the whole fields of psychology and sociology ought to serve as basis for teaching impossibility to formally prove. This seems so obvious I hadn't felt the need to state it previously.
I presumed you capable and understanding and knowing this. I inferred that someone choosing to represent the argument in such way must see the situation is too complex to break down like this and must be misrepresenting the facts or they must not be competent to discuss this topic. Because I presumed your competence I felt you must be misrepresenting the situation, I now see I was mistaken.
For example, Fiona Phelps has researched into gaze aversion in multiple contexts, one of which is pedagogical, and in there she proposes an intervention for educators, and she discusses outcomes. She is interested in making a causal link between intervention and outcome. It should also be noted that the causal model she discusses is simple, discussable, falsifiable, affirmable, has few factors and few measurement instances.
Discussing simple causal models is standard professionalism in pedagogical policy literature... or any other empirical field involved in policy.
Is it that scary to say that a Pyret or Python intervention should have a relationship to CS graduation rates, upper division performance, GPA, or some other stated policy outcome?
Not stating your intervention outcomes allows you to go statistical fishing.
One can say the same thing about iPads in schools. It's not wrong to ask what outcome variables the school intends to affect with an iPad intervention. Maybe the effect is there, but we can't begin discussion unless the proposed intervention is willing to state its predicted outcomes.
Why are you <not> pushing for a discussion of outcomes?
Honestly, looking at the examples on the Pyret main page makes me think that this isn't a good approach at all. I have been in software engineering for many years now and worked with quite some languages, yet I find the code practically in-understandable and the language looks very complex.
So likely someone who "learned" using this language will have a lot of trouble actually working with real languages, assuming that he actually managed to learn this language as his first, which is rather unlikely I believe.
I was also involved in CS education - in first semester programming courses. And the language is really the last problem there (C was used, I think the dept. is moving to Python for that course though). Especially eg. Python is a good language overall and is already widely used in CS education.
In short, I don't see why the world needs this, and I actually think that using it would be actively harmful.
Simply looking at text in a different language is not going to impart magical understanding. Things you haven't actually used in a non trivial program will seem complex at the beginning. And sure, you can rationalize that away by treating this like a special case, but it's just a general bias in our brains. And depending on your background, certain languages may seem more complex than others due to simply lack of exposure.
I'd rather propose putting it up to a damn test. Compare the computer science knowledge of outgoing students to other students in similar circumstances who were taught using a practical, heavily-used-in-industry language. Use a short test maybe, along with interviews to get at qualitative unknowns. The results of that could then be used to pick the best strategy to teaching going forward.
But also, in general, why not try to solve the problem better, without pre-supposing all the sunk costs and existing ideas? And not everything needs to be on the freaking altar of industry either. It's not like your college implementation of heap sort is gonna be mainlined in a company and used forever more. What're you optimizing, the couple of weeks it'd take for someone who already knows this to learn python? You already saved it probably by skipping the b.s. until it actually became relevant to learn about.
It's pretty clear that I was specifically referring to the language at hand and not to general idea/concept.
> ... needs this ... using it ...
yet I find the code practically in-understandable and the
language looks very complex
I was also involved in CS education [..]
And the language is really the last problem there
I mean, why isn't it possible to design a language that is both quite fit for education & production? Is Python vs. Pyret really that big a difference in education outcomes?
This is important for professors & TA's, educators, or enthusiast parents who wish to teach their children. All have to master a toolset and environment prior to teaching others, and maintaining the environment is also an ongoing cost. Some of this cost will be assuredly transferred to students, so we want to be sure that a "pedagogical" language without production traction has such powerful gains that it offsets the losses.
- has library support for the usual stuff: strings, math, web, database, encoding, etc.
- written using a lower level language (like C, Rust, C++), self hosted (Go) or a mix of those (Dlang, Perl6)
- is actually used to build applications, has some sort of community around it (Rust, Perl, Python, Lua etc.) or some backing (Swift, Java, Go)
(Disclaimer: I haven't used Pyret) But.. maybe the language allows the developer work out problems differently. With that in mind, perhaps asserting that the language was built for education, [so as to imply that it'll make you a better programmers in general, even in 'bloated or expert' languages], doesn't necessarily shed light on its actual benefits -- but I'll check it out later on to find out for certain.
It may also be useful to know that Pyret exists as part of a larger project and community. We're using Pyret to develop curricula for Bootstrap (http://www.bootstrapworld.org/), which integrates computing curricula into existing middle and high school classes, like math and physics.
Part of the idea is to integrate computing into these settings to reach students who may not self-select for computing.
A lot of Pyret's design, like a dead simple animation framework (https://www.pyret.org/docs/latest/reactors.html#%28part._to-...) and interfacing with real spreadsheets (https://github.com/brownplt/pyret-lang/wiki/TM028-Pyret-and-...), supports these low-friction uses of computing in these curricula.
Having these features and concepts baked into the language, supported by the IDE, and easily accessible goes a long way to building something that a non-CS teacher can pick up and use in an existing class.
A lot of us hackers here today cut our teeth on BASIC, even me as a 90's kid. I remember the joy at having automated away rote chemistry or physics homework, and in hindsight realise that writing a working BASIC program made me learn stuff better and deeper and quicker than the homework I avoided.
I have a four-year-old now, and the other day she asked if I could count to a thousand million. I answered "Nope, but I'll show you Mr. Fortran, he can." Give kids a DO loop, some arrays and WRITE statements and they will conquer the world. For learning, it's hard to beat the simplicity of
if i % 1000000 == 0 then
write(*,*) i," Million"
> The docs you link for the "dead simple animation framework" assumes I understand templates; that's hardly dead simple
The idea of "templates" comes with a lot of baggage from C++-land. If we were doing this with C++ templates, I'd agree, because the errors are abysmal and the types are difficult to write.
As an example, we teach students to pick "the type of their reactor's state" as a useful pedagogic step in physics simulations, which is almost always a (maybe nested) tuple of numbers. But writing out the types (the instantiation of "a" in to-draw, on-tick, etc) is a useful and quite concrete activity. Since we have them working with types in their contracts already, they're equipped to think through data definitions that way.
That kind of type-based reasoning is a kind of thinking that isn't just loops and arithmetic, also needs language and curricular support, and is something we try to support early and often.
> if you did the same with Pandas in Python...
That said, tables do more than work with just "half-baked" Google spreadsheets. The interface is generic to support other formats – the Google Sheets import is a library, and a Pyret/JS programmer can write library code to import from other sources (this is of course work, and won't magically appear, but the language supports more than Google Sheets by design). In addition, tabular data is built-in to the language enough to use it for _testing_ example tables (and functions over tables) with the same primitives as testing other functions. And the animation framework can spit out tables that trace execution of the animation, so students can generate their own data from simulations.
All of these things live together inside the same language that has other features we've found useful for beginners, like more authentic math (not just bignums but rationals), easy image creation, example/testing support, carefully-crafted error messages targeted at beginner vocabulary, and so on.
> I have a four-year-old now, and the other day she asked if I could count to a thousand million. I answered "Nope, but I'll show you Mr. Fortran, he can."
That's just awesome :-)
> The idea of "templates" comes with a lot of baggage from C++-land. If we were doing this with C++ templates, I'd agree, because the errors are abysmal and the types are difficult to write.
You're spot on. I suspect most people familiar with template-ish expressions come from C++, where you encounter quite a few projects that are "templates all the way down" (looking at you, OpenFOAM).
Is there a way to make a donation to this project?
Teaching languages/OSes/compilers/toolchains (if done well) lower the barrier to entry while providing a conceptual transition path to learning 'real' stuff later on the job. Plus, the 'real' thing that we may be teaching in school now may not even be relevant in 4 years in industry when students get out into the working world, and they will have to pick up new languages/toolchains/etc. constantly on the job anyways. There is definitely a good argument for authenticity, but also one for paring down extrinsic complexity to illustrate core computer science concepts. e.g., try explaining parsing using full-blown C++ as a case study -- that seems pretty daunting. But one of these teaching languages (if done well, of course) is much easier to write parsers for.
However, this doesn't exclude it from being a Real Language. Scheme is both a very popular teaching language (perhaps the prototypical teaching language, being used in SICP), and has seen actual use, with a dedicated community around the various implementations of the language. For a teaching language, there are a lot of things built atop it. Like, say, this site (PG's Arc runs atop an old version of Racket, from back before Racket diverged quite so much from Scheme, and was still called Scheme).
>"full of awkward inconsistencies, warts, and good-enough-for-nows"
can not agree with 'full'. but Is there a teaching language better? I do not think so.
For that matter, so does Scheme: It just has less of them. Which has its problems.
Oh, and Racket is in no way as good for Real Work as CL. It doesn't have the libraries, and it sure as heck doesn't have perf.
>including Racket's commercial users.
A language doesn't need maturity to have commercial use. Rust had commercial adoption pre-1.0, and that adoption is increasing. Rust is also nowhere near mature.
Racket being built in part for education is also undeniable. Look it up.
Racket's library support vs CL's is highly debatable. Neithet is ideal, but both are "good enough," so let's leave it at that.
As for Racket's perf, that's just a fact. Look at Racket, then at SBCL. Then at Racket. Then at SBCL. Sadly, it isn't SBCL. But if it started doing native code compilation, maybe it could be a bit more like SBCL.
This is also true of Scheme: Some of the Schemes are much faster than Racket, for many of the same reasons (many of them aren't as fast as SBCL, as I understand it, although I haven't run the benches, so I can't be sure).
This doesn't make Racket a bad language by any stretch. I don't happen to like it, but that's just preferential: there's nothing unambiguously bad about it AFAICT. In fact, it's quite well designed.
>a CL fanatic
Crap. You got me. All that time I spent arguing with lispm that yes, Scheme is a Lisp, explaining to people what Scheme is and why it's cool, explaining that yes, Hygienic Macros are a good idea, and no, that doesn't just mean declarative macros ala syntax-rules, there are imperative systems that do the same thing, discussing the differences between Racket and Scheme, and why I think they matter, but you can choose either, depending, writing CHICKEN Scheme macros, barely tested (because I was busy), on my cellphone, and so on, was just a ruse. I've really been a CL user all along. /s.
While there are things I love about CL (and I've even considered moving to it for various reasons, despite it being an uglier language - although in some ways also a more practical one), and things I hate about the Schemes - CHICKEN in particular (familiarity really does breed contempt! At least a little...), I still think that Scheme is a beautiful language.
While I don't think I'm a fanatic for anything, if I am a fanatic, I'm a Scheme fanatic, not a CL one. I have, as I've mentioned above, argued a lot with CL fanatics on This Very Forum.
Idiot, I'd take with a grin. Incompetant, with a smile (I may well be). Possibly misinformed? I'll downright admit it.
But a CL Fanatic? That's not just a stretch, it's flat-out wrong.
You realize who you said that to?
I think he knows the background of Racket quite well. ;-)
The converted ones are usually the most fanatic.
Everything you mention here is a consequence of age not intent for real use.
An old teaching language would be just as bad if those are your only criteria.
This is not entirely unusual, even today. I was curious about one of the toolchains for phone app development. For one particular tool, the tutorial for "Hello World" was many pages of instructions: Installing the tools, creating an empty application, setting its characteristics, adding code, building, etc. Even as an experienced programmer, I found it to be rather forbidding.
Another possible reason for a teaching language might be to excite the interests of particular age or social groups. For instance, Logo was oriented towards making interesting graphics. Scratch eliminates the need to learn syntax, and has easy-to-program graphical output.
On the other hand, some languages come pretty darn close to that old BASIC in terms of the ease of getting started, and one of those languages is Python. It seems to me that the sheer size of Python shouldn't be a deterrent to using it for teaching: Just ignore the advanced features at first.
Every language you name and love and use had a growth path. This, ironically, includes Python, which also evolved from ABC: a teaching language.
Each level allows you to think to the limit of that world. Only then you see where a new trait fits, how it relates to others. You don't just drown into heavy manuals right of the bat.
Teaching languages that are even modestly successful in that role tend to achieve significant use outside of teaching, as well, so that tends not to be true. Lots of languages that end up having broad general use are created with a minimum ore specific motivating use, and teaching is no less legitimate than any other motivating use. Pascal, for instance, was a teaching language, and it for quite a while had enormous industrial uptake.
Even Pascal grew up into Object Pascal. Yes, it is still used in production.
(f - 32) * (5 / 9)
Beginners frequently have issues with grammar, that is, they are pretty much unaware of it when trying to figure out the many other things they have to learn about programming. Even in SQL, I forget that it must be quite confusing for beginners to parse out the meaning of the `` in something as simple as:
SELECT mytable.*, (a * b) AS myproduct,
COUNT(*) AS the count
FROM mytable ...
Scheme mitigates the spacing issue with s-expressions. Pyret attempts to mitigate it by enforcing that operators should be separated from their operand with at least one space.
(Disclosure: I am an occasional developer of Pyret.)
But Pyret is using more traditional syntax, where binary operators go between their operands, e.g. `2 - 1`, which means using hyphens in identifiers is a hazard as the parent comment describes.
The number of people who forgot even the most simple syntax rules towards the end of the semester, even after having shown them during each lab session, was surprisingly high.
You tell them that the computer is super dumb and pedantic and they understand this. So you tell them the rules (semi colon at the end of line perhaps) and they understand. Then an hour later and they're staring blankly at the screen wondering why they got the exact same error again.
Obviously not all students are like this and almost all did get it after a while, but you'll be repeating your statement to them enough times that you wished the language didn't have whatever warts is causing the confusion.
I found Java particularly bad, because you have to tell beginners to ignore 90% of the hello world program (class ... public static void main ... - you have to tell them "hey, ignore class, public, static, void etc for now, we will cover it later"), which isn't satisfying at all and in my experience caused more confusion still...
Yes, you've got to treat it as a learning experience, but the students usually see it as a roadblock and cause of frustration. Often one that doesn't make sense to them and they question why its even a thing. ("why can I use '_' in identifiers and not '-'?", not too hard to explain of course but its yet another bit of incidental complexity to them)
It's also worth noting that pyret requires spaces between all math operators, not just subtraction. In other words, 3+4 is not valid; you would need to type 3 + 4. So at least it's internally consistent in that sense.
Good to know that Pyret requires spacing between the other operators. That does help a lot, and an argument could be made that it forces folks to not do `a-b` when they mean `a - b`.
(- a b)
Like in a Lisp infix macro: (infix (a + b * c) / d).
This kind of thing is quite habitable: we hardly lose sleep over (3.4) the list of one float versus (3 . 4) the dotted pair of integers.
I have spent quite some time chasing bugs of the type (- a -1) where the second minus ended up there because of either a copy/paste-error or a brain fart due to the original expression was a - 1...
Other lisp-ism: gratuitous abbreviations. Is 'lam' instead of 'lambda'. Given its teaching claim, that's two layer of understanding required of students: to know what a lamba is and to connect that lam stands for lambda.
Ada for instance specifically considered then rejected case sensitivity because they had a (at least one) study that said case sensitivity was a source of errors.
I don't know how well instrumented the pyret courses are but I would be very if they have studies on this.
This therefore is perfectly natural; non-programmers are used to using spaces to delimit words and this is simply doing the same for tokens.
Hyphens commonly are used to join words into compoubds in English, underscores are not; the latter are used in many programming languages as a result of the fact that not allowing hyphens to serve that role when they also serve as an operator in other contexts simplifies automated parsing (and allows syntax that supports more compact expressions, and there was a time when bytes of source code had a greater cost than they do now.)
Maybe Pyret isn't for beginners, and it's intended to teach people who already have some basic knowledge more advanced concepts like functional programming. That's fine.
But to a total beginner, the syntax of Pyret is definitely a step back in terms of readability.
In what world is:
fun square(n :: Number) -> Number:
n * n
return n * n
I have a bias here, as I help teach an introductory course using DrRacket, and many of my colleagues are very aware of Pyret, some even helping to develop it. In one semester, however, we have students with a full understanding of recursion, linked lists, many other common data structures, anonymous functions, functions as data, and understanding of map/filter/fold and how to use them. The class is specifically aimed at people who do not have prior programming skills, and works very well in my experience. Yes, it is a lot, but it builds an incredible base, and if taught to build on itself slowly, is actually very minimal conceptually. The optional features in Pyret are probably allowed with that in mind.
One of its weaknesses, arguably, is that it doesn't look like much else out there in common use, since it's a Scheme. This syntax is an attempt to try and make the same ideas translate to other languages easier, such as Python/Java.
That is a beauty of Lisp especially Racket. Every thought is explicitly inside a bracket.
I am a self-taught programmer started with Assembly Z80, Pascal etc.. I mostly work with R now a days and I was struggling with R to get to the next level. I picked up a book and learned Racket and the convergence of R and Racket was totally unexpected. I missed all the Scheme influence in R and most R programmer in the past didn't use that part of the language. So as a learner them brackets are golden especially for learning concepts.
It seems like it'd be easy to define a typedracket-derived #lang where you had to write:
(define (sum a b) : [-> Integer Integer Integer]
(where (= (sum 0 1) 1)
(= (sum 2 2) 4))
(+ a b))
 I, personally, love parenthetical syntax.
The same applies to programming languages. The fact is most programmers learn popular languages. Languages that deviate from popular languages tend to become less readable.
So the answer is no. If you are accustomed to a syntax, it naturally follows you are most likely going to languages that adopt those syntactic features will initially be more "readable" because things only become readable when you learn how to read them.
n * n
party in here
// no longer having fun here
When these people talk about easy-to-learn, they don't mean what a lot of people in this thread mean. Def / fun is such an easy thing that it's a non-issue. Anyone can learn programming language syntax pretty quickly.
It's the semantics that's hard. When you first start learning, the things that actually trip you up and that you can't just look up in a cheatsheet in two seconds are things like: Unexpected behavior, bad error messages, hidden rules, hidden complexity, complicated control flow, code that doesn't do much but must be there anyway, arcane things that you must do to make the compiler happy.
In that way, you don't have to assume that beginners are stupid. A lot of these things happen because languages are badly designed, or because they evolve to have too much cruft over time in an effort to be general purpose and catch up with the times. For learning, this just gets in the way. With better tools, learners can tackle much more complex concepts much quicker. Once those take place, you can introduce the arcane weirdness of other programming languages, and it makes a lot more sense.
>>> def x = 3
File "<stdin>", line 1
def x = 3
SyntaxError: invalid syntax
Does that matter? I'm used to Scala where `def x = 4` is just as good as `val x = 4` as far as a beginner's concerned.
n * n
square n = n * n
Welcome to the Real World. Please make different entities look different.
"Return" seems meaningless.
So that is the "real world" function declaration (if this phrase makes any sense for a math notation convention) for a nonprogrammer.
Is that what you mean? Because then I agree...
You know n is a number and the result is a number. You don't need to "type check in your head".
Remember Pascal? Being explicit is good for education and for regular sanity.
P.D: I like python very much, but all the time I'm wondering "ok, what is the meaning of this code, let's find elsewhere so I can remember what is supposed this return"
This also happens to me with F# (type inference is weird: The compiler know, but it hide to me what it know!).
I often hear people say that functional programming is advanced concept. I think students are able to learn functional programming as their first model of computation, before they are even exposed to Turing machine computing. It's not like you have to first learn C, then some java and only after you become familiar with OO, you have enough knowledge to grasp functions as first class citizens.
So, for a curriculum based on that basic approach that starts with the analytical methods and then adds concrete programming on top of it, the Pyret approach -- which requires and gives effect to elements of the analytical product that Python would not require and provides no convenient means to clean express that also gives it concrete effect -- is probably both simpler (as the two-way correspondence between code and analysis is better) and more productive.
It's probably not particularly useful to discuss the quality of a pedagogically-focussed language outside of the context of an approach to pedagogy.
Also, try code samples in here: https://code.pyret.org/editor
Also, scroll down to see the comparison vs other languages: http://www.pyret.org/#examples
Some thoughts on syntax
We believe indentation is critical for readable code, but we don't want the whitespace of the program to determine its meaning. Rather, the meaning of the program should determine its indentation structure. Indentation becomes just another context-sensitive rule.
Unambiguous syntax (the reason for explicit end delimiters) means you can copy-and-paste code from email or the Web, and its meaning won't change. Your IDE can help you reindent code without worrying that doing so will change the meaning of the program.
I've found it wonderful and productive for low-level bits of functionality, especially working on small data structures and primitives. With Emacs set up to reload the namespace and run all the tests on every save, it's a great feedback look with very little cognitive overhead.
It breaks down a bit for more involved tests, as you say, and it can be taxing knowing when to refactor, how much test setup code to include along with the unit under test, and also once you start splitting things into other namespaces, where to look to get a full view of a module's functionality.
- avoids the duplicate class hierarchies you get with xUnit,
- avoids having to expose class internals for testing,
- makes tests an integral part of code, instead of an external add-on.
It has all the basic constructs of both functional and object oriented programming in plain english without a bunch of syntax to worry about. I personally think it's the most readable language. it's also quite intuitive imo. Lastly, there's a fantastic community and instant help is available for all skill levels but especially beginners on #python on freenode irc
Pyret... isnt very readable, doesnt seem very intuitive, has somewhat odd syntax, and generally seems like a terrible choice for learning. Doubt it has the resources quality python has (community, docs, tutorials)
i dont get it.
I linked to a perspective on where Pyret fits in in another similar answer:
We also may have different definitions of what a "beginner" needs for community and help.
Pyret is used in middle and high school classes (along with the undergraduate level). Those students, nor their teachers, are likely to know what IRC is or how to seek help on it. We do a lot of curriculum development, and professional development workshops in the summer are a major way we interact with these teachers and set them up with a high-bandwidth way to communicate with us over the year (which is direct email or mailing lists).
Prompt and detailed help is certainly available for teachers (who then help their students), and it's coming from the folks who designed the curriculum and the language together. Those mailing lists are public (pyret-discuss  and bootstrap-teachers ), and anyone is welcome to ask away.
Of course, Pyret doesn't have the huge community, Stack Overflow presence, documentation maturity, etc. of Python (or other major languages) for general programming. That takes time and consistent effort, but the lack of it doesn't negate the reasons Pyret is good for its current use cases.
It has mostly Python syntax. Some bits inspired by Standard ML, Ocaml, F# and Haskell. Nothing odd here. End marker prevents accidental scope overrun and is from Pascal I think.
Community is hard to get for any new language. If this got popular, there would be no such problems.
You're arguing that Python is Good Enough. But then, so was assembly.
says the programmer using significant whitespace
runs for cover
Curious to know what would you have done differently. I'm working on a programming language based on OCaml with Ruby-like syntax.
* |> rather than ^ for the pipe operator (more consistent with several other languages)
* [list| 1, 2, 3] rather than [list: 1, 2, 3] (just for better visual distinction, though the current syntax does look pretty clean and uncluttered)
* having a separate code block for methods of individual variants rather than interspersing them with data definitions (because i think it's important that the basic data definition be as uncluttered as possible)
if you'd like some syntax feedback on your language i'd be happy to take a peek :)
it's the separator aspect i was getting at with my proposal though - visually it's [ datatype | elements ] whereas the current one looks more like the colon is part of the first element - [(list : 1), 2, 3]
Frankly, I'm really trying, and I'm having a very hard time suffering from the "list: 1" confusion you claim. I admit, that's just me.
Also, `|` as a separator suggests a kind of symmetry. But the two sides are not symmetric at all. Whereas in English, `:` does not imply symmetry.
Well not really, we need better teachers and methods for teaching programming and computing. A good language on its own is not gonna lift the interest.
I think most of the Pyret developers would agree wholeheartedly! To this end, the Pyret development team works closely (and in some cases, overlaps) with Bootstrap , a nation-wide program to teach programming to middle school students. The feedback we get from these teachers (and the process of teaching teachers) directly informs language design.
Furthermore, at Brown University, where many of Pyret's developers have ties, Pyret is dogfooded on two courses: an "Accelerated Introduction to Computer Science"  and "Programming Languages" .
(Disclosure: I am a developer of Pyret.)
Racket is great! If you're already deep into Racket, I wouldn't say stop to try Pyret unless you particularly have the goal of learning more languages, or there's something about Pyret that really gets you interested.
We taught a programming languages course that uses Racket back in 2012, but all the material is still online:
So if you're looking for online content that we recommend, that could be an interesting next step that makes use of the momentum you have in following Racket.
The Pyret book PAPL is free online, and it's possible to dive in and follow that if you're curious about learning Pyret, though it doesn't have the infrastructure of assignments with autograders, etc, that a MOOC would have:
I started reading HtDP2e before finding the Edx. The MOOC helped reinforce what I was reading. I realized that through Racket, learning to program is a superior experience. This, compared to any other (popular) languages, people are teaching (or coding on the screen). I spent too much time watching people code. People attempt to teach online, but most of them lack the traditional pedagogy. They know how to code, but they don't necessarily know how to teach.
The only exception I would make to above (as far as what I know right now), is Harvard's CS50.
This may run counter to the 'work smarter, not harder' ethos I see here but I feel that programming competency in practice is much more a function of literal hours writing code than any other factor.*
*with the caveat that competency tends to follow a sub-linear growth curve and people vary significantly with respect to their growth rate so this statement is more accurate when comparing individuals who are early in their growth curve aka beginners.
A backend developer probably faces the preferred backend language (Java / Python / Ruby / PHP / whatever), SQL, and has to have at least a nebulous understanding of HTML, CSS, and JS.
A devops person has to face a shell language, the Unix tools mini-languages, have enough Python or Ruby chops to handle things like Ansible / Salt / Chef, and likely some SQL to keep an eye on the databases.
This does not include smaller languages like regexps, or narrow-use languages like XSLT, JSON Schema, etc.
There's no way you learn the One Practical Language That Matters and can be limited to it. You could in past, with a Spectrum or a TRS-80 and Basic, but the times have changed.
Therefore I believe that it is productive to start beginners off with a language which is:
1) as approachable as possible
2) as useful as possible
Such that the value of that first, most difficult learning experience is maximized and the challenge is minimized.
Most mainstream languages have very similar semantics overall, differing wildly from FP-like approaches. So learning a mainstream-ish (imperative-ish, OO-ish) language will enable mastering other mainstream languages with greater ease.
If my first language is Lisp-1944, I'll likely have a hard time learning C or Java. If, on the other hand, my first language was C, I'll likely have much more and quicker success with Java, Python, C#...
OTOH if your first language was Lisp (or even Python, to a lesser extent), and it was competently taught, the syntax of Java or C# would feel alien at first, but you'd find out that many of the concepts are already known to you; chances are, you even have an idea how they work under the hood.
(Disclaimer: I majored in embedded systems, my first languages were Fortran IV and PDP-11 assembly.)
Also, Pyret can be used to make things – Pyret's compiler is written in Pyret!
As the language matures, the experience of writing small Pyret programs that grow into real applications will only get more authentic.
Just please don't label it as "educational" or "academic" or industry devs will avoid it out of pure macho-ism :)
And find a way to marry it to a popular ecosystem, maybe a transpiler to js (for nodejs ecosystem) or python or go. 'Cause it really looks like a language I'd enjoy writing production code in if I could just `require` some of my favorite libs and hit the ground running!
Industry devs should be suspicious of Pyret's performance right now, though that will improve as time goes on. We've been spending most of our effort supporting our student and teacher users, so things that really build the traditional out-of-the-box language experience aren't where a typical application developer would hope.
I think that's fine since we've made our intentions and goals clear (the educational audience and a coherent curricular design), and would be less fine if we were offering Pyret as a general-purpose language today. Offering it as general-purpose will become more and more reasonable as time goes on.
* concurrency, is there any nice built in syntax like Python async? Any kind of threading? Multiprocessing support?
* error handling, how is it done? Where are exceptions?
* standard and file io, string operations, serialisation?
* no builtin higher math types (matrix etc.), is math done on decimal floating point numbers?
* foreign functions, interfacing with other languages, embedding?
Pyret's number representation is one consequence of this: Pyret numbers are stored as either exact rational numbers:
Pyret has file I/O, but not in its online development environment, since the web page does not have filesystem access.
(Disclosure: I am a developer of Pyret!)
> One of the enduring lessons from the Racket project is that no full-blown, general-purpose programming language is particularly appropriate for introductory education. By the time a language grows to be useful for building large-scale systems, it tends to have accumulated too many warts, odd corners, and complex features, all of which trip up students. The journal paper for DrScheme (the old name for DrRacket) explains this in some detail.
In that respect, the closest fellow travelers of us Pyreteers are the Racketeers (see how that works?). In fact, the first version of Pyret was merely a #lang in Racket. Nevertheless, Pyret represents a departure from Racket (for now and for the near future, at least) for several reasons:
That said, Racket is indeed a fantastic next language after Pyret. I myself constantly switch between the two.
Racket is the most fun I have had in Programming and the more I "get it" the more I enjoy it. I can't say that for any other language.
* metaprogramming - templates, macros, generic programming?
One of the frustrations of using Racket—which is macros all-the-way-down—is debugging. When you see an error message or use a stepper-debugger, you want it to be in terms of the code you wrote; not in terms of the code generated by some complicated macro which was glued together by a sleep-deprived graduate student.
One of the graduate students here in the PLT group at Brown, justinpombrio, is working on resugaring—taking a sequence of evaluation steps in terms of a post-macro-expansed program, and presenting them back to the user in terms of the surface-syntax of the language. (More on this at his website, http://justinpombrio.net/) We're just starting to prod at implementing Pyret's existing syntactic sugar in this manner, which will be a first step to a proper macro system.
Some would say Pyret's syntax is an improvement over that of OCaml. (Others might not.)
A critical difference is that Pyret does not force static typing. The type-checker is optional, so beginning students, in particular, do not need to deal with the overhead of type errors and the vagaries of type inference. But Pyret's design, and pedagogy, aim to push students towards an ML style of programming with ML-like types (e.g., the language does not have untagged unions).
In fact, type inference in Pyret is done not from the code but from the tests the programmer has written. This is a very different philosophy of not just inference but program development.
I would also like to note that we do also have a new feature where types can be inferred if you provide tests in a "where:" block attached to a function. This infers a function's type solely from the examples of usage provided.
If you have any other questions feel free to ask!
But does Pyret always translate an ADT to a class hierarchy behind the scenes? This example sure looks like it does:
| elephant(name, weight)
| tiger(name, stripes)
| horse(name, races-won)
fun animal-name(a :: Animal):
Additionally, Pyret's type system internally tracks each of the different constructors as a refinement on the type. With the animal example this would be represented internally as something like
if is-elephant(x): ...
The programming languages course in question (https://www.cs.swarthmore.edu/~jpolitz/cs91/s15/) indeed used Pyret, and that course is a heavy functional programming implementation course (writing interpreters, type inference, GC, etc).
That's using Pyret for more complex programming, and is the upper end of where we use it pedagogically right now. Ditto for CS019 (the advanced introduction to data structures), and CS173 (PL) at Brown. In all of those cases, students _are_ expected to have some prior programming experience, so they get a faster introduction to Pyret.
In the other direction, Pyret is in active use at the middle and high school level in math, cs, and physics classes, and this is where much of our active design work is tailored (tables, reactors, etc have some explicit curricular goals they target). They have scaffolding and workbooks written for their grade level, use features suitable for the context they are in, and so on.
The use in CS91 in particular is
(1) appropriate in the first place, since many functional languages could work in that setting,
(2) eating my own dogfood, and
(3) an excellent opportunity to discuss design decisions of a language that students are learning in a PL course, while it is still having design decisions made about it!
I was actually a little flabbergasted when you said it looked like "odd text gibberish", but then I took another look at the the first example and saw it.
| node(value, left, right)
I'm relatively flexible when it comes to syntax, but clearly syntax is a huge hurdle that almost always gets in the way of adoption. I understand Pyret isn't looking to be the next Python, but for an educational language I would argue it's not living up to Python's Principle of Least Surprise. It's great if you know Haskell, OCaml, Python and other languages, but it's not so great if you're an unsuspecting student who's sharpened your teeth on Java/Python.
I suspect most people could pick up the syntax pretty quickly, but I suspect most people still need some hand-holding on the first page. Most programmers still don't know what this short paragraph means:
Pyret allows for concise, expressive, recursive data declarations. Type annotations
are optional and can be added incrementally, to serve a variety of pedagogic
styles and curricular needs.
I have fond memories of learning to program as a kid. Concepts like trees, nodes, and data types have no place in an introductory programming for kids. The initial idea to teach is the notion of getting the machine to follow a set of instructions. IMHO even functions should not be forced until that notion of writing instructions for the computer to follow is mastered. Imperative is the word I'm looking for. You need to start with that. Then move on to loops, arrays (or list but don't try to explain the difference for a while), functions. You don't want to require any of the concepts that we all take for granted just to get started. Type systems? structures? Objects? Those need to be introduced through examples where they are used to solve specific problems.
I have some examples of how an absolute beginner may think.
I was learning about BASIC and had worked up (yeah up) to using FOR loops, IF and GOTO. Then I read about the PLOT(h,v) function which would light up a pixel on the screen - awesomeness was about to ensue. But I had the notion that - as the instruction said - I had to pass the variable h and v to that function, not whatever pair I choose. That would mean copying values from other variables (i.e. h=x, v=y, plot(h,v) and then h=i, v=k, plot(h,v)). Fortunately that notion got cleared up very quickly, but it illustrates the point that a noob may take something quite literally and not understand the simplest abstractions that we all take for granted.
On another occasion I wanted to write Pac-Man so I had variables (we didn't even had structures available) for the player and I figured I'd use x1,y1, x2,y2, x3,y3, x4,y4 to track the positions of the ghosts. I could see in advance that I'd have to replicate the ghost logic 4 times using different variables in each instance (or write a subroutine and copy variables around like I suggested above). What I wanted was to put the logic inside a loop (for n=1 to 4) and reference the ghost variable as xn, yn but that wouldn't work because those are just different 2-letter variables. So I asked someone if there was a way to solve this and was pointed to arrays. The notion that a construct existed that matched exactly what I wanted to do was amazing and I felt like I really must understand something if my ideas about what should be possible were already there. Try to imagine that type of discovery compared to being presented a bunch of such concepts and being expected to understand form the start.
When I got to college I figured I already knew programming. Hah! The data structures course was incredible, I could already see how all of that stuff could be applied to different things. It made perfect sense because I could already imagine how to use it in all kinds of situations.
Even the most simple concepts need to be taught slowly at first, and preferably to solve a specific problem. If your language requires too much boilerplate, type definitions, or anything else it's too much.
So when I said this language looked like a bunch of gibberish, I was looking at it through the lens of a kid who doesn't know shit about any of this. Because for some reason I can remember what its like to think that way.
Also, nobody said Pyret is for kids (defined as pre-teen). It's not.
If you want a good teaching language which will receive substantial adoption, do this:
* syntax-wise, use curlies and semicolons like C, Java, Perl, JS, etc.
* use many of the good function names that Perl uses.
* have all variables be refs to objects, like Python does.
* use nice normal lexical scoping like Scheme et al
* this shouldn't have to be mentioned, but provide data literals for lists, maps, and sets
* keep it small, and written in C, like Lua.
* resist the temptation to complicate things by adding this really amazing advanced feature that it's just got to have to distinguish itself.
* plan for it to become used for general purpose programming, as it very likely will.
And some "don'ts":
* don't worry about performance, that can come later
* don't worry about lack of libs --- your implementation is in C, and you should get good ffi down the road. And with the substantial adoption you'll see, libs will come later anyway.
* don't have it be on the JVM, LLVM, or any other existing VM. Keep it simple. Even an interpreter is fine for now.
That's it. That's all you need to do for amazing success in a teaching language that will also see serious adoption.
That said, no one does this. Presumably because if you're skilled enough to implement it, you have some neat but obscure features you'd like to add, or a unique syntax you'd prefer, which ... turns users off and keeps adoption low.
Some of these lead to endless bikeshedding (e.g., curly syntax) without much of a way to resolve it. Pyret has one position on these, others may have others.
And some of these are just plain invalid in some contexts (e.g., implement it in C). Our target audience is browser-bound: many of the schools we work with cannot install software on their desktop (so no compiler, IDE, etc.). Implementing in C is therefore a non-starter.
What is the weirdness and/or pet features you see in Pyret?
For the language I described above:
* it's like JS but with warts removed
* it's like C or C++, but higher level
* it's like Java but doesn't require types or the JVM
* it's like Perl but without the context rules and other zaniness
* it's like Python but with better names, more familiar syntax, and better scoping
(It's even a little like Scheme but with more conventional syntax, and regular lists, maps, and sets.)
Students can use it in your class, then if they want to pursue programming in earnest, can very easily segue to any of those common languages above. Not only that:
* They can easily re-type their classwork programs in any of the above langs, and not feel like their teacher had them using something that was only applicable to their particular class.
* They can show their classwork programs to any JS/C/C++/Java/Perl/Python programmer and it will be easy for that person to make sense of what they're looking at (with no previous experience with the lang described above).
I have no experience with Pyret, and don't want to cast any judgement on it or the people working on it. Maybe it's great and will benefit students even when they move on from their class which uses it.
Seems like language implementors are like carpenters: once you're a skilled pro, you're not interested in making a plain simple bookshelf. It's got to have dovetail joints, beveled edges, countersunk fasteners with plugs over them, light sanding between multiple coats of finish... then it's great, but not affordable and I'm back to looking for a simple bookshelf.
About the only new thing here is the testing, documentation and contract support and it is great to have.
In reality, most people in the world struggle with abstract mathematics and find it much more intuitive to think about a program as a series of steps and in terms of control flow. Programming beginners want to draw nice pictures on their computer, not struggle with understanding and implementing some recursive function.
This is of course not to say that recursion is not an important concept or that functional languages have no merits. They are however complicated abstractions over simple concretions that are much easier to understand for a beginner.
As a concrete counterpoint of _friction_ between what students may have as background and what many languages do that relates to language choice:
Many math teachers we work with use the word "variable" to refer to a name that can change per-instantiation of a function or expression by substituting a value. But it doesn't make sense mathematically for x to be 5 "now" and 6 "later". So writing
x = 5
x = 6
Math variables and traditional (mutable) computer science variables are very different things. This provides an _immediate_ tripping point in vocabulary if you want to, say, teach computing in a math or physics class (or leverage that background). So we make a conscious choice in Pyret to dissuade multiple definition of the same variable, or mutability of a variable, to hew close to definitions we can leverage without creating extra conflict and confusion between concepts.
That particular point may differ in different contexts, and deeper studies are definitely needed to figure out what kinds of "notional machines" are easiest or best for students to build up. There may even be more to say about addressing this particular example. But this is a little perspective on concrete reasons for some of these choices.
Math is useful not because it's abstract, but because it's concise and descriptive.
Pattern matching is a much more natural way of thinking for a human than tracing the execution flow; did you ever write Basic on 8-bit computers?
It seems that Pyret is much closer to OCaml (there's a direct comparison in the examples), but with a less mathy syntax. It's also not statically typed. So it should feel much more like Scheme, or, well, Python: you can easily write imperative code, with explicit loops when you want to, etc. But it gives you the nicer data model, the one that combines the natural feel of "objects" and the conveniences of Hindley-Milner-ish types.
Since they made the first stone tools and cave drawings, humans have always defaulted to "how to" reasoning and not "what is" reasoning. You may have the opposite intuition for the simple reason that 99% of what you read, write or see on TV is produced by the the 0.01% percent of people with decent communication skills and a larger percent of them have better developed "what is" type thinking. Also, "what is" type knowledge is easier to communicate in compact form.
Also, most people who tend to learn programming are "makers" types, like me, which have a huge bias towards "process based intuition": for examples, for us, intuitively, an "ellipse" is firstly "the abstract equivalent of what you get by drawing a curve constrained by a string constrained by 2 pins" (http://www.sems.und.edu/FrameA/SolarEclipses/ellipse.jpg), not "the points whose sum of the distances from 2 fixed points is constant", and a "bubble sort" is "what you get by walking over an array and for each..." etc.
And the obvious companion to "how to" or "procedure based" reasoning is extensive mutability because... this is how the physical world seems to work at our level: even writing an equation on paper is actually just "mutating the position of particles of ink from a pen to precise distinct locations on the piece of paper", and results in "changing the information stored at particular 'addresses' of the paper. Even pointers ended up being used not because they are a very smart or appropriate concept or anything, but because they map obviously to the intuition of "pointing at things scribbled on a piece of paper or a blackboard"...
But yeah, "how to" reasoning scales horribly, creates huge misunderstanding and communication problems, and is extremely hard to debug, and we should start moving software engineering past the "cave drawing" stage... but this doesn't mean we should ignore where we're starting from :)
The rest of the people were more obsessed with "what steps to follow to plant my seeds in the ground so as to increase the probability of crop yield" or "what steps to take when preserving food so as not to get really sick" or "what steps to do in what order when building my house do it does not fall down on me at the next storm" etc.
Oh, and lineage is pretty much retrospective "how-to knowledge" anyway: it describes what the sequences of actions where that resulted in someone existing today: 'X1 and X2 had a baby", then "the son of X1 and X2 married Y0" and "moved to village Q" after "fighting in war Z" etc. History is pretty much recorded procedure.
Not until we got to philosophy, geometry and other types of math did we really have clear what-is knowledge... (Except maybe for religion, and no wonder that smart priests and monks were pretty damn good at math.)
As an example, go to any remote village with and ask them "where is X" and what they'll inevitable tell you is "what actions to perform to get from here to place X". (The infuriating part is that some of the modern humans kept this thought pattern and good luck getting indications to how to get anywhere from them... thank god for the "what is" knowledge provided easily by google maps :) )
I don't mean to throw it out, but I try to be skeptical of obvious intuition. I just feel like people were saying "the Earth is obviously flat" (and plenty of other examples). Which from a practical point of view for a lot of people, the Earth was effectively flat. I'm certainly not saying "FP-Master-Race!" But I try to be wary of the fact that history is important but imperfect. Imperative programming is terrifically important, but that does not necessarily mean it is inherently more humane.
One of the links under "set sail":
I like Elixir's approach better where your test cases live as documentation examples in comment form, right above the function declaration. It's fantastic and free. Your mix test command runs tests, in addition to any examples you have as tests.
You put your tests as comments below an ## Example header. Oh and these examples go into your auto generated documentation as well. Ridiculously effective and free. You feel bad not using them.
Once we've taught students about function calls and values, it's a small jump to write an example with "is" in the middle. The syntax errors, static errors about unbound identifiers, and dynamic behavior all act the same as everywhere else in the program, so it's less friction to write the first test.
For getting off the ground writing that first test, I've been really happy with what Pyret lets us do. There is more involved in getting better testing/reporting options as systems scale up (we have lots of test-only files for the compiler), but pedagogically, the testing block infrastructure has worked well.
Cannot do that with elixir style tests easily.
Scripting and functional go amazing together, as long as it's (mostly) dynamic typing. In general, types really just get in the way when scripting or prototyping or trying to do anything fast, especially when you don't know the types of data you're going to be working with ahead of time, e.g. when it comes from a file or from an external API, etc.
But what is BinTree in the following example on the website? Is it a type? Is it a function, or constructor, or what? I'm confused.
| node(value, left, right)
I've found the opposite of that claim to be true. Types are absurdly helpful in prototyping or trying to do anything fast, especially when exploring data coming from a file or an external API.
Anecdotes aside, BinTree in the example is both a data type and a type testing function ("detector") of signature `Any -> Bool`. See http://www.pyret.org/docs/latest/Declarations.html#%28part._... for details.
The documentation seems to verify my intuition: http://www.pyret.org/docs/latest/A_Tour_of_Pyret.html#%28par...
node(1, leaf, leaf)
node(2, leaf, node(1, leaf, leaf))
Object Pascal is the best language I know for education.
Programming education first must be about understansing algorithms I think.
However, learning something old or not useful in real world can be harmful. The best I think is to learn modern tech. The development world evolves too and other skills become more relevant.
No monads, applicatives, etc. No category theory. Just immutability, laziness, pattern matching, and type classes. I feel like that makes as much sense as developing an entirely new language that most likely will only be used by a complete novice. Conversely, one can grow with a language like Haskell for an entire lifetime. Just my two cents.