you lieDown; position: (table onTopOf)
you lie prone on the table
But look at your smalltalk example: does that look like English syntax? First, the word order is wrong, and second, "onTopOf" already is a concatenation. It doesn't really support the idea that your native language shapes your programming.
Oh, sure; morphophonological processes do tend to mess up everything. (This is why Valentine doesn’t give a morpheme-by-morpheme gloss of the example I included.) This is why I don’t claim that such a programming language will be _exactly_ like human agglutination — only that it will have a very different structure to our languages. The complexities of human languages will always need to be simplified and regularised in order to create a programming language.
> But look at your smalltalk example: does that look like English syntax? First, the word order is wrong, and second, "onTopOf" already is a concatenation.
I agree that Smalltalk syntax doesn’t look at all like English syntax, at least in terms of word order. (If anything, it looks a lot more like Burmese.) And it is true that early languages like RPG and SNOBOL — and I should include assembly here as well — have little to do with any human language. But that’s not the central component of my claim. What’s important here is that, for most if not all modern programming languages, you can construct a syntax tree, like this:
┌──┴──┐ │ ┌──┴──┐
you lieDown; position: (table onTopOf)
┌──┴─┐ │ ┌──┴──┐
Lie down on the table
incorporated.noun - directional - instrumental - distributive - indirect.object - direct.object - specific.locative - general.locative - 1A.prefixes - STEM - adverb - diminutive - intention - ability - mood - deduction - modality - dubitative - hearsay - auditory - tense - consequence - sent.func = enclitics
> Obviously my primary target communities here are Cree communities that are looking for new (and exciting) ways to encourage students (especially in the K-12 grades) to use their heritage language as much as possible, and resist using English as their primary language.
But at the end of the day - This feels more like a display piece along the lines of art. Choices were personal, artistic, and spiritual. But I REALLY struggle to call this a programming language. To quote him:
> Where the output is generative and graphic. The generative aspect is crucial in the representation of the Indigenous worldview, because when the program ends whatever display was generated is destroyed (comes to end of life). And subsequent running of the program – though they may produce similar results will never be graphically identical to any previous execution. This mimics the “real” world equivalent of listening to a story from a storyteller – who might change it slightly each time, so the same story is never the same twice.
I'd say instead this feels more like NetLogo - It's a modeling environment that creates generative graphical output based on input, but is not capable of doing most of the things that I'd expect from any real programming language. Mainly - repeatability and precision.
Doesn't make it a bad choice, particularly if his goal is student engagement - but it's not a tool that I feel has much use outside of the very limited environment of teaching/story telling.
I think it's fine for you to expect repeatability and precision from the languages you use if you are using them as tools. But not everyone uses programming languages that way, so I don't think a PL that is not repeatable and precise is any less of a programming language. It may not be a good tool, but as the talk argues, that's okay, as this is only one a narrow view of programming languages.
And I'm not saying that because I think those things are inherently more valuable (they might be - I don't really know), I'm saying that because a programming language has to interact with hardware. And at least our current generation of computational hardware requires precision and repeatability or it doesn't work. And I don't mean doesn't work as in "fails". I mean the literal foundation of the space is built on logic gates - Small devices that have very specific, repeatable, and precise outputs for a given input.
The requirements for precision and repeatability are SO ingrained that it's genuinely hard for us to introduce real randomness to the process.
So stepping back a moment - I think the issue at hand is really how we define "programming language". I see them as tools to control hardware that has fundamental requirements (at least right now) around repeatability and precision.
This uses languages that do have those properties to create another abstraction layer which no longer provides them (or does provide them, but through a black box that hides or obfuscates how they're being applied in a manner that simulates not providing them, it's actually really hard to tell based on the content of the article)
It means that there's a fundamental gap between the capabilities of the two.
A programming language can program hardware. This is an application for turning stories into visual output written in languages that can control hardware (Go and C#).
It's a cool application in much the same way that NetLogo is a very nifty toy to introduce beginners to language/concepts/terms that are used in programming.
So in that sense, I think the creator has mostly hit the mark he's going for. But it's not general - it can't step outside of that space. It's more akin to a domain specific language that has to operate within the context of his specific editor/application.
So just like Photoshop is not "a programming language", I don't really see this as a language.
For instance, I could write a language that has all the trappings you would expect from a PL: a parser, compiler, syntax, semantics, code gen, etc. But the execution of language constructs depends on the time of day the program is compiled. e.g. An if statement compiled in the morning doesn't behave like an if statement compiled at night. Would that be a good tool? No. Would it be a programming language? I don't see why not. It's a programming language in every sense except for an arbitrary constraint you've placed on it based on your particular expectations.
> But it's not general - it can't step outside of that space. It's more akin to a domain specific language that has to operate within the context of his specific editor/application.
There is no requirement for a PL to be generally applicable. Domain specific programming languages are in fact programming languages. Moreover, I would also argue that any language you call a "general" programming language is in fact a "domain specific language". You just have defined your native domain as the "general" case, and any language outside of your native domain a "DSL".
For example, "general" languages like C are in fact specifically tailored to the domain of imperative programming on a von Neumann computer architecture. When you take C out of its target domain - e.g. into parallel programming in a non von Neumann architecture, it suddenly becomes very cumbersome to express programs. Other languages you might call "domain specific" can very easily express programs in that domain. e.g. the dataflow language Lucid. People native to those domains would call those languages "general" and C "domain specific". It's all a matter of perspective.
But isn't this still a specific and repeatable behavior?
You're defining a language feature that I have no issue with here. I agree that it doesn't seem all that useful, but it's not at all in conflict with my definition.
> For example, "general" languages like C are in fact specifically tailored to the domain of imperative programming on a von Neumann computer architecture. When you take C out of its target domain - e.g. into parallel programming in a non von Neumann architecture, it suddenly becomes very cumbersome to express programs. Other languages you might call "domain specific" can very easily express programs in that domain. e.g. the dataflow language Lucid. People native to those domains would call those languages "general" and C "domain specific". It's all a matter of perspective.
I feel like this is really the heart of the discussion - If we are to assume that a language is to eventually be expressed on hardware that has been designed from the ground up to perform boolean logic, I don't see how we avoid the requirement that the language deal with boolean logic.
Lucid is fine by me - it was literally designed to be a disciplined, mathematically pure language. That it happens to use a different architecture than a central CPU and registers has little bearing on its ability to perform maths/logic.
Basically - Is this language not just a less capable subset of a "general" language? Because even the author has explicitly stated that it almost certainly won't be able to achieve even simple tasks such as parsing a document, and even a basic calculator was a "maybe".
So I can certainly understand that it may not be relevant to parse a file in some contexts/cultures, but I can't help but wonder how you can possibly hope to build a framework that explicitly avoids those concepts when the whole foundation has to be built on the things you're trying to avoid. The abstraction has to leak by default, or be inherently less capable.
Now - There may be some interesting room to consider hardware that isn't based on gates (AND/OR/NOT and all their various combinations) but this isn't that.
Which brings me back around to - isn't this just making the rules into a black box? They still exist, but they've been obfuscated in a way that makes them much less apparent? Handy for teaching, but ultimately limiting?
Depends, maybe it chooses the time zone to calculate night/day randomly.
> They still exist, but they've been obfuscated in a way that makes them much less apparent? Handy for teaching, but ultimately limiting?
Right, and that’s okay. Languages that are handy for teaching but ultimately limiting are still programming languages. Being good at parsing files and writing calculators is not the bar for being a programming language. HTML and CSS are still programming languages even if they’re not used to write parsers. Excel is still a programming language even if it’s not used to write servers. LaTeX is still a programming language even if you can’t easily write games with it. People don’t reach for C to write web pages, or budget their finances, or publish their manuscripts. This doesn’t make C less of a programming language.
Datalog, Coq, and Agda are three languages off the top of my head that are not even Turing complete, so you’re not going to be able to express all programs in them. If not being able to express a parser in Cree# makes it not a programming language, is Datalog not a programming language?
Coq is a limited language for theorem proving. Is it not still a programming language? Actually, now that I think about it, “general purpose” languages like C are ultimately limited by their Turing completeness to not be good languages for theorem proving. So this is another area where “general” has some caveats. In other words, Coq being “less capable” than C allows you to do things in Coq that you can’t do in a “general” language.
And that's enough.
Cree# itself as a general-purpose language based on C#/Java with Cree keywords
Ancestral Codes and wisakecak as multimedia versions of the language, what he calls the "digital storytelling apparatus." Here he's bringing in cultural logic from Cree, with programs as stories written to the Raven, as interpreter of the code, used to record and present stories from Cree elders, etc
And then the Indigenous Toolkit to help other communities build programming languages around their own traditions
le if (...)
c’est la vie
In what way do you perceive mainstream programming languages to be tied to cultural norms?
Well, the form of mathematical reasoning that's prevalent was actually first developed during the Islamic Golden Age. Most of Western math is actually imported from the East. Symbolic logic, also is of Islamic origin.
If there is anything to the claim that it is “western”, it is in the way that “western” people regard it (possibly compared to how others regard it).
I don’t see how the idea that mathematical reasoning is inherently “western” (as a property of mathematical thinking, as opposed to as a property of “western”) could possibly stand up to scrutiny.
Sure, the symbols being used may be due to particular cultures, as well as a number of conventions (e.g. infix notation vs whatever, some minor choices made in some definitions, etc.), but, these are not inherent to mathematical reasoning.
It sounds like he's trying for a very comprehensive refactoring of the conceptual base of computing. No idea what'll come out of this, but I'm strongly rooting for it.
What's the point of using a computer to do something that a human can do better? I thought the whole point of automation in general is to free people from the tedious stuff; but storytelling is not supposed to be tedious.
Not to be confused with Jonathan Corbet of LWN.
One consequence of this is that the indigenous cultures that we’ve actually had the chance to study don’t really represent the pre-Colombian cultures that well, because by then, there was already substantial disruption from the infectious diseases and wildlife that spread throughout the continent well in advance of explorers and colonists.
> the available evidence clearly indicates that the demographic collapse was not uniform in either timing or magnitude and may have been caused by factors other than epidemic disease. ... Despite the trauma of conquest, Native Americans continued to have their own histories, intertwined with but not entirely determined by Europeans and their pathogens.
Since that was written, the evidence has swung even more strongly towards the idea that disease was intimately associated with the close, persistent contacts needed for the "conquest" and missionary activities of colonial powers.
Not to mention, similar epidemics were observed among indigenous southern africans and siberians during their respective colonizations. The Americas were unique in the scale and completeness of their disruption, but not in the mechanisms.
Otherwise, it seems that any “close, persistent contacts” would inevitably happen and inevitably lead to the same results.
You're arguing from the quote that 'it's other than disease' and then two sentences later that it was, but due to 'persistent contact needed for conquest'.
None of this adds up to a coherent argument.
If Aboriginals weren't dying en mass from disease, then what from? Because we have crude records of interaction. There were very few violent fights between Aboriginals and newcomers in Canada, for example.
And where is the evidence that Colonialists had 'consistent, closer contact' in hew New World, than in Africa?
I'm all for more nuanced history, we're learning stuff every day, but I think a lot of it is also speculative, and ideologically driven.
> And where is the evidence that Colonialists had 'consistent, closer contact' in hew New World, than in Africa?
In Africa, Europeans died off rapidly due to local diseases. Consequently, the early slave trade was centered in the islands off of Africa itself, and mediated by a mulatto class who were less susceptible.
The Canadian government forcibly took indigenous children away from their parents to be taught at religious schools where they'd be beaten if they spoke their indigenous language. The goal was literal cultural genocide and to erase indigenous nations and cultures from the continent.
This was not "inevitable" but instead an active policy goal the government persued.
By the time a Canadian government even existed, most of the damage had already been done.
One might question why this policy was undertaken in Canada but not the African colonies. Perhaps because Canada’s indigenous population was already a minority. But how did that happen?
Things also would have played out differently if the Canadian government and/or colonial precursors hadn't engaged in active genocide.
Engaging in genocidal polices is of course not inevitable but an active policy choice.
It was an active policy choice, and an abominable one at that, but it was not the primary cause of the destruction of indigenous cultures. If it weren’t for the residential schools, Canada’s First Nations would still be a marginalized minority in their own homeland, displaced by English and French-speaking settlers. Without the wholesale depopulation of North America via infectious disease, English and French-speaking settlers would have never been able to come here in great numbers at the time they did in the first place.
Absolutely First Nations were to varying degrees shrunk massively from pre-contact highs, there's no debate, but the loss of cultural memory, art, music, and language really only happened very recently, in the last 100 years, and was directly related to government and religious orders imposing residential schools and literal government bans on cultural activity and organization (eg. Potlatch).
This is not ancient history. The potlatch ban only came into effect in 1885 and was only removed in 1951. Residential schools were only closed in the 1970s.
You can see evidence of this in NW Coast art, where post contact, pre 1900s, there was actually a renaissance in art production and development, as superior iron tooling made it easier than ever to make art, and creation of carvings for the tourist market opened up all sorts of new economic opportunities for First Nations people.
Then in the late 1800s the government imposes literal bans on cultural activity and brings in residential schools. Enormous loss of cultural memory occurs.
By 1969 no one on Haida Gwaii had raised a totem pole in living memory, but Robert Davidson carved one and raised it. He couldn't speak Haida. No one even knew what to do at a totem raising. Luckily there were a handful of old timers that vaguely knew enough to kick start this cultural revitalization.
If indigenous art and culture was already dead due "inevitable" disease, then why would the government feel any need to actively try to ban it, police it and arrest indigenous people for taking part? So you can see here that no, despite population decreases, indigenous customs were quite alive and well. The destruction required a brutal government clamp down to try to snuff it out.
When you think about potlatches and totem poles, you should consider that these are the culture of a remnant of survivors of a vast cultural collapse that utterly eradicated entire civilizations across two continents. That doesn’t minimize their value; if anything, it makes them more rare and precious, and hence makes it even more fundamentally evil, if such a thing is possible, for the Canadian government to have attempted to destroy these things.
You could be right. You could be completely wrong, and neither of us have any ability to know.
What is measurable is what the Canadian government explicitly tried to eradicate in the last century.
As a point of comparison, there are 2.9 million registered Native Americans, and 5.2 million people who check that box, sometimes along with others, on the latest census.
By your very own evidence - Spanish and Portuguese utterly dominate Central and South America.
There are 650M people there, if ~10% of the population speaks another language, that helps prove the case of 'language from more advanced languages win'.
Even where there is relative parity, systems definitely favour the language of the more powerful entity.
All over Europe there are 'niche languages' dying out, which is sad, but it's materially the case.
Go to Nice, France, and the street signs are in 'Nicoise' - not French. In Monaco if you listen carefully you can hear 'Monegasque'.
The decline of those languages is hinted at in the fact my spellchecker doesn't even recognize those words, unfortunately.
That should be a signal as to how 'far apart' colonialists and aboriginals were with respect to development of cultural institutions.
The writing you see in this post is invented by a Canadian-English Methodist Priest in the mid 10th (Edit: 19th century obviously!) century for the benefit of the aboriginals. The system, in current terms is itself 'firmly colonialist' (I'm sure someone will cynically characterize it as a form of oppression).
That said, it'd be cool to see Cree keyboards.
In fact, making a 'Cree Keyboard' might have been a much more practical use of the authors time, and might have actually more materially affected young people's ability to learn Cree.
Come to think of it, there really should be such keyboards available ...
Mesoamerica had at least one family of complete writing systems (I'll call this Maya, although whether or not they invented it or adapted it from others is debated), and another proto-writing system that may well evolved into a full writing system (the Aztecs, who at the time of contact appear to have been in the early stages of planning a conquest of the Maya). Andean cultures had a maybe-it's-a-writing-system-unlike-any-other, the quipus.
Of course, positing a less domineering conquest, it is very likely that cultures may well have developed their own indigenous writing systems via contact with Europeans--that is precisely what the Cherokee did. I doubt they would have stubbornly refused to pick up any writing systems.
> That should be a signal as to how 'far apart' colonialists and aboriginals were with respect to development of cultural institutions.
Yeah, Tenochtitlan had public zoos and museums, organized anthropology, universal primary education, ethnic quarters, professional sports leagues at a time when all of those concepts would take another few centuries to be 'invented' in Europe.
Oh, wait, were you suggesting that it was the Americas that was culturally backward?
> The writing you see in this post is invented by a Canadian-English Methodist Priest in the mid 10th century for the benefit of the aboriginals. The system, in current terms is itself 'firmly colonialist' (I'm sure someone will cynically characterize it as a form of oppression).
My understanding is that the modern indigenous groups see the use of the syllabics as less oppressive than being forced to use Latin, as the syllabics are designed to more closely match the language than using the Latin script.
Your post could be applied to the Vietnamese writing system, for example. Or to the many other languages that have adopted variants of the Arabic, Cyrillic, and Latin writing systems.
No society is frozen in time.
It's hard to do most advanced things without writing if you only have oral, just like it's hard to do some things without Iron if you only have Bronze.
We refer to stages of cultural development i.e. 'Iron/Bronze/Steel' but we may very well use 'Oral/Written/Printing Press/Digital'.
The same limitations apply: if you can't forge things with Iron (like strong chariots/carts & ploughs), there's a variety of advancements you can't make. Likewise, if you don't have reading/writing, you're equally limited.
You can't do mass farming without high quality carts and ploughs, and you can't build schools without reading and writing. Without those things, you can't get very far.
The Maya and Aztec had writing in the form of hieroglypics. The Inca had persistent communication via Quipu's rope knots.
(I learned this from _Guns, Germs, and Steel_ which is a phenomenal book. I haven't done other research, though, so maybe the book isn't a good source.)
The Mayans had a complete, complex logosyllabic writing system. (I believe the syllabic components are more common than logographic components, but I'm not certain). Individual syllables (or logograms) could be combined into a single glyph block in a variety of ways. This writing system, I believe, is connected to Zapotec and epi-Olmec writing systems, but disentangling who created what and who borrowed from whom in Mesoamerica is challenging.
The Aztecs had what appears to be a proto-writing system, largely capable of only recording proper nouns (predominantly place names); most of the writing would instead be conveyed pictographically. Before the Aztecs, in Classical Mesoamerica, Teotihuacan (which was the major power in the Central Mexico Valley at that time period) appears to have never used any form of writing, despite having conquered Classic Maya city-states which were in full florescence of their writing systems.
Quipus originate at least as early as the Wari culture in the Andes, although (again) people only recognize the final Andean civilization, the Inca. Whether or not they are a writing system is debatable--it's known they encode more than just numeric values (such as place names), but whether they can convey enough information to be considered writing is unknown.
Post-contact, Sequoya developed a syllabary for the Cherokee language based only on the knowledge of the existence of the Latin alphabet (he couldn't read English or any European language, but he did have access to European-language materials--that's why several Cherokee letterforms look like Latin ones but have completely different meanings). Missionaries in Canada developed a syllabary for several aboriginal languages that remains in use by many Cree, Ojibwe, and Inuktitut speakers.
- 1491: New Revelations of the Americas Before Columbus https://www.goodreads.com/book/show/39020.1491
- 1493: Uncovering the New World Columbus Created https://www.goodreads.com/book/show/9862761-1493
GG&S is an interesting book, but extremely conjectural and ideological, and not well-sourced. Some of the evidence is distorted. Off the top of my head, he reproduces a table of grain yields, and when tracking down his sources for this, it turns out that he's omitted results that contradict his theory. The reasoning is sometimes shaky or circular: 'Why do we know X wasn't domesticable? Because it wasn't domesticated.' Etc. etc. I don't find his theory holds up particularly well.
I think the storytelling aspect is what he's going for, not just to make C# but with Cree keywords. Baskets are weaved differently in different cultures. Is it possible to have a programming language that reflects a different culture or outlook? What would that look like? It's an experiment.