Hacker News new | past | comments | ask | show | jobs | submit login
Cree#, a morphemic programming language with Cree keywords and concepts (esoteric.codes)
86 points by relatt on April 11, 2021 | hide | past | favorite | 60 comments



As someone with an interest in both linguistics and programming, I’ve often wondered what programming languages would be like if they had developed from an agglutinating language like Cree rather than the mostly isolating languages of western Europe. English sentences and phrases consist mostly of isolated words joined together by syntactic words: ‘Lie down on the table’. This has carried over to the programming languages we use, which mostly consist of isolated keywords and operators joined together by syntactic rules. e.g. a ‘translation’ of the previous example into Smalltalk-ish pseudocode would look something like:

    you lieDown; position: (table onTopOf)
Contrast this to languages like Cree — or Ojibwe, since I can’t find a convenient Cree example — where this sentence becomes as follows, with the verb becoming the central element of the sentence and each word inflected describing its relationship to the others (Valentine 2001):

      Zhngishin     doopwining.
    you lie prone  on the table
I’d be really curious to see how a native speaker of such a language would design a programming language — I’m not sure what it would look like, but I suspect it would look remarkably different to the ones we usually use. Corbett’s languages look like an interesting step in this direction, and I’d be interested in knowing more details if they’re available.


I doubt it. Natural language agglutination tends to have all kinds of nasty side effects, modifying the letters at the boundary, which makes parsing hard and therefore would have created problems writing a decent parser in the early days. COBOL famously tried to follow English syntax, but most other languages deviate from it. Look at early languages such as RPG or SNOBOL. They follow the machine instruction set much more closely than the English grammar.

But look at your smalltalk example: does that look like English syntax? First, the word order is wrong, and second, "onTopOf" already is a concatenation. It doesn't really support the idea that your native language shapes your programming.


> Natural language agglutination tends to have all kinds of nasty side effects …

Oh, sure; morphophonological processes do tend to mess up everything. (This is why Valentine doesn’t give a morpheme-by-morpheme gloss of the example I included.) This is why I don’t claim that such a programming language will be _exactly_ like human agglutination — only that it will have a very different structure to our languages. The complexities of human languages will always need to be simplified and regularised in order to create a programming language.

> But look at your smalltalk example: does that look like English syntax? First, the word order is wrong, and second, "onTopOf" already is a concatenation.

I agree that Smalltalk syntax doesn’t look at all like English syntax, at least in terms of word order. (If anything, it looks a lot more like Burmese.) And it is true that early languages like RPG and SNOBOL — and I should include assembly here as well — have little to do with any human language. But that’s not the central component of my claim. What’s important here is that, for most if not all modern programming languages, you can construct a syntax tree, like this:

        ┌────────┴─────────┐
        │            ┌─────┴────┐
     ┌──┴──┐         │       ┌──┴──┐
    you lieDown; position: (table onTopOf)
This is a very English-like trait, in that English sentences also consist of isolated words organised in a syntax tree:

        ┌───┴───┐
        │    ┌──┴───┐
     ┌──┴─┐  │   ┌──┴──┐
    Lie down on the table
This structure is of course common crosslinguistically, but agglutinative and polysynthetic languages such as Cree are structured differently. In these languages, words are composed of a stem, surrounded by affixes expressing their relationships to other words. e.g. Koasati verbs (which are reasonably straightforward in this regard) have the following structure (Kimball 1985):

    incorporated.noun - directional - instrumental - distributive - indirect.object - direct.object - specific.locative - general.locative - 1A.prefixes - STEM - adverb - diminutive - intention - ability - mood - deduction - modality - dubitative - hearsay - auditory - tense - consequence - sent.func = enclitics
Due to this extensive marking, word order is extremely free; in many languages with this amount of marking, words within a sentence may be arranged in pretty much any order. Though linguists can still draw out syntax trees for these languages, it is unclear to what extent such an abstraction is useful: the affixes within a word have a rigid and non-hierarchical order, while the words themselves have few constraints on ordering. This is a characteristic unlike all programming languages I am aware of, and is what I refer to when I say that programming languages today are more like English and French than they are like Cree and Koasati.


This is a very interesting post, thanks for sharing. If you come across anything definitely give us an update


I read the whole interview, and at no point did I really feel like I understood what this is trying to achieve. The closest I felt I got was this:

--- > Obviously my primary target communities here are Cree communities that are looking for new (and exciting) ways to encourage students (especially in the K-12 grades) to use their heritage language as much as possible, and resist using English as their primary language. ---

But at the end of the day - This feels more like a display piece along the lines of art. Choices were personal, artistic, and spiritual. But I REALLY struggle to call this a programming language. To quote him:

--- > Where the output is generative and graphic. The generative aspect is crucial in the representation of the Indigenous worldview, because when the program ends whatever display was generated is destroyed (comes to end of life). And subsequent running of the program – though they may produce similar results will never be graphically identical to any previous execution. This mimics the “real” world equivalent of listening to a story from a storyteller – who might change it slightly each time, so the same story is never the same twice. ---

I'd say instead this feels more like NetLogo - It's a modeling environment that creates generative graphical output based on input, but is not capable of doing most of the things that I'd expect from any real programming language. Mainly - repeatability and precision.

Doesn't make it a bad choice, particularly if his goal is student engagement - but it's not a tool that I feel has much use outside of the very limited environment of teaching/story telling.


I encourage you to watch this talk by Amy J. Ko called A Human View of Programming she gave at SPLASHCon a couple years ago [0]. It makes that case that the predominant view of programming languages as math or tools is not the only valid view, and that by holding this narrow definition for programming languages, the PL community has left a huge design space untapped.

I think it's fine for you to expect repeatability and precision from the languages you use if you are using them as tools. But not everyone uses programming languages that way, so I don't think a PL that is not repeatable and precise is any less of a programming language. It may not be a good tool, but as the talk argues, that's okay, as this is only one a narrow view of programming languages.

https://medium.com/bits-and-behavior/my-splash-2016-keynote-...


At least right now - I do believe, fairly firmly, that repeatability and precision are requirements for a programming language.

And I'm not saying that because I think those things are inherently more valuable (they might be - I don't really know), I'm saying that because a programming language has to interact with hardware. And at least our current generation of computational hardware requires precision and repeatability or it doesn't work. And I don't mean doesn't work as in "fails". I mean the literal foundation of the space is built on logic gates - Small devices that have very specific, repeatable, and precise outputs for a given input.

The requirements for precision and repeatability are SO ingrained that it's genuinely hard for us to introduce real randomness to the process.

---

So stepping back a moment - I think the issue at hand is really how we define "programming language". I see them as tools to control hardware that has fundamental requirements (at least right now) around repeatability and precision.

This uses languages that do have those properties to create another abstraction layer which no longer provides them (or does provide them, but through a black box that hides or obfuscates how they're being applied in a manner that simulates not providing them, it's actually really hard to tell based on the content of the article)

It means that there's a fundamental gap between the capabilities of the two.

---

A programming language can program hardware. This is an application for turning stories into visual output written in languages that can control hardware (Go and C#).

It's a cool application in much the same way that NetLogo is a very nifty toy to introduce beginners to language/concepts/terms that are used in programming.

So in that sense, I think the creator has mostly hit the mark he's going for. But it's not general - it can't step outside of that space. It's more akin to a domain specific language that has to operate within the context of his specific editor/application.

So just like Photoshop is not "a programming language", I don't really see this as a language.


Respectfully, these are all requirements you are projecting onto the design and implementation of programming languages based on your preferences. Like you said, you "see them as tools to control hardware that has fundamental requirements". Your definition of programming languages is predicated on your view of languages as tools. Yes, it is important for tools to be predictable, repeatable, and precise. But not everyone uses programming languages as tools, so there is no requirement for them to be tools for their applications. Just because the underlying hardware is a predictable machine, doesn't mean the language has to use it in predictable ways.

For instance, I could write a language that has all the trappings you would expect from a PL: a parser, compiler, syntax, semantics, code gen, etc. But the execution of language constructs depends on the time of day the program is compiled. e.g. An if statement compiled in the morning doesn't behave like an if statement compiled at night. Would that be a good tool? No. Would it be a programming language? I don't see why not. It's a programming language in every sense except for an arbitrary constraint you've placed on it based on your particular expectations.

> But it's not general - it can't step outside of that space. It's more akin to a domain specific language that has to operate within the context of his specific editor/application.

There is no requirement for a PL to be generally applicable. Domain specific programming languages are in fact programming languages. Moreover, I would also argue that any language you call a "general" programming language is in fact a "domain specific language". You just have defined your native domain as the "general" case, and any language outside of your native domain a "DSL".

For example, "general" languages like C are in fact specifically tailored to the domain of imperative programming on a von Neumann computer architecture. When you take C out of its target domain - e.g. into parallel programming in a non von Neumann architecture, it suddenly becomes very cumbersome to express programs. Other languages you might call "domain specific" can very easily express programs in that domain. e.g. the dataflow language Lucid. People native to those domains would call those languages "general" and C "domain specific". It's all a matter of perspective.


> For instance, I could write a language that has all the trappings you would expect from a PL: a parser, compiler, syntax, semantics, code gen, etc. But the execution of language constructs depends on the time of day the program is compiled. e.g. An if statement compiled in the morning doesn't behave like an if statement compiled at night. Would that be a good tool? No. Would it be a programming language? I don't see why not. It's a programming language in every sense except for an arbitrary constraint you've placed on it based on your particular expectations.

But isn't this still a specific and repeatable behavior?

You're defining a language feature that I have no issue with here. I agree that it doesn't seem all that useful, but it's not at all in conflict with my definition.

> For example, "general" languages like C are in fact specifically tailored to the domain of imperative programming on a von Neumann computer architecture. When you take C out of its target domain - e.g. into parallel programming in a non von Neumann architecture, it suddenly becomes very cumbersome to express programs. Other languages you might call "domain specific" can very easily express programs in that domain. e.g. the dataflow language Lucid. People native to those domains would call those languages "general" and C "domain specific". It's all a matter of perspective.

I feel like this is really the heart of the discussion - If we are to assume that a language is to eventually be expressed on hardware that has been designed from the ground up to perform boolean logic, I don't see how we avoid the requirement that the language deal with boolean logic.

Lucid is fine by me - it was literally designed to be a disciplined, mathematically pure language. That it happens to use a different architecture than a central CPU and registers has little bearing on its ability to perform maths/logic.

Basically - Is this language not just a less capable subset of a "general" language? Because even the author has explicitly stated that it almost certainly won't be able to achieve even simple tasks such as parsing a document, and even a basic calculator was a "maybe".

So I can certainly understand that it may not be relevant to parse a file in some contexts/cultures, but I can't help but wonder how you can possibly hope to build a framework that explicitly avoids those concepts when the whole foundation has to be built on the things you're trying to avoid. The abstraction has to leak by default, or be inherently less capable.

Now - There may be some interesting room to consider hardware that isn't based on gates (AND/OR/NOT and all their various combinations) but this isn't that.

Which brings me back around to - isn't this just making the rules into a black box? They still exist, but they've been obfuscated in a way that makes them much less apparent? Handy for teaching, but ultimately limiting?


> But isn't this still a specific and repeatable behavior?

Depends, maybe it chooses the time zone to calculate night/day randomly.

> They still exist, but they've been obfuscated in a way that makes them much less apparent? Handy for teaching, but ultimately limiting?

Right, and that’s okay. Languages that are handy for teaching but ultimately limiting are still programming languages. Being good at parsing files and writing calculators is not the bar for being a programming language. HTML and CSS are still programming languages even if they’re not used to write parsers. Excel is still a programming language even if it’s not used to write servers. LaTeX is still a programming language even if you can’t easily write games with it. People don’t reach for C to write web pages, or budget their finances, or publish their manuscripts. This doesn’t make C less of a programming language.

Datalog, Coq, and Agda are three languages off the top of my head that are not even Turing complete, so you’re not going to be able to express all programs in them. If not being able to express a parser in Cree# makes it not a programming language, is Datalog not a programming language?

Coq is a limited language for theorem proving. Is it not still a programming language? Actually, now that I think about it, “general purpose” languages like C are ultimately limited by their Turing completeness to not be good languages for theorem proving. So this is another area where “general” has some caveats. In other words, Coq being “less capable” than C allows you to do things in Coq that you can’t do in a “general” language.


> Doesn't make it a bad choice, particularly if his goal is student engagement - but it's not a tool that I feel has much use outside of the very limited environment of teaching/story telling.

And that's enough.


Yes I think it's both those things, and really a few different projects with overlapping goals:

Cree# itself as a general-purpose language based on C#/Java with Cree keywords

Ancestral Codes and wisakecak as multimedia versions of the language, what he calls the "digital storytelling apparatus." Here he's bringing in cultural logic from Cree, with programs as stories written to the Raven, as interpreter of the code, used to record and present stories from Cree elders, etc

And then the Indigenous Toolkit to help other communities build programming languages around their own traditions


Do you know whether Cree# is open source? Sorry if I missed this when reading the text too quickly, but I struggle to find more documentation or a code repository. I know some folks at my university who are working with indigenous languages and cultural heritage, this could be interesting for them.


That the keywords and symbols are Cree is trivial and not that much different than when someone does the same thing with Klingon or Swedish Chef but the concept of a programming language based on storytelling is interesting. It might never end up being a useful language for information infrastructure or corporate problem solving but I suspect that if it gains much traction with Cree students (or anyone with an interest) that they'll end up coming up with uses that we haven't thought of before with our programming languages tied so deeply to our cultural norms.


A colleague is French-Canadian; I suggested localizing C to French; he proposed:

  le if (...)
  { }
  c’est la vie
  { }


Is it? The language appears to (at least somewhat) make use of syntax that is inspired by the specific way the Cree language works.


>with our programming languages tied so deeply to our cultural norms.

In what way do you perceive mainstream programming languages to be tied to cultural norms?


They certainly are tied to a certain mode of thinking rooted in Mathematical reasoning originating in the West, but certainly influenced by other cultures. The thing is that the underlying computer architecture uses this reasoning, so it's difficult to see how you could program a computer without relying on that reasoning at some level.


> Mathematical reasoning originating in the West

Well, the form of mathematical reasoning that's prevalent was actually first developed during the Islamic Golden Age. Most of Western math is actually imported from the East. Symbolic logic, also is of Islamic origin.


Mathematical reasoning doesn’t care who discovered it.

If there is anything to the claim that it is “western”, it is in the way that “western” people regard it (possibly compared to how others regard it).

I don’t see how the idea that mathematical reasoning is inherently “western” (as a property of mathematical thinking, as opposed to as a property of “western”) could possibly stand up to scrutiny.

Sure, the symbols being used may be due to particular cultures, as well as a number of conventions (e.g. infix notation vs whatever, some minor choices made in some definitions, etc.), but, these are not inherent to mathematical reasoning.


It’s not just the keywords though; in the interview, he explains how it’s the logic of storytelling that makes up the programs and what the language is expected to perform as well


This sounds both like it's absolutely incredible and has a long way to go. I recognize the creative approach of "I'm here, I want to go there, let's take one step at a time."

It sounds like he's trying for a very comprehensive refactoring of the conceptual base of computing. No idea what'll come out of this, but I'm strongly rooting for it.


I see this as an attempt to implement a programming paradigm (i.e storytelling) that's radically different from existing paradigms. How are others interpreting this project? What's it's significance?


> This mimics the “real” world equivalent of listening to a story from a storyteller – who might change it slightly each time, so the same story is never the same twice.

What's the point of using a computer to do something that a human can do better? I thought the whole point of automation in general is to free people from the tedious stuff; but storytelling is not supposed to be tedious.


> Jon Corbett

Not to be confused with Jonathan Corbet of LWN.


Vaguely reminded of the lingua::Romana::Perligata module to write perl in latin

http://users.monash.edu/~damian/papers/HTML/Perligata.html


I can imagine another timeline where the settlement of the Americas was more a blending than a displacement and there exist things like Cherokee and Navajo keyboards.


Both keyboards do exist. The Cherokee layout is included in OSX and iOS, while Navajo is available for download. You can find pictures of Cherokee keycaps in use, and I think there were also typewriters and certainly printed type in Cherokee.


None seem to be for sale on eBay or Amazon.


The cultural devastation wrought on the indigenous peoples of the Americas was tragic. But it was also largely inevitable. The colonization of Africa left many more Africans alive than the colonization of America because America was biologically isolated from Eurasia while Africa was not. The epidemics, particularly of smallpox, that depopulated the Americas were so brutal largely because the indigenous Americans had neither the biological nor the cultural adaptation to infectious disease that was common among Eurasians and Africans. Most indigenous causes of illness in the Americas were due to parasites rather than viruses and bacteria, allowing the evolution of cultural practices like having the entire extended family of a sick person keep them company and try and comfort them through their illness.

One consequence of this is that the indigenous cultures that we’ve actually had the chance to study don’t really represent the pre-Colombian cultures that well, because by then, there was already substantial disruption from the infectious diseases and wildlife that spread throughout the continent well in advance of explorers and colonists.


This narrative gets trotted out anytime this subject comes up. It's not reflective of the actual literature. Here's an older perspective from 2009 [1]:

> the available evidence clearly indicates that the demographic collapse was not uniform in either timing or magnitude and may have been caused by factors other than epidemic disease. ... Despite the trauma of conquest, Native Americans continued to have their own histories, intertwined with but not entirely determined by Europeans and their pathogens.

Since that was written, the evidence has swung even more strongly towards the idea that disease was intimately associated with the close, persistent contacts needed for the "conquest" and missionary activities of colonial powers.

Not to mention, similar epidemics were observed among indigenous southern africans and siberians during their respective colonizations. The Americas were unique in the scale and completeness of their disruption, but not in the mechanisms.

[1] https://doi.org/10.1007/s10814-009-9036-8


I suppose if you posit an alternate history where the entire rest of humanity completely left the Americas alone after discovering them and quarantined the entire Western Hemisphere until achieving a 20th century understanding of infectious disease and medicine, the demographic collapse would not have happened. The likelihood of this happening seems fairly remote to me.

Otherwise, it seems that any “close, persistent contacts” would inevitably happen and inevitably lead to the same results.


This is a wayward argument.

You're arguing from the quote that 'it's other than disease' and then two sentences later that it was, but due to 'persistent contact needed for conquest'.

None of this adds up to a coherent argument.

If Aboriginals weren't dying en mass from disease, then what from? Because we have crude records of interaction. There were very few violent fights between Aboriginals and newcomers in Canada, for example.

And where is the evidence that Colonialists had 'consistent, closer contact' in hew New World, than in Africa?

I'm all for more nuanced history, we're learning stuff every day, but I think a lot of it is also speculative, and ideologically driven.


I agree largely with your response, but

> And where is the evidence that Colonialists had 'consistent, closer contact' in hew New World, than in Africa?

In Africa, Europeans died off rapidly due to local diseases. Consequently, the early slave trade was centered in the islands off of Africa itself, and mediated by a mulatto class who were less susceptible.


No.

The Canadian government forcibly took indigenous children away from their parents to be taught at religious schools where they'd be beaten if they spoke their indigenous language. The goal was literal cultural genocide and to erase indigenous nations and cultures from the continent.

This was not "inevitable" but instead an active policy goal the government persued.


> The Canadian government forcibly took indigenous children away from their parents....

By the time a Canadian government even existed, most of the damage had already been done.

One might question why this policy was undertaken in Canada but not the African colonies. Perhaps because Canada’s indigenous population was already a minority. But how did that happen?


If Canada's First Nations had been less susceptible to disease things probably would have played out differently though things would have played out quite a bit differently.

Things also would have played out differently if the Canadian government and/or colonial precursors hadn't engaged in active genocide.

Engaging in genocidal polices is of course not inevitable but an active policy choice.


> Engaging in genocidal polices is of course not inevitable but an active policy choice.

It was an active policy choice, and an abominable one at that, but it was not the primary cause of the destruction of indigenous cultures. If it weren’t for the residential schools, Canada’s First Nations would still be a marginalized minority in their own homeland, displaced by English and French-speaking settlers. Without the wholesale depopulation of North America via infectious disease, English and French-speaking settlers would have never been able to come here in great numbers at the time they did in the first place.


No seriously this is not at all the case.

Absolutely First Nations were to varying degrees shrunk massively from pre-contact highs, there's no debate, but the loss of cultural memory, art, music, and language really only happened very recently, in the last 100 years, and was directly related to government and religious orders imposing residential schools and literal government bans on cultural activity and organization (eg. Potlatch).

This is not ancient history. The potlatch ban only came into effect in 1885 and was only removed in 1951. Residential schools were only closed in the 1970s.

You can see evidence of this in NW Coast art, where post contact, pre 1900s, there was actually a renaissance in art production and development, as superior iron tooling made it easier than ever to make art, and creation of carvings for the tourist market opened up all sorts of new economic opportunities for First Nations people.

Then in the late 1800s the government imposes literal bans on cultural activity and brings in residential schools. Enormous loss of cultural memory occurs.

By 1969 no one on Haida Gwaii had raised a totem pole in living memory, but Robert Davidson carved one and raised it. He couldn't speak Haida. No one even knew what to do at a totem raising. Luckily there were a handful of old timers that vaguely knew enough to kick start this cultural revitalization.

https://www.cbc.ca/news/canada/british-columbia/haida-totem-...

If indigenous art and culture was already dead due "inevitable" disease, then why would the government feel any need to actively try to ban it, police it and arrest indigenous people for taking part? So you can see here that no, despite population decreases, indigenous customs were quite alive and well. The destruction required a brutal government clamp down to try to snuff it out.

Potlatch ban: https://www.ictinc.ca/the-potlatch-ban-abolishment-of-first-... https://www.sfu.ca/brc/online_exhibits/masks-2-0/the-potlatc...


I’m not denying or minimizing these policies. But your argument relies on a survivorship bias. You think, “well, I know that potlatches and totem poles are indigenous cultural practices, and the Canadian government banned those around the turn of the 20th century, so 100% of the lost cultural practices are due to the Canadian government”. But we don’t even know the cultural practices of the majority of indigenous people who were killed by infectious disease between 1492 and maybe 1800 or so. And we barely know how the cultural practices of their survivors were fundamentally altered by those changes.

When you think about potlatches and totem poles, you should consider that these are the culture of a remnant of survivors of a vast cultural collapse that utterly eradicated entire civilizations across two continents. That doesn’t minimize their value; if anything, it makes them more rare and precious, and hence makes it even more fundamentally evil, if such a thing is possible, for the Canadian government to have attempted to destroy these things.


ok but if there are cultural practices lost between contact and the near term recent memory, this is unquantifiable.

You could be right. You could be completely wrong, and neither of us have any ability to know.

What is measurable is what the Canadian government explicitly tried to eradicate in the last century.


That blending largely occurred in Latin America, but languages rarely survive such things both in the Americas and generally throughout history. Communication is important enough that the language of whichever group is technologically superior becomes the lingua franca, and the other slowly disappears.


That undersells the situation in Latin America, to put it mildly. There are almost two million native speakers of Nahuatl languages in Mexico, six million Maya speakers in Central America, 25% of Peru speaks Quechua at home (around ten million in total throughout the Andes). Guaraní has official status in Paraguay, half of the population is monolingual in it, substantial numbers of people of partial or complete European ancestry speak it every day, we're talking about 6.5 million people.

As a point of comparison, there are 2.9 million registered Native Americans, and 5.2 million people who check that box, sometimes along with others, on the latest census.


I'm not sure what you mean? You named a handful of languages that have survived, most of which are still in decline even if they still have substantial numbers of active speakers. The only one that seems possibly positioned to 'win out' in the long run is Guaraní, and even in Paraguay Spanish is steadily making inroads.


You've basically proved the point of the commenter you are responding to.

By your very own evidence - Spanish and Portuguese utterly dominate Central and South America.

There are 650M people there, if ~10% of the population speaks another language, that helps prove the case of 'language from more advanced languages win'.

Even where there is relative parity, systems definitely favour the language of the more powerful entity.

All over Europe there are 'niche languages' dying out, which is sad, but it's materially the case.

Go to Nice, France, and the street signs are in 'Nicoise' - not French. In Monaco if you listen carefully you can hear 'Monegasque'.

The decline of those languages is hinted at in the fact my spellchecker doesn't even recognize those words, unfortunately.


I'm being downvoted here, I think mainly because frank admission of facts is considered mean. For reference, I'm speaking as a member of a tiny ethnic group that survived well into the 20th century, including a period of strong repression. What we couldn't survive was modernization. The need for work led to embracing the English language as the only way forward, and mass media, mobility, and exogamy have all contributed to extremely rapid cultural disintegration. This is the way the world works.


Aboriginals had no writing system so there would be no keyboards.

That should be a signal as to how 'far apart' colonialists and aboriginals were with respect to development of cultural institutions.

The writing you see in this post is invented by a Canadian-English Methodist Priest in the mid 10th (Edit: 19th century obviously!) century for the benefit of the aboriginals. The system, in current terms is itself 'firmly colonialist' (I'm sure someone will cynically characterize it as a form of oppression).

That said, it'd be cool to see Cree keyboards.

In fact, making a 'Cree Keyboard' might have been a much more practical use of the authors time, and might have actually more materially affected young people's ability to learn Cree.

Come to think of it, there really should be such keyboards available ...


> Aboriginals had no writing system so there would be no keyboards.

Mesoamerica had at least one family of complete writing systems (I'll call this Maya, although whether or not they invented it or adapted it from others is debated), and another proto-writing system that may well evolved into a full writing system (the Aztecs, who at the time of contact appear to have been in the early stages of planning a conquest of the Maya). Andean cultures had a maybe-it's-a-writing-system-unlike-any-other, the quipus.

Of course, positing a less domineering conquest, it is very likely that cultures may well have developed their own indigenous writing systems via contact with Europeans--that is precisely what the Cherokee did. I doubt they would have stubbornly refused to pick up any writing systems.

> That should be a signal as to how 'far apart' colonialists and aboriginals were with respect to development of cultural institutions.

Yeah, Tenochtitlan had public zoos and museums, organized anthropology, universal primary education, ethnic quarters, professional sports leagues at a time when all of those concepts would take another few centuries to be 'invented' in Europe.

Oh, wait, were you suggesting that it was the Americas that was culturally backward?

> The writing you see in this post is invented by a Canadian-English Methodist Priest in the mid 10th century for the benefit of the aboriginals. The system, in current terms is itself 'firmly colonialist' (I'm sure someone will cynically characterize it as a form of oppression).

My understanding is that the modern indigenous groups see the use of the syllabics as less oppressive than being forced to use Latin, as the syllabics are designed to more closely match the language than using the Latin script.


Sure, Central American aboriginals did, but essentially none of the north American aboriginals (i.e. Cree, specific to the article) did.


Or, more plausibly, these societies without writing systems might have developed their own writing systems eventually.

Your post could be applied to the Vietnamese writing system, for example. Or to the many other languages that have adopted variants of the Arabic, Cyrillic, and Latin writing systems.

No society is frozen in time.


That they 'would have, maybe in 1000 years developed a writing system' is definitely true, but the fact is at contact they did not, and even after 100's of years of trade and interaction and availability of Western literature - they still did not, and so that they did not is a meaningful measure of cultural evolution.

It's hard to do most advanced things without writing if you only have oral, just like it's hard to do some things without Iron if you only have Bronze.


And FYI, the same thing would apply to any culture that lacks a writing system i.e. 'Vietnam' as you said.

We refer to stages of cultural development i.e. 'Iron/Bronze/Steel' but we may very well use 'Oral/Written/Printing Press/Digital'.

The same limitations apply: if you can't forge things with Iron (like strong chariots/carts & ploughs), there's a variety of advancements you can't make. Likewise, if you don't have reading/writing, you're equally limited.

You can't do mass farming without high quality carts and ploughs, and you can't build schools without reading and writing. Without those things, you can't get very far.


> Aboriginals had no writing system so there would be no keyboards.

The Maya and Aztec had writing in the form of hieroglypics. The Inca had persistent communication via Quipu's rope knots.

(I learned this from _Guns, Germs, and Steel_ which is a phenomenal book. I haven't done other research, though, so maybe the book isn't a good source.)


Charles Mann's 1492 is a better book than Guns, Germs, and Steel for anything pre-Columbian Americas.

The Mayans had a complete, complex logosyllabic writing system. (I believe the syllabic components are more common than logographic components, but I'm not certain). Individual syllables (or logograms) could be combined into a single glyph block in a variety of ways. This writing system, I believe, is connected to Zapotec and epi-Olmec writing systems, but disentangling who created what and who borrowed from whom in Mesoamerica is challenging.

The Aztecs had what appears to be a proto-writing system, largely capable of only recording proper nouns (predominantly place names); most of the writing would instead be conveyed pictographically. Before the Aztecs, in Classical Mesoamerica, Teotihuacan (which was the major power in the Central Mexico Valley at that time period) appears to have never used any form of writing, despite having conquered Classic Maya city-states which were in full florescence of their writing systems.

Quipus originate at least as early as the Wari culture in the Andes, although (again) people only recognize the final Andean civilization, the Inca. Whether or not they are a writing system is debatable--it's known they encode more than just numeric values (such as place names), but whether they can convey enough information to be considered writing is unknown.

Post-contact, Sequoya developed a syllabary for the Cherokee language based only on the knowledge of the existence of the Latin alphabet (he couldn't read English or any European language, but he did have access to European-language materials--that's why several Cherokee letterforms look like Latin ones but have completely different meanings). Missionaries in Canada developed a syllabary for several aboriginal languages that remains in use by many Cree, Ojibwe, and Inuktitut speakers.


Thanks for the reference. Sounds like an interesting book! I see Charles Mann has a couple of books. Have you read 1493 also?

- 1491: New Revelations of the Americas Before Columbus https://www.goodreads.com/book/show/39020.1491

- 1493: Uncovering the New World Columbus Created https://www.goodreads.com/book/show/9862761-1493


It's on my to-read list, but I haven't made time for it yet.


I assume the person you're responding to is talking more about the aboriginal peoples of the United States.

GG&S is an interesting book, but extremely conjectural and ideological, and not well-sourced. Some of the evidence is distorted. Off the top of my head, he reproduces a table of grain yields, and when tracking down his sources for this, it turns out that he's omitted results that contradict his theory. The reasoning is sometimes shaky or circular: 'Why do we know X wasn't domesticable? Because it wasn't domesticated.' Etc. etc. I don't find his theory holds up particularly well.


[flagged]


The creator is Cree himself [0].

I think the storytelling aspect is what he's going for, not just to make C# but with Cree keywords. Baskets are weaved differently in different cultures. Is it possible to have a programming language that reflects a different culture or outlook? What would that look like? It's an experiment.

0. http://joncorbett.ca/default.html


Don't let past misuses of Cree culture by outsiders restrict their ability to use those paradigms as they see fit. Just because someone in Hollywood thought that smoke signals made a good trope for any movie involving first nations doesn't mean that smoke signals are inherently racist or offensive when used in an appropriate context.


Please don't take HN threads into flamewar.

https://news.ycombinator.com/newsguidelines.html




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: