Hacker News new | comments | show | ask | jobs | submit login
Coding - the new Latin (bbc.co.uk)
138 points by soitgoes 2062 days ago | hide | past | web | 72 comments | favorite

It is important, but I prefer Eben Moglen's analogy:

"Software is what the 21st century is made of. What steel was to the economy of the 20th century, what steel was to the power of the 20th century, what steel was to the politics of the 20th century, software is now. It is the crucial building block, the component out of which everything else is made, and, when I speak of everything else, I mean of course freedom, as well as tyranny, as well as business as usual, as well as spying on everybody for free all the time."


Check the total number of steel workers as percentage of the work force in the year 1900 to a hundred years later. Very much less.

My personal lesson from this, is that computer software jobs in general might soon have the hourly pay of PHP coders on elance. (Then the cycle of few people studying computer science will repeat... like ten years ago.)

What makes you think "soon"?

It's obvious that in some distant point of future programming will cease to be The hot thing to do, but what makes you think it might be soon enough for us to see it?

I'm not sure that steelworkers were ever "the hot thing to do." The downfall of the steel industry in the US is multifaceted, but mainly boils down to the fact that importing steel from outside US is cheaper than producing it in the US and technological improvements in building with steel has greatly reduced the number of people needed to construct things with steel.

Take from that what you will about the software industry, but just remember that everyone either gets replaced by someone cheaper or technology takes their job away.

Steel workers were never highly paid. Engineers were, and continue to be.

Who here studied Latin?

Well I did Latin for 6 years in High School (Sydney Boys fyi). I find this title "Coding - the new Latin" as a bizarre way of inducing people to learn to code.

Latin is dead. Very dead. Back in my day Latin was billed as "The Fastest Growing Language", but that never transpired.

Now if you want to read Caesar, Virgil, Catulus et al, then fine learn Latin. But Latin is no longer the ticket to an elite private members club that it once was.

Latin - is also a language that you exclusively read. There is so little creative element to it. One furthers oneself in Latin by re/interpreting existing works.

Coding - is a creative craft where you take instructions and make something that never existed before.

It would be better to strap coding onto something intensely creative:

  Technical Drawing
  Metal Work
  Fine Art, Sculpture

I believe you're missing the point. Here are some historical uses of Latin:

- Obviously it was the language of the Roman Empire;

- It is the basis for many European languages;

- It was the language of Christianity, the Papacy for thousands of years (tying in somewhat historically with the conversion of Constantine). This went so far as Roman Catholic services being held in Latin until IIRC the 1960s;

- Prior to the printing press, books were the province of monks and the like so being educated (in Western Europe) involved speaking Latin;

- In England, Latin was the language of court and the law for centuries. There are many Latin terms--even in American law--for a reason;

- Latin was the language of science. Element names in chemistry, species names in biology and so on have Latin roots;

- For centuries, Latin was the lingua franca of the ruling class of Europe.

The short of it is that being able to speak, read and write Latin in medieval and Renaissance Europe was power. More to the point, if you didn't know Latin, you were essentially excluded from many things (as many people were). Knowing Latin had as much power and influence as literacy in general.

So what the author is saying is the ability "speak" the language of computers, the ability to communicate with them and make them do things, is the 21st century version of the thrall once held by Latin.

Many have spoken about how expensive it was to launch a dot-com company in the 90s and how that cost has essentially decreased by two orders of magnitude in a decade. One might have thought this would make engineers less valuable as our skills would've been commoditized to some degree.

The converse has happened. While other costs have largely vanished, more businesses became possible that never were before. There seems to be an insatiable demand for engineering talent, at least for the short to medium term.

Last century the dream was the 21st century would be about space exploration and an otherwise "Star Trek" like future. Instead I believe the story about the 21st century will be what incredible things we do with computers. Genetic engineering, curing diseases, artificial intelligence, you name it.

This may go so far as to essentially stratify society. It's hard to overstate just how importance computers will be in the next century (IMHO).

"I believe you're missing the point."

"Coding - the new Latin" is a marketing slogan. And if it breeds misunderstanding, or has to be explained, then it's not a good marketing slogan. I design software and write code. My initial reaction to the phrase was "so you're telling people it's hard and they should give up before they get started?" After that, I decided I had to read the article to get the meaning.

Perhaps it's a dead ringer of a slogan in the UK. But it won't get kids a-codin' in my corner of the globe.

If it breeds misunderstanding or has to be explained to people who aren't in the target demographic, that's neither here nor there. They don't strictly need you to understand any more than they need cocker spaniels to. From what I can tell, the people they're trying to reach are British educators and education policy-makers, and if you're not one of them, I feel pretty sure their relationship with Latin is very different from yours.

It is a little harsh on me to say that I "missed the point". I was not short of understanding of the historical significance of Latin. I led with the fact that I studied Latin for 6 years. During this time all your reasons were expounded ad nauseam.

I want to avoid getting into any flamewar with someone whose StackOverflow karma is measured in exponents. (I made no point on the importance of training people to program computers, the second theme of your comment.)

What I was saying is that I believe Latin for one hundred years has involved nearly ZERO speaking.

There are some very scholarly latin speakers with an English/Vatican pronunciation divide. But they are the exceptions.

Yes there is Vatican City, but I think we can both agree that it would be best if there are more people who can program computers than the population of the Vatican (832).

I don't think the OP is comparing programming to Latin today, but to Latin during the Renaissance.

Yes Latin isn't spoken, and nothing is created in it today, but that wasn't true hundreds of years ago.

Programming is like Latin 400 years ago.

There are actually a fair few people who practice spoken Latin for pleasure. As for the pronunciation divide, that's a quirk we inherited from German.

I agree with your factual statements, but my immediate reaction to “coding is the new Latin” is “does this person mean that coding is a skill which is fairly useless for its own sake, but signals to other people that you have mastered a complicated formal system and therefore deserve admission to the club of the elite?”

That's exactly what I thought.

I also think you're missing the point, but not in the sense that cletus talks about. Learning Latin rewires your brain, i.e. it opens your eye to certain modes of communication and expression that you didn't know existed. I don't know Latin but learned Ancient Greek during my Linguistics MA, and was by how some Linguistics concepts are "implemented" in that language. My English monolingual classmates minds were blown by things like noun cases and the vocative. That is why you learn those languages (arguably Sanskrit, Chinese, Arabic, or any other language with a rich history also fits the bill).

In this sense, learning Latin or Greek is similar to learning Lisp: sure you'll not use it in your daily life (I know, many will differ on that), but, boy, does it open your mind up.

Unfortunately, these languages are taught by people who are generally English majors, not Linguistics or CS majors and so focus on the literature as opposed to cool grammar. The goal of my two semesters of Greek was just as a prelude to read Plato, so everything was jammed down your throat. I was surprised to learn that our TA, who was writing his PhD dissertation on Ancient Greek drama didn't know how to say "it's raining" or the name of the color green. Think of learning English just to read Shakespeare!

> Coding - is a creative craft where you take instructions and make something that never existed before.

Mathematics - is a creative craft where you take a formal system (a structure purely of the mind) and create something (a new theorem, an addition to the structure) that never existed before.

That is what mathematics is. That is why there are math majors. If it were all rote nonsense, computers would be doing all of it by now; computers are cheaper than people for rote nonsense, after all.

So, to tie this back to the topic, teaching programming correctly would be much easier if we taught mathematics correctly.

Making advances in math indeed requires creativity. But actually creating something new and expanding mathematics is a Very Hard Problem. It doesn't lend itself to "instant creation" like programming does. Sure, you can apply formulas in many areas of business and life, and you can create simple formulas for these things-- doesn't take a math major.

As to your last sentence, I completely agree.

When I think about all of the problems with the education system I am not sure that "teaching coding" would top that list. The bottom line is that coding is a profession which is not suited to everyone. I remember being a TA at university in the late 90s when IT was booming. Everyone wanted to learn to program. Once they learned enough to realize it meant sitting in front of a computer alone all day - most moved on to something else. Few people have the concentration / (lack of :-) social aptitude. I believe most people have the intelligence.

Basic computer skills need to be taught. Coding? It should possibly be a required semester in university for all majors. ...beyond that? Does everyone really need to know about recursion?

One proposal, which I'm somewhat sympathetic to, that a general cognitive skill of being able to think procedurally or computationally is important in the 21st century, even if the craft of being able to program is more specialized.

Some papers on that:




     Does everyone really need to know about recursion?
Does everyone really need to know the basics of calculus? Or how about genetics? Does everyone really need to know French? I had classes for the above in my high-school. In fact I can't think of any class I took in high-school that isn't useless to a large portion of high-school graduates.

And I can't think of many examples that are more useful than grasping the concept of recursion.

I think everyone needs to be at least introduced to those fields so they have some shot of figuring out what they might like while learning something that might serve them. Also, if you replace "calculus" with "statistics" and "French" with "English," I would say yes.

You're not dissagreeing with me here.

Maybe they do, or maybe they don't. In many cases, I'd argue that it's more about teaching how to think and learn, rather than "an understanding calculus will be required for everything you do in the rest of school / career / life"

Well, if judging from that perspective, I don't think how anyone can argue against teaching programming in high-school.

Remember that a lot of people who don’t code as a profession end up doing it as part of their work (just as a lot of people who do not have “writer” in their job titles, and who didn’t major in English, end up doing a lot of writing for their jobs).

One of the most horrifying pieces of Perl I had to work on was written by a biologist who knew just enough Perl to translate certain information from File Format A to File Format B. Or at least, he thought he knew just enough Perl....

A lot of businesses depend on complicated Excel spreadsheets whose authors don’t realize that they are programming.

> Does everyone really need to know about recursion?

Yes, and I'd argue indirection as well. If you teach coding without recursion, you're just teaching basic logic coupled with arithmetic, we have classes for that already. (As an aside, I think iteration should be introduced as a special case to recursion.)

The answer to your question is a very easy YES! Think about it: isn't your question isomorphic to "do all students need high school algebra", "if you're not become a historian do you still needs to know all those details from the history class" and countless others. Some of these questions do have merit (how many among the HN crowd remembers or cares about how photosynthesis works or sin^2 + cos^2 = 1) bu they miss the point: those topics are there to open your brain to ideas, mental weigh lifting if you will. In this sense you most definitely need to teach recursion and coding concepts.

Coding (especially for high schoolers) doesn't have to be "sitting in front of a computer alone all day", in fact if this is how it's presented that's totally the wrong approach. That comes in the pro stage. At he early stage you have to stress the creative and social aspects of coding.

Everyone who spends their day on a computer would get a productivity boost if they learned to code.

Everyone who spends at least part of their day interacting with programmable devices would have their lives improved if they learned to code.

Now... how many people in the industrialized world don't spend at least part of their day interacting with programmable devices?

> Does everyone really need to know about recursion?

Recursion is taught in gym class through the game of dodgeball. Consider the following C-like pseudocode:

    void throw(int *ball, int hits)
        if (hits < NUMBER_OF_PLAYERS)
            throw(ball + angle(), hits + *ball);
The question then becomes, do people need to learn specific computer languages, or is teaching the underlying concepts in abstract ways enough?

I thought recursion was taught in music class, through the song about the old lady who swallowed a fly.

I would argue that dodgeball is better expressed as an iterative algorithm. Consider the following pseudocode:

  def dodgeball(players, balls)
    until players.all? { |p| p.is_hit? }
      throw(nearest_ball(balls), nearest_player(players))
Most people think like this. Dodgeball teaches while loops, not recursion.

Edit: I began writing this before a different post appeared with the same point. Alas!

I'm sorry to be a jerk, but I found your code pretty tough to decipher. Here's my thought process:

  NUMBER_OF_PLAYERS could be, say, 6.
  the caller calls: throw(ball, 0);
  0 < 6, so throw(ball+angle(), 0+dereference(ball));
So, what is ball? It's either an integer and you're just passing its address around for no particular reason, or it's an integer array and you're using angle() to move around inside the array somehow? Then what exactly is stored in that array, the positions of people? Let's imagine it's 1s and 0s to denote a person or not.

After making that leap of logic, your code starts to make sense. You're adding "some amount" to the address, and then whatever is at that address (1 for hit, 0 for nothing) to hits, and passing them into the next invocation, as long as it's less than the necessary number of hits. Could easily be a while loop, but I'll get to that in a minute. I have some other points:

  - Ball isn't a great name for the list of the positions of the players.
  - It isn't clear that ball is an array.
  - There's a pointless 'else return'.
  - You never address the "I don't have a ball" situation!
This code requires explanations of arrays, pointers, and addresses before you can talk about recursion. I'd express how to play dodgeball to a new programmer as follows:

  throw(angle, enemy_positions):
    # This function takes an angle and a list of
    # the spots where other people are, removing 
    # them from their spot if it registers a hit.
    # Returns true if it's a hit, false if a miss.

    # Tries to find a ball; returns the number
    # which you found -- 0, 1, or 2.

    # Returns a list of the spots where other players are

  enemy_positions = load_enemy_team()
  players_on_other_team = enemy_positions.length()
  ball_count = 0 # all balls start in the middle!

  while players_on_other_team > 0:
    if ball_count > 0:
      angle = get_throw_angle() # user inputs this
      hit = throw(angle, team_positions)

      if hit:
        players_on_other_team -= 1 # they're down a guy!
      ball_count += find_ball() # reload
That should cover it - though I'm sure there's problems with my code too. :)

The biggest point I want to make is that this example is trivially easy to represent without recursion. While recursion can be used in place of any loop, a non-programmer's mind will likely arrive at iteration first for tail-recursive examples. And once someone understands how to solve a problem one way, teaching them a totally different way can be tough.

I personally think people do pretty well at understanding how recursion applies with respect to exploring a maze, which is almost as universal an experience as dodgeball.

> Ball is a terrible name for the list of the positions of the players.

From a code maintainability point of view, yes. I would never write code like this for a real application. However, for analogizing the game in code, I have to strongly disagree.

Ball is not a list of players. It is, just as in the game, a pointer to a location in space. Players may or may not occupy that space. If the player does, a hit occurs. The hit test is based on the intersection of the location in space and the occupation of that space.

I chose to represent my code in a C-like manner because it makes that spacial representation easy to write. People are not thinking about the game in terms of lists and objects, that I am sure.

That kind of attitude (i can/could program but it's boring sitting alone) will change soon enough when you can't feed yourself unless you program all day. The only debate is in how soon that occurs.

I think that's a horrible phrase. The first thing that comes to mind is: "Coding isn't dead!", then: "But coding has direct benefits, not hand-waving indirect benefits like 'you will get better at English!'" I get that it's meant to convey that if you can't code even a little, you're an uneducated person, much like in older times people who didn't know Latin (and Greek) were uneducated. That's a fine sentiment to have, I agree coding is more or less a fundamental skill these days that many are ignoring, but using Latin as the metaphor isn't the way to go. It needs to be as important as learning to write.

I think the metaphor goes deeper. Knowing Latin (and Greek) opened doors to greater learning, providing access to classical writing and history. Indeed, the Renaissance was built on Latin and Greek study. It also was a common language that united the educated elite across Europe.

That's true. Yet my issue remains: how many people know how important Latin really was for its lifetime? A good metaphor that you want to resound with the broad populace should resound with the broad populace. I don't think I'm alone in thinking "What's a dead language got to do with coding?" as a first thought, and I studied Latin for 3 years. As an example of something I think is easier (nearer) for people to grasp, let's add a new R! Reading, 'Riting, 'Rithmetic, and 'Rogramming.

Apparently Latin is a very popular subject in schools. IF that popularity is genuine, it refutes the idea that children are always looking out for the easiest option, which bodes well for CS in schools because (if it's done correctly) it will be challenging. Personally I have my misgivings about the phrase "The New Latin" but I have strong objections to the "Hey kids! Everything is fun, fun, fun!" approach which is so often the alternative.

I actually think Latin's popularity is, at least in part, due to its easiness, and at least in my former high school. I'm inferring to other schools since the AP test doesn't require speaking it. I mentioned in a cousin-comment that I took Latin for three years, I also took French for five. French was a lot harder. Merely speaking it and understanding when it's spoken back with different accents and different speeds is a lot harder than just reading or writing it, which is all I had to do for Latin. (We learned the simple pronunciation rules and read some things aloud but that's it.) Yet for the universities around the area requiring two foreign language credits from high school, Latin was good enough.

Latin was funner for me in some ways though since it has a pattern-matching or puzzle-solving feel to it at times with all the word endings and no reliance on word order. I don't think it's as neat as Arabic but it's a somewhat similar feeling. I agree that overselling things as fun is a great way to kill any fun that might have existed.

> Hey kids! Everything is fun, fun, fun!

Programming is fun if it's taught well, which is why it must always be taught poorly, if at all. After all, education and play are two mutually-exclusive concepts for humans, like they are for all somewhat-intelligent mammals.

I don't think the slogan is talking about the value of knowing latin today(or a few years ago), it's about the value knowing latin in the middle ages. In the middle ages(in Europe), knowing latin meant you had access to most of the culture that existed. Not knowing latin meant you didn't. In that sense, it would be awesome for coders if coding were the new latin. Alas, it's not. You don't need to know how to code in order to access culture nowadays. You maybe need to know how to sue a computer, but there's a long distance from that to knowing how to code.

I think it's a great analogy--there is little like programming for opening your mind up in certain ways. Big ideas from computer science--more than those from any other single subject--have changed they way I see the world.

Of course, I am highly involved in computer science, so its effect on me is disproportionate. However, I think that everyone would benefit from at least a cursory understanding of the big ideas behind CS, particularly abstraction. If I had to choose one, single concept that has affected me more than any other it would be abstraction. Apart from this, the highly logical programming mindset is also healthy. Some of the discipline that comes from writing your ideas in a form even a computer can understand is invaluable.

Additionally, programming is a creative endeavor with a very low barrier to entry: this is imperative for certain types of people. In my arrogance, I view myself as a relatively creative individual; however, I am also fairly lazy. I do not think I would have pursued engineering or art nearly to the extent I did programming (I was exposed to all three at relatively early ages) simply because they required so much more. To build something, I would need materials, tools and space; the same is naturally true of drawing a picture. To program all I needed was a computer, and since they were common by the time I was in elementary school, this was not an issue. This allowed me to make cool stuff without going out of my way.

Computer science really is something that opens the mind. I fervently believe it should stand with subjects like math and literature, not just for practical reasons but because it is immensely valuable for personal development.

This. I often get stuck in communication with people because I tend toward abstraction and they toward being more concrete-bound. I often get annoyed with people when they get hung up on stuff which is just irrelevant detail to me. "No, disregard that, it's not important to the general principle here, you see..."

But lately, I am gaining some appreciation for the detail-focused mindset. It's just two ways to look at things.

Why is it that coding is such a rare (relatively speaking) skill?

Sure, there are some hard concepts in coding, but for loops and conditional statements are hardly more difficult to understand than a lot of the math which gets taught in junior high.

Speaking of that, why is it essential for young people to learn algebra but not to learn for and if?

The simplest way would be to shove computing under the maths curriculum. Teach if with guard statements, and skip for, teach recursion. by the time the kids get to sixth form, have a mathmatical computation course where you teach them to apply their knowledge to a language like haskell that mirrors mathmatical notation quite well.

I would probably create a whole new class called "thinking 101" or something. It should teach:

-numeracy: how much is 1 million/1 billion/1 trillion, etc.

-common cognitive biases.

-logical fallacies.

-basic science.

-basic coding.

-De Bono's 6 thinking hats.

-how the brain works.

-meditation (ie not thinking).

-how to learn stuff faster and better.

-how to remember stuff: memory palace and the like.


I could think up a HUGE list of stuff in this genre that would be very helpful.

During sixth form college (ages 16-18, in the UK) one of the philosophy lecturers started a course with the same sort of goals. It covered critical thinking, written communication, logic, a bit of psychology etc.

The 12 or so hours I spent in those lectures were among the most valuable hours I invested in anything, ever.

Yeah, it's an obvious patch to society with tremendous potential for trickle down benefit. Why isn't it being done?

Honestly? It would likely be too much fun.

There is still, to this day, a large number of people who think education needs to be repetitive rote drill in order to be... real, or legitimate, or even 'useful' by some warped definition of that concept.

It's tied into the notion of hazing, or "If I had to waste my years in school tied to a desk memorizing stuff I don't use, so should you! Builds character!"

And, finally, the idea that if the next generation does it, too, maybe your time doing it wasn't simply wasted.

Sounds like you are reinventing the ExPhil:


Interesting. Do you have any idea what kind of topics the course teaches? Is there a list online?

I'd definitely be interested in reading the full list :) Any more suggestions?

It depends on what you are interested in. Thinking better in general?

Yes, definitely. I don't know the right term / description of what I have in mind though. Maybe "meta-cognition"?

Loving this idea.

I would probably just go with Python or something. Teach them variables, for loops, if statements, printing stuff to the screen, stuff like that.

That way they get a good notion of what programming actually is.

I'm only now getting into functional programming. I would not have grokked it very well when I was like 12 years old.

Maybe functional programming would be easier to grasp had you not learned imperative programming first.

I never had a problem with this, and my first few languages were BASIC, assembly and C. LISP came naturally to me a few years later.

I think the problem is that people are /used/ to imperative programs, and aren't given real practice in functional programming.

There are also FP zealots, some of whom maintain that learning IP first causes damage. I think this attitude is toxic and doesn't actually help their "cause" (as if good tools needed a "cause" -- good tools should just be good tools).

Well, FP is more elegant. But sometimes you just need a chainsword.

I will not be silenced by the functional agenda, dammit. Sometimes the right tool is imperative programming :p

Algebra is simple. It conforms to a small set of rules that are easily understood, often intuitive, and have few special cases. Problems are small, self-contained, and have clearly-defined parameters. They don't suddenly change on you halfway through the test, they don't have to account for unexpected input, and they don't have to interact with other problems.

Real-world computer programs exist in a large set of arbitrary, often unintuitive rules that are both absolute and malleable, and change between languages and platforms. They have complicated requirements that seek to solve real-world problems that are often ill-defined. They have to cope with unexpected input, hostile input. They are large, comprising hundreds, thousands, even millions of lines of code. They have a multitude of paths that a given piece of input might take, they display emergence.

Coding is not just a simple math problem. It requires clear, detailed reasoning about complex systems operating within arbitrary sets of rules and an unpredictable world.

Related questions might be "Why do people have trouble doing their taxes by hand?" and "Why do people have such a poor understanding of the legal system?". Also, chess.

Algebra is as simple as the syntax of a programming language. Applying programming to solve real world problems is as hard as applying algebra to solve real world problems - and there is also only a very small group of people able to do the latter effectively.

The syntax of a programming language is actually simpler than algebra. Algebra is not particularly intuitive.

And, indeed, that's part of the problem. You really can't claim to be teaching anything of value if all you teach is programming-language syntax. One has to teach people to write some kind of actual program. And that's like teaching prose writing, except that schools and teachers have spent literally thousands of years learning to teach prose writing whereas programming was invented in living memory.

My experience with IT education in the UK is genuinely atrocious. During secondary school we made a spreadsheet in Excel and a couple of Word documents. The closest thing to programming was a picture of traffic lights we had to "programmatically" operate. For a single hour lesson. And that is it. We were lucky if our computers even turned on.

The problem is that teachers are often completely computer-illiterate. This is a chicken-and-egg problem; you won't teach proper, interesting computing activities without competent teachers.

Although true in general, some subset of students will figure out interesting things, and then show them to others, if you at least put them in an environment where that's possible. My middle-school computing class was not particularly well taught, but the curriculum included a few simple things in Hypercard, and Hypercard was the kind of environment where students who finished the official assignments early could find all sorts of other cool things to do in it.

That does also require having free time. There's a trend lately towards assuming that any free time students have is wasted time in which they could be learning instead of goofing off, which I'm not sure is the right way to look at it.

We programmed abstract data structures in Pascal. I think we implemented quicksort. I guess I was lucky?

Fun story: The first time I was truly speechless in CS in high school was when my teacher, looking at my code, told me that, for a boolean variable foo, it's not necessary or reasonable to write

  IF foo = TRUE THEN // whatever;  "=" is comparison in Pascal BTW
but rather you can just write

  IF foo THEN // whatever
It seems trivial, even stupid now, but at the time it was an eye-opener.

I remember going a step further than using VBA macros in Access at least once...

Using LOGO in Primary School was far better.

I've spoken to both of my principals and my ICT teacher about introducing a solid computer science course and scraping "ICT". As a student and a avid programmer, I find it horrible that we don't get a chance to truly leverage the power of computers.

The reason I don't think CS is being widely adopted is because many students have been conditioned to accept ICT as a good computer course and have no clue as to what you can achieve with proper training and a computer. My classmates aren't interested in CS because they think it's way to difficult and has no tangible effect on society. On top of that, both of my previous ICT teachers had no clue how to program: those days were spent editing movies and making animations in Flash.

Glad to see the powers that be finally take some initiative to solve this problem...

And it looks like they've found what could be a great slogan for their campaign. "Coding is the new Latin," says Alex Hope...

To echo number of the other comments here (and try and respond to some of the criticisms of those comments):

I think that's a terrible slogan.

The point of a slogan is to be immediate - something appealing that gets the point across in a catchy and unambiguous fashion.

I think that the associations (and slogans are about associations) Latin has for most people is a language that is presently irrelevant and of no practical purpose.

It may have once been really significant, and it may have played an important role in the world becoming what it is today, but such details are not the things that the term "Latin" immediately invokes for most people, and what it immediately invokes is what matters for a slogan.

Being a great analogy does not make it a great slogan.

Ewnay atinlay? ixnay!

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact