Hacker News new | comments | show | ask | jobs | submit login
Google Go: A New Programming Language That’s Python Meets C++ (golang.org)
377 points by snewe on Nov 10, 2009 | hide | past | web | favorite | 219 comments



My word, a whole bunch of what I'm reading is exactly what I have/had intended to implement in my toy language, prism. Bit-specified integer (and real) types, increment/decrement operators as statements (prefix only!), hell, even using "var <X> ..." for variable declarations. The guiding principles are divergent though. And the bit about deducted type relations (http://golang.org/doc/go_lang_faq.html#inheritance) leaves me with mixed feelings.

I'm actually happy! It means my stupid little language has some validity :P

For those who wish to deride it: http://code.google.com/p/prism/


Well, in the interests of actually furthering discussion:

I feel as if string and map simply could have been part of the standard library. My solution to the this for prism was to have a section of the standard library (which doesn't exist yet) marked as "integral", to effectively have an implicit "import string, map from std.__integral__" at the top of every translation unit.

The absence of overloading operators seems to contradict their claims of improving the visual "flow" and feel of programming. There are cases where savvy operator overloading works wonders: vectors/matrices/quaternions, strings, complex numbers, etc. To be able to write my_string + my_other_string is far more elegant than strcat(my_string, my_otherstring). Consider what happens when you have four of five strings you'd like to concatenate!

The choice of CSP reflects Google's mentality: Their problems lend themselves very well to CSP. Other domains (in my opinion) work better with a STM model. But that's clearly a design choice, and I think it's fine!


I'm not a huge fan of "+" for string concatenation as "+" is, traditionally, commutative; I like Lua's "..". That being said I like "+" more than "strcat".


Operator overloading has good points and bad points: we're erring on the side of simplicity for now.

However, you can write "abc" + "def" in Go and it does what you expect.


=> "egi"?


Thankfully strings in Go are completely distinct from Arrays -- immutable, and with hard baked-in UTF-8 support.


I'm sorry if this is a stupid question, but where'd you get that? I don't get it. :(


"abc" + "def"

a + d = e (1 + 4 = 5 = e) b + e = g (2 + 5 = 7 = g) c + f = i (3 + 6 = 9 = i)

I think he was making a joke about how + with strings doesn't necessarily mean string concatenation.


I think the point is even better than a joke: there is a danger in allowing things like, e.g., operator overloading (or redefining methods in the standard library), because programmers have different feelings about what the code "obviously" means.

If in your language, + means string concatenation by convention, then + darn well means string concatenation. If in your language + means "contextually appropriate depending on what arguments you pass to it", then

"abc" + "def" => "egi"

doesn't strike me as obviously wrong.

More broadly, consider a language in which we teach initiates that "Plus adds things in the way you'd expect!"

map = {}

map2 = map + {"key" => "contents"}

What are the contents of map and map2 now? Uh oh. More worrisome:

map = MemcachedWrapper.new

map2 = map + {"key" => "contents"}

What just happened?


This whole line of reasoning reminds me of the Java3D API. Java, of course, takes the approach that only the designers of the langauge were smart enough to know when to abuse operator overloading (e.g. with strings), so for the Java3D math API, they couldn't have nice syntax for vector arithmetic. Instead of having nice operators for addition (+) and accumulation (+=), they just have english, and they want it to be terse, so the accumulation method is .add, and there is no addition method for vectors. It's not terribly confusing, but seeing vec1.add(vec2) rather than vec1 += vec2 is ugly and you still have to read the documentation to know what the .add method does.

Avoiding operator overload doesn't avoid poorly named methods, nor does it avoid the need for documentation. All that it does is force programmers to do ugly things when dealing with math, and there is a lot of math-based programming out there.


I'd like to make two points in response...

1) I did say 'savvy'. Sort of because I don't get to use that word enough, but also because it is possible to abuse operator overloading. String concatenation with the addition operator isn't going to suprise anyone, even if you come from another language (well, it might, but the choice of the + operator isn't an unintuitive one). Furthermore, nobody's jumping at things with well-defined operators, like vectors/matrices, etc. I've written very little outside of a maths library that used operator overloading, but it's a godsend for the cases where it's appropriate... probably a truism.

Secondly, they're just functions. You can look them up (y'know, assuming you can), you can optimise them, and you can document them. A programmer encountering an operator doesn't have to rely on their interpretation alone, and indeed shouldn't. They should know what that operator does. Now of course, if the operator has absolutely no contextual reason for existing, or is suprising divergent from its apparent purpose... well, see point 1.

As for your example, how do you do a greater-than-or-equal comparison on strings? :D


And he could have expected 18AB to point out that those strings wouldn't necessarily be interpreted as strings in a math context.


Oh, element-wise addition. Somehow I didn't figure that out. Thanks.


:P Nice one, made me smile


I always thought a huge benefit of lisp using S expressions is the lack of infix operators saves a lot of time and energy arguing whether/when they should be overloaded.


I'd have to disagree with the title "Python meets C++". There's very few features that have been brought over from either of those languages.

It looks like C with some very carefully chosen additions: memory safety, concurrency and maps.


I agree. From the FAQ:

"Go is mostly in the C family (basic syntax), with significant input from the Pascal/Modula/Oberon family (declarations, packages), plus some ideas from languages inspired by Tony Hoare's CSP, such as Newsqueak and Limbo (concurrency). However, it is a new language across the board."

Two of the designers were Rob Pike and Ken Thompson, both instrumental developers of Unix/C and Plan9/Inferno. Thompson created the B programming language that C was based on.


Don't forget Russ Cox, who I believe stuck with Plan 9 longer than either Rob or Ken.

I'm excited to see CSP in Go. I've been advocating its use for concurrency since I learned about Plan 9's channels, but most people paid no attention.


Russ still maintains 9vx and Plan 9 from User Space.


From the Tech Talk PDF, it is mentioned that Robert Griesemer, Ken Thompson, and Rob Pike started the project in late 2007. Ian Lance Taylor and Russ Cox joined in 2008.

And the goal, as they mention, seems to have the efficiency of a statically-typed compiled language with the ease of programming of a dynamic language but with -

Safety: type-safe and memory-safe.

Good support for concurrency and communication.

Efficient, latency-free garbage collection.

High-speed compilation.

They are aiming at a lightweight type system with no implicit conversions and no more 0x80ULL.

A good starting point is their FAQ page and also the tech talk at youtube: http://www.youtube.com/watch?v=rKnDgT73v8s


I'd say that name-based interface is indeed Pythonic.

> Any type that has the named methods [...] is said to implement the interface.

For instance, in the tutorial, the cat function takes a reader interface. fmt.Print and Println a type will output whatever its String() "method" output. This is what Python does, e.g. "print x" is eqv. to "print x.__str__()".


That's not a Python thing - it was present in Smalltalk, for example.


Well, if you're looking for features that Python was the first language to have, then I don't think Python has any "Python things". Garbage collection is still Pythonic, nonetheless.


Is it really useful or meaningful to claim every feature that Python has is "Pythonic", though? Are dicts "Pythonic"? Awk had them a decade earlier. Is garbage collection "Pythonic"? Lisp had it long ago. Dynamic typing? Generators? Etc.

To my understanding (and admittedly, while I've been using Python since around 1.5, I've largely abandoned it for Lua), 'Pythonic' is usually used to refer to the clearly preferred way of doing something that may have multiple implementations - Python's community making explicit the observation of conventions such as "for (i=0; i<N, i++) {" in C.

Using a dict rather than an int-indexed list is Pythonic, for example, because that's just the way that it's done in idiomatic Python, and the community encourages having one preferred idiom for common tasks. I don't think the concept applies to language features such as garbage collection.


A much more interesting addition is the interface type and values.


Syntax looks like Pascal meets C++


As any pythonista would say: "from __future__ import braces"


garbage collection also.. which is maybe the most "pythonic" thing they've worked in. look at all those semicolons and curly braces!


I had a friend who was worried about the overhead of interfaces (are methods called by hash lookup on name or by something like a vtbl?), so I compiled a simple example program to test how dynamic dispatch works: http://scott.wolchok.org/vtbl.go

The binary is at http://scott.wolchok.org/6.out . If you search for deadbeef, you'll find a sequence of instructions ending in an indirect jump, so it is decidedly not some kind of string comparison.

The function in the compiler that generates calls through interfaces seems to be cgen_callinter in go/src/{6,8}g/ggen.c, but I can't decipher the terse code therein. Would be interested to learn more about Go method dispatch, especially as regards to dynamic linking -- does Go have it? What if I create a new interface that encompasses objects in a library built before I created that interface? Do I have to rebuild the library?


I was ready to be dismissive until I read who was behind it. This is part of the Bell Labs legacy. Kickass.


I also suspected bell labs the moment I saw the page... not sure where that idea came from. Maybe from this? ;) http://plan9.bell-labs.com/plan9/img/plan9bunnyblack.jpg http://golang.org/doc/logo-153x55.png

[edit: creepy thought... Go logo == the Glenda bunny being thrown at something?]


Perhaps flying? :)


How come who was behind it is important but who says what on HN is not?

One could have extra information right beside each submitter/commenter's name (e.g., ... founder, Google employee, PhD in ..., etc.).


There are some usernames I will recognise, and their comments do mater more to me. Also, if someone does say something interesting, you can often learn more about them from their profile page should you wish to.*

And, yes, it would amuse me for a couple of days if everyone with/working on a PhD were encouraged to append their thesis titles after their usernames.

* though yes, for people I don't recognise who are not posting particularly interesting comments, it does not matter who posted them. Such as this comment I assume, anyone could write it.


Imagine someone commenting on Google but you don't take his/her comment seriously and post a reply indicating that this person is mistaken.

You later find out from his/her profile that this person is a Google employee and is in a position to know what he/she commented about. It might be too late by then to take back what you wrote.


Too late for what? This is the Internet. You're allowed to make an ass out of yourself sometimes.

I remember when I came on here and criticized Delicious and got in an argument with joshu because I didn't know who he was. If anything, I'd like more anonymity, because the people I do know I sometimes tiptoe around.


There's also follow through and prolonged effort which matters on a project vs. points for discussion. On HN, the idea is the point each comment makes should be considered on its own merits. On the opposite end of the spectrum; the project should stand on its own merits. However, those merits are likely to be derived from the choices of the person(s) working on the project, so the person(s) involved in the case of a not-yet-finished project is relevant.


You can say that who is behind it is important because you do not need to add extra information besides their names, their names stand on their own.


If a commenter on HN says something about Google, wouldn't you take his/her comment more seriously if you know that he/she is a Google employee?


Anyone could just say they are a Google employee. You'd have to solve that problem too.


As soon as I got to the channels portion of the tutorial, I knew it was Rob Pike's baby. He did a tech talk a while ago about parallel languages, which was very interesting http://video.google.com/videoplay?docid=810232012617965344&#...

EDIT: after watching some of the tech talk, it's evident that while Rob Pike is involved, it's not entirely his baby... He is however involved, and there are definitely ideas from newsqueak in the language.


You probably know what happened to NeXT legacy.. =)


It's tainted because of that. I became dismissive as soon as I saw who was behind it...


Intriguing. Mind if I inquire about your thoughts on transistors, another Bell Labs creation?


We're talking about the UNIX and C heritage. Hardware is irrelevant and I'm surprised you received any up-votes at all. Bell Labs may produce some good things, but programming languages isn't one of them.


Doubt it will be successful, no beards: http://www.youtube.com/watch?v=rKnDgT73v8s


Ken, iant and I have beards thank you :) Rob, gri, Russ and Kai are sadly lacking.


Any chance you could please ditch the brace requirement?

You're already marking up the code for humans using indentation, don't repeat yourself by having the compiler use a different parsing mechanism (and make developers mark up seperately for their colleagues and the compiler).


> Any chance you could please ditch the brace requirement?

Wearing braces is optional. It's only the beards that are mandatory.


Let's please not ditch the brace requirement.


Could you provide either some supporting arguments or a response to the parent post?


Sure. The 'indentation is enough' argument ignores that braces offer redundancy and are unambiguous.

They're like guard rails on the highway, they make totally explicit what is going on.

The 'but you're writing for other humans to read' argument imho is not 100% the truth, you're writing for both the computer and other humans to read.

Humans like the indentation, the braces make it explicit.

The computer doesn't care about the indentation, it just looks at the brackets.

Drop a bracket and you get an error, drop a couple of spaces and your program will happily continue to work, only not in the way you intended.

it's funny how even in languages that don't use 'begin/end' markers (such as python) there are extraneous characters, such as the () pair when calling a function (surely there are ways to do without that ?), or when making a list or a dictionary.

Some things come in pairs, and the beginnings and endings of blocks are such things.

In Pascal it was BEGIN and END, which seemed overly verbose, in C it was { and }, which seems about as minimal as you can go before you become invisible, and when you get invisible you lose something.

Try cutting and pasting some python examples from online fora and see how well it works without.

And try figuring out from a python program that has been created on a box with alternate tab settings what it's intended function was (yes, I know there are tools for that situation).

Begin and end block have meaning, they are 'there', and even if you don't strictly speaking need them they cost next to nothing and make me feel a lot more like things are spelled out to mean exactly what I'm reading.

I should probably start collecting python examples in the wild on the web where the indentation is clearly messed up so the code no longer works the way it is intended, this is a lot more common than you'd want. It is also very frustrating when learning a language.

So, please keep those braces.


Redundancy is an argument against braces, not in favor.

'They're like guard rails on the highway, they make totally explicit what is going on.'

Using this logic, perhaps you should add some ASCII art to your source to denote blocks a third time? This is also a common practice. Do you feel the same about duck typing?

'The 'but you're writing for other humans to read' argument imho is not 100% the truth, you're writing for both the computer and other humans to read.'

Yes, that's true the computer also parses the code. You still haven't provided an argument why using seperate blocking formats is required.

'they cost next to nothing and make me feel a lot more like things are spelled out to mean exactly what I'm reading.'

The cost is evident: there are many cases of indentation not matching braces and vice versa, and the inconsistency between these causing confusion between the language and the humans trying to debug issues.

Tell me you've never looked for an unmatched brace in code you can otherwise parse?


Redundancy is an argument against braces, not in favor.

I disagree. I know that the word "redundant" sometimes carries a negative connotation, but the truth is that it is a mere property whose value depends upon the context and associated tradeoffs.

Spacecraft have redundant systems, but those systems are far from superfluous.

Human languages have built in redundancies. Generally they are there for ensuring communication still happens in a noisy environment.

My music CDs have redundancy in the encoding, and thankfully at that: after years of accumulating scratches I could still rip them without loss of quality. Redundancy was a positive here too.

Redundancy is a tradeoff, not an automatic downside.


Agreed about context, but let's be clear: spacecraft and music CDs have redundancy to prevent loss.

What loss are you preventing by requiring blocks be delimited twice?

Have you considered that inconsistencies due to repetition are frequently themselves cites as a cause of data loss?

Ever had code that looks good to you but doesn't parse because a machine is expecting an extra bracket?


I'm no rocket scientist, but I'm pretty sure that not all redundancy in spacecraft is aimed at preventing data loss. If you look closer you'll see redundant gyros for attitude control, redundant sources of power (solar panels, fuel cells) and probably numerous redundancies in environmental subsystems on the ISS (air, water, waste), and even redundant explosive bolts for stage separation.

I think jacquesm already mentioned the value of the redundancy above: you will never have a situation where a change in indentation leaves you with a runnable, but semantically-different program when you pasted code from a forum or from a machine with different tab settings. Note how the missing-bracket case in your last question is superior to this: a missing bracket offers fail-fast detection of the problem. If you indent a line wrong your program may still run, but differently: indent the last line of a block at the wrong level and it may happen after a loop rather than inside a loop.

The indentation is for the person. The braces are for the compiler. I like this, personally. I understand how some may prefer the python way (it has a superficial elegance, after all). But I personally don't like the downsides that jacquesm mentioned.


'you will never have a situation where a change in indentation leaves you with a runnable, but semantically-different program when you pasted code from a forum or from a machine with different tab settings.'

In that situation, with any computer language, you'll have an unreadable program. In Python, you'll need to fix the readability before the program will execute - which is a win to me.


I'll admit I don't know what magic the Python interpreter is capable of. I was imagining cases where an indentation change does not break readability:

    for foo in bar:
        foo.fizzle()
    baz.wibble()

...versus...

    for foo in bar:
        foo.fizzle()
        baz.wibble()
...how would it know that I wanted to wibble the baz every time I fizzle a foo, as opposed to wibbling the baz only after all foo have been fizzled?


Because you've told it to by making your data look correct.

Python executes your data the same way you yourself parse it: by how it appears. Brace requiring languages execute your data using different rules than you use to parse it.

Redundancy of logic is universally known to be a poor engineering practice.


Tortoise: I'd like a bicycle helmet, please.

Achilles: Why do you want to wear a helmet?

Tortoise: So that I won't hurt my head if I bump it.

Achilles: You won't bump it.

Tortoise: Suppose I do.

Achilles: You won't bump your head.

Tortoise: But suppose, hypothetically, that I fall off my bike and bump my head. What would protect it from injury?

Achilles: Your head is not injured, because you didn't bump it.

Tortoise: But that's the very basis of my hypothesis!

Achilles: Yeah, I'm going to just ignore you said that, and tell you again that you won't bump your head.

Tortoise: How do you know?

Achilles: Because you made sure that it didn't come into contact with anything.

Tortoise: Good night, dear Achilles. May you encounter others like you.


Sorry poor Tortoise, you're putting words into Achilles mouth.

Fortunately, Python gives you:

* A helmet. It won't parse unblocked code like any other language.

I'm not sure why you think blocking a second time with brackets would improve things or avoid this risk. But hey, if you want to wear a second helmet, that's cool. May you encounter others like you. :^)

* A mirror where things are as large as they appear (the blocking looks like it executes), unlike other languages where these can get out of sync and cause problems. If you enjoy this risk, that's awesome. High five.

* A handy warning light should you accidentally stray too far to the wall (Python will telling you that your readability is broken at the same time it tells you your blocking is - and you only need one fix). If you'd prefer not to be told about this, two places to fix things, and enjoy receiving badly written code, that's your prerogative and who am I to say anything other than that it's not mine.

Achilles: Nice bike and helmet.

Tortoise: No!

Achilles: Er, ok. Could you explain?

Tortoise: I need to get a helmet!

Achilles: You're already wearing one.

Tortoise: But what if I fall?

Achilles: You're wearing a helmet, which will protect you.

Tortoise: But I need to get a helmet.

Achilles: No you don't, you're wearing one. Oh and by the way, you need to be visible at night. The helmet's reflective, so you probably don't want to cover it with another ...

Tortoise: You're not listening to me!

Achilles: I am listening, but I'm also disagreeing. Are you listening to my disagreement?


I was going to answer you until I got to the 'ascii art bit', clearly you are not taking this serious.

I apologize for wasting your time with my attempt at putting in to words what I was thinking, at your request no less.


I am taking it seriously, else I wouldn't have bothered replying to a blank 'i disagree' post with a polite request that you contribute to the discussion. The question is quite genuine - where in your view does the need for redundancy stop?


Braces have a very practical level of redundancy, they cost literally next to nothing in terms of effort and reduce a whole class of errors to 'impossible'.

Let me give you a simple example, in two psuedo languages, one with and one without braces:

   if a > b
     a = a + 5
     b = b / 2

   if (a > 0) {
     a = a + 5
     b = b / 2
   }
vs

   if a > b
     a = a + 5
   b = b / 2
or

   if (a > b) {
     a = a + 5
   b = b / 2
   }
Even though the text is similarly corrupted (for whatever reason) the redundancy in the second example makes it clear that something is amiss, you could even lose the second brace and still at least know that all is not kosher in that segment.

That's not 'ascii art', that's functional.

Redundancy definitely has it's uses.

I'm going to do some digging later on and find a bunch of examples of broken code due to cut & paste problems, some of it on the django site, no less.

That's something that would never happen in C or any other 'overspecified' language.

And regarding 'finding that matching bracket' that is exactly what they're for, if you can't match the bracket chances are that you really do have a problem in the code.


If you've spent any time with Python, it should be immediately clear that:

   if a > b
     a = a + 5
     b = b / 2
is quite different from:

   if a > b
     a = a + 5
   b = b / 2
Without needing a second markup for the compiler to emphasize the point.


Of course it is clear that they are different, that was not the point of the argument...

The point is that you can't know which one was the one that was meant to be written there in the first place, and all that changed is some literally invisible characters are gone.

It's like with comments and code, if a comment says 'add two volumes' and I see that the code does not do that then I have a hint that something is not correct.

It could be the comment or it could be the code, but there is something smelly there.

Redundancy = good.


'all that changed is some literally invisible characters are gone.'

Er, no. That's not the extent of the changes at all: the very visible code has shifted to the left. I refuse to believe you cannot see that with your own eyes.

'The point is that you can't know which one was the one that was meant to be written there in the first place'

In any language at all, the lack of indentation makes it quite clear the intention is different.


You've lost sight of jacquesm's premise, which is when 1) the intention is not accurately reflected by the indentation, and 2) does not break the syntax of the language.

You asked what benefit might come from the redundancy of a curly brace, and this was a scenario in response.


When the intention is not reflected by the indentation, you need to fix that, in every programming language.

Python also has the handy benefit of telling you this immediately.

jacquesm's statement about the visible difference is obviously false.

However I don't think it's doing anyone any good to keep contributing on this topic. All I was doing was making a suggestion to improve something. The troll who posted 'no' with no arguments has a bunch of supporters, I'm being moderated down to nothing. You're not responding to my points, and you seem to think I'm not listening to you, though I'm trying to understand what you're saying. I give up, this isn't worth it.


I'm only guessing, but the downmods may actually be because of disingenuous rhetoric. Example: "you need to fix that, in every programming language."

This problem only exists if indentation is significant in the language as a way of demarcating blocks. This is exactly the reason for the suggested syntactic redundancy. A reasonable person can accept the tradeoff, but it's not reasonable to insist that the tradeoff isn't there at all.

Edit: the proposed scenario is readable code. It is, in fact, disingenuous to ignore that. It isn't candid to claim apathy, either.


Needing to have readable code is a problem is every language.

If you find that disingenuous, fine, I don't care anymore.

Edit: you consider badly indented code readable? You must be great fun to work with. :^)


In my defense, Python would agree with me that it is readable: it would happily execute the line in the wrong block scope because of a single, invisible character was present or missing.

I bet you meant "semantically correct" instead. Yes, every program's semantics need to be correct in order for the program to be correct. That's a tautology, and it's a deflection from the subject of the tradeoff analysis that you requested upthread.

You're probably a bundle of joy to work with, too. :^)


No, I mean readable. For example:

   if (a > b) {
     a = a + 5
   b = b / 2
   }
...is not considered readable brace-requiring language because it's appearance doesn't reflect what's executed.

Ie:

* Python and similar languages would execute the code consistently with how it looks. This is good.

* A brace-requiring language would execute the code in a way that is not consistent with how it looks. This is not good.

Scroll up: I didn't request any analysis, I made a suggestion. Someone posted a blank 'no', and I asked them for a more civil explanation of their views.


...and I asked them for [an] explanation of their views

Yeah, that's the request I was referring to.

Python and similar languages would execute the code consistently with how it looks. This is good.

It's good and bad. You win something, you give up something. This is the nature of a tradeoff. It's not like Python lives in an alternate universe where tradeoffs don't exist. But I'm beginning to think its advocates live in one where tradeoffs are invisible in broad daylight.


'Yeah, that's the request I was referring to.'

Having an opinion thrust at you does not equate to soliciting said opinion.

It's your responsibility to clearly state the benefits of what you're advocating. You don't seem to be able to do that.


Is it possible for you to have an argument without resorting to the personal ?


This thread started with a contentless troll post. You've said things in this post that are provably untrue, and the other poster has accused me of being disingenuous. You've also moderated simple, globally accepted truths like 'unreadable code need to be fixed' past zero without replying.

I suggest you examine your own courteousness.


it's funny how even in languages that don't use 'begin/end' markers (such as python) there are extraneous characters, such as the () pair when calling a function (surely there are ways to do without that ?)

The () when calling a function with no arguments is not extraneous -- because Ruby lets you omit that, it needs yet another function type to disambiguate when you want a reference to a method (total of 4: Methods, Procs, Lambdas, and Blocks).

What is completely superfluous in the syntax is the colon preceding an expression suite -- you're already using a keyword and indenting the next line(s).


The only thing I HATE about Python is the mandatory indentation. The problems with it are:

1. It mixes format with logic. Don't we all know this is one of the greatest evils one can commit?

2. Indentation is a bitch to parse compared with braces. Violates occam's razor of software of engineering -- simpler is better. Unnecessary complexity is the second greatest evil in software engineering.

3. Indentation is neither symmetrical nor logically unique. Therefore you can't delete all the formatting from a python program and expect a formatter to get it right. With symmetric braces a formatter always knows exactly what logical level it should be on by counting the number of closed braces; not possible with python's goofy system.

4. All that tiresome debate and worry over tabs versus spaces.

5. One can't really say "braces are redundant not indents" or "indents are redundant not braces". Redudency is commutative. I would say braces are for computers, indents are from human; it is simple to go from braces to indents but not so in the other direction.

Just a note -- I will always pick Python over any other scripting language. I just think the indentation is a serious mistake; there are so few other bad choices it tends to irk me that much more.


Indentation is logic. Repeating logic is one of the greatest evils one can commit. http://en.wikipedia.org/wiki/Dont_repeat_yourself

Re: parsing, read each line, if the indent is greater than the current indentlevel, add line to a new block, else, add line to existing block.

Tabs were considered harmful before Python, they're still considered harmful now.

You could add braces to Python quite easily, I think this has already been done in jest a few times...


My main gripe with Python is that it's impossible for autoindenters to work correctly. Consider this code:

    if x == foo:
        # do things
        if y == bar:
            # do something
        # do other stuff <-- at what indent level should this line live?
    # get on with life
You and I know what that means given its indentation, but an autoindenter can't know what to do with the fifth line. It gets worse when you're moving blocks of code into different indent levels.

Now, it's true that in this simple case, you could rewrite the snippet starting with:

    if x == foo and y == bar:
        # ...
    # get on with life
but when blocks get long, this becomes less practical. And your autoindenter still doesn't know when to stop indenting and get on with life.

As a result, in my Python code, I use otherwise gratuitous pass statements all over the place–ugly, but it does the trick.

    if x == foo:
        # do things
        if y == bar:
            # do something
            pass
        # do other stuff <-- at what indent level should this line live?
        pass
    # get on with life
My hypothetical ideal language (the one that's not lispy in outward appearance, anyway) would use braces but would also not run if any inconsistencies of indentation are found.

This would be allowed:

    if x == foo {
        # stuff
        # more stuff
    }
But this would not:

    if x == foo {
        # stuff
       # more stuff <-- BAM!
    }


http://golang.org/doc/go_faq.html#Concurrent_programming "Do not communicate by sharing memory. Instead, share memory by communicating."


Chrome. Closure. Go.

It says great things about Google that these all clearly started as projects that they assumed nobody would ever have to Google. (Although http://www.google.com/chrome is now #1 for [chrome]).


Is there a reason why so many languages are nearly impossible to search for? (C, C++, Go, etc.) I mean, surely someone must have figured out that this is not the best naming trend quite a while ago.


Our kind is so dominant on the net that our interests eclipse the original concepts. Try googling Java, Python, Lisp or Ruby - not much about Indonesia, snakes, speech impediments or gems to be found.


Rich Hickey picked the name Clojure partly because the name is easy to search for.


Perl was chosen in part for the same reason. Long before Kibo made searching for your name on usenet famous (how many here remember Kibo?), in the early days (late 80s until the volume got too high) Larry Wall used to monitor all mentions of Perl on Usenet.


Not only do I remember Kibo; I was killfiled by Kibo ...

(I feel old, now. Time to think about learning a new language.)


I haven't heard that name in a very long while.



Have you ever tried to find anything about Io (http://iolanguage.com) through google? Impossible. Any projects I think up in the future are going to have to be named from this angle: how easy will it be to google for?


"io language" or "io programming language" work just fine. Of course I don't know what I'm missing, but I assume everyone who writes about Io uses one of these words in their text.


Those are nothing compared to d and r (relatively unknown) and my favorite: io.


funny, I can always find malbolge.


Searching "go programming language" (without the quotes in the search) seems to work for me.




Hem... at one point you will need to search for more specific things.

"package management for go programming language" or "XML parser go programming language" probably won't be so useful, and it is sure cumbersome to type.


Maybe its a branding exercise in disguise. You can't search for ["Chrome", "Closure", "Go"], but for the first two you quickly get what you're looking for if you associate the company name: "Google %s".


It's legally the safest and easiest route. If your company name is already trademarked, you don't need to worry about someone tacking a common word on the end and stealing it. This is why so many Microsoft products have such generic names -- and all of them include at least one prior trademark. Google is at least trying to keep a flat namespace instead of Microsoft's nested hierarchical mess. If nothing else, it would help prevent duplication of efforts because of potential naming conflicts.


> This is why so many Microsoft products have such generic names

Actually, in their case, I think it's more that they want to choose names are a mix between commonly-used word and one that is similar to an already existing product. Wordperfect --> Word, Netscape Navigator --> Explorer, OO.o XML --> OOXML, etc.


One more - jscript, which happens to be very similar to JavaScript, but has a number of differences.


Yup. Their standard "Embrace, extend, and extinguish" routine.


Hm, it could be a self-challenge - make web search good enough to know what I'm talking about when I look for Go?

I'd expect: the game, then the language. Certainly not go.com


Since I saw this the other day:

http://duckduckgo.com/?q=go&v=


It seems another programming language named Go! already exists:

http://news.ycombinator.com/item?id=935545


Ok, a language for engineers not for people that want to be truly expressive with the code they write. This is my impression when looking at the examples, they have something odd in the way the code looks like.

Now from the technical point of view, there are a few interesting points like the threading approach, but in general the language does not appear to provide great new ideas, nor to be a language where consolidated ideas are put together very elegantly.

It seems like that for system programming the "better C" that one could need is different than this, while for general programming what is really missing is something like Ruby and Python but made as fast as C and with low memory usage by mean of doing the right sacrifices to the language, but still retain most of the abstraction level.

Go is not as powerful as C++, not as high level as Java, not as raw as C for low level stuff. I don't like it.


Neither C++ nor Java implement Hoare's CSP, or have any primitives fundamental enough to implement them in terms of.

Go is exactly what you're asking for in your third paragraph -- it seems you were blinded by the curly braces...


Yes CSP is a good abstraction that Go implements, but programming languages are not just a matter of features like call-cc, tail-calls, closures and so on. This are very important things, but it's not enough.


The language breaks with a tradition established in the past 15 years, which says that every new language must be at least a few hundered times slower and less memory efficient than everything that came before it (I know there are exceptions)

Go is simple and fast. It has garbage collection, it facilitates parallel programming and it does away with lots of useless OO cruft. At first sight I like it.


  hg log --rev 0:4 --template '{date|isodate}\t{author}\n\t\t\t\t{desc|firstline}\n\n'
  1972-07-18 19:05 -0500	Brian Kernighan <bwk>
  				hello, world
  
  1974-01-20 01:02 -0400	Brian Kernighan <bwk>
  				convert to C
  
  1988-04-01 02:02 -0500	Brian Kernighan <research!bwk>
  				convert to Draft-Proposed ANSI C
  
  1988-04-01 02:03 -0500	Brian Kernighan <bwk@research.att.com>
  				last-minute fix: convert to ANSI C
  
  2008-03-02 20:47 -0800	Robert Griesemer <gri@golang.org>
  				Go spec starting point.


AT FIRST GLANCE:

Annoyances: braces; variable declarations more verbose than in C

Flaws: no discriminated unions; no generics or parametric polymorphism

Doubts: no exceptions; no macros (as far as I can tell)

    Edit: I take back the verbosity claim. := does inference
    Edit2: interfaces replace generics, but are 
        type-checked at runtime, apparently:
        http://golang.org/doc/go_for_cpp_programmers.html#Interfaces


No macros indeed. Unions types will probably happen sooner rather than later.


No macros is pretty annoying, but note the package go/ast and go/parser; they include in the standard library a parser for the language, which should make writing tools operating on Go significantly easier.


Yeah. Languages with no control flow other than "call" and "return" were ok for the 80s... but we know better now. Just add continuations and you get functions, coroutines, exceptions, loop exits ("last", "next", etc.), and so on for free.

I am not sure why I'd use this language with no libraries when I could use a better-designed language with no libraries instead.


There are other control flow options available. For example they offer the following simple example of what they call goroutines:

    ch := make(chan int);
    go sum(hugeArray, ch);
    // ... do something else for a while
    result := <-ch;  // wait for, and retrieve, result
You can actually set up any complex mess you want of functions talking to other functions through channels. This gives you coroutines, threading, etc. Looking through the spec, it looks like they have anonymous functions and closures as well.

While I agree that they need to add exceptions, they do have some interesting things in there already.


It ain't CPAN yet, but I think "no libraries" is an over-statement: http://golang.org/pkg/


On Edit2: interfaces don't replace generics, they're more like a strong duck-typing system (if that makes any sense).

Generics are being worked on (according to the tech talk), but they haven't determined the best way to do them yet.


Exceptions are definitely needed.


"Go has its own model of process/threads/light-weight processes/coroutines, so to avoid notational confusion we call concurrently executing computations in Go goroutines."

I like this not just because it's CSP reborn -- new languages should be introduced with their own delightful disambiguating nomenclature.


I like it because it rhymes.


Garbage collection, type inference, built-in maps and arrays, immutable strings, concurrency support. Very nice!

Best of all, no header files!


As a random aside: it's possible to build C++ projects without header files, and get your source files to automatically work out their own dependencies and all that good stuff.

I did it with a horrible C++ preprocessor hack. It was fun.


And for C, see makeheaders http://www.hwaci.com/sw/mkhdr/



Despite the large amount of enthusiasm for language design, modern mainstream programming languages don't fall far from the C tree. The best that Microsoft, Sun, and Apple have to offer are just variations on that theme, with the addition of predictable object models and conveniences like garbage collection.

Hey, give MS some credit for F# here! And it's got out of the box support in Visual Studio 2010.


Nice article, but calling Go not innovative because it has a C-like syntax is a bit naive.


Journalists journalist, coders code.


There's a shitload of coders who write well.

* Kathy Sierra

* JWZ

* ESR (okay he's a bad coder, but still)

* Zed

* _Why

* The MIT Introduction to Algorithms instructors who clearly spend time making their lectures entertaining as well as informative.

* And yeah, most of the staff at Ars:

DRY dictates not having seperate markup for the compiler and your colleagues - so yes, in that respect, the unnecessary braces in Go are not innovative.


"Journalists journal, coders code." FTFY


It seems like Ars Technica also have their own shortened links...

http://arst.ch/9uw


You have to love a project where the designers are sitting on irc fixing build bugs as they come in. (No, this isn't unique for opensource projects, but it's always great to see.)

Thanks for giving me something to do with all my free time for the next couple of weeks.


For my doctorate I did everything in CSP and Occam, so it's wonderful to see the channel communication methods defined for this language.

http://golang.org/doc/go_mem.html

If you do things via synchronized channel communication then it eliminates (mostly) the need to spend all your time with mutexes and the like because the channel sync does that work for you. At the beginning it'll probably take people time to get there heads around it, but my recollection was that it was a game changer once you understood how the synchronization worked.


I use C when I have a program with tight timing and/or memory requirements. I see that this is a GC language, so am I right in assuming that it's not suitable for soft or hard realtime systems?


Most of Google infrastructure is soft realtime so GC delays have definitely been a concern. The memory footprint for a simple hello.go on arm using 5g is currently ~70k with the optimizer not yet enabled. I'm planning to do some work on a AT91SAM7 with 256kb of flash and it should fit OK. You might be able to squeeze it into a smaller space if you drop some of the runtime.


Assuming the OS threads/preemptive multitasking are playing nice (i.e. letting me devour timeslices at my leisure), how well do you think the GC would work when I need to land events in a 15 microsecond window at 40hz on a normal desktop PC? I'd be crunching a decent amount of math and discarding/creating objects between events, but total memory utilization would not be going up or down.

Sorry if that's kind of specific, it's some of the limitations in one of the current problems I'm working on :)


Given the realtime GC is not done nobody really knows. However, the design goal is to have the GC interrupt time to be linear to the stack of your go-routine (with small constants). Thus you should have at least some control over event latencies through program structure.

Like I said earlier, there is significant interest inside Google to make it possible to write really low latency services in Go. I personally think 15 usec is reasonable to ask for.


That's very very cool, thanks for the info


GC pauses are current unbounded, so I wouldn't use it for that sort of thing, no.


Right on, good to know, thanks :)


Erlang is used for 'soft real time' applications, and it does GC.


I wonder if Go is what will "replace" Python internally at Google? ( w.r.t. http://news.ycombinator.com/item?id=933493 )


I can't find any sort of reference to a Google sponsorship. Am I missing something?


We (the Go team) all work at Google :) See also http://www.youtube.com/watch?v=rKnDgT73v8s


Thanks Adam.


It's a very promising language, and the tutorial's not bad.

The main quirk I see right now is that the syntax feels more like a shuffling of characters, rather than actually simplifying anything. For instance, they save parentheses in for loops (great idea!), yet suddenly I need a useless "func" before every function. It also seems like there needs to be a default type for arguments, either "int" or "string", for it to feel truly simplified.

A feature I would really, really like to see in a language is the notion of automatically printing lone strings, or automatically printf()-ing lone string format lists. It's a little irritating after all this time that one must "echo" or "print" or System.out.println() or fmf.Printf() everything (much less import standard libraries to do it), when it is such a common task.


wow... first impression is this is my dream language... Can't wait to start messing with this...goroutines yay...I could be wrong but this looks really nice.


The home page touts "safety" as a feature the language but all it takes to crash the program is concurrent access to the builtin map: "uncontrolled map access can crash the program", according the faq.

Interesting so far besides some syntax ugliness and lack of macros, but not yet good enough for many system programming projects, especially given the state of the garbage collector. GC is the hardest thing to implement large-scale, concurrent, memory intensive applications (e.g, database servers). Java spent 18 years to perfect its GC to be the current state of the art, which still sucks for these applications.

I hope they can come up with a concurrent garbage collector that can work on > 16 GBs of RAM with sub ms max latency :)


It is intended to be absolutely safe: dereferencing an invalid pointer doesn't result in "undefined behavior" -- it results in your program halting every time. You can't accidentally succeed or allow for buffer overflows, which is all that's important (especially since Go doesn't have exceptions).

Hopefully they can rip off the design of GHC's amazing parallel garbage collector (it runs in its own pthread), and ditch the current mark-and-sweep.


Maybe if we're lucky Azul will go broke and Google will buy up the talent.


Looking good. I always thought something like this would be LLVM based, this is surprising.


They said in the FAQ that they "felt" LLVM was "too large" and "too slow". I would have liked to see this statement backed up with data.


link to Ken Thompson's previous c-compiler paper http://doc.cat-v.org/plan_9/4th_edition/papers/compiler search for 'Performance'


Yes I saw that (and took their word for it). It is interesting that they seem to deeply care about compile time speed (I guess given Go is a fresh start in so many ways, people thought it would be nice to have super fast compiles for quick turn around and testing).


After reading "The Case for D" a little bit back, what these leaves me wondering is not, "Why this over C or C++?" but "How does this compare to D?"

After a quick glance, D piqued my interest more. D seems to fall a bit nearer to the C++ side of the fence than Java, but also boasts C++-level performance and relative safety.

http://www.ddj.com/hpc-high-performance-computing/217801225

Neither of them will replace C or C++ in the near-term simply because those languages are so entrenched, but if I were looking for a language just to play around in for potential future use from these families, at this point it'd be D.


Looks interesting! I feel that an improved C is what's mostly lacking in the programming language landscape nowadays.

So now Google's got its own OS (two actually), browser, and now its own programming language. Anything missing yet? Their own editor / IDE?


They actually maintain at least four browser codebases: Chrome, Android's Webkit, Picasa (uses Gecko), and their server-side renderer.

They have at least three product OSes: ChromeOS, Android, and The Google Search Appliances.

This is at least their second new programming language project: there's also Rob Pike's Sawzall. They have numerous in-house implementations of existing programming languages: V8, Closure, GWT, Dalvik, Unladen Swallow, plus all the extant ones they maintain forks of.


May I ask you? What do you mean by improved C language? Safe pointers? I think this is the biggest advantage of C.. Of course, it can(and occasionally it will) become a trouble at some point, but you won't get that much flexibility and portability in any other language. Not trying to say that this is a bad idea to create a new language or anything, but I just don't see how this language will make my life better..

P.S. No offense, but it's amazing how some people are such Google fans.


Bram Moolenaar (vim) is a Googler.


...and has his own experimental C language replacement: http://www.zimbu.org/


It's like several people got the same idea... http://ooc-lang.org/ is also an improved C really.


Their own editor/IDE?

I wouldn't imagine this as something that they'd release.


Oh, you never know. They release a lot of the not-business-specific infrastructure they improve, which an editor undoubtedly is.

I'd argue their secret sauce, and the parts they wouldn't want to release, are - the insane scaling stuff - all the intelligent things they do with the insane amount of data they have, and the insane rate at which they can process it


A lot of this looks very nice. One very simple thing I noticed was missing was a binary integer literal syntax. With a systems programming language, it would be nice to be able to write bit sets without shifts sometimes. Just use the obvious '0b10100101' syntax.

Also, I think the zero-prefix convention for octal literals is very confusing and possibly error-prone, but it seems to be used even in modern programming languages. Is there a good reason to keep using '0765' instead of something '0o765'?


"However, in Go, any type which provides the methods named in the interface may be treated as an implementation of the interface"

I've been wanting this in C# for a while now. Maybe one day...


Yeah, I'd kill for Duck Typing in C#.


IronPython is duck typed and runs on the DLR. The same should be possible for a future C#...


I never knew what duck typing referred to. Thanks.


(possibly redundant)

It's based upon the phrase: if it looks like a duck and quacks like a duck, it's a duck.

In this case, if the object has the methods of the duck interface, treat it like a duck object.


Some of the style reminds me of Alef from Plan 9; in particular the communication channels, but also a fair chunk of the syntax.


That will be because that's what it's based on.


It does certainly smack of "Not-Invented-Here Syndrome," but compared to the Dilbert like project management culture of my job I rather admire it.

I've just been getting in to generic programming with C++ (Stepanov's new book, awesome). So the fact that they don't have templates or operator overloading (the two critical pieces to STL style) was a bit of a downer. That being said the interfaces concept reminded me of generic functions (in CLOS). Sounds pathetic but I probably will give it a try just because it's from Google -- but I'm quite skeptical. With the really interesting work going on with Haskell, Clojure, Fortress, etc. it seems a shame to stay so close to C.


Have a look at some of the names behind it and see if you stick to your 'Not Invented Here'.


I don't think you understand what "Not invented here syndrome" means. See: http://en.wikipedia.org/wiki/Not_Invented_Here

What I mean is that Google always has to do things the Google way whether that's an improvement or not. Whether this turns out to be a valuable contribution to the field will play out over the next few years. We'll see. But a company that makes their own DB, their own phone, their own programming languages, etc. is pretty much a prototypical "not invented here" company.


I was familiar with the term, but thank you for your attempt at educating me.

Google has some unique problems, they also have a unique position in that they probably have pooled the largest number of IT talent world wide.

That gives them clout enough to do these things, just like Erlang came out of a phone company (why, couldn't they have used some existing language ?) and Apple made their own phone (I'm sure they could have afforded a Nokia or two) google can and does have the budget to branch out in other fields.

That's not 'NIH', NIH is doing something even though better alternatives exist, for instance, writing your own bookkeeping software when you're a mid-sized company and you should be spending your time on serving your customers instead of writing bookkeeping software.

Google even makes their own servers, at their levels of scale a lot of things that do not make sense for smaller companies actually improve the bottom line.

Google suffers from many defects, bad customer support would be the top of my list, but NIH isn't one of them, at least not in this case.

That's almost like accusing Xerox in their heyday of having NIH, and I'd hate to think what the world of software would look like today if it hadn't been for that.


"Reinventing the wheel we invented earlier"?

(That's not a criticism—language design shouldn't always be in the revolutionary mode.)


So how long before we start seeing craigslist ads that ask for 5 years of Go experience?


2 weeks, tops.


not before the end of the world (2012)


Perhaps time travelers exist, and this is just them accidentally blowing their cover?


I am curious if someone could explain how a project in Go would compare to a project in Python or Ruby + a fancy array of custom C modules? This implies that it is possible to implement the nicer features of Go (for example concurrency support) in the modules supporting these languages. Or instead making kickarse fast compilers for higher level languages?

Is it too simplistic to say that at one end you have expressive languages designed for humans and at the other you have low level languages designed for speed?

edit: not meaning to insult humans who like fast languages.


I am not so interested in Go as an implementation platform, so much as I am in the idea of a bridge for web programmers to real C.


I like C, maybe it is Good Enough, but don't you think C has some real flaws that could be fixed in a newer language?


I think there's ~30 years of C code that isn't going away and that is going to remain the bottom layer of most programming stacks, and it's good for devs to be able to drop into it.


That's a great point, and I won't even look at a new language anymore unless it has C linkage. Which Go does (as does D, OCaml, and many more...).


There will always be a need for a bottom layer language, but 20 years from now, will it still be C? I hope at some point the world will migrate to a newer language.


Anything you could write a native filesystem or device driver in would be semantically quite similar to C, even in the ideal microkernel fantasy.


Why would you hope that if the old language is doing just fine for the purpose ?


'Just fine' is relative. The state of California still does all its financial transactions on a mainframe running COBOL; the cost of maintenance rises every year, but thanks to our budget deficit nobody wants to write the check for the cost of replacement.


I would like to see Google "eat their own dogfood" before I invest in Go. Or do they already use Go in production?


I agree as to "investing", but as to experimenting with and trying out, I don't see why not. And the site running on Go seems to be puttering along nicely...


goglang.org runs Go.


I meant real business like GMail or similar.


Do you want your email server to use a language that even the developers admit isn't yet ready for production?


No, I did not write that, but since there is no serious Go-application inside Google, their management could very easily "pull the plug" and no longer support this experiment.

I would probably fear that, if I would invest time in learning Go.

I don't understand why the team did release Go to the public at this early stage of development, what they expect from the developer community.


You have reversed the sense of olaf's point.


Apparently, Go style is to use tabs (see go-mode.el). This is not Pythonic.


There's also gofmt, which automatically applies the full house style using tabs.

If anyone can bring about a tab reniassance, it's bwk, ken, rob, rsc, and gri. These titans wrote your operating systems and your programming languages. They all wrote their own editors (not sure about gri) -- hell, the Go tests use ed!

Tabs have been lost in the wilderness for too long, it is about time they are reborn!


Yeah, this was most disappointing (I'm otherwise giddy). I hoped source tabs were dead and buried.


I am just wondering how same link got posted twice ... http://news.ycombinator.com/item?id=934140 ....Both point to http://golang.org ..



import "fmt"

what advantages this have over

import fmt

that a new language is choosing this sytax?


It's a shorthand for:

import fmt "fmt"

Where the quoted fmt is an import path. Not making it a string literal would create a potential clash between characters used in filesystem paths and the Go parser.


Could it have something to do with allowing unicode in the source?


it allows module names with spaces/other special characters, for two.


It keeps the grammar simple, for one.


IMO, Thats no reason to make me type two extra chars each time I import.


You're a programmer, right? So you type 70-100 WPM, right? That's 5-8 chars per second, so we're asking you for 0.4 to 0.25 seconds per imported module, or maybe 5-10 seconds per program.

I do see the reason to design a language efficiently, but I emphatically disagree with optimizing away small numbers of characters. Typing is a basic skill for the job, and the overhead is dwarfed by the text of the program (not to mention that we're making everyone type braces and semicolons).


If its a conscious tradeoff decision, doesn't it need to have a positive however small, to counter that negative, however small? now That positive might be "close to c++ syntax", but I am wondering if there is any other.


The main selling point is compile time.

Sweet...


And finally C 2.0 sponsored by Google. Good move against Apple.

It (go + native client) might become a next major platform. Just think about low-level (and easier to code than C[++]) Javascript alternative.


What the hell??? Didn't they come out with programming language last week already or some such? I don't see the point exactly. I mean I can learn it, but then what? It's not going to be installed anywhere where I can do work, it's not going to have any third party libraries, it's not going be be supported by any decent editor for a while (I use vi, throw your rocks).

Google: what is the point? First it's browser plugins, then browsers, then two OS's, now programming languages? How about a nice new processor architecture a la PowerPC? It's better and faster you know. Maybe they'll also invent new people to be their user base. I can imagine that intro video: "Welcome to GPeople - the new Google interaction experience. When you interact with a GPerson you don't need to talk. They already know what you are thinking and will insert relevant advertisement into their conversation with you. The advertisement will be unobtrusive and will target your innermost fears..."



Google has a ton of really smart people using their 20% time on projects they find interesting. I think they've shown great restraint in what they release publicly. I'm sure there's a ton of very cool new things being worked upon in the company.


Can someone give me some examples of where this might be used (and to do what)?


This new programming language from Google is quite a buzz on the web now. To be honest, I always got excited about tools that promised to speed-up and simplify development. But most of the time, those trendy tools are just even more time wasters. One language I find to be [almost] ideal is Scala. I believe that Scala will replace Java. Also, writing code for JVM gives you a huge advantage - you can create OSGi bundles. IMO, OSGi framework is the most mature module system. So I think that there's nothing good about this language except that it's created by Google.


I am not interested in really learning another programming language, and from the looks of it, the productivity claim is questionable. Not only can many of the examples they show be done in fewer lines of code in other languages, but you also do not need to re-learn the languages you already know. Many programming languages offer many of the features they are mentioning as useful so why learn yet another programming language unless it truly has something useful to offer that does not already exist. As one person stated, it has braces. I like braces for the purpose of organization, but I dislike variable declarations because it only serves to clutter the code and sometimes decreases productivity. The programming language needs to become more productivity oriented, and I do not think Go accomplishes this goal. Developers want to be creative with their code, but they would probably rather be more creative with their projects instead.


This is hacker news, learning new programming languages and learning about them or thinking about them is part of what hacking is all about (at least for me it is).

If you think that programming languages are a 'done deal' then try to ignore the last 15 years or so and see what you're left with.

Sure, we're not much closer to the holy grail of programming in absolute terms, but new languages (erlang, clojure, ruby, python to name a few) have definitely incrementally changed the perspective we have of programming, and have opened up more fruitful ways of building certain classes of applications.


When I stated "I am not interested in really learning another programming language", I should have worded it more clearly. "I am not interested in learning this particular programming language because it does not seem to be a step above existing languages. It seems outdated before it has even been released. Simply because something has big names behind it does not mean the product is automatically superior to others. I intend to learn other languages, of course, but I do not intend to waste my time.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: