Hacker Newsnew | comments | show | ask | jobs | submit | login

AS3 is an implementation, not a standard.

Large parts of AS3's feature set actually were considered for standardization, back in the days of the "ECMAScript 4" standard (in quotes because it was never released). That standard was abandoned, due to differences some dismiss as politics and others claim were due to implementation and complexity issues.

reply


My first thouvht on hearing the name "libOS" was that someone had gotten Linux running on an exokernel, which is in some ways a variation on the microkernel theme.

Linux on a microkernel has been done before. The MkLinux projext is one example, though it also doubled as a port to PowerPC. I haven't heard of any exokernel attempts, though.

reply


Classes != OOP. JavaScript has had OOP, not quite since its inception, but quite close to it; it merely follows a different paradigm when doing so. In fact, ECMAScript 6's notion of classes is explicitly defined as being not much more than syntactic sugar over the existing paradigm.

As for functional programming, JavaScript has had that in some form for years. The addition of tail calls in ECMAScript 6 is an improvement for people who want to use this paradigm, but the language has had most of the old mainstays since at least ECMAScript 5, and some (like closures) date back further than that.

reply


Website != API. A cleverly-written API can exist in the same space as a Website (even the same URLs, if you differentiate by things like Accept: headers), but they're not the same thing.

It's perfectly acceptable for Websites to include, and even require the use of, JavaScript. It's also perfectly acceptable to offer a JavaScript client for your API. In fact, you pretty much have to do that if you want your Website to work with the API anyway. What's not OK is for an API to require downloading and using additional JavaScript while the API is being used.

As a contrived (and somewhat ludicrous) example, let's say that I queried "http://foo.com/bars" for some kind of collection. It's OK to return the collection. It's even OK to use an HTTP redirect if the collection actually resides elsewhere: HTTP 302, perhaps, with a Location of "http://foo.com.bazes". What's not OK is to return a line of JavaScript reading "window.location = 'http://foo.com/bazes';" which might work for the browser, but wouldn't work for most other clients.

reply


XSLT was an interesting language that unfortunately tried to solve the wrong problem. What we needed was AWK for the DOM. Instead we got a vision where XML would be used to transform other forms XML into yet OTHER forms of XML, and the result was a mess.

I think there's still a place for an "AWK for the DOM". The XML extensions to gawk are interesting, but not really what I'm talking about; it works at a lower level (more like an "AWK for SAX"). jQuery comes closer to what I'm talking about, but I still wouldn't really call it a clean match.

reply


That's a good idea - keep XPath but lose the XML based syntax.

Edit: In fact, lose the XML completely and go with JSON and JSONPath.

reply


It's not so much portability as optimization. Valid asm.js is also valid JavaScript, so it can run in V8 unmodified, but it runs without the asm.js modifications.

reply


The HTTPS-only folks mean well, and I support it as a stopgap solution, but it is useful only in that it can probably be implemented more quickly than IPSec-everywhere (or, if IPSec proves to be unsuitable, then some successor standard with the same goal of encrypting all traffic).

The latter, however, should be preferred as a permanent solution. The Web is by no means the only part of the Internet that needs to be secured.

reply


IPSEC and HTTPS work at different levels. With IPSEC, your computer can be sure it's talking to the computer at 198.51.100.1 and not to any other computer. With HTTPS, your browser can be sure it's talking to www.example.gov and not to any other web server. Both work equally well against passive eavesdroppers, but they authenticate different things and so will work differently against active attackers.

reply


In other words, IPSEC is useless without DNSSEC, and that isn't getting universal adoption anytime soon.

reply


Oh, no, it's useful for what it's designed for: to protect communication between two computers. If I have IPSEC protecting the connection between my desktop and my internal DNS server, and between my desktop and my database server, I know that connection to my database server is protected by IPSEC.

It doesn't protect the mapping between a computer name and a IP address, but that's not its job.

reply


I think it's more like IPsec hasn't happened because it's a huge hairball of complexity which requires kernel-level configuration on every client and full end-to-end support for two new IP protocols and a UDP key management service.

In contrast, TLS requires using a new client library and works just about everywhere. All of the work people have been doing to switch to strong crypto everywhere and deploy things like perfect-forward security? Imagine how quickly that'd have happened if it required everyone to install a kernel update.

Until IPsec becomes easier to use (something as simple as checking that a socket is actually secure used to be shamefully under-documented) the best way to think of it is as a potential replacement for proprietary VPN protocols. Anything which cares about security will still need TLS over that so most people will simply use only TLS.

reply


No, IPSEC is useless because the ISPs broke the end-to-end nature of the internet which IPSEC assumes as a starting point.

So, IPSEC is a non-starter.

reply


It's worth noting that HTTP(S) has broadened outside of the web, in the sense of web browsers. Most native mobile apps, and lots of APIs used by desktop apps, etc., all use HTTP to get their job done. Definitely doesn't cover everything, but I think it's fair to say that HTTP is basically the protocol of our lives right now.

reply


It makes sense to me that depressed people are just as accurate at judging time as non-depressed people. In cases where depression has a sudden onset, that "slower" rate of time passing might take some getting used to. But eventually you WOULD get used to it, and start to judge the passage of time from that frame of reference.

There's no reason that couldn't be just as accurate as a non-depressed person's judgment. It's just a matter of relearning the skill. Or learning it in the first place, to those of us whose sense of time wasn't so acute to begin with.

reply


It strikes me that perhaps the simplest way to kill plagiarism is to simply require a handwritten rough draft. Take the copy/paste convenience out of cheating, and far fewer students will cheat: writing your own thoughts is a lot less tedious than hand-copying someone else's. It also has the bonus of ensuring that at least some minimal editing has taken place, over the course of writing the paper a second time.

Students with disabilities that prevent writing a draft by hand -actually missing the dominant hand or not having use of it, or arthritis in the wrists, or carpal tunnel syndrome, or whatever- will need accommodations, of course. No solution will catch everyone. But it should still radically shrink the pool of copied (and copyable) papers.

-----


I'd wager that most people here would go ahead and write their paper on their computers, then recreate a plausible handwritten rough draft afterwards.

And that seems about as difficult as recreating one for an essay you didn't write.

-----


I write about 50 times faster on keyboard than in handwriting. Why should I be penalized in this fashion?

-----


A reasonable alternative to this would be to require a hardcopy draft with visible editing marks as a draft. Going to paper at least once helps alot in the editing process.

-----


Hardcopy or not, making students iterate over their work with feedback at each step, is the right way to do this. Make submitting drafts and responding to feedback part of the grade. The only problem that this is a lot of work for the instructor, who increasingly is an adjunct with little experience running all over town to teach six courses just to make ends meet.

-----


low tech is sometimes the best tech ;)

-----


It is true that few students have thoughts that have not already occurred to others. This is not the important part of honest research. The important part is that the student actually HAVE these thoughts, and actually EXPRESS them, rather than just copy/pasting someone else's expression of the idea without ever thinking about it themselves. It is the process that matters, not the outcome.

In other words, buckle down and do the damn work. Laziness is not a virtue when it comes to thinking.

-----

More

Applications are open for YC Summer 2015

Guidelines | FAQ | Support | API | Lists | Bookmarklet | DMCA | Y Combinator | Apply | Contact

Search: