Hacker News new | past | comments | ask | show | jobs | submit login

It seems likely that, simply, times have changed. There was a time when being on the same platform as the deployment environment was super important, but nowadays the tooling has gotten so much better that it matters a lot less. The proportion of people still writing C code on a day to day basis has dropped... well to pretty much a rounding error.

The bigger issue is really that ARM servers aren't that much cheaper than x86 servers today, and its very likely a lot of that difference in cost is just Intel's synthetic market advantage that would disappear if ARM actually started becoming a threat (which has already started happening due to AMD becoming a threat). Phoronix did a synthetic benchmark of AWS's A1 instances, versus the C5 Intel and C5A AMD instances [1]; they're nothing special at all, even with price taken into account.

Maybe that'll change in the future, but now that AMD is in a competitive state, that's pushing Intel into high-gear and its hard to say that ARM will have any effect on the server market in the short-term.

[1] https://www.phoronix.com/scan.php?page=article&item=ec2-grav...




> There was a time when being on the same platform as the deployment environment was super important

Which is also interesting because there was a time before that where being on the same platform as the deployment environment was sometimes considered nigh impossible, such as the early days of the "microcomputer" revolution where a lot of software was written on big iron mainframes to run on much more constrained devices (C64, Apple II, etc). It's interesting to compare the IDEs and architectures of that era and how much cross-compilation has always happened. There doesn't seem to be a lot of computing history where the machine used to build the software was the same machine intended to run the software, it's the modern PC era that seems the unique inflection point where so much of our software are built and run on the same architectures.

(A lot of the modern tools such as VMs like the JVM and CLR are because of the dreams and imaginations of those developers that directly experienced those earlier eras.)

It's interesting how that tide shifts from time to time, and we so easily forget what that was like, forget to notice the high water marks of previous generations. (Even as we take advantage of it in other ways, we cross-compile to mobile and IoT devices today we'd have no way to run IDEs on, and would rather not try to run compilers directly on them.)


I know some software was written on minis to run on 8-bit computers, but I have a hard time imagining that as the norm. My Apple II dev rig was two computers, one running development tools and one to test and they were two because running my software wasn't possible on the development machine without rebooting and loading all the tools took 30 seconds - a painful eternity in Apple II terms.


It was quite common in the game industry.

As confirmed by multiple interviews on the RetroGaming Magazine, almost every indie that managed to get enough pounds to carry on with their dream, invested into such setup when they started going big.


For consoles, it's natural - they don't have any self-hosted development tools and the machine you write your code with is largely irrelevant. Early adopters also benefit from the maturity of the tools in other platforms for the time before native tools are developed.

This may be more common in game studios, but was not mainstream in other segments.


It was quite common on C64, Amstrad CPC and ZX Spectrum.

Games were developed on bigger systems, and uploaded into them via the expansion ports.


I write Java but I seriously doubt that ARM has comparable JVM, it’s probably slow compared to x86. Cross platform in theory, not so much in practice.


My understanding is that there's a pretty good proprietary JVM for ARM (optimising JIT and all), but that the FOSS stuff (including OpenJDK) is well behind, and as you say, can be expected to perform nowhere near as well as the AMD64 counterpart.

> Cross platform in theory, not so much in practice.

Optimistic that the OpenJDK folks would rise to the challenge if there was anything to play for. Writing a serious optimising JIT for modern ARM CPUs would doubtless be no small task, but wouldn't be breaking the mould. I believe it's a similar situation for RISC-V, currently.

Googles But wait, there's more! 'Graal'! Shiny new JIT engines are on the way, and ARM support is in there. Hopefully they'll perform well. [0] [1]

[0] https://github.com/oracle/graal/issues/632

[1] https://github.com/oracle/graal/pulls?utf8=%E2%9C%93&q=is%3A...


wait aren't virtually all android apps written in java?


Android uses Dalvik, not JVM. Language is Java, standard library is mostly Java-compatible, but runtime is different. And I'm talking about server loads, I don't think that Dalvik is very good for those tasks (but I might be wrong, it's an interesting question).


Remembers Jazelle DBX with a wry smile.


Falls squarely into the 'cute but pointless' category.

Java is intended to be used by optimising JVMs. Java bytecode is rarely optimised -- that's left to the JIT. Using the Jazelle approach, where is the compiler optimisation step supposed to occur? Nowhere, of course! You'd be far better off with a decent conventional ARM JVM.

If you're on a system so lightweight that this isn't an option, well, you probably shouldn't have gone with Java. (Remember Java ME?)

[Not that I've ever actually worked with this stuff, mind.]


It's still the case that environments should be as close as possible. It's easier to achieve now because the number of environments have shrunk significantly.

Nowadays you will be running on a CentOS/Debian server or a Windows desktop, on an AMD64 compatible CPU. Not so long ago, there were tens of Unix and Linux variants with significant differences. It was impossible to support half of them.


> but nowadays the tooling has gotten so much better that it matters a lot less

I think that that's the point. Portability to platforms with a strong tooling and usage base even in a different sector is ok and safe. The problem is when you try to do something like x86 -> Itanium or alike, that could take some time to stabilize.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: