Hacker News new | past | comments | ask | show | jobs | submit login
The Developer Experience Gap (redmonk.com)
95 points by mooreds on Oct 14, 2020 | hide | past | favorite | 56 comments



> developers are forced to borrow time from writing code and redirect it towards managing the issues associated with highly complex, multi-factor developer toolchains held together in places by duct tape and baling wire. This, then, is the developer experience gap. The same market that offers developers any infrastructure primitive they could possibly want is simultaneously telling them that piecing them together is a developer’s problem

The past had a solution for this: the system administrator. He could design and implement robust and simple systems to provide infrastructure and tooling for developers to deploy their code. But this isn't "cool" any more. You have to use tens of self-service bleeding edge (read unproven and unstable) concept tools bought from the players who have the most money to invest in tech conferences, blog posts, astroturfing, evangelists, etc, and pretty much nothing left to pay their own developers.

For example try telling people they would be better off with their code being packaged and delivered by a Debian package instead of a container image - see how fast they consider you a crazy preacher from the past.


Sysadmins were supposed to “go away” after transitioning to the cloud. Instead, what I’ve observed is that so many years ago companies hired more and more developers that wound up trying to solve problems sysadmins long ago solved with little attention from developers that many are reinventing or rediscovering. This is one reason how we wound up with so many management systems like Puppet, Chef, Ansible, etc.

But the market for sysadmins is completely soaked in corporate IT professionals whom really only have maybe 20% skill coverage with what smaller tech companies want or need. When it comes to delivering software, companies have found they want / need strong sysadmin knowledge domain with a strong software development background - this is the rise of the “devops” and SRE disciplines.

What really bums me out is that long ago the term sysadmin applied to software developers as well - people on Unix systems that wrote C and other systems code that used bash to string together software fast.

So in a lot of senses, the creation of bash is a form of the developer-sysadmin construct done quite well. But today, with our fractured, mercurial software ecosystem we see so many competing tools that largely do things 90%+ the same way in practice for hardly any real advantage over each other. Every option can easily manage hundreds of machines in some odd seconds or a few minutes, after all.

But there’s a cultural problem that’s become a bifurcation into engineer castes over time. Go to conferences, social networks, and meet-ups for developers vs sysadmins - the tone and tenor of topics is completely different. Developer conferences have a smattering of deeply concerned talks but a majority are bright, cheery, and exploratory in tone and the side conversations match up. Sysadmins oftentimes stuck in cost centers are as a rule grumpy and even belligerent, and the side conversations remind me more of the hallways of a VA hospital than Bell Labs.


> For example try telling people they would be better off with their code being packaged and delivered by a Debian package instead of a container image - see how fast they consider you a crazy preacher from the past.

Where I work we create Debian packages, then prepare an image installing it from the Dockerfile.

In our defense, our test systems are physical, our CI/CD chain is containerized, and production will move from physical to containers soon.


System administrators are still around outside of Silicon Valley. And we still do that thing of managing servers, build systems, and assisting developers get their code from git to production (usually by setting up an automated process).


How do Debian packages isolate ? Can you make sure that e.g. app1 won't replace app2's config file with its own (if, say, both app1 and app2 need ImageMagick with different policies) ?


I believe those changes are tracked, there are commands that will tell you what package a file is from (I think it might be apt-file). It's been a while since I used apt in depth, but I do know that pacman tracks that and will log error messages if a file already exists. For example when config formats change, if I have /etc/example.yml created already, pacman will throw an error and put the new file at /etc/example.yml.pac-new (or a similar scheme).

> (if, say, both app1 and app2 need ImageMagick with different policies) ?

Debian typically handles that by building fat binaries/libraries with all the possible options enabled. In cases where there are multiple incompatible options, I'm not an expert, but I believe you create a virtual/meta "package" that can be satisfied by installation of any of the incompatible binaries. Pacman handles it similarly. Gentoo is an oddball in that it tracks a set of system flags and enabled/disables features in builds based on which flags you have set. It's able to do that because it compiles the packages itself, so you can choose the options at install time.


> For example when config formats change, if I have /etc/example.yml created already, pacman will throw an error and put the new file at /etc/example.yml.pac-new

But, you understand that this means that Debian doesn't handle the issue, right ? Like, cool, I got an error but I still can't make my two programs run concurrently, back to Docker it is. Also some programs may generate config files or stuff in /var at runtime so apt can't even warn.


Err, because it's the job of the package manager to prevent this exact situation from happening...

Packages are not permitted to overwrite data files or configuration files (conffiles) from other packages. dpkg would abort the installation or upgrade of any package which tried to usurp a file owned by another package. At least, not without an explicit Replaces dependency to allow adoption of them. Enforcing consistency and ownership with a central database is the entire point of managing the system with a package manager.

This isn't even new. This is 25+ year old technology at this point.


Yes, and this is a behaviour that prevents various use cases, hence we need different tools with better isolation, hence docker.

> Enforcing consistency and ownership with a central database is

Assuming a lot of things very wrongly. My distro maintainer's idea of consistency may be very different from mine - it's not the system that matters, it's the freakin apps because running this is the only reason we buy computers for, and it happens that you need to run two apps which would be entirely incompatible on the same system (if I take my past life music production experience, for instance you really really want to keep using the same version of a software for a given project. But you can have a dozen different projects in flight which all require different versions of the software used. Try installing 12 different versions of gimp or ardour on Linux without something like appimage :))


I really do wonder how often this happens.

Like, I know that Python is particularly prone to this, and if you have multiple languages needing C/C++ based libraries then I guess this is a concern.

But how many people have experienced this to an extent that containerisation seems like the right solution most of the time?


I’m convinced that the real reason is all the cruft in packaging systems and opinions among their domain experts that are specific to the needs of distro maintainers. “Docker Build” could produce DEB packages as a target and they would be fine 99% of the time.


They wouldn't. There's far more to a Debian package than simply aggregating a bunch of files into a container and calling it a day.

Making sure it's all correctly split into runtime, library, development, documentation, debug etc. parts, and ensuring that each part has the correct dependencies upon other packages in the system is a much more involved task. But it's this part that adds most of the value compared with simply building from source.

Making a "package" is easy. But the real point of packaging is integration with the wider system, and that part requires actual effort. Docker doesn't even attempt it. Docker is easy and convenient primarily because it pretends that problem doesn't even exist. It does the easy 10% of the job while ignoring the 90% of the work that takes the time and effort.


This might be a misunderstanding... they would be fine in 99% of the cases where containers are currently used, not where packages are currently used.


Why wouldn’t containerization be the right solution?

A “container” is an archive (eg, tarballs) of the contents, a hash chain of the construction, and some config data about cgroups and namespaces to run it with. Turn off the parts you don’t want — I usually turn off virtualized networking, for instance.

It’s a more lightweight package that doesn’t have versioning problems and doesn’t require crazy installation scripts — what benefit is traditional Debian packaging supposed to have?


My current developer experience:

1. after code review etc. a change is merged to master 2. the CI tests are automatically run 3. IFF the tests pass the change is auto-deployed to staging 4. promoting a deploy to production is a push-button or a CLI command 5. adding 3rd party services such as New Relic / DataDog / Papertrail / Bugsnag is a push button or single CLI command 6. even things like upgrading to a new version of Postgres is distilled down to a few CLI commands

I think Heroku is not alone in providing this level of ease, but that's what I've been using.

I would argue that there is not a "gap" but there is actually "a thing that you can totally have if you're willing to pay more money and give up some customization or control," which makes it roughly analogous to the iPhone.


Or if you are already down for paying extra and want customization then you know, hire a competent Linux admin.

It's not actually that hard to setup a self hosted solution that gives you the Heroku workflow.


I hope Heroku's pricing comes more in line with offerings like DigitalOcean. The difference between the $5 and $50 Heroku instances is negligible compared to almost any VM provider.

Dokku provides a nice, open-source Heroku-like experience. I've had a good experience so far using Dokku on DigitalOcean or Hetzner VMs.

http://dokku.viewdocs.io/dokku/


I have set up all of this at my work and we only have two EC2 instances. It wasn't that hard.


> I think Heroku is not alone in providing this level of ease, but that's what I've been using.

I do. At least for the vast majority of my experience, for webapps nothing comes close in terms of dev happiness.

You do have to give up that control and stay within the rails, though.


This matters ... I think.

There is something there fighting to come out - something about software literacy (I claim we all will read write software like we read write language).

There are two types of "publish" - the kind I do here and on social media that is not proofread just flung out. It includes emails to friends and texts with shopping lists - and it is easy and simple and enabled by the platform

and there is publishing - a book an article etc. It has hurdles (and higher expectations)

Most times writing software feels like the latter - QA and proofreading, tests and slow release cycles.

Actually getting my thoughts down takes time - it takes reading around and consideration and hell it has friction.

So yes we need faster ways for people to deploy software and sandbox it so the blast radius is is limited - ensure my code to switch the lights on might leave my hall light blazing for a weeks holiday but is not going to turn the microwave on or reset the fire alarm.

But the code to do that must be the higher quality, the professional tier.

It is crazy to expect JK Rowling to go through the same process to send a tweet as to publish a book, but it is also crazy to hold that tweet to the same standards. And frankly it's crazy for the tweet to get the same kind of audience. (no this is not a comment about the content of any of her tweets - just an example after buying books for daughter)

Same goes for software. Different release standards different quality standards different sandboxes.


This comprehensive, developer focused eco system has been promised again and again. The challenge is that those metaphors take a long time to create and are fragile. A new need arises or the paradigm shifts and all that equity evaporates.

In the early 2000's there were promises that we were on the verge of graphical programming. Environments where anyone could stitch programs together visually. They turned out to be more trouble than they were worth. Anything sufficiently complicated needed a proficient software engineer. That engineer inevitably did away with cumbersome graphical tool preferring to just write the code. The tasks that non-software engineers were able to work were of questionable value to begin with.

Personally I prefer the patchwork. Emerging from the java, .NET bubble was a breath of fresh air for me. I considered the IDE to be constraining. It did 90% of what I needed but I also needed to do the other 10%. Simplicity always trumps completeness and composition is always better.


Another example is early implementations of Visual Studio GUI designer. It was much easier for them to produce this using a new magic code that we didn't have to interact with but which quickly became passe and required some understanding when debugging. Once they were able to produce forms using code, the old method just went away and everything changed again.

I don't feel that we lack the tooling to be as slick as we want, we lack the motivation to invest the time and money to make things slicker. It worked at Apple, not because of integrated tooling, but because of a dictator-style management that forced people to collaborate in the name of quality. Perhaps what we are uncomfortable with is the idea that high quality requires that level of control!?


UX overall is under-appreciated. It's very hard to measure "soft" attributes (effects of aesthetics, chaos/consistency, ambient stressors, physical exertion, etc. on mood) of humans, hence harder to gain consensus. It's unfortunate it has such a tremendous impact on humanity, yet we have such bad instruments to measure it.


I don't think it's that we lack measurements, it's that it is very objective. I am used to Android and when I tried an iPhone, I couldn't get it to work because I didn't know all the little tips and tricks despite many Apple fans saying that "clearly" the iPhone UX was the most superior.

The truth is that there is probably not much in the UX that is fundamentally better or worse apart from the animation side of things and possibly the higher res screens in some instances, most of it is just different.


How quickly we forget the Danger Hiptop, which had internet access, phone, music, and even an app store years before the iPhone was released. And it worked very well, considering the 2G data networks at the time. It was quite a successful product.

What made the iPhone revolutionary was the input method (and of course all the UI built on top of that).

I kinda glazed over after that intro.


You say "we forget", but the overwhelming majority never even heard of a Danger Hiphop. Which is essential, because the OP isn't writing historical reference materials, they're providing an (apt, IMHO) _analogy_ about trends and popular / common experiences.

Thus I kinda inferred the OP meant "... which device crossed the threshold into mainstream awareness". So your "glazed" dismissal seems a little harsh.

That said, I think you're right to make note of the Hiptop (TIL!), and I agree that it was the touchscreen (albeit also not truly firat of its kind either) that was a transformative experience for the masses. I'll never forget the first time I interacted with an iPhone.


While I understand what you're trying to say, I still think it's incorrect. The Hiptop (branded as the T-Mobile Sidekick) was very popular in the US - it was widely advertised on TV and "all the cool kids" had one. Literally kids - it was popular with a younger generation, and advertising played that up.

The hiptop unquestionably crossed the threshold into mainstream awareness at the time. It's just been forgotten because the iPhone overshadowed it.

On a personal note, I had a few generations of Hiptop before I switched to the 1st gen iPhone. Yes, my first experience with the iPhone was memorable... but still not as memorable as my first Hiptop. "You mean I have the internet everywhere??" That was incredible in 2002. We just take it for granted now.


You say that, but...what were you going to do with mobile internet in 2002, exactly?

The connected devices at the time were all still struggling with the early adopter chicken-and-egg issues of an emerging network of services. There weren't "apps" and web sites had not centralized themselves. And cost, speed and service coverage remained very limited. You couldn't justify a data plan just for Mapquest when you could print it out at home. Most kids, myself included, wouldn't be getting any phone for a few years yet. The mobile phone's purpose in the 2000's was served well with what was in feature phones: call, text, Snake, maybe email.

In contrast, the iPod, a rough contemporary with the Hiptop, addressed something more immediately compelling with a two-sided, integrated marketplace when paired with iTunes. There was a value proposition in that since not everyone was or wanted to be a savvy song pirate, and you could buy singles instead of albums. Internet speeds and access were ready for that use case.

Apple's success at the time, both with iPod and with iPhone, rested on timing and quality of integration, which returns again to that which the article alludes - we have a lot of early-adopter developer services, and some of these are in a position to be more like an iPod/iTunes. But I don't think the article goes deep enough in recognizing that even the iPhone was capitalizing on underlying infrastructure developments to channel them through a specific product and service mix.


You're vastly underselling the value of mobile internet in 2002 (and through the period up until the iPhone launch in 2007).

Email was (and still is) the killer app, and it worked great. Instant messaging worked great - and at the time, Maxis (where I worked) ran on AOL IM the way people use Slack now. Danger's web browser actually worked through a proxy that chopped up pages and served them in a reduced format, which was effective in 2002 but was a liability by 2007. You could even edit your contacts online with full syncing!

The Hiptop was popular and "quality of the integration" was better than anything Apple had produced right up until the iPhone. However, by 2007 the data networks had gotten fast enough that the Hiptop's web browser was feeling a bit antiquated, and navigating web pages with a scroll wheel was cumbersome. The capacitive touchscreen was a major leapfrog, and coming up with all the UI behaviors to leverage that was a major feat - Apple deserves a lot of credit! But maybe not quite as much credit as the OP suggests.


Sidekick (hiptop) let you interact with your AIM contacts for basically free, at a time when many people were still paying 30 cents apiece for SMS messages. They even had a data only plan.

I knew a bunch of deaf kids around that time and they all were enthusiastic Sidekick users.


In February 2003, I bought Nokia 3510

https://en.wikipedia.org/wiki/Nokia_3510

What I could I do with 2003 Internet?

Download J2ME applications from Swisscom and Sunrise, including some crufty map applications based on cell position.


Maybe because I'm European, but this is the first time I learn about the Hiptop.


> It's just been forgotten because the iPhone overshadowed it.

Or maybe because it never successfully left US and reached the rest of the world.


There was myPalm that ran windows and was touch based.


Yup, and a lot of the Danger DNA found its way to Android as well. The Hiptop had a Java VM, app store, push(!) IMAP/POP3 email, media support and more.


Or the Symbian and Windows CE/Pocket PC phones, before it.


Indeed. And let's not forget the Palm Treo https://en.wikipedia.org/wiki/Palm_Treo

Was out well before the iPhone, sold Internationally, had installable 3rd party apps (proper apps, not just "web" apps that the 1st gen iPhone had).

Apple fanatics always seem to think Apple create the first smartphone as we know it.


Not only that, iPhone base security model was actually one of the main selling points for Symbian OS S60 v3.

"Symbian OS platform security model"

https://www.usenix.org/system/files/login/articles/73507-li....

While the USENIX paper is from 2010, the 3rd edition was released in 2005, 2 years ahead of iPhone.

And the now super fashionable Widgets were already a thing on Symbian, back in 2010.


The answers to all these things cost money to develop. Lots of it. And developers are on the whole a bunch of cheapskates who led the rush to demanding everything be free of charge, all the while bemoaning the fact that their customers were starting to feel that way too.

So while we do occasionally get nice things, they're paid for by "investor story time" instead of the people who use them, so they will go away. Either the companies go broke, or they're swallowed up by a behemoth who may steward it well for a while, but will nonetheless eventually want their billion back.


The problem explored by the author is quite an interesting one. Developer experience is indeed fragmented, varied depending on the toolchains chosen and not always good.

However, I have to disagree with the iPhone analogy. The consumer mobile phone market doesn't have any similarities with the developer tooling and software platform market. For one, Apple has always preferred to set trends and _tell_ people how to use their devices rather than provide openness and flexibility. Developers thrive on flexibility - shepherding them to the one true way of doing things, even if it provides a better experience, would unleash the inner forces and wrath which made open-source a thing. Also - the problem of "listening to music while taking a call" has vastly reduced complexity compared to anything done by developers. I don't think you can apply the same learnings. Each market and each customer segment require completely different business models and the iPhone bundling and "do few things right" strategy just isn't right for such a vast ecosystem of developers who are all doing so many different things. Yes, they are all deploying code to production. That's not a market. And to see the differences, compare the SDLC at a regulated bank to that of a 50-person startup with proper automation - you'll find they are light years apart. Not because the banks suck as devs, but because they are in a different market.


Don't we all want silver bullets?

My perspective is strongly influenced from a system dynamics perspective -> money and time in -> experience out. It we look at the whole space and then seed it with lots of money and huge time pressure we get condensation around those kernels. These will be island solutions and they will be very good as lots of money was available. But no money can buy enough calendar time to coordinate over distance. In good old times (TM) we had larger scale solutions but those also took years to develop and the outcome was often less than perfect. At the moment we have a more component architecture (and as they were build in semi-isolation integration is lowest common denominator).

Where does this lead us to? Innovation on the semiconductor layer slows dramatically as financial cost of the next generation explodes, speed-up has stalled and density improvements have slowed. This gives us more time on the software level. The fragmented software landscape makes is harder to do first-one-takes-it-all. Larger consolidated integrations may again be doable as more calendar time is available.


Java has all of the listed criteria.

1) Comprehensive. Maven provides comprehensive coverage from archetypes (project starter templates), dependency management, build, version archiving, everything. It is written in Java as well.

2) Developer native. Eclipse is all encompassing. It provides deep integration with everything via plugins and great out of the box experience with different default plugin packages. There are eclipse for developers of java, jee, rust, c++, python. There's even third party distributions of it like NVidia's Nsight plugins for CUDA development.

3) Elegant. Is very subjective, but the Java API was elegant enough that Google decided to steal it for Android and be sued for billions of dollars.

4) Multi-Runtime. "Write once, run anywhere"

5) Multi-Vendor. OpenJDK is GPLv2+CE. There are numerous vendors who provide Java. Amazon Corretto, IBM OpenJ9, AdoptOpenJDK, most linux distros have their own package repo distributions of one or more of these as well.

With the added bonus that Java has memory safety, is the dominant language for the past two decades, and stays current by adding trending features like functional programming syntax.


I was a Java developer for approximately 15 years, and am generally a fan (and actually a huge fan of the JVM), but I'd still like some of whatever it is that you're smoking:

2. Eclipse is your suggestion here??!! I could never imagine willingly using Eclipse again. Heck, years ago I was willing to pay hundreds of dollars a year for Jetbrains products specifically to avoid Eclipse.

3. Google's decision to use Java for Android had nothing to do with the "elegance" (or lack thereof) of the language, and everything to do with tapping into a huge community of existing developers and libraries.


I'm working on an application that's written in javascript, java, and python. It uses maven to build. I have it under git version control with git submodules. I don't have to context switch in Eclipse. I can do all of that without leaving the IDE at all. I can start a project, download all the dependencies, hover over methods to see javadocs, and I even have a terminal tab if I need anything like npm on the command line. All without leaving Eclipse.

The fact that you can do the same thing in your tool of choice, IntelliJ, further refutes the author's point 2. You and I arguing about which we prefer is beside the point being made by the author.

Elegance is subjective. The fact that there is a "huge community of existing developers" means Java is objectively more approachable to more people. Yet another way to look at it as elegant.

I don't smoke. You've reinforced my points in favor of Java.


The Lisp Machine was the iPhone in this analogy. Granted it was built in an era before the web, Google, git, or cloud computing. But the degree of thought put into integrating its tools was astounding. The system anticipated what you needed at any given step and gave it to you almost before you asked. I would love to put that team back together and tackle modern development.


> The first device to compress – successfully, at least – a mobile phone, a computer with internet access and the 21 century’s equivalent of the Walkman into something that would easily fit into a pocket, the sheer breadth of its capabilities was without precedent.

Sigh. There were other such phones before the iPhone. But I guess if we repeat it long enough, someday we will start believing it.


The thesis is that quality integration is a game-changer. To ignore the difference it makes is to ignore the entire point of the article.


Sure, but they were much more limited. I had many of them.

On paper the Nokia N70 I was using when the iPhone came out could do everything the iPhone did and more. But it was essentially unusable by comparison. All three aspects of the iPhone were dramatically superior to prior devices.


The developer toolchain for the cloud this author is looking for is CDK: https://github.com/aws/aws-cdk

* Self updating code pipelines? Yep: https://aws.amazon.com/blogs/developer/cdk-pipelines-continu...

* Cross-cloud compatibility?

Yep: https://aws.amazon.com/blogs/developer/introducing-the-cloud...

YepYep: https://github.com/awslabs/cdk8s

YepYepYep (CDK for Azure): https://www.youtube.com/watch?v=0q89VbEA9I4


Steve is not alone in his line of thinking. If you've experienced something better than what exists today you constantly question why the industry hasn't come to the sam realisation. It took Apple to really say, hey this is where mobile is going, here's the iPhone, nothing else matters. I think right now we're in the pre iPhone era where we're still hand crafting this stuff.

Certain developer experiences are drastically improving. The frontend (as fast as it moves) now has the likes of Netlify and Vercel. It's about saying, we just need to focus on frontend so give me the solution to that. I think backend is the same but it also highlights that there's no defacto standard for backend development in the cloud. When that's finally realised we'll see solutions that cut away all the non-essentials. Someone mentioned Heroku and one click spin-up of dependencies. Great, be everything to everyone, that's why Heroku is barely successful. Anything that was ever compelling was incredibly narrowly focused, opinionated and resonated heavily with devs.

My take. Go based microservices written using gRPC and consumed as HTTP/JSON APIs. A platform that offers access to underlying primitives for building those types of services and a complete lifecycle for build, run and manage. Look at Twilio, look at Stripe, look at Segment, look at Sendgrid. These are the companies of the future built in the Cloud. For the backend, its simply about building APIs and having platforms that enable that. Which means you need auth, config, storage, pubsub, etc.

Here's my piece https://github.com/micro/micro/blob/1166d15eff2015e32a9ed793...

Here's my effort https://m3o.com


Heroku has very wide support, you can deploy most types of applications super easily, including python/java/ruby/go. Pretty much all languages follow a basic model of build and run, though how the build is done is variable.

It's great and it's very successful. The only reason Heroku is not more widespread is because they're charging $50 per unit (roughly a CPU or a GB of memory), which makes it prohibitively expensive for most workloads.


The iphone was a bigger jump in the US than elsewhere. AFAIK a big part of this was that data was expensive and slow in the US.

I assume there exist lots of good solutions and technologies for many problems, they just aren't mainstream. Many times the barriers for adoption are not technological.


Most of the solutions to the "developer experience gap" have already been built inside massive tech companies like Google. Smaller companies have their own rendition of this. Which is to say, we have a specific stack on a specific cloud provider and we use a specific developer workflow with a specific language and framework. And because of this we're all highly productive because we don't have to think about the tooling, we don't argue over design decisions or new things and when we see limitations we start to investigate the next dependency.

Pragmatically companies form opinions which differ very much to the "developers need choice" mantra of the industry itself. Choice is the enemy of productivity. Google maintained a single platform called Borg that ran self contained binaries and required you to use the Google-3; c++, java or python. Internally the majority of systems looked the same and everything was consumed with RPC apis using Stubby that's now open source as gRPC.

Explicit decisions and doing less makes us all more productive. The iPhone did that for mobile. Cloud is waiting for an industry standard.


Iphone was not the first mobile device with internet, not by a long shot.


> It’s an experience that is taken for granted now, but relative to the competition then it was revolutionary. Call comes in while music is playing. In response, the iPhone fades the music out while fading the ring in. The call can be taken seamlessly, and once ended the music fades back in. Simple. Obvious.

So much romanticization, but let's we forget it also had no copy/paste. What's more important? What's the better UX?


Apple History Lesson

I was the Apple Forums technical administrator at the time the iPhone 1 came out. All the hardware competitors said in print that Apple couldn't make a phone. Pleasantly surprised at how very, very few bugs/problems were reported on the forums.

Yeah, but no copy and paste initially.

(The forum databases ran on 3 PowerPC minis running MySQL using statement-based replication. Those were later upgraded to Intel minis. We didn't use any endian or isa-specific column types like float, so seamless upgrade.)




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: