In the good old *nix days the system provided pretty much everything you needed: C, perl, man pages, and well documented header files. Throw in a couple of good books on programming, and you could do a lot.
That's still the case, but it feels like today's software development reality is different. There are endless frameworks, apis, devops methods, scalability concerns, and n+1 standards with which to integrate and keep up to date on. Knowing your language is only about 25% of the battle, and few projects are developed in a vacuum.
It's not possible to keep track of all those things, so endless internet searches are the end result.
web is so saturated, it is hard to turn off the internet if you want to put some distraction free hours of coding. There will be something you don't know
My guess is that OS programming (Linux etc) has not been affected that much form this overflow of redundent tools and you can stay productive offline?
It's been a long time since I've done that kind of work, unfortunately. I still do have projects that I can be productive offline, most recently in rust ('cargo doc' is really amazing for this kind of thing).
It could be rose-tinted glasses, but I'd love to find a way back to the distraction-free coding days. Focus is by far my biggest battle these days, especially when running up against seemingly bizarre deficiencies (most recently, lack of unsigned 64-bit int support in postgres).
Hitting these kinds of walls almost always causes me to go wandering through distraction sites because I can't bring myself to build a bridge over yet another small discrepancy.
I was a programmer in the 80s. We had MS-DOS, Turbo Pascal (with it's damned fine documentation, in an actual book, and help files).
The hardware sucked, but we didn't have to worry about anything other than putting characters on a screen, eventually VGA graphics if we got fancy. There was one platform, the IBM compatible PC, and that was the only one I worried about.
We didn't have GIT, we had PkZip, and a stack of floppy disks with labels like "Source Backup v42 2/3/89". If we made a mistake, we had to manually revert.
I wrote "OverSeer" - a program that managed the inspection of Fire Extinguishers, using hand-held computers and barcode scanners from the Norand corporation of Iowa. Eventually it was adapted to other uses. It was written in Turbo Pascal. All the libraries were my own, including one to do cooperative multi-tasking.
Our customers were in Northern Illinois, and I was the programmer/tech.
As me anything. ;-)
[Edit - Extended description of the stuff I wrote]
I used to write pseudo-code that didn't compile for one pass, working out logic, then work through turning it into code on a second pass. Generally, stuff I pseudo-coded seemed much less buggy than stuff I wrote straight in as compilable, presumably because I didn't have much besides program-logic to concern myself with. Translating it to something compilable was pretty mechanical, since I wrote something in the ballpark; I just didn't worry about syntax at all.
It wouldn't be a problem - as you state in the question - coders used to do that before the web.
This is a rare (?) occasion where old dogs would likely adapt more quickly due to having done it before. Not only will their old muscle memory kick in, they will be more comfortable with the situation.
The curious thing I've noticed over time is that in the 90s as a demoscene coder I remembered a very good proportion of the Amiga hardware registers and how they worked, very rarely had to look stuff up. These days I have to look up things that I use fairly regularly. It may be a function of age of course, but I have a sneaky suspicion that reading stuff from paper manuals and the extra effort required to look things up made me remember them better.
Sure, but I still remember that $dff100 was BPLCON0 (at the time I new what each of the 16 bits did without having to look it up). 5 years ago I did a ton of Python programming, but I would need to look up how to do a for loop now.
More documentation was available at your fingertips because it wasn't assumed that you had an internet connection and search engine always at your disposal. Also heading into a datacenter to physically plug in a crash cart without a laptop was fairly normal.
I've been remote for years and tend to live in rural areas with spotty connections. I also travel frequently. I can usually stack up a week or two's worth of offline work just in case connections are terrible. Some of it's documentation and writing. But a lot of it is technical stuff or mock ups. Maybe I can't take a task all the way to completion without ci/cd or additional documentation, but I can keep making progress at least.
* I download documentation sites for offline use with wget
* I set up virtual environments, containers, and dependencies prior to going offline.
* Abstract away any libraries that I might need to add or look up the semantics of. Handwaving over the details.
* If there's a dependency that I might have to patch, or upstream code I might have to read, I typically vendor things with submodules locally
* run local instances of infrastructure for testing integrations or tools that expect you to be online all the time such as; config management tools, helm charts, docker repos, secrets management.
* have testing plans that don't require online validation
* for code, unit tests
* for IaC stuff, templating it out and diffing or manually eye-balling
* keep the logic of ci/cd pipelines in code or scripts instead of platform specific configuration files (within reason)
It doesn't have to be a technical solution. Explicitly planning out your work days to the minutia and accounting for all the tools you'll need will get you pretty far.
Now planning around times in-between access to power is an entirely different game.
I don’t see the issue… I already seldom check google (maybe once a month). I have the official documentation of the language I work with (c++) and that covers most of the questions I might have on a day to day basis. Everything else I need comes from my education. I really don’t see why people should rely so much on google!
Depends what I'm currently doing.. The past week I've implementing some trivial feature in javascript and C and so haven't had to look up anything.
But weeks ago, I had to implement something in DQL, which I'd never used before, their documentation is terrible, and google didn't help that much either.
The one formerly known as GraphQL+-
I might just be dense (in fact, I'm sure of it) but it was not a pleasant experience, even if I did get the job done.
Yeah, a couple good books and possibly some source of my older projects and I'm set. I still regularly buy reference material, not as much nowadays but its nice to have it available in a pinch.
Same as before Google. Books, text files, and collected code snippets. During school, the SWAG collection for Pascal was eagerly downloaded from BBSs when new editions came out.
Depends on what I'm doing. If I'm working on my usual BE stuff, I'd probably go for years - I can't remember the last time I googled something about this. If you ask me to do frontend work I'd probably get stuck within a few days.
Pretty far actually...However without the package index, and documentation of other libraries and imported modules and without an IDE, much more slowly.
I find myself using the Google --> Stackoverflow work pattern less and less.
Used to know somebody who contracted with national security and had to do this. They said it wasn't so bad with paper manuals! The worst part for them was not being able to use outside libraries without going through extremely extensive checking processes
That's still the case, but it feels like today's software development reality is different. There are endless frameworks, apis, devops methods, scalability concerns, and n+1 standards with which to integrate and keep up to date on. Knowing your language is only about 25% of the battle, and few projects are developed in a vacuum.
It's not possible to keep track of all those things, so endless internet searches are the end result.