I'd never done anything with FORTRAN before, so I spent a couple days reading about it and looking into how it works, fearful that I'd muck everything up trying to fix it.
I open the code and find that it was the most elegantly written software you could wish for, with fantastic comments and structure. I changed two numbers and was done, buying us another 30 years or so of rock-solid service.
A lot of junk and spaghetti code gets written (or copied/pasted together) today because there are few hardware-based constraints in much of modern app and system dev
I work with a lot of embedded engineers and have the utmost respect for those that work on safety-critical, legacy systems
I have worked for a significant time in my career as both an embedded engineer as well as a backend engineer, and in general I find that the backend code is way easier to read, maintain, and extend. Embedded code is seldom properly tested and most of it is written in C where people can abuse a library's contracts or the preprocessor. It is not uncommon to find functions that are over a thousand lines in the embedded world. Compare this to Rails where there are a ton of standards, short and succinct functions, and good support for testing.
I guess it depends on your definition of "quality code". If you mean code that is dependable and will do one thing well on one platform for the rest of time, then embedded code could be considered high quality. I would debate that these items have more to do with the binary than the code though. If you mean code that conveys how a program works well to other programmers, is under test, and follows some standard structure, I would pick modern app development as typically being much higher quality.
The only thing that resource constraint forces is less code, especially less dependencies, because you run out of space.
The Toyota "unintended acceleration" court case was a flagship example of bad embedded code, that we rarely get to see.
Of course, the antipattern on enterprise code comparable to thousands of global variables in C code, is to have thousands of columns in hundreds, or thousands even, of database tables, all intertwined and used only God knows where.
It's one of those Rashomon situations where I'm not sure we can ever be entirely sure but it seems to have stopped.
When the punishment for writing code that doesn't work is a long wait, you learn to write small pieces and test individually.
My experience with Legacy code has been the exact opposite of this. I've seen some really well written and documented legacy code but almost never in resource / performance sensitive areas.
People inevitably seem to accept trade-offs that sacrifice readability and maintainability for efficiency.
I don't know, your mileage may vary with this one. I have had to make code less readable to make it more efficient more often than I've been forced to find a more elegant way because the clever hack was too slow.
That all said, modern FORTRAN is entirely pleasant language. Just stay away from the legacy.
Maybe in the general case, but there are so many exceptions they're almost the rule. Apps like JIRA and Electron constantly have slowdown from bloat that pretty clearly is swamping the hardware constraints and would be avoided with better coding.
Also, I see classic games that are emulated or transformed on "fast" hardware that nevertheless have input lag. I've seen this one the Wii console for SNES games, and on inflight emulator that ran Pac-Man. Plus, TVs that upscale make Dance Dance Revolution have too much lag to be playable.
This will be bad even if they are all written in the same language, if they were not also all written in the same time frame.
With ancient FORTRAN systems, generally you need to learn the version of FORTRAN it was written in (FORTRAN 77, Fortran 90, etc) and enough about the problem domain to understand what they were trying to do.
With the kind of systems we are making now the hapless maintainer will not only have have to learn the specific ancient language version and the problem domain, but also whatever now forgotten framework was in fashion at the time the thing was written.
There's also the idea NOT that it was bad when it was written 30 years ago, but that 30 years of patches by people who didn't write the original code mean it's no longer clean code.
Now I work on newer software and I frequently see people making changes without spelling out the rationale, dates, author etc. And it makes it harder to track what has happened and what does the code do exactly. When I tell people to comment it correctly they often reply about - how much time saving can be had by writing a clean code.
Ever moved from Github Issues to Jira/Trello or vice-versa? How about Bitbucket to Github (as I did for my current company)? That's a whole lot of context potentially being lost, either in tickets or PR discussions.
How much of that lost context would have ever been used anyway is up for debate, of course. In my experience the answer tends to be "rarely, but sometimes".
Entropy. It's not our friend. It's why we can't have nice things. Code bases suffer from bit-rot over time.
Minimalistic UIs are great if you're trying to make it simpler for outsiders.
If you have people who understand how to use the software, you can actually cram the UI full of everything you need in a single place.
What I can not dig is how old software prints things to screen. Touch a new dimension, let's recompile the data and re-load the list/entire screen right now! We've never done analytics our study our user base, so the most popular functions are buried 10 levels deep in a menu that re-loads every time you lose your scroll spot! ETC.
If you have spent time in corporate software developed in the WIN 95 days (cough oracle CRM cough) you won't see intentionally brutalist design. You will often see Magyver level hacks stacked through the roof, and software that loads so slowly you can re-load Gmail 3 times before it finishes.
Older UIs written in Visual Basic, Delphi, Powerbuilder, or even for text-mode interfaces did not do this, because the frameworks supported refreshing only the changed data.
Look at how emacs or vi works over a 1200 baud dial-up connection. Surprise, it does, because it only redraws the parts of the screen that are changed. These problems were solved in the 1970s and 1980s.
The web broswer has been for most of its existence a really bad way to deliver user interfaces. The zero-deploy nature of it was very powerful though, so people suffered through it. Only in the last few years have browser-based interfaces approached the abilities of native clients.
After the latest changea to Gmail I have high hopes that it soon will catch up in terms of slowness :)
It's not like I don't get it. I do. Just not 100% of the time.
Don't forget the 10,000 packages you need to find because they no longer have a feed available! They'll be rewrites.
The generation from the 70s and 80s will forever be the best technological generation the world will ever see. We had serious constraints that would need elegant solutions. I remember as a kid looking at Z80 maunals to learn how to save a few bytes so that code would run, or the 680k Microsoft limit that caused the code to be very concise and every bit was thoughtfully used.
Now that the sky is the limit we have cookie cutter solutions built on other pieces that "work" which really are just rewritten code in some new language or framework. This is now the new norm and it will never change.
In the era of abundant hardware resources, I like to think that this sense of craftsmanship hasn't been abandoned, but rather transmuted--transmuted into the ability to make things as readable, accessible, and maintainable as possible.
The Redis, Phantom (Thrift proxy), Kafka, and PostgreSQL codebases are regularly things I consult as role models in the design of clean, pragmatic code.
The explosion of technologies since the elegance-by-necessity-of-optimization era does not necessarily mean that the capability of engineers has declined.
I don't think those are comparable to complexities in IT that exist today.
I think the constraints and lack of “here you go” resources many starting programmers dealt with, even for toy applications, out of necessity in the 70s and 80s are better preparation for the attitude necessary for dealing with the complexities in IT today than the learning conditions today.
Which isn't to idealize it, it was also a hard onramp that drove lots of people off that would have done well at many real world problems where they weren't soloing without support, and night with experience have still developed to great solo practitioners, too. But the people I've encountered that came through that 70s/80s start tend to be, IME, on average more willing to slog through and learn hard stuff in new areas than people who came up through, later, easier onramps.
Though that may also be in significant part survivorship bias, as the 70s/80s crew will have had to have stuck around the field longer, usually, and it may be people with that flexibility are more likely to stay in technology past the half-life of whatever was current when they got in.
Business is great at wasting money on software, I just wish they'd waste it in a way that benefited people.
Stupid, no. Crappy code, yes. Part of the reason people wrote crappy code then is because programming techniques and best practices have come a long way. The other part is most people writing code then were inexperienced coders.
Of course, plenty of crappy code (most?) is written today, mostly because of inexperienced coders. It's not because they're stupid. After all, who would make a 20 year old a general?
That hasn't change in the last few decades and it won't change in the next few decades either.
Good luck with them lasting 10 years
It was probably still a valid use case for comms, if not within the application. Code was still being built on x25 and oh-so-slow modem links.
Yet, we didn't hear about left-pad until what? 2016?
If it were shit craftsmanship, it wouldn't still be running.
- Roman architecture
- Browning designed firearms
- Savile row tailoring
- Old ships
- anything your great-grandfather owned
I had a look at some Roman houses in Ostia Antica, and the really striking thing is how little the techniques of architecture have changed. They could do some impressive architecture from time to time, but on the whole, they built stuff about the same as your average cowboy builder would today, minus a bunch of legal requirements (fire proofing, etc).
Technology advanced too slowly for a producer to assume they'd have a new model to sell in the near future.
Crappy made w/e will break the moment you stop maintaining it.
When actually entering code, their focus can then be on the fairly straightforward translation of their notes specifically into FORTRAN, and on adding good comments for those who deal with the code later. Most of their creative energy at this stage can go into those comments.
Part of this is that it was still common in the '80s to either not have interactive access to the computer the code was for, or for such access to only be via shared terminals in a computing center away from your office. You needed to arrange things so that when you actually got to a terminal, you were efficient.
Since then, we've almost always got access to our own private computers that are powerful enough to run at least test versions of whatever we are working on even if it is ultimately meant for some big server somewhere else. Now we can sit down and start coding while still designing the program in our head. And so we do, even if sometimes it probably be better to separate design and coding.
I think this also might have something to do with why BASIC was good as a teaching language, as was noted recently in some other HN discussions. BASIC, especially with some of the limits put on it to fit it in some smaller computers, was constraining and painful enough to deal with that people quickly learned that trying to do that while also trying at the same time to figure out the design and algorithms they needed was way too hard. They naturally learned to separate figuring out how to do something from coding it.
We might hope that at some point in the early 2100s, somebody will notice that none of the nuclear missiles work anymore, and with any luck, the engineering knowledge to create new ones will have been lost.
I have not thought of FORTAN as legacy because whenever I need to look at very old FORTRAN code I am looking at mathematical structures such as matrices and vectors. At that point I am not thinking of FORTRAN anymore but another language (such as linear algebra) which is few hundred years old but still so crisp in a world of numerical computing.
HOWEVER: I cannot say the same about ABAP or Java that initially began to take shape in an enterprise system 20 years back. Then I have to go through reams of code that is truly legacy, hard to navigate through years of modifications, dependent heavily on local contexts, personal perferences, etc.
Downvoters don't seem to understand this point, that was the idea in the old days. That hardware was for men.
Well they're in for a surprise if they think modern user experience designers would fix that.
The software is pretty simple, and mistakes were almost always process problems associated with complex exemptions and classification.
The other set of problems were related to small market sample size for some corner cases. You can see this at play with Zillow.
Software is used as a boogeyman for leadership and process failure. A clipper system from 1985 is a great solution. The problem with it is that it’s highly likely that the needs of the business have changed over 30 years, and the software has not. That just reflects the lack of capital investment and process improvement typical in municipal government.
It’s easier to say “OMG, Clipper from 1980!” Because that is better than “we are a bunch of idiots”.
In the town that I grew up in, the process prior to 1980s computerization was a 100 year old paper process that was essentially the same as the computerized process — maybe better as the internal controls were better. Some of the properties in that town had deeds dating back to the Dutch colonial period, and in many ways the 1650 process was similar to the 2019 process, except they used percentage of rents and a levy of crops based on market value.
In my business systems consulting business, I think I have seen exactly this issue almost more than any other issue that I've encountered... across industries and business sizes.
Software gives you that convenient, opaque, ill-understood rug you can sweep any old systemic (non-technological sense) problem under. And it's probably so common because it works. Those people trying to get to the root of organizational problems more often than not don't have sufficient understanding of what role technology plays in their business as to challenge people blaming the system... so many dollars and so much time get dedicated to solving the wrong problems this way....
To be fair most, if not all, business systems are terrible, but they aren't usually the core problem when a company gets to bringing some like me in to "fix/replace" the systems. Makes my job a little harder than it should be to be honest.
Software is often used as an attempt by one group in an organization to impose control, often by imposing process, on another group. Software and software salespeople are often pawns or MacGuffins in an internal power struggle.
It sucked not having transactions and referential integrity, but I liked not having to second guess a query planner :-)
Systems didn’t scale past small workgroup LANs very well, though.
<Aliens Guy>User Experience</Aliens Guy>
however as you mentioned, a modern user experience isn't the end all. we even did a migration from an MS Access database application to a new platform where it was required to be 100% look a like. The whole move was done to modernize that application yet those doing the work got handcuffed on day one.
one isuse holding back the upgrades in most cities is that their IT departments tend to be a real mess with one or two SME per system/platform that isn't PC related and even then its all maintenance mode anyway because it just works.
however the real issue is likely, infrastructure maintenance which includes IT and other not sexy items like sewer and road repair fall by the way side of ribbon cutting opportunities for new buildings, roads, and bridges. then throw in the incredible debt to employee benefits and pensions most cities have and its no wonder so falls to the wayside
RPG was not my cup of tea, but the results on the screen looked much like the XBase stuff on PCs I was doing the previous half decade, so I guess people were happy to schlep their working code onto newer, cheaper machines.
Regardless, defending it as standard behavior means you're arguing it has -average- UX. The OP was in favor of it having -good- UX (well, 'clean, customizable and nothing is hidden away', and I was just pointing out that doesn't necessarily mean it's good).
People that have years of experience using the older interface are going to have a tough time making the transition.
training is usually lacking, and a newer UI does not guarantee that more mistakes will not be made in the learning curve.
I doubt most UX designers have ever even used a curses based interface. They are just disgusted by seeing so much monospace text, and are compelled to tear it down and replace it with something they are comfortable with (And usually the only thing they know how to use).
I had to clean up after some UX monkeys a few years ago, who "upgraded" a warehouse inventory system. The warehouse used 30 year old embedded MSDOS based portable Symbol terminals and a SCO UNIX server in the back end. They also used Wyse terminals in the office for the admin interface and data entry. Everything booted in seconds, and the portable terminal/scanners lasted for days, and never had a problem.
They decided to replace it with React Native on Android touch screen terminals (this was a refrigerated food warehouse). The hardware couldn't go even a whole day, and the touchscreens were not at all appropriate. The software was slow and unreliable, and had led to costly mistakes.
Speaking to the designer, he seemed far more concerned about how it looked than about adding back essential features. He kept joking about how backward things had been, and how heroic he was for helping these simpletons see the light. His only justification for several things was that it was "trending" and he didn't want his client to be "left behind".
They also started by gutting essential features, and adding them back on a rolling basis, when they got around to it.
In my opinion, a big part of the problem is the culture of UX. It seems to have amplified the stereotypical arrogance that people sometimes associate with IT people, and combined it with smug fashionistas.
And they just assume that older technology must be inferior, because it is old. They use obsolete as a synonym for old.
In the span of a year, I had UX consultants tell me that password masking is obsolete, and that there should be one field for passwords, no wait, two fields but masked again. Decisions are not really based on logic, they are based on fashion trends, and intense fear of not being hip.
Maybe in some way this makes sense in the modern app ecosystem, that is possible. But is should be relegated to crummy Listicle apps. But these types of people should be kept well away from any software that does anything important.
As for the warehouse I mentioned, Android and React Native are not the right technologies. But maybe this firm didn't want to tell the client that they should go somewhere else.
I think this is a key point. Beyond the fact that their brain is still developing, most people under 25 (or with only 1-3 years in tech) simply don't have the breadth and depth of experience necessary to build an appreciation for reliability, much less the ability to identify reliable components or design reliable solutions for various environments.
I've seen CEOs give the reigns of their app to some UX person, because they want to impress the hip young UX expert, and they know just enough tech to appear to a wizard.
Software is art, fashion and politics.
Or you can do '5, 3, 12, <customer number>'
Try changing the work flow of the 60 year old lady in accounts payable who's been using that for the last 25 years, she will never bring you in brownies again.
Not to mention no fat web client interacting with a server in the cloud is as fast as interacting with the server in the basement of your own building.
We couldn’t find the issue for the life of us. Eventually, we sent someone down to the shop to see what was happening. We realized that the field workers weren’t waiting for the screen to update and were tabbing through using muscle memory. I had inadvertently switched a field and the tab order around.
And then there was the time that a new user was complaining about not being able to see a field but all of the old repairmen didn’t complain. Someone hadn’t actually tested the colors on the screen outside in direct sunlight where everyone was working. The new screen was just like the old one and all of the other repairmen just did it by memory.
Im sure those technicians went through a few new-hires who just c/would not put up with not being able to see what they were inputting.
Thank you, kind sir, for fixing what actually broke.
And the effort spent making the transition may not be worth the reward.
This assumes that you're not also trying to make it a mobile app. Many of the more (in)famous public redesigns in recent years went for "open" and "spacey" UX because they erroneously tried to make their main design a mobile-first design, when that wasn't their core business need.
(Often, a second application can be built that satisfies mobile needs. Many uses of reddit or gmail, like myself, are more than happy to use a separate mobile app for consuming data.)
"You'll get used to it"
"It's the latest trend, you have to do it."
The Internet Archive redesign is a great example of this.
I recall working as a technician years ago we used text user interface ticket system from the early 90s. While it wasnt the prettiest system it did its job and one could navigate it quickly once they has commited the keystrokes to muscle memory.
We thought about replacing it with more modern software but after evaluating a few of the options found none of them to be particularly compelling. If anything, we appreciated what our “old” system could do even more.
They then replaced them with Windows touchscreen-based machines, where just paging through the stations to find your destination by name took as long as it took to buy your ticket with the old machine. Add to that all your typical touchscreen issues (calibration drift etc)
I've been to around 30 countries worldwide and never anywhere seen machines as fast as use as the old ones for repeat users.
Your old train ticket machines were apparently great for locals who were familiar with them, and had memorized information like the zone code, but they were surely horrible for anyone else. By contrast, on my last trip to Germany, the Deutsche Bahn ticket machines weren't the fastest things in the world to use, but for tourists from other countries, they worked pretty well, since they had dozens of different languages you could choose to use the machine in, and had the ability to look up destinations, which is surely a lot easier than trying to figure out some arcade zone code.
In short, an interface that's highly efficient for an extremely experienced user probably isn't going to work well for someone who's never used the system before, and vice-versa. For a train ticket machine, it seems to me you want something that's easy for newcomers; you can't assume everyone is a regular user of the system. However, a lot of these comments are talking about things like warehouse inventory systems; these are things that only a relatively small number of employees will be using, and they're going to be experts or working towards that, and efficiency is the goal, not newbie-friendliness.
It's not an either/or. You need to support both sets of users well.
Sure, if 90% if interactions are going to be by newbie users, design the system for them. But for something like a train ticket system, maybe 50% of users may be newbie visitors, but only 10% of interactions are by them. It doesn't seem like the right call to cater to newbies if it means making the remaining 90% of non-newbie interactions noticeably worse.
That's the kind of user-hostile decision-making that annoys and infuriates people. It's basically saying it's better to bother a hundred other people than have one person bother you.
In my comment I said that you need to support both sets of public transit users well, and that it's not either/or. In many cases it's a mistake to make the dumbed-down newbie interface the only interface, because that could make your system noticeably worse for most users.
Personally, in cases like this, I think some kind of touch-screen guide to the efficient expert interface is probably the right balance. Make it easier for newbies to scrape by (because that's the best they'll ever be able to do), but give frequent users a path a better experience rather than holding them back.
> that's probably worse than making things slightly easier for your regular visitors.
I think we're talking about residents, not visitors, in this case.
>That's the kind of user-hostile decision-making that annoys and infuriates people. It's basically saying it's better to bother a hundred other people than have one person bother you.
I don't understand this comment. If someone wants to buy a train ticket and the system doesn't even support their language, and they can't understand it, literally the only option they have is to seek help. I don't have to be able to read German to use the Deutsche Bahn train-ticket systems; they have options for English, French, Spanish, and even many other lesser-used languages like Polish, Czech, etc. If your city has a lot of international visitors, you either need to have a system that they can use easier (by supporting their language), or you need to hire a bunch of multi-lingual agents to work full-time at all the stations, or have some kind of translating service available for your employees to call to help these people. Guess which one is cheaper?
I think we're talking about different things there. I was focusing more on the language-agnostic interface design (e.g. hand-holding but slow vs fast after a learning curve), but you seem to be focusing more on internationalization and language issues.
For example, there are regional tickets that allow you to take just about all the regional trains for a day with a group of people. You can find this ticket if you poke in your destination (the default prompt), select it, and then it gives you the option to use this or other types of tickets. Or you go ahead and select "all tickets", type NDS, skip options and pay. Takes less than a minute.
This sounds more like a bad UI. A good UI covering all use cases would let you do both, and a UI not able to provide choices like that is a bad one.
That's putting it mildly. I make over $100k where my three bedroom house cost me $165k. I guess the weather is worse than the bay area, but I have a 10 minutes commute to an affordable house.
I work on crappy legacy code (and some nice shiny new stuff or I'd go insane) for what would be considered in SF a crazy salary (but by UK standards and particulary where I live, decent).
Quality of life matters as well, Mon-Fri 9-5 with overtime been a once/twice a year thing.
It usually seems to me that the people who favor the low-CoL lifestyle are people who value having a giant house. I want more money for things like foreign travel; I don't care about having a giant house.
We built better software for cities and ran into this over and over and over again. They're not incentivized to change.
Citizens don't care or know enough to raise hell about city operations. The employees who do the work can't advocate for themselves and/or feel no reason to risk their jobs to make their own lives better (there's no reason to be more efficient.) Leadership cares mostly about press releases—blockchain or AI or whatever buzzword sounds better to them than the incremental changes that will actually contribute to fixing cities.
It's frustrating, bleak, and the inverse of inspiring.
I tried for 2.5 years before losing hope. My co-founder carried on another year before running into the same. The best way to get people what they need is to bribe them—e.g. work through an intermediary who takes a cut and delivers the rest to a political campaign—which we refused to do.
Local governments affect most Americans' day-to-day far more than the national political shitshow and this only becomes more true as populations further concentrate in cities. And yet, they are utterly soul-sucking morasses of internal and external politics, misaligned incentives, and bureaucracy. They drive out the people who can and genuinely want to effect change.
It's a damn shame.
The effect of that if something is running well often the most rational solution is to leave it running. Yes it might not be the most efficient solution, but it might still be more cost effective than losing 3 months of work due to conversion to a new system, in addition to all the extra cost, training, downtime, ...
In the current ecosystem more and more also seems to be license based. Do you really want to change a decently running system with just occasional maintenance cost for an untested system that will change all existing processes and in addition ups your annual cost by quite a bit? Any time you change a system you probably have to change some additional systems too as suddenly things don't work together anymore....
Cities are often stretched for resources ranging from (good) personnel to financial to physical space to ... It's probably in many cases the rational decision to stick with how it is if the system mostly works.
To give one other concrete example, i know of a country that chose a great new software solution to provide information exchange between medical providers. The tool runs well, they got long term commitment and a safe provider. BUT they then found that there's literally no one except the provider itself who can train all the users as the UI, process and exisiting training materials are all copyrighted (or otherwise protected, maybe through contractual clauses) by the provider. Means they found s gigantic additional cost even when they thought they'd done everything right in the choice of software.
Again, risk can be bigger than the reward.
"We’re dealing with an irrational public who wants greater and greater service delivery at the same time they want their taxes to be lower,"
There's noting irrational about wanting government to take advantage of the same technological advancements that businesses use to improve service while cutting costs.
When I order from Amazon, I see real-time inventory, get immediate order confirmation, near real-time tracking as the package is handled (and actual real-time tracking for some orders were I can see where the delivery vehicle is), and delivery to my door step is 2 days (or sometimes the same day).
In contrast, when I ordered a dog license from my county, I had to mail (FAX was also accepted) a paper form, and then 6 weeks later I received a metal plate in the mail (just a numeric ID on it, no personalization). The only way I knew they even received my order is because my check cleared the bank 2 weeks after I mailed it (no credit cards allowed).
I don't expect my county animal services department to build an order an fulfillment system that rivals Amazon's, but there are thousands of this offices across the country, they could all cooperate on one system and save money dealing wit these paper forms.
Bias: I am working on revamping some systems used within California which feed data in from the county offices to the state office. We are almost done replacing a birth certificate / health survey system from the 80s so that the same database is used state-wide, rather than batch updates between hospitals, counties and the state running separate datastores / app instances.
Then we have to make some less drastic updates to some other similar (purpose) systems later this year...
I work for a medium size company, our expense reporting system is 100% paper or excel driven - we could go to Concur, but it would probably cost more than our existing system does, and may not pay for itself within the lifetime of the product.
Sticking with your exmaple, you'd probably search for "dog" and receive actually useful results that range from relevant forms to complex guidelines.
The UK's portal has been discussed on HN before and in my opinion it's something other countries should emulate.
Competition is no guarantee of good customer service -- cellular companies have some of the worst customer service, but almost everyone has 2 - 4 cell companies to chose from.
The most common complaint I hear from people who've switched from the legacy software to the modern one is that their end-users' productivity is greatly reduced by the point-and-click interface. The older software's TUI had a very steep learning curve, but almost like learning Vim, once they developed a muscle memory for the necessary keystrokes, they could access and update information much more quickly.
The user then requested and got a single input box where he would type in all the numbers just the way he did before.
The video store where I grew up had a similar text based interface to some floppy located database for rentals. Amazingly fast and some override button to force invalid entries when someone else had messed up or something was registered wrong.
Binaries run straight from a floppy with no fancy OS to interrupt the very simple yet manually tedious things it does. Many apps today are way over cooked for what they do.
BaRT, I wonder if they’ve upgraded their system software to more modern software? They certainly could add more intelligence (system health, wear, etc.) and begin automating more aspects, so it could make sense to upgrade here, but I’m sure there are lots of places where upgrades are not necessary and the old software is good enough, so long as it’s still supported by a vendor.
I'd say that's especially true in places like power plants. The old stuff is impossible to hack remotely because it's not on any networks. The rush to put everything on the network has made these types of systems vulnerable and places are only now trying to figure out how to secure them.
Not everything needs to be rewritten in React using Google’s Material UI design standards, but my god, our public software really sucks.
Even as the article talked about the tax assessors office... I know someone who bought a condo, but SF can’t collect property tax because their office is so backlogged. So instead they told the owner to make sure they save the money because in a couple of years they’re going to come asking for it.
Turns out SF Muni and Clipper Card (and possibly Clipper Card's billing vendor?) are two/three completely independent organizations. Everything is passed between them in batches with up to a several-month delay, with batches often arriving out of order. This makes auditing nearly impossible and is what made me think something sketchy was happening. It's a miracle it works at all.
You're putting the cart before the horse there. It's the transit agencies themselves that advocate for the lack of prices. BART, for instance, held out on implementing TransLink/Clipper for ages because they wanted their own BART-only cash purse on the card. Clipper is still fairly new and I don't think the fare rules have changed much, if at all, since its inception.
IME they're able to collect property tax, no problem.
"SF can't collect the full and accurate amount of property tax in a timely manner from properties that were recently reassessed because their local assessor office is backlogged due to use of old and inefficient software."
Which is basically what this whole article is saying.
Of course, telemetry is the first thing that comes to mind about modern software.
Me, I'd probably go with a static executable with a terminal UI and a flat file sqlite database that could be copied by mortals, and hope that whatever I set up as a backup regime would survive future migrations to new hardware.
And there's nobody that can help them fix anything. So upgrade you say? Then they'd have to start from scratch. And they long ago laid off the accounting experts, tax lawyers and business planners. They have button-pushers and data entry folks. Nobody left who knows were the data comes from, where it goes, or who all uses it.
Rumor is the company has been trying to re-write the control software to run on Windows 7 for the last 10 years (name rhymes with spaghetti)
We interact with it using little data packets (ASCII, sent from a VB app) of coordinates to tell it what to do.
Insane amounts of money would be lost if that machine goes offline for a day. We can't just replace it unless it's 100% compatible. Some of our bigger competitors have 10 of these machines.
There's no real benefit of upgrading them, though. We'd still run the whole damn machine behind it's own ASA and on it's own network. The users don't care. And our programmers wouldn't be happy to change it.
As long as the computer is well-isolated, the biggest ongoing risk here is replacing old and failing PC components when necessary. But the same might be also be true of the industrial systems themselves.
DOS with green-screen terminals
Not a horror. If it works, why mess with it?
hand-coded apps booting on home-brew hardware
Also not a horror. Bespoke hardware is everywhere, as are hand-coded apps. AI does not yet build software.
Still not a horror. Perhaps you just don't have a lot of experience with legacy systems.
Can you still get parts for said machine? Do the disk have bit errors? Is the RAM faulty?
Things fail, sometimes in non-obvious ways. It's important to be able to fix your existing system. Often times that means having a plan to upgrade to a new system.
Sure, why not?
Worst case scenario for a DOS machine is that you run it in DOSBOX or a similar VM.
I contract for a few companies that supply local governments in the UK. I regularly have to fill out stringent checklists about IT security provisions. PCI-DSS and beyond.
So I always chuckle when we meet them on-site and they're running an XP install with IE8, plastered with toolbars. These are machines I would sooner throw into the abyss than try to rescue. They'll make me jump through hoops but there they are inputting and manipulating citizen data on machines that are almost certainly part of a botnet.
Tablets are also increasingly a problem. As a b2b-service webdeb, I still see a lot of first gen iPad Minis in active service in enterprise. Last patched mid-2016. Oh well.
The next big worm attack is going to be devastating to organisations like this.
I used to work for a large oil company. Their IT guys were pretty adamant about security. Right before I left the company, they purchased another smaller distributor. All their guys had ipad minis and same thing, were last patched in 2016. The IT guys right away were telling the execs, "Yeah, they're all getting Win10 laptops, fully patched and locked down to use only our software. No way we're going to keep using these."
This raises so many questions.
The entire industry knows this doesn't work, but there's no political will to fix it because it's easy to attack an opponent for spending money with no defined goal or for wearing anti corruption measures that drive a lot of the regulations that require defining everything up front
Project officials did not enforce proper code development practices, and there was insufficient testing of the software.
Sheesh, I wonder how much of the budget was burned on inflated admin fees?
That's some serious scope creep going on.
I truly do wonder exactly how much software you could produce if you had 2.2 billion dollars to do it.
Governments are getting taken for a ride.
What was the original budget?
Were there cost overruns?
Who got the tender, and why?
Where was the complexity that made it a $100M, decade long project?
You'd have to pay me a lot to touch that with a 100 foot pole, or I'd have to be a pretty down on my luck subpar developer to touch that. But hey, to the lowest bidder we go!