My company is finally upgrading away from a product that is written in a 4GL language. This product probably started out on a Unix but was ported to Windows decades ago. It has both a web and classic VB front ends.
All the source code is available and theoretically I could make changes and compile it up. The language itself is basically just plain procedural code but with SQL mixed right in -- somewhat like DBase or Foxpro but worse. I think the compiler produces C code and is then compiled with C compiler but it's been a while since I looked into it. Requires a version of Kornshell for Windows as well.
The other good reason that Google has is that it puts them entirely in control of the lists. If they don't want Chrome to block ads on Google properties they can opt them out of the block lists.
It's basically free money. You get money from people who subscribe and nothing from people who don't.
Since so many companies are simultaneously doing this app-powered subscription based value extraction there is no competitive disadvantage to doing it.
I grew up before the Internet and we still craved connectivity with our computers. I remember dialing into BBSes and playing turn-based text games and it was amazing. It was also the best way to get software; my computer would be pretty boring if the only software I had was what I created myself or purchased in a box.
I also could have done so much more with all my computers, from my Commodore 64 to my 286, if had I had the vast information resources that are available now.
I think the difference is that in the days of the nascent internet, connecting with people meant much more than it does now. You dial into a BBS or log into a MUD and you have a small-ish community of real people that you can develop relationships with. Modern internet connectivity almost means the opposite: all the major services are oriented toward moneymaking, nothing is genuine, there is no sincerity, most behavior is motivated by accumulation of worthless social capital.
So, the society that you craved connection with no longer exists now that you are able to connect. This is another thing that, seemingly, has to be rebuilt from the ground up locally.
I got started with a 1200 baud modem, back in the late 80's. I miss the local community found on BBSes and the early, text-oriented Internet providers. There seems to be no replacement for that at all. Any "local" oriented sub-reddit, Discord, etc. is full of bots and spammers.
I've significantly improved the performance of queries by undoing someone who did #5 when it wasn't strictly needed. Sometimes breaking a query into many smaller queries is significantly less efficient than giving the query optimizer the entire query and letting it find the best route to the data.
If you've done #5 without doing #6 then you'll likely not see that you're doing something not optimal. My advice is avoid premature optimization and do things the most straight forward way first and then only optimize if needed. Most importantly, don't code in SQL procedurally -- you're describing the data you want not giving the engine instructions on how to get it.
I've had the opposite experience. Any attempt to adapt processes to an off-the-shell tool have always been a subpar experience for everyone. I've brought a few of these in house as fully custom software and the end result has been a better user experience and faster changes. If we can buy something that fits the need, we will. But if we can't, we build. And we build a fair bit.
I disagree with "the organization is likely less unique than you think". If you're big enough you will have unique requirements that nobody else has. I'm in the middle of that now in a project to install some industry standard software that runs the whole business and we have to customize and add custom integration into it. I wouldn't want to build software this complex in house but if I was given the resources and tasked to do it, I could, and it would be better.
Scenario 1: You're doing something that every other business is doing. E.g. ERP/accounting, sales, contact center, etc.
Scenario 2: You're doing something few other businesses are doing. E.g. your actual customer business, creative, etc.
(1) is amenable to making your process fit software, to good results. (2) is usually a train wreck.
Unfortunately, figuring out if your thing is scenario 1 or 2 is non-trivial.
Canonical example: EMR/EHR systems in healthcare. You think they'd be the same... but actually there are so many integrations with other systems and/or different sorts of specialists, that a real world implementation has substantial functionality gaps (papered over with custom work).
My impression is that most people don't understand just how awful most commercial business software actually is.
One thing our business does that every other business does is vacation and overtime tracking. We have a custom in house application for that and we've yet to find a commercial replacement that is, in anyway, half decent. For most Payroll/HR systems, this is merely an add-on feature and doesn't get much attention.
For overtime, integration with our financial system allows overtime to be charged to the correct files and this is something that nobody does (or does well). Probably doing just this little bit makes this project pay for itself.
Commercial business software manufacturers are isolated from the users and in a way isolated from consequences as long as they fulfil the contractual obligations (which practically never has a 'make the users happy' stipulation).
Beside the lack of attention, you also have gargantuan legal requirements you need to integrate. Which change all the time. Sometimes a few per country.
One advantage of building in house is that you're only building for your own company. This is significantly less work than building commercial software for multiple clients (which I have also done). I can't overstate how much less work this is and how much of a better experience it can be for users.
As an example, for calculating annual vacation entitlement, we have some pretty complicated rules. But every company in every country has their own set of rules so most HR software doesn't bother calculating it -- you just figure it out manually and input it every year for every employee. But because we just have one "client" our rules are just code that we can change as needed and can arbitrarily use whatever information we have. This saves HR a ton of manual work all the time. But this only works because it only needs to be one set of hard-coded rules.
The problem when building it yourself is that this is usually done by "generic" developers who discover edge cases (often together with the requester) that threaten the whole model.
A company doing payroll for us ("us" being a multinational company) asked for a "typical payroll" to start with. Fortunately we had experienced people on the pay side who discarded the company because of this question (for one they should know, and for two they should know that "typical" will cover maybe 60% of the cases -- I thought that this was an exaggeration until I discovered the reality of calculating pay in France)
A good company specializing in "pay" or "vacation" (which are very regulated over here) will know the "typical" case and the edge cases.
> has their own set of rules so most HR software doesn't bother calculating it -- you just figure it out manually and input it every year for every employee
Beg to disagree. This is the complexity that large ERP firms handle and why Oracle, SalesForce, etc are expensive to implement. They figure out the commonality (if any) and build for it. Then they add on features specific to countries they target and then they add the ability to configure for your own situation (to a certain level).
PeopleSoft did this for Payroll and workforce administration which is part of how they cornered the market for HCM.
I think most moves to EMRs were insufficiently disruptive, e.g., electronic orders recapitulate the old paper orders without using the opportunity to reduce ambiguity and insert constraints to prevent common errors.
The problem is that the paper form is the interface, because the ecosystem was designed around it.
10 years ago or so, I asked a health insurance company why a specific digital form could only list up to 16 diagnostic codes (otherwise a duplicate with the additional needed to be created).
They thought about the question for a second, then said that's how many were on the paper form the digital system had been created from (35+ years ago).
> If you're big enough you will have unique requirements that nobody else has.
Definitely agree, and that is very interesting to hear. I only mean to speak to my experience, and what I saw in a lot of cases was unique requirements that were due to the organizational equivalent of tech debt (ie things like "our books are organized in this unique way because we acquired this other company a many years back but kept their stuff separate because it was the path of least resistance at the time").
That’s a legit issue. What’s the value of changing a deep seated process?
I have definitely seen examples on both sides of that question. Especially in a place like a public university with multiple collective bargaining agreements. The unions aren’t going to accept significant change without some sort of cost.
Typically, since processes are built around the system, nobody understands the actual business needs.
> If you're big enough you will have unique requirements that nobody else has.
the problem is when EVERY process is that way.
yes, each business has unique aspect to their processes, but when every process is heavily customized by people who have no business designing processes & applications, organizations start to hamstring their flexibility and scalability.
Its like hand-making a Ferrari with completely custom parts when what you need is a Toyota Camry. If you aren't gonna race with it, its a waste of money.
I have been working in the ERP space for over a decade. and almost ALL customers have no criteria for when to customize or keep a customization. They don't do any cost benefit analysis or any strategic planning beyond building a customized tool for the thing immediately in front of their nose.
Personally I hate software customizations. I prefer to build from scratch using proper software development tools than use whatever piss-poor customization system commercial software typically provides.
I recently sold my director on building some software from scratch rather than try to combine two commercial products together for some Frankenstiened solution. The motivation for Frankenstiening it is that we are paying for both services so we should use them (and at least one is very customizable). Only after 6 months of failing to get these systems to play together in an acceptable way, I finally asked a team member to spend 3 days building a mock up of a new app. From there it was quickly approved and we hope to roll out next month. I only wish I had done that sooner; we would have had way more time for development.
Request headers aren't going to do anything. Browser settings, maybe. If browsers were not owned by advertising companies, they'd just disallow this tracking and that would be the end of it.
This also solves nothing. It's up to the ethics of the company how they chose to group "none" "essential" and "all" and what kind of server-side tracking they do anyway.. It's no harder to do the wrong thing with the current system, but at least the headers would be invisible to the user.
Alternatively: Only allow the website to set cookies if it presents headers with the different options, in a standardized way so the user can chose to pre-set a preference and not be bothered with the cookie nag modal.
I would think you only have to generate enough legacy mouse messages that your application works. I'm sure I'm probably missing some catch here but I know plenty of applications that simulate input.
(Calling them "legacy" messages seems weird -- this is the normal current way that input messages work for regular applications).
Even if you do generate legacy input events, it is worth noting that the kernel still knows that they're not real input events. It seems to keep track of the current event on a per-thread basis (maybe it's stuffed into the TIB or something, but it could also just not be visible, I didn't search too deeply.) The knock-on effect of this is that sometimes DefWindowProc won't do the expected thing if you feed it a synthetic event, which could lead to strange behavior. You can of course use CreateSyntheticPointerDevice[1], and try to emulate it like that, but obviously then it knows that the events are coming from a synthetic pointer device (which is also system-wide and not just per-application!) So it feels like any way you go, behavior will be slightly different than expected.
Will this matter? I'm not quite sure.
I think Microsoft's WebView2 component enables Mouse-in-Pointer mode using the undocumented syscall for doing it per-HWND, but apparently Raw Input is per-entire-process, so if you aren't generating "legacy" events, will a WebView2 component in your process function as expected? Maybe if you go the synthetic pointer route, but again, that's a pretty big can of worms in and of itself.
Windows input is indeed very tricky. Seems like there's only one way to figure out some of these answers. Perhaps I should add RawInput to my test application and find out what the interaction is.
All the source code is available and theoretically I could make changes and compile it up. The language itself is basically just plain procedural code but with SQL mixed right in -- somewhat like DBase or Foxpro but worse. I think the compiler produces C code and is then compiled with C compiler but it's been a while since I looked into it. Requires a version of Kornshell for Windows as well.
reply