First, there was no guarantee you could get useful data out of the programs of the PC and Mac era if they didn't explicitly support getting data out. I've got tons of sheet music in a Windows 3.1-era notation program that I can't get out, because it's stored in a binary format and there's no built-in exporter.
Second, APIs for web services actually do support lots of this. I have a very tiny script taking all my Foursquare checkins and sending them to a private Slack, for instance. I wrote the script. I didn't need approval from Foursquare or Slack.
I wonder if the author has actually tried using any APIs. The fact that some '90s desktop apps were easily scriptable and some '10s web services aren't scriptable has nothing to do with them being desktop apps or web services.
(Not to mention most '10s web services being scrapable. I don't remember any screen-scraping tools for Windows 3.1 or classic Mac UIs that were anywhere near as good as the web-scraping tools we have today.)
I don't know why you keep lumping the Mac in with the PC; on the Mac -- not counting AppleScript, the myriad of data translation tools, etc -- there was rich copy+paste support that worked between applications.
> I've got tons of sheet music in a Windows 3.1-era notation program that I can't get out, because it's stored in a binary format and there's no built-in exporter.
So write a conversion tool that parses the file format; if it's sheet music, the format isn't going to be too hard to decipher. How is this any different, other than scale, than the "web APIs" you claim solve the author's complaints?
> Second, APIs for web services actually do support lots of this.
No, they don't. They require writing code, which requires that you actually be able to, you know, program.
In addition, even being able to use these "web service APIs" requires that you give all your data, control of your applications, and the ability to get work done to a 3rd-party service.
That's not a replacement for a desktop application.
> The fact that some '90s desktop apps were easily scriptable ...
Not some. Almost all. Even when they didn't explicitly support it, there was generally a way to get what you wanted done, done, thanks to scripting language integration with standard OS-level APIs upon which the applications relied.
> (Not to mention most '10s web services being scrapable. I don't remember any screen-scraping tools for Windows 3.1 or classic Mac UIs that were anywhere near as good as the web-scraping tools we have today.)
You mean the classic Mac OS where you could literally hit "Record" in AppleScript Editor and produce a surprisingly workable script based on your actual UI actions in the applications?
Or modern Mac OS and iOS, where things like VoiceOver give you semantic access to the entirety of the UI?
I appreciate your optimism. NoteWorthy Composer was first released in 1994; last I looked was around 2011. It looks like that year, after I stopped looking because I was no longer singing with the group using binary .nwc files, MIT's Music21 project started scraping together beginnings of a parser. (Also, sometime around 2007, the software authors voluntarily added a text export format. That doesn't count, because it depends on the goodwill of the software authors, just like the web services being decried.)
If it took fifteen years for someone to decipher the format, and it only started being slightly parseable in the modern web API era where we've supposedly left scripting behind, I don't think you can claim either that it "isn't going to be too hard to decipher," nor that we're losing anything in our modern era.
(Also, like, my memory of software in the '90s is that in several cases the authors went out of their way to add obfuscation or anti-tampering features to their data formats.)
> How is this any different, other than scale, than the "web APIs" you claim solve the author's complaints?
I didn't claim that they solve the author's complaints. I claimed that it makes no difference whether the software is web-based or local.
Your Mac vs. PC distinction is much the same. Some web services are pretty good at this; some aren't. If this is a criticism of the PC platform, phrase it as a criticism of the PC platform, not the web.
> No, they don't. They require writing code, which requires that you actually be able to, you know, program.
2. How is "program" different from "script"?
I don’t know what kind of application is the author talking about, and what kind of support does he need. Here on windows I can automate things just fine.
Some apps, like all OS components, web browser, ms office, visual studio, and photoshop have their own APIs exposed, documented and supported. For other apps I can use OS-provided UI automation API if I need to.
In addition, the industry has shifted towards open file formats and network protocols. This makes possible to process other app’s data in a way that was impossible in the early days of that amiga’s, when most data formats were binary because performance.
Average end users ain’t doing all that things but that’s not a technical problem. Just demography and marketing. Old computers were expensive equipment targeted towards scientists and engineers. Modern computers are cheap, they’re for everyone from 5 years old to 100+ years old who use them to post selfies and watch cat movies. You cannot teach them all programming, they have better things to do with their lives.
Even if you don't, I assume that you are a programmer based on context and your comment, which means that the applications you use were designed for people who know how to program and therefore are more likely to be interested in scripting. Also, as a programmer, you're more likely to be using a lot of free software tools even if you're not running a libre system.
Take iTunes for instance. On Windows I'm under the impression that it's not at all scriptable. On OS X there's AppleScript, but it's somewhat underpowered (I remember back in my Apple days I tried scripting it and found that some metadata fields were inaccessible from AppleScript, namely Album Artist. It's also woefully underdocumented.
A smartphone is for checking the weather, reading, looking up places to eat, movies, requesting an Uber or figuring out what bus is coming next, and so on. If you're not using your smartphone for that stuff and instead are scripting your music collection, why even leave your house? Just stay at home and use a computer.
I'd like to be able to use it for more. It is, after all, a computer.
Also, truth be told, since I got a smartphone, the muses just haven't spoken to me with an idea of something to script, that's interesting enough to capture my attention.
Lack of a decent keyboard and editor are barriers for me. I need a larger screen simply to see what I'm doing, in my old age. And most of what I do with scripting, nowadays, has to do with things that are not on my phone anyway, such as processing big files and interacting with measurement instruments.
What's happened instead is that I got a tablet with an OS that's scriptable, and that has a detachable keyboard. Now, all I have to do is wait for those damn muses to speak up. ;-)
Unfortunately installing Python manually has not gotten much easier, though it's certainly doable. You can even use Tkinter, but that is beyond the scope of this comment.
As far as cross-platform across desktop and mobile, that's a difficult challenge. Java used to be the go-to if you want to be platform agnostic but that obviously won't cut it if you want to support mobile. You can check out Qt. I only use it on Linux but it apparently supports Windows, Mac OSX, Android, IOS, and Windows Phone (maybe even more).
Edit: Smartphones and tablets are simply not general purpose computers. I don't want a desktop type OS on either my phone or tablet. I certainly don't want a tablet OS on my desktop. That defeats the purpose. Smartphones and tablets are good at certain things the require _direct user input_. If you really want a smartphone/tablet with desktop capabilities there are vendors that sell that stuff. I just can't see a need for it.
What you're missing is, for many of us, scripting is fundamentally how we interact with computing devices in a convenient way. I don't know if I want "a desktop type OS" on my phone or tablet, but I definitely (!!!) want an OS that allows me to easily and flexibly compose disparate pieces of functionality across contexts.
Then as far as development, its pretty much a necessity.
(Admittedly, my tablet is a raspberry pi duct-taped to a LCD screen, but I would have the same expectations of a 'proper' tablet. (Which is a major factor in why I am using a raspberry pi duct-taped to a LCD screen.))
Maybe he is talking about end-user software that can be scripted? But even that isn't true for devs.
Take banking as an example, pulling up your bank balance, or making an online payment, is nowhere near the computing power of actually balancing the transactions at all banks across the world each night. You get a tiny sliver of a tiny report of the computing that happens in the background.
Entertainment, personal tools, communication - those are moving mobile. But that is just the surface.
We've had a shift back to centralized data storage and processing.
There were a few decades - the 1980s & 1990s & 2000s - where the canonical master of a data set might have been local to the user, but cloud computing has moved data back to DCs for centralized storage & processing. And we are using very smart terminals.
That doesn't invalidate the claim that scripting is dying. Sure, us tech folk can (and do) script all kinds of things, even even more so the Enterprise IT world, who practically lives and dies by powershell. But back in the 90s in particular, there were such a thing as "power users", who would script the crap out of their systems. That really doesn't exist so much anymore. Now people in that grey area between end users and the coders are more likely to be professional business analysts. The days of a true end user also creating their own custom scripts to merge data and functions from disparate apps is getting rare.
(Keep in mind, HN is an entire community of edge cases... of course we all can spit out exceptions to this trend.)
Now, I do not think this is a bad thing. On the contrary, most of those people were very hard to support, and the ones who did it well and were easy to support often ended up moving into IT/Analyst roles anyway. I think we are in a better place today, with a much larger population of coders and analysts, so the end users can just tell us what they need, then focus on the actual jobs.
But the claim that end user scripting is dying... Yep, I do agree with that much.
On Linux machines I don't have to install because it's already there :P
ie ARexx and that stuff MacOS does or used to do with traditional desktop apps.
Can you script copy and paste or move the cursor? It will be used for nefarious purposes, block it. Can you install and run apps as a user from your home directory? That is a security hole, should be disabled. Can you read windows' titles and move windows around? That is an information leak, it should be plugged. I like secure system just like everybody but I am worried that this will interfere with my pleasant very automated computer life.
But this article is about something else, so I will stop the rant here.
Given the author's background along with this paragraph, I'm going to go ahead and guess that he's referring to iOS.
But if this is the optimal UI for highly competitive environments, why isn't it also the optimal UI for uncompetitive competitive environments like retail?
1. Self selection. The highly competitive environments attract extremely intelligent people. Abstractionists can assemble a breadth of solutions that systematize solutions, while workers such as retailers are generally less adept.
2. A constriction on UI is similar to a schema constriction. Aside from the occasional "General Notes" field on a intake form, the interaction can be structured more reliably than a person with a terminal shell. This is important partially because lesser skilled people can't be trusted to make higher level decisions like data structure.
I don't think #2 is a hard limit, I think UIs can be designed around the limitations of human intelligence and I think that systemic educational changes could make CLIs more friendly to the general public; but I think there is a local maxima that is more easily solved with software like Tulip* and so that is what we have now.
Never thought writing WoW ui add-ons would prepare me for a 6 figure job.
Even in that context, the article is a pile of fear-mongering horse shit.