Hacker News new | past | comments | ask | show | jobs | submit login
The death of scripting (sicpers.info)
23 points by eclw on Aug 23, 2015 | hide | past | favorite | 50 comments

This is just historically and non-historically inaccurate.

First, there was no guarantee you could get useful data out of the programs of the PC and Mac era if they didn't explicitly support getting data out. I've got tons of sheet music in a Windows 3.1-era notation program that I can't get out, because it's stored in a binary format and there's no built-in exporter.

Second, APIs for web services actually do support lots of this. I have a very tiny script taking all my Foursquare checkins and sending them to a private Slack, for instance. I wrote the script. I didn't need approval from Foursquare or Slack.

I wonder if the author has actually tried using any APIs. The fact that some '90s desktop apps were easily scriptable and some '10s web services aren't scriptable has nothing to do with them being desktop apps or web services.

(Not to mention most '10s web services being scrapable. I don't remember any screen-scraping tools for Windows 3.1 or classic Mac UIs that were anywhere near as good as the web-scraping tools we have today.)

> First, there was no guarantee you could get useful data out of the programs of the PC and Mac era if they didn't explicitly support getting data out.

I don't know why you keep lumping the Mac in with the PC; on the Mac -- not counting AppleScript, the myriad of data translation tools, etc -- there was rich copy+paste support that worked between applications.

> I've got tons of sheet music in a Windows 3.1-era notation program that I can't get out, because it's stored in a binary format and there's no built-in exporter.

So write a conversion tool that parses the file format; if it's sheet music, the format isn't going to be too hard to decipher. How is this any different, other than scale, than the "web APIs" you claim solve the author's complaints?

> Second, APIs for web services actually do support lots of this.

No, they don't. They require writing code, which requires that you actually be able to, you know, program.

In addition, even being able to use these "web service APIs" requires that you give all your data, control of your applications, and the ability to get work done to a 3rd-party service.

That's not a replacement for a desktop application.

> The fact that some '90s desktop apps were easily scriptable ...

Not some. Almost all. Even when they didn't explicitly support it, there was generally a way to get what you wanted done, done, thanks to scripting language integration with standard OS-level APIs upon which the applications relied.

> (Not to mention most '10s web services being scrapable. I don't remember any screen-scraping tools for Windows 3.1 or classic Mac UIs that were anywhere near as good as the web-scraping tools we have today.)

You mean the classic Mac OS where you could literally hit "Record" in AppleScript Editor and produce a surprisingly workable script based on your actual UI actions in the applications?

Or modern Mac OS and iOS, where things like VoiceOver give you semantic access to the entirety of the UI?

> So write a conversion tool that parses the file format; if it's sheet music, the format isn't going to be too hard to decipher.

I appreciate your optimism. NoteWorthy Composer was first released in 1994; last I looked was around 2011. It looks like that year, after I stopped looking because I was no longer singing with the group using binary .nwc files, MIT's Music21 project started scraping together beginnings of a parser. (Also, sometime around 2007, the software authors voluntarily added a text export format. That doesn't count, because it depends on the goodwill of the software authors, just like the web services being decried.)

If it took fifteen years for someone to decipher the format, and it only started being slightly parseable in the modern web API era where we've supposedly left scripting behind, I don't think you can claim either that it "isn't going to be too hard to decipher," nor that we're losing anything in our modern era.

(Also, like, my memory of software in the '90s is that in several cases the authors went out of their way to add obfuscation or anti-tampering features to their data formats.)

> How is this any different, other than scale, than the "web APIs" you claim solve the author's complaints?

I didn't claim that they solve the author's complaints. I claimed that it makes no difference whether the software is web-based or local.

Your Mac vs. PC distinction is much the same. Some web services are pretty good at this; some aren't. If this is a criticism of the PC platform, phrase it as a criticism of the PC platform, not the web.

> No, they don't. They require writing code, which requires that you actually be able to, you know, program.

1. https://zapier.com/

2. How is "program" different from "script"?

Scripting isn’t dying, it’s doing great. For example, every competent system engineer needs to be comfortable with scripting, bash/ksh/zsh on *nix, power shell and vbscript on windows.

I don’t know what kind of application is the author talking about, and what kind of support does he need. Here on windows I can automate things just fine. Some apps, like all OS components, web browser, ms office, visual studio, and photoshop have their own APIs exposed, documented and supported. For other apps I can use OS-provided UI automation API if I need to. In addition, the industry has shifted towards open file formats and network protocols. This makes possible to process other app’s data in a way that was impossible in the early days of that amiga’s, when most data formats were binary because performance.

Average end users ain’t doing all that things but that’s not a technical problem. Just demography and marketing. Old computers were expensive equipment targeted towards scientists and engineers. Modern computers are cheap, they’re for everyone from 5 years old to 100+ years old who use them to post selfies and watch cat movies. You cannot teach them all programming, they have better things to do with their lives.

Every application I use daily is scriptable. My OS is scriptable. My shell is scriptable. My text editor is scriptable. I hate to be negative, but I call BS.

Do you run a free software focused system like GNU/Linux? They're much more scriptable than OS X and Windows because (1) a lot of programs embrace the Unix interoperability conventions and (2) free software means that if someone felt that a program's scripting facilities are lacking they can submit a patch.

Even if you don't, I assume that you are a programmer based on context and your comment, which means that the applications you use were designed for people who know how to program and therefore are more likely to be interested in scripting. Also, as a programmer, you're more likely to be using a lot of free software tools even if you're not running a libre system.

Take iTunes for instance. On Windows I'm under the impression that it's not at all scriptable. On OS X there's AppleScript, but it's somewhat underpowered (I remember back in my Apple days I tried scripting it and found that some metadata fields were inaccessible from AppleScript, namely Album Artist. It's also woefully underdocumented.

Windows has Powershell which has an amazing level of power in a system. Everything is an object, everything has properties that can be manipulated. I don't look after Windows infrastructure at all, but people I know who do rave about the level of control they get, and some of the stuff they do makes me jealous. My understanding is that the GUIs for Exchange etc have essentially changed to being skins on top of it and that's one of the changes that is allowing Microsoft to start promoting GUI-less versions of Windows for cloud servers etc.

OS X. Even Spotify is scriptable.

Is Spotify on iOS scriptable?

Why do people keep bringing up mobile devices like it is some kind of trump card? If you really want to script a mobile device, install whatever Linux distro you want on an Android phone and go to town. Congratulation, you just created a tiny under powered laptop with a shitty screen and an even shittier keyboard.

A smartphone is for checking the weather, reading, looking up places to eat, movies, requesting an Uber or figuring out what bus is coming next, and so on. If you're not using your smartphone for that stuff and instead are scripting your music collection, why even leave your house? Just stay at home and use a computer.

> A smartphone is for checking the weather, reading, looking up places to eat ...

I'd like to be able to use it for more. It is, after all, a computer.

Do you use scripting on your smartphone or tablet?

I've definitely looked into it. I bought, with enthusiasm, a tablet with the first version of Android that supported Python. Installing the necessary pieces, and editing programs both seem to be super awkward, and my beloved Tkinter has not been ported. I know there are ways, but if it's not more convenient than just walking back to my desk and doing it on a PC, then it won't happen.

Also, truth be told, since I got a smartphone, the muses just haven't spoken to me with an idea of something to script, that's interesting enough to capture my attention.

Lack of a decent keyboard and editor are barriers for me. I need a larger screen simply to see what I'm doing, in my old age. And most of what I do with scripting, nowadays, has to do with things that are not on my phone anyway, such as processing big files and interacting with measurement instruments.

What's happened instead is that I got a tablet with an OS that's scriptable, and that has a detachable keyboard. Now, all I have to do is wait for those damn muses to speak up. ;-)

You should check out QPython or QPython3 from the Play Store. It includes the Python binary, an editor, and a terminal emulator. It is all bundled up. So easy to setup a child could do it.

Unfortunately installing Python manually has not gotten much easier, though it's certainly doable. You can even use Tkinter, but that is beyond the scope of this comment.

I'll definitely check it out. My dream is to be platform independent, i.e., to be able to write / edit / run the same program on any device. So far so good with Windows and Linux, including Raspberry Pi, but my phone feels left out.

The aforementioned QPython includes Kivy libraries. Kivy is cross-platform across Linux, Windows, Mac OSX, Android, and IOS. Its a Python library but seems mostly oriented towards games/graphics. That's a drawback for me, but some might be interested.

As far as cross-platform across desktop and mobile, that's a difficult challenge. Java used to be the go-to if you want to be platform agnostic but that obviously won't cut it if you want to support mobile. You can check out Qt. I only use it on Linux but it apparently supports Windows, Mac OSX, Android, IOS, and Windows Phone (maybe even more).

I'll check those things out. I only need to make any of them work once, because I'll put a wrapper around my most frequently used functions. And I only used Tkinter because it was sitting there, ready to use, before I even learned how to install packages.

If you're looking at your smartphone or tablet and wanting to script it, put it down and find yourself a desktop or laptop.

Edit: Smartphones and tablets are simply not general purpose computers. I don't want a desktop type OS on either my phone or tablet. I certainly don't want a tablet OS on my desktop. That defeats the purpose. Smartphones and tablets are good at certain things the require _direct user input_. If you really want a smartphone/tablet with desktop capabilities there are vendors that sell that stuff. I just can't see a need for it.

Android smartphones and tablets can most definitely serve as general purpose computers (for developers on the go at least). My daily-driver is loaded up with the latest bash, python2, python3, gcc, openssl, ssh, busybox, etc. I'd never carry around a laptop, but with a phone it is always on me so I can get right to work anytime I have a few spare minutes. Just pull up a terminal and ssh into a development box or work locally. Its as easy as can be... You can even run a full Debian image or distro like XFCE in the background if need be. I usually just ssh in, but you can also connect over VNC for desktop use (this is generally best on a tablet). It works really well for me anyways.

Why? Especially if you ssh into a workstation. That seems to support my point...

Because its just not practical (or desirable) to carry around a laptop everywhere I go whereas my phone is always charged and on me. Nor do I always have a reliable network connection.

"If you're looking at your smartphone or tablet and wanting to [use] it, put it down and find yourself a desktop or laptop."

What you're missing is, for many of us, scripting is fundamentally how we interact with computing devices in a convenient way. I don't know if I want "a desktop type OS" on my phone or tablet, but I definitely (!!!) want an OS that allows me to easily and flexibly compose disparate pieces of functionality across contexts.

The only reason this isn't possible for most people is because the people who sell that hardware have made it difficult, on purpose.

I use scripting all the time on my daily-driver Android phone. I think I'd have a hard time listing all the uses as they are so numerous. I have init scripts to do essential stuff at boot (like mount encrypted volumes) and other scripts for backing up. I also have scripts that execute upon trigger events (e.g. notify my desktop over ssh if X occurs). Scripting may not be absolutely necessary, but it sure is immensely useful.

Then as far as development, its pretty much a necessity.

I script my phone at least as much as I scripted my phone in the '90s. Probably more.


(Admittedly, my tablet is a raspberry pi duct-taped to a LCD screen, but I would have the same expectations of a 'proper' tablet. (Which is a major factor in why I am using a raspberry pi duct-taped to a LCD screen.))

I wonder what he is talking about. I recently created a script that can build our whole cloud based test environment from scratch. It creates the servers, downloads and installs the platforms, builds and deploys our software.

Maybe he is talking about end-user software that can be scripted? But even that isn't true for devs.

I think he's talking about how scripting is less and less possible on Mac/Windows, and practically impossible on phones, which is where the bulk of computing will be done in the future.

Be careful there - the bulk of computing is always in the background - the consumption of the results of that computing may move towards phones.

Take banking as an example, pulling up your bank balance, or making an online payment, is nowhere near the computing power of actually balancing the transactions at all banks across the world each night. You get a tiny sliver of a tiny report of the computing that happens in the background.

Entertainment, personal tools, communication - those are moving mobile. But that is just the surface.

History lesson: It never was possible on 3270 smart terminals, either.

We've had a shift back to centralized data storage and processing.

There were a few decades - the 1980s & 1990s & 2000s - where the canonical master of a data set might have been local to the user, but cloud computing has moved data back to DCs for centralized storage & processing. And we are using very smart terminals.

Even in those years, the client/server model was the norm. Except for people using MS Access, or similar fiascos, there was always a centralized data store.

That doesn't invalidate the claim that scripting is dying. Sure, us tech folk can (and do) script all kinds of things, even even more so the Enterprise IT world, who practically lives and dies by powershell. But back in the 90s in particular, there were such a thing as "power users", who would script the crap out of their systems. That really doesn't exist so much anymore. Now people in that grey area between end users and the coders are more likely to be professional business analysts. The days of a true end user also creating their own custom scripts to merge data and functions from disparate apps is getting rare.

(Keep in mind, HN is an entire community of edge cases... of course we all can spit out exceptions to this trend.)

Now, I do not think this is a bad thing. On the contrary, most of those people were very hard to support, and the ones who did it well and were easy to support often ended up moving into IT/Analyst roles anyway. I think we are in a better place today, with a much larger population of coders and analysts, so the end users can just tell us what they need, then focus on the actual jobs.

But the claim that end user scripting is dying... Yep, I do agree with that much.

> scripting is less and less possible on Mac/Windows

[citation needed]

One of the first things I install in any Windows machine I'll be using on a regular basis is Python. Helps "unlock" a lot of potential from the computer (through scripting, obviously).

On Linux machines I don't have to install because it's already there :P

I got the feeling they were mourning the GUI environment macro type scripting not the bash/python type console scripting.

ie ARexx and that stuff MacOS does or used to do with traditional desktop apps.

Just by reading the title I thought that finally somebody has been able to put in good prose what I have been thinking for years: the push towards more secure systems will kill scripting.

Can you script copy and paste or move the cursor? It will be used for nefarious purposes, block it. Can you install and run apps as a user from your home directory? That is a security hole, should be disabled. Can you read windows' titles and move windows around? That is an information leak, it should be plugged. I like secure system just like everybody but I am worried that this will interfere with my pleasant very automated computer life.

But this article is about something else, so I will stop the rant here.

"And so it’s sad to see scripting die out as the popular platforms for application development fail to support it. Instead of the personal control of the script – I will take this information from that app, and put this part of it in that app – we have the corporate control of the API. This app maker and that app maker are BFFs, sign in here to let them share everything. After all, they know best."

Given the author's background along with this paragraph, I'm going to go ahead and guess that he's referring to iOS.

Lately I've been thinking about optimal UIs for highly competitive environments (military, day trading, fraud reduction) and I've come to the view that the majority of these interfaces should be through an extensible command line interface - and given the plethora of easy to install open source software, I largely think that the CLI of choice should be a linux terminal shell. Some I could also see being delegated to either a spreadsheet or a mixture of a spreadsheet and a terminal shell, but in general I thing SQL wins against SUM_IF + macros + Solver.

But if this is the optimal UI for highly competitive environments, why isn't it also the optimal UI for uncompetitive competitive environments like retail?

Two reasons:

1. Self selection. The highly competitive environments attract extremely intelligent people. Abstractionists can assemble a breadth of solutions that systematize solutions, while workers such as retailers are generally less adept.

2. A constriction on UI is similar to a schema constriction. Aside from the occasional "General Notes" field on a intake form, the interaction can be structured more reliably than a person with a terminal shell. This is important partially because lesser skilled people can't be trusted to make higher level decisions like data structure.

I don't think #2 is a hard limit, I think UIs can be designed around the limitations of human intelligence and I think that systemic educational changes could make CLIs more friendly to the general public; but I think there is a local maxima that is more easily solved with software like Tulip* and so that is what we have now.

* http://tulip.io/

"NeXT had services, in which applications could publish menu items that became available in other applications where the two were using the same data." -- am I mistaken, or doesn't Android have a similar concept in "intents," which allow different apps to send data to each other?

Yeah that sounds like Android intents (http://developer.android.com/training/sharing/send.html) and content:// style URIs. There's all sorts of really cool IPC (inter-process communication) mechanisms among various OS's such as Unix domain sockets, D-Bus, ØMQ.

I think scripting is alive and well. I've seen self described non-programmers do all sorts of interesting things with Excel, that I would have done with BASIC in 1981.

Scripting is experiencing a huge resurgence, just look at the prominence of Lua in the ML community.

Never thought writing WoW ui add-ons would prepare me for a 6 figure job.

I think the OP is bemoaning a paradigm shift - cloud computing, with all the implications - without fully understanding it. The best modern applications are built of APIs, and they are composable. It's practically my personal working definition of "best" (certainly popularity isn't the key). Perhaps the author had a bad experience with a few popular services.

It sucks that GUI scripting is dying on the vine on Mac, but it still sucked even when Apple was trying to support it with AppleScript. It was always kludgey. BeOS seems like the only OS that made apps that were by default scriptable.

All very true. But what to do about it? The hubris, that is.

Can you provide an example of this hubris?

Utter garbage click-bait and FUD. "Scripting" in the form of writing code in high-level programming languages is not going away anytime in the foreseeable future.

That's not the definition of scripting that the article was refering to. The article was refering to automating tasks, which is usually, but not necessarily, accomplished with "scripting languages".

Powershell is fantastic for that kind of thing in the Windows environment...

Which is absolute garbage considering how heavily that very thing happens.

Then that's still very, very easy and doable with the Windows Task Scheduler, crontab, and many, many other applications built around task automation and orchestration.

Even in that context, the article is a pile of fear-mongering horse shit.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact