In non-technical companies I see turf protection by IT departments as part of the root cause of this. They set themselves up as wizards who are the only ones able to make the computers do their magic and make the task seem much more difficult than it is. After all, if journalists, lobbyists, customer representatives, and such were able to write their own SQL queries, then Bob the SQL Guru™ would be out of a job. Plus, you would have to give these unwashed masses access to the systems.
I'm being a little harsh to IT departments in my caricature, but my experience has been that 90% of them make the other 10% look bad. IT at my wife's company is particularly bad; if she wants data out of their database she has to request a database report from a group in Kuala Lumpur (she is in Dallas). She has to basically write the SQL query in plain English in the request in order to get exactly what she wants, because from what I can tell this group is basically an English->SQL translator. If the data is not what she wanted she has to do it all over again. Thus it takes her days to get what she could get in minutes if she had her own MySQL client. On other things I have offered to show her how to use Python to automate things, but those conversations always end with "IT won't let us install anything". In that kind of environment I'm not surprised that people who have a natural reluctance to programming would have said reluctance amplified.
Then when that person leaves, or confidential data gets out, or an OS upgrade screws up the ad-hoc system they created, who's responsible? IT. IT has to now learn about, repair, and support this system they didn't know about or budget for. It's even worse when a non-IT area hires their own programmer who thinks that IT is "protecting their turf", and so that dev does some skunkworks thing without any consultation.
If you've ever managed corporate IT, you know how these little systems come up. And you learn why IT wants to control it. Because the average person has their job to do, and they're learning computers on the side, and only enough to make something that barely works.
So a little bit of training early on could indeed be a good thing. It would solve many issues. :)
> So a little bit of training early on could indeed be a good thing. It would solve many issues. :)
What we are talking about in the context of the article is a lot of training. It would give non-technical people the ability to go beyond something that barely works. It would increase the number of independent micro-systems, unless IT departments are willing to start really listening to their customer base.
This is unfortunately the kind of thinking I see with people I know who are tasked with system administration. They take pride in their work, setting up infrastructure to run unnoticed in the background, but I've noticed a tendency to simplify the job to extreme by making sure said infrastructure can't really be used. After all, if no one uses your system, they won't break it and you won't have to fix it. Users, instead of being customers, become adversaries.
I don't know anyone who does that on purpose, but I see the tendency to go there unconsciously. Restrict this, limit that, block everything. Principle of least privilege. It all makes sense from security POV, but when applied internally, users have to fight with support to just get their job done. I especially dislike this when it's happening in the educational context (schools, universities, hackerspaces) - computer systems there should serve as opportunities to tinker, learn and explore. Strict limitations significantly hinder usefulness of the infrastructure while saving only little work for admins.
 IT groups that aren't large enough to be organizations, say 0.5 - 3 people, don't seem to have this issue as often or as badly.