Hacker News new | comments | ask | show | jobs | submit login
The First Time I Used Programming At Work (justinkan.com)
222 points by maccman on May 16, 2012 | hide | past | web | favorite | 85 comments

There are also many people in biology research who do these kinds of repetitive tasks on daily basis for hours without realizing that much of it could easily be automated.

While we're sharing horror stories, a shining example from my experience is a friend who spends days analyzing western blots from experiments: it comes down to identifying a set of 10 rectangular black patches on clear white background and measuring their average brightness. The standard solution is this software package that lets you open each .jpg file one by one (there is a folder with hundreds), and you draw little rectangles and it tells you the average brightness inside. You write the number down inside an excel spreadsheet one by one, alt-tabbing back and forth.

After a brief facepalm moment I opened MATLAB, wrote ~30 lines of code and processed all images in the folder automatically, dumping all average brightnesses into a giant matrix in 1 minute of runtime. Several days of work saved that could go to curing cancer instead of filling spreadsheets.

At one of the first computer conferences I went to(HAL2001) there was a talk on "wetware hacking". The guy giving the talk was a neuroscientist I believe and he was literally begging the programmers in attendance to give his field a try since they had so many programable tasks and no one able or willing to do them. Unfortunately he fell short of giving advice on paths into the field if you aren't already in academia.

The problems these people are solving though are definitely things that myself and a lot of my friends would find interesting problemsets. I think its just a matter of making it accessible.

I work as the IT guy at a biological institute (fixing Windows boxes).

I don't know for sure that there are repetitive tasks that the graduates do there, I am pretty certain that there are some.

But there simply is no structure in place that would allow me to work for them. I could do it pro bono, but honestly I don't have the time for it. On the other hand they simply don't hire programmers to automate tasks, that's just not part of their budget.

And of course, the reason it's not part of their budget is because they are wasting money on boring stuff.

Have lunch with some of them. Schmooze with them the next time they call you for a hardware/software fix. Point out a single, simple way they could do something more easily by scripting it. Then recommend that the next time they want to someone to do some research, they instead hire an undergrad who can write code. Undergrads are cheap, and will get them thinking about what they can automate.

These "wet" labs usually do not have many people who are enthusiastic about computers in the programming, text-only sense. Usually there is one or two people that know some stuff and they do all the technical stuff for everyone else. It's not uncommon to find people who are actually quite computer phobic in these labs. The general interest in computers is prefering Mac's to PC's and that's about as far as it goes. If you start talking up some programming solution to the wrong person you may actually alienate them. The problem extends to much of the software that is written too. There are some command line suites that do batch processing, e.g. for sequences, but much of the software is point and click. And that's the problem. No one expects the typical lab user of these programs to be versed in using the command line and doing batch jobs. Everything is mouse-driven. Even if the back end is powerful, the front end is always a slow GUI.

This just slows everything down.

Another shining example of this sort of thing is audio engineering. You have programs that are designed as GUI's that look like physical pieces of gear. You move the mouse around to adjust turn knobs, switches and sliders and adjust settings. Extremely cumbersome. But this is how they expect users to be comfortable. Then companies started selling physical consoles with real knobs, switches and sliders to control the software. It's comical.

The touch screen trend is not going to help people get comfortable with the command line. And the key to realy fast computing is using a command line and batch processing. There is no way around it. When you are pointing and clicking, you are just sending a signal that began as a text instruction in the code anyway. Why not just start with the text command and skip the point and click?

It's comical.

Only to someone who's never mixed a song.

Professional audio engineers do their work by FEELING the position of the equipment, making adjustments, and LISTENING to the results. And they expect the feedback to be instant -- just like playing a musical instrument.

Now put them in front of a command line and imagine if they had to type out their intents, listen to the results, and then make tweaks to their typed-in textual program to effect some change in the audio output. You'd end up with a bored, pissed, unproductive audio engineer. The sliders and switches made digital audio professionals MORE productive, not less -- even more so when they became controls on a physical console with digital outputs, as it put controlling the digital gear back in the analog realm a musician is already familiar with.

I find it also interesting that lab environments and music are where plenty of programming takes place by people who are not occupational programmers -- and much of it is visual, in environments like LabVIEW and Max/MSP. This is the future of programming. It is why we should listen to those who say we are too bound to a textual representation for our programs and that in order for the craft to evolve, a richer representation must be embraced.

You can't turn two knobs at once in a live performance with a mouse.

Did you share your solution? Imagine them reapplying your method to some testcases, yielding the same results for days of work within minutes.

Thing is the programming in this case is trivial; but if there's no tick on those guys that what they are doing is easily automatable; they won't go looking for a solution anyway.

It's only trivial for someone who has used MATLAB, taken a machine learning course, or otherwise has similar experience. I'm sure the biology people know this is automatable; it's the "easily" part that's the problem.

Actually, I don't think I agree. It's quite fascinating but the friend I mentioned is not very good at telling the easily automatable tasks apart from those that aren't.

On a later occasion the friend came to me again with a different problem to solve, but this time it involved spotting dead cells in a mix of cells, images taken with a microscope. They had to count the ratio of dead cells. The difference involved properties of texture and size that could be easily explained and I could do it with near perfect accuracy with little training, and yet it would be much much harder to automate this task. The cells were tightly packed, there were scratches and noise on the image, brightness and contrast variations, distortions,... It would involve training a model, computing features, cross-validations, etc. That did not seem obvious at all to them and it took a while to explain. I'm not even sure if they got it. After all, both tasks are very repetitive and boring, what's the difference?

To address your point, you're right in that I wouldn't expect my friend to did what I did if they only read a "learn how to program in x days!", or even several similar books over few months, or if they knew how to write simple scripts in python or something. But what I would at least hope for is that they realize that this kind of task is very easily automated, and have a very basic understanding of what it would involve. Perhaps labs should come with computer scientist hacker floater technicians?

Exactly. It's probably not even that common for biology groups to have a programmer doing work for them, so they don't often get that chance. That's why we need more interdisciplinary research groups / teams.

Sorry, I didn't mean trivial like "my mom can do it" - but like karpathy below understood.

Something I find amusing is that the ability the author demonstrates here, to notice that a task is repetitive and has a programmatic solution, is itself a skill that has to be learned, and one that's different than programming.

Once, when I was working on a group project in an operating systems class, we modified our kernel malloc and free routines to print the relevant address every time they were called, to do some simple leak-checking. The result was a file of a few thousand lines like "malloc 0x94583900", "free 0x34739A4", etc. One of my partners looked at the file and said some variant of "oh no - this is going to take hours to match up". Apparently it hadn't occurred to him, someone who'd been programming professionally for years and who only minutes before had been tweaking the internals of a process management function in a multithreaded x86 operating system, that there was an easier avenue available to him.

I have a darn similar story, writing a virtual memory manager for OS design class. I was the only one in class who even thought of instrumenting the thing, and then wrote a Python program to read the traces and tell me when frames got swapped to or from the wrong disk pages.

I was also the first one in class to actually finish the project.

There is another side to it when a programmer looks for a technical solution unnecessary (when all you have is a hammer -- everything looks like a nail)

It's a mindset that's rarely taught: if you have a problem, and you're not writing code to find the solution, you're probably wasting effort.

"I spent 10 hours making a system to perform this 1-hour task in 5 minutes!"

Note that this only pays off if that task occurs frequently enough to offset the upfront programming cost.

Yes. But you can also waste lots of time creating code, when just doing it quick, dirty and manual is faster.

I agree.

But I'd add that even tasks that only require a quick-and-dirty solution can frequently benefit from quick-and-dirty automation. A few throw-away lines in an interpreter environment like irb, for example can often shave several minutes of certain tasks, and throwing a lit bit of code into such tasks can certainly make some otherwise dull processes more interesting.

I remember the first time I used programming at work too. I worked in a marketing agency as a PC operator. The two tasks I had were:

* print enormous text files with tables showing survey data.

* put a long list of .gif files each one on its own .ppt file (yes, as ridiculous as it sounds).

For the first problem I did two little Pascal programs. The first measured the files and calculated the best settings for the printer (it used to be a try and error process). The second, I did another little Pascal program that added page numbers to the files, so when the printer got stuck (or the pages got mixed) it was easy to resume instead of starting over.

For the second problem, it did an MS Office Macro.

I stopped programming at the office because one of my coworkers asked me to stop because "we are all going to be out of work" (sic).

That is when I learnt the importance of knowing how to code.

"one of my coworkers asked me to stop because 'we are all going to be out of work'."

This is the argument of the anti-technologist, sometimes called the "Luddite fallacy" [1].

Somehow these people think that the time freed by automating menial and repetitive tasks can't be used to do more productive or additional tasks, or heighten the ability and/or education of the workers.

[1] http://en.wikipedia.org/wiki/Luddite_fallacy

While that's true; If you look at how the surplus of productivity generated by technology was distributed among the members of the society in the last 30 years, you will notice that, maybe, she was right.

Who got benefited from the increase in productivity I generated with the automation I made? The owners of the company.

I automated an entire department down to a single person (me) -- this was during a severe downturn for the business with layoffs left at right. I didn't quite feel I was putting others out of work, as two jumped ship and one was retired out (comfortably).

I even significantly improved timeliness and accuracy. And I absorbed a major change in inputs that likely would not have been accommodate-able under the old system. Turned one of the high-priced consultants onto a reporting package that greatly improved their life feeding upper Management endless varieties of ad hoc reports. Etc.

And... this senior management, brought in from outside due to the share price, set up separate reporting channels using their people, and eventually laid me off.

These days, with so much top-down control, it's often a matter of who you know more that what, and how you are perceived. I was very good at what I did, but from their perspective, I was a "grunt" and a replaceable cog.

In the mainstream work environment, you can indeed work yourself right out of a job, especially if perceptions of you don't match a profile that Management is used to respecting and promoting.

I know I often come across as cynical, here. And certainly, I bring my own faults to the table. But... experience has shown me that these days, still, image often outshines talent in the job market.

When I envision "Everyone can program." this sort of thing is what comes to mind. Not people building cathedrals out of notepad.exe and a python REPL.

Even little nuggets from programming, gifted in the right way can radically change things for non-programmers. I introduced a friend to regular expressions and Textpad so he could cleanup the output from a news reporting system he used (excess line breaks, weird characters, that sort of thing). It turned a manual task that took him the bulk of his time into something that took seconds.

When I showed him a related tool that could hunt out and tag terms of interest using the same concept (phone numbers, email addresses, etc.) it completely changed his life. He ended up using those two little kernels of wisdom to mostly automate a previously entirely manual research process and now leads a team of 20 researchers in his field.

He's not a programmer or even a scripter, but he uses regexes pretty much every day in his work.

In other words, "Everyone can script", rather than "Everyone can program". Where by "script" I mean to write a program, but for ad hoc tasks, and on a relatively small scale.

I use the word "program" pretty loosely. To me, a script is a program.

But for what it's worth, I think a working knowledge of the gnu utilities is worth more than most programming languages to most people. (If you're on a system that supports *nix utilities.)

Ideally it's not an either or thing.

I know that a script is a program. That's why I defined “script” as “write a program […]”. I included my definition of “script” because I was worried about people misinterpreting me… but it seems people misinterpreted me anyway.

With my comment, I wasn’t trying to contradict your original comment. I was just trying to reword the concepts to be more specific. “Everyone can script” was meant to be a more specific version of “Everyone can program” that might convey the concept better to those who shared or at least understood my definition of “script”.

How would you recommend gaining that knowledge?

This free book:


The best way I know short of immersing yourself in a community of *nix geeks for a long period of time.

If you were looking for something more grandmother or time friendly, I'm afraid I can't help you.

There is really no difference though. Whether you 'script' away a mundane task with AppleScript or an elaborate C 'program' you still wrote down a series of instructions that were run by a machine. That's the definition of programming in some dictionaries.

I didn’t mean that “script” was an alternative to “program”. I meant that it was a subset of “program”. I tried to convey that by defining “script” as “write a program […]”. So we already agree.

O HAI! It's Slashdot 2001 again.

Historically there was a meaningful distinction between scripting and programming: A script is interpreted, so you trade off slow execution for no compilation. A program is compiled, so you get fast execution but slow compilation.

Today (as in 2001) both compilation and execution is so fast, for most use cases it doesn't really matter which you do. JITs further washes out the distinction.

My definition of scripting is slightly different.

Scripting, as I learned, is being able control a host program in not-so-trivial ways with an embedded programming language, where the host isn't really dedicated to running programs. Consider Lua embedded in World of Warcraft, JavaScript embedded in the browser or even mod_PHP for Apache.

It's all still programming though.

What’s the difference? I can’t see one.

I think that people are trying to coin separate terms to differentiate the amount of knowledge one needs to complete the task. E.g. writing some automation in Python is easier than writing the same automation in C due to a lack of needing to understand things like character arrays and malloc. In reality, it's all programming, just via different interfaces.

> Where by "script" I mean to write a program, but for ad hoc tasks, and on a relatively small scale.

By that, I meant that a “script” is a specific type of program. “Scripting” is a subset of “programming”. Any type of program is a “program”. To be a “script”, a program must be for an ad hoc (one-time) task, and small relative to other programs. At least, according to my idea of the definition – though it is unclear what the “official” definition is.

I fail to see the difference.

My mother was a computer science professor and was active with ACM's SIGGRAPH for most of my early childhood, but somehow I got through high school never learning how to write code. I did, however, learn that laziness can sometimes be a virtue.

In my second job out of college, I was running the sound board for a national talk radio show. We broadcasted three hours a day, and I was tasked with getting recordings of each day's shows uploaded as a podcast with the commercial breaks edited out. Being far too lazy to want to do this by hand, and recognizing that we broadcasted on a "hard clock" (a commercial break would run every day from 2:06:55 to 2:10:25 for example), I decided it was time to learn how to let a computer do the work for me.

Fortunately, a friendly coworker who was a hybrid of broadcast engineer and sysadmin pointed me toward bash scripts and cron jobs. Figuring it all out was a frustrating couple of weeks, but I certainly didn't regret it.

A few years later, among other duties, I was responsible for the audio archives of a very old musical organization. Mostly this meant knowing how to make decent transfers of DATs, LPs, 1/4" reel to reels, etc. We found ourselves with over 20,000 tracks of original recordings in .wav format and the metadata about those tracks in an unconnected database.

After far too many searches in the library database, followed by finding the desired file by hand, I got busy scripting (and perusing posts on HN that have helped immensely along the way). That project has evolved into a searchable website containing audio, video, photos and print materials of everything in our catalog.

Next week, I'm on my way to Silicon Valley to interview for a programming gig at a company doing very exciting things. I definitely didn't see it happening this way, and maybe it's not the best career move for me, but my life wouldn't be the same if I hadn't realized my laziness could be rewarded by learning how to code.

Good luck next week.


When I was 16 I had a summer job as an intern for my Physics professor. Our main task was to stir these huge vials of very viscous liquid for hours on end.

So I went to the Chemistry lab, got some mechanical stirrers, plopped them in, and played video games all summer. Good times.

Recognizing that automation is possible is very hard. A quick tip, if you can find a pattern in a task, you can probably automate some or all of that task.

My first internship after college had me taking technical tables in old Word documents, and rewriting them as old-style HTML tables. These documents were often thousands of pages of tables, and there were hundreds of such documents.

It took me a while to realize that, over a few dozen tables, I had come up with a manual system that was just a series of copy/pastes of various HTML snippets in and around the copy/paste of the table contents in a text editor. At the rate I was going using my copy/paste method I was processing tables at about 2-3x the rate of the other interns who were hand typing the markup in the table (the average was about 20 tables per day, I was doing a solid 50-60).

Thinking for a minute I whipped up a script to do the work and my metrics went through the roof. All I had to do was copy/paste the table contents (I couldn't come up with a way at the time to automate that bit of tedium), then run the script. In a day I went to doing 6000-8000 tables a day. My boss, who fortunately thought it was pretty cool, had me train the other interns and we finished the project months ahead of schedule.

So even if a complete solution wasn't to be found, automating even part of it was a massive boost to productivity.

I am trying to use programming at work, but I feel the thing is seen as mere "IT" and that it has no consideration at all. Which is weird, as we are sharing horrors, now at my company (a multinational) is budget time. There is an excel file with 500 tabs (500) going around to collect the budget for each cost center.

I mean, when you start clearly overusing a tool, if you call yourself a manager you must realize it. You can't keep saying that your job is not IT because at the end your and others productivity depends on it. Somehow I start thinking that the way you do things is as important as the things you are doing.

Bottom line is, we will come up with a budget, everybody will work a lot over the 40 hrs per week, there will be lots of input and logic mistakes (most of them will be corrected, others not). But they will be able to check the 'budget done' radio button. And they will be proud of this achievement. I won't, because I will be feeling stupid spending time doing this kind of things in 2012.

I've been interning at a power company since high school (I'm a junior in college now -- EE/CpE major) and one of the things I did was taking log data for a bunch of events in csv format, filtering the irrelevant data, and formatting it to make it more readable by humans. Since I had to repeat the process a bunch of times, I spent an hour or so writing an Excel macro that let me select a few examples of what was relevant and then proceeded to do all the hard work. I told my supervisor about it and she had me write up documentation for using the macro and then shared it with everyone in the department to save everyone's time.

That was one good manager!

Excel can be an amazing tool when automated. I am amazed that more companies don't train their employees on task automation (Excel or not).

A few years ago I had a situation while working on a circuit board design. The board used huge FPGA's with over a thousand pins each. Creating and managing the schematic and PCB symbols for these chips took days of agonizing work. I finally had enough and wrote a tool that used Excel for data entry and editing. After about a month of work we could take an FPGA from data sheet to EDA files inside of 30 minutes. The tool most-definitely paid for itself many times over as it saw use over several years.

That's the one thing that is outstanding about MS Office: The ability to automate across tools is fantastic.

At this place where I'm currently working they have this legacy system which logs transactions, and sometimes there are lines missing from the logs.

Early on these guys figured out they could recreate the missing lines using the data they did have.. the problem is that it's the most horribly tedious process imaginable, involving lots of calculations and comparisons.

One day one of the guys runs into a problem where 3-4 of these transactions hadn't been logged properly. I didn't have much to do that day so I opened up Visual Studio and got to work. Field by field I figured out what the calculations were supposed to be and then put them down in code.

Saved this guy hours of work that day, and we save ourselves time every time the problem has cropped up since.

Oh to be a programmer.

The IT manager that didn't believe in him are the types of people that can't embrace technology, and when you don't embrace technology or at least try, you end up getting fucked. She could have told him to prove it, show an example, etc, but instead she automatically dismissed it and he got the better end of the deal by getting to be lazy for a few days. If she would have looked, commended, and appreciated what he spent effort trying to improve it would've had a better outcome for both parties, but instead he was the only one that got to take advantage of it.

Yep. I got sick of entering the same data six times in spreadsheets at work. I made an Access database. Added VBA to make it better. Ended up hitting the limits of VBA, made a full client app in VB.NET. Ended up importing data from a remote system using an AutoIT script. I was still an admin assistant- no-one would promote me.

That was the beginning, I never looked back.

Wonderful post. I love that he solved the problem, and since he received no reinforcement, kept the success to himself and did what ever he wanted while the program did his work. Win X 5

I had a very similar experience! I used to work holidays for the family mail order business.

While we could easily process upto 400 small orders a day (because the warehouse was well organised) we maxed out at shipping around 6 large trade orders per day. Each one took an hour and the courier collection was at 3pm.

After getting through a particularly manic afternoon, I decided to have a look at why they took so long.

I realised it took 25mins just to turn the mailorder spreadsheet into an invoice to mail out with the parcel.

I was dying to automate it! I made a macro to remove the empty product lines, format the layout, background & colour, position the delivery address, insert the company logo and information and generate an invoice number and mark it as paid.

I even future-proofed it so that it would still work even if we added new products to our order form.

I stuck a big smiley face button on the tool bar. I tested it out, invoices now took around 4 seconds. The screen sat there with a beautifully formatted invoice, cursor flashing patiently for me to enter a payment reference and hit print.

Basically that one small change doubled the trade order capacity of the business, which was a LOT of income. It's an awesome feeling :D

That was about 8 years ago, I think they still use it now!

The sad part is that they probably went back to doing it by hand once he left the company.

Also amazing that "normal users" consistently don't even want help with automating their IT tasks. Maybe they are worried about losing their jobs?

there's also the issue that manual processes tend to have a consistent rate of small mistakes whereas automated processes are more likely to have very occasional gigantic fuckups without warning.

Or they just don't know what is automatable? Or they are worried they won't understand the new system?

In that story the boss didn't want the automation, and I made the same experience whenever I offered to automate the tasks for somebody else.

Really I think we just need to use the term Automation for this, rather than "coding" or "programming".

My take on this latest Jeff Atwood fiasco[1] is that yes, I certainly agree with him that "coding" should not be lumped in with the same concepts as general literacy and numeracy.

But certainly automation using a computer should be, and there are a range of ways available to teach people how to do that.


That's really a distinction of degree, though. It seems like the only real difference is whether you're programming Python, Photoshop, Excel, bash or whatever, and the power of the tools available to you. I'm not sure that whether it actually involves writing text to be interpreted is important.

I think it's not a distinction of degree any more than the difference between algebra (which everyone is expected to know) is different from calculus (which most high school graduates are exposed to) and then fourier transforms and vector calculus (which 2nd year university students are expected to know).

I think something like "basic automation" provides the same type of introductory computer literacy as algebra does for maths, and "learning how to program" is similar to a continued education in maths.

Different syntax/symbols, different levels of power, yep, pretty much the same deal. Different levels of math are, in a practical sense, different degrees of the same thing. There are different kinds of math and there are different kinds of programming/automation, sure, but it mostly feels like "this is harder/easier/more intense".

"Automation" may well be a good bait word for luring in unsuspecting non-nerds, but it's still lightweight programming, a subset. I think at this point we're in agreement.

Before I started studying computer science in university, I was working as a laboratory assistant in a paint coloring laboratory. One of my tasks was to do colorant characterizations to our custom-built (the company had one programmer) application so the software can approximate formulae for hardware stores.

This was really manual work. You needed to do different mixtures with the colorant and the paint to get an even spectrum of colors from light to dark. Everybody just started to do them randomly, calculating approximate mixtures, doing draw-downs, measuring and repeating until the result was good enough. I started playing around with excel macros and studied how Kubelka-Munk formulae worked. I even realized that there are lots of colorants using the same pigment, but with different strength.

When getting these numbers and some old measurements I built a big excel file with macros, where you just type the pigment data and it would print you instructions what to mix and how much. This saved several days in the process and finally they let me learn Delphi and convert the excel file into a real program.

I wouldn't want to read that code nowadays, but I think they're still using it in the laboratory.

Yeah, it's really useful to be able to automate tasks, especially when you're the intern that gets handed the boring task the higher ups don't want to do. An intern's job is basically to do the lowest level tasks that computers can't do. And yeah, sometimes (ok a bunch of times) your superiors have no idea how simple that task is on a computer. However, other times the task is just complex enough and happening rarely enough and your skills are not good enough to justify an automation scenario. Then it's really boring. I would say learn programming to raise your complexity bar on repetitive tasks.

know your tools. i had a colleague who needed to xy plot some data from an instrument dump into excel. problem was, the data and timestamps weren't one-to-one. he was looking at a few thousand rows of timestamps with almost random gaps between instrument readings. he'd psyched himself up to spend the next day right-clicking 'delete row' to get his data set. select-column, goto special, blanks, right click delete rows. seriously, the guy almost had tears in his eyes. i'm with the comment "learn how to automate using scripts" - should be a core skill

I liked this post a lot. I notice that most everyone mentions the time saved by automating tedious tasks, but not how such automation eliminates user-error. For me, that's at least as important for maintaining peace of mind.

there was an assignment to print 10000 random number.

a guy used excel, put a formula into cell A1, and drag it to A10000. file-print ... presto ... wait for 200 pages or so

i wrote a one liner matlab and printed the result in about 20 pages

i just happened to know little bit of matlab -- and that's all needed to 'beat' the competition by 1000%

The way I see it you came out about the same, both getting things done in under five minutes. Excel guy could have dragged from A1 to J1000 if he cared about page count.

I wonder if this would have worked for your assignment.


This is by far the best entry in the "learn/don't learn to program" series!

ice t would be proud of you http://www.youtube.com/watch?v=mAlzPgXb6rE

This is a pervasive problem, one of the prime examples of why knowing the basics about algorithms and programming is a great skill to have.

Many people do not think about whether they can automate certain things. Even if you tell them too, they will often not have the right mindset for it and be unable to recognize that something is automatable.

This is not the fault of those people, it’s a fault of education. Automation is not intuitive, it’s not something humans understand instinctively. To know what works and what doesn’t, to know what’s possible, people need to learn the basics about how to code. (Maybe some will even be able to do some of it themselves, while others can at least ask around for an actual implementation.)

No, it’s not the solution to the failing education system in the US. No, it’s not the best thing since sliced bread. It’s not the savior. What it is, though, is just a good idea.

> This is not the fault of those people, it’s a fault of education. Automation is not intuitive, it’s not something humans understand instinctively.

Actually, I think a large number of people who are good programmers figured out automation on their own, meaning the education system is almost entirely useless when it comes to this. We all have our own stories for this moment of enlightenment - mine was writing TI calculator programs to "automate" problem sovling in middle school algebra.

Well, yes, "people who are good programmers". The point here is though that the others should get an idea about this. There will always be people who don;t understand the easiest of procedures, but I am sure a large number of people would understand automation if the idea was given to them (at an early age). Probably would they then think about it when the need arrises, and maybe come up with some creative solutions. If we just leave it to the natural good programmers, the solutions may be better, but the number will be lower.

I don't think it's algorithms specifically. Even a naive algorithm is going to be better than doing some rote tsk by hand.

That’s not really what I meant. Most people do not know what algorithms are in the first place. That’s what you have to teach. You would tell them that some of the most common tasks are searching and sorting, for example, and then show different ways of doing it. It’s not about picking out the best algorithm, it’s about showing which types of problems are usually solved with algorithms and also explaining some of them, maybe also in a second step how those algorithms can be combined to solve more practical and complex problems.

That way people learn what’s possible. But you are of course right, in most cases even a naive implementation is better than manual labor.

I think the most important part of this post is "Well, fuck that."

Update: Actually it's this part: "Well, double fuck that."

I had this same moment at work not long ago. My boss gave me a task to compare user permissions exported from a directory (not a database). The output was ridiculous and it could easily take 10 hours to work through lining it up and correlating just 5 users. They had been doing this by hand for 6 years before I got there. Within a few minutes of her explaining it to me, I flat-out said "I don't want to do this, so here's my plan". 30 hours of Bash scripting later, I had the task completely automated, and we could run through 20 users in less than 2 seconds, with the output going straight to Excel.

You can't be afraid to tell your boss you think a task is ridiculous, but you'd better be prepared with a better solution.

I once worked in an office where at the start of every month we'd get a cell phone invoice for 30-some phones listing every call made in the last month in an Excel document. One of the secretaries would then print out the whole thing spend the next two weeks going through it with multiple colored markers annotating it with who had which phone one which days and checking for different classes of violations of the cell phone usage policies.

Another secretary realized that he could make it a lot faster by just copying and pasting the data into different sheets for each person and then highlighting it in Excel. Then he wrote a macro program that would automate the highlighting once you'd done all of the copy and pasting. This cut the total work load down to a few days.

I was very derogatory of the whole thing and insisted that it was absolutely appalling that humans were spending so much time doing obviously machines' work, so I got assigned to fixing it. I ended up with an HTML+JavaScript application that would read the invoice file and monthly phone directories to automatically match up names with numbers for every day, calculate policy violations, and write the results to a new Excel spreadsheet in the established format. It cut the total human time involvement down to 5 minutes per month and time to wait for results down to about an hour.

Once we had the phone records available on the first day of the month, demand suddenly rose for even more detailed and different kinds of analyses. So it turns out we still spent more than two weeks on making reports, but only because new ones kept being demanded since they could actually produced on demand.

I especially enjoyed the fact that he kept it on the down-low. Just use your super-power to do 4 days worth of work in 20 minutes and righteously goof off for the rest of the week. Win.

two words to explain the law firms reaction:

billable hours

*credentials/disclaimer: I'm a programmer and I worked as a clerk for law firms back in college too.

The hours for this task weren't billed.

He means that the law firm makes money by billing hours; hence if you take less time to do a task, the firm earns less money.

Although in theory they could just up their hourly rate, if their hours are more productive, now.

Yeah, I'm the author and I know what billable hours are. My hours here weren't being billed out to the client.

Surely people aren't paying a lawyers hourly rate for them to cut and paste around spreadsheets?

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact