Getting started is tricky, but my most profitable side work has been workshops and lunch-and-learns, which is beneficial as I can re-use a lot of the same content between engagements.
Of course it has to be a specialization that enough people need to be willing to pay for.
You also have to keep a clear lookup for a new area or areas if your specialty is in decline. You don't want to be the Y2K mitigation expert in 2001. Or the top performance expert for some legacy or discontinued computer architecture.
> This also applies to software development. If an e-commerce application you are building will bring in $500,000.00 in sales in the next year, then charging 10% to 20% of this amount is acceptable. The same goes with feature implementation. If your feature will save them $200,000.00 in the next year, price with that in mind vs the hourly rate.
And how do you know what amount of money they'll make or save? Do you estimate it yourself? Do they tell you?
Maybe some case studies of how to do this specifically with the kinds of info a software consultant (or, indeed, a low-ranked development employee—all are advised to attach a dollar value to their work for later bragging/self-promotion purposes, after all) may readily access, then, would be enlightening.
Maybe you can help them make or save even more.
First: $210k/yr is not necessarily really good money for a US software consultant. When consulting full time, please try to keep two things in mind:
(1) Your cost basis is higher than it was when you were a W2 FTE. If you're making $100k/yr in salary, your employer is paying substantially more than $100k/yr to keep you on staff; you have a "fully loaded" cost that includes not only infrastructure stuff like computers and office space and Google accounts and training and vacation and sick days, but also your benefits and a pretty substantial chunk of taxes, and a bunch of tax planning stuff that your W2 hides from you. You're now on the hook for all of that.
(2) More importantly: employers are on the hook for the fully loaded costs of their employees indefinitely. Well-run companies hire developers with the expectation of keeping them on staff with no fixed end date (run don't walk from any that don't). Which means that the decision to hire a freelancer versus a full-time employee is not simply based on rate; it's also based on the fact that the freelancer comes with a guarantee that the relationship can be severed the moment it's no longer valuable. That guarantee has a lot of value; "double your FTE rate" isn't even stretching it. If you're giving that up for free, by working at a rate comparable to what you'd be making in a good job, you're doing it wrong.
I don't think it ever makes sense to work hourly. I've written a ton of posts here about why that is; here's a link to the one people seem to like the most: https://news.ycombinator.com/item?id=4103417
What I can say with almost 10 years remove from that post is that I was if anything underselling my position on this. When I was beating the drum on not doing hourly work, I was at Matasano, and we had a day rate (you couldn't buy work for us at increments less than a day). After that, we started another consultancy, where our minimum billable increment went up... uh... substantially from that. You can do week rates, and not sell in less than 1-week increments; you can do month rates; you can do more than that.
The classic dumb argument about hourly versus not tends to devolve to debates about the pitfalls of fixed-rate work. I don't advocate for project rates (I'd do a project rate, I guess, if it made sense; I'm not religious about them). Rather: I think you should provide your customers with a proposal for a total cost for the project based on an estimate of the number of days (weeks, months) you think it'll take, and a SOW for a T&M project with available prorated overages if it takes longer. Then do your best to deliver according to your estimate; if you blow the estimate because you screwed up, eat the overage; if you blow the estimate because your customer didn't get you access to the systems you needed to work on until 3 weeks after the kickoff, they eat the overage. Nobody has ever pushed back on me for this.
When I spelled this out on HN back in like 2010, people responded as if it was black magic. I think what's really happening is that people who run serious consulting firms just don't write a lot of HN comments, because I know of lots of big firms that work this way.
This right here, I once waited 3 months for a company to get their stuff together for me to work on it. I had other projects going on, but you should be paid for waiting time as well, because you won't be taking on as much other work if you expect to get a huge project any time now.
Didn't the clients expect holy 8 hours in this case?
Another class of freelancers/consultants which are missed out from this discussion (hourly rate vs other): long-term/indefinite flex-time workers. No estimations or promises, just pay-as-you-go for several years or more. With the hourly rate, in this setup, one can work 5 hours one week, 0 hours second week, 35 another week... So a kind of very flexible employment arrangement disguised as consulting.
It's something they are expected to do for their clients, so within reasonable limits doing it to their clients is expected and shows a serious professional attitude.
For example a defense Attorney defending a DUI charge will likely be a flat fee. They will also charge according to what it is worth to you.
e.g. If you were Palo Alto's most successful real estate agent, and were at risk of losing your drivers license, you will be charged accordingly when you show up at the doorstep of a prominent defense Attorney.
It's your day, not theirs. Accounting for number of hours spent in a day on client work is just getting back to hourly charging.
For the most part, this stuff is honor system.
As a user I want to use PWAs.
As a developer I prefer to program PWAs.
But as a contractor, nobody wants them. Everyone is happy to use react native and spend hours/days debugging their build each time they do a minor upgrade to XCode.
I could advertise myself as a PWA specialist. But I've yet to see evidence that anyone would care.
I make a very comfortable salary these days, but I worry about only having a single income stream.
Other places to look:
* Ask former colleagues if they need any off-hours help.
* Ask people you meet at meetups if they are looking for help. (If you attend meetups.)
* Look at the 'whose looking for a freelancer' posts here.
* Look at fast growing companies in your area or network and ping folks there. They may want only FTEs, but they may also be open to contractors.
* Look for part time opportunities. For example, there's a big need for training in certain technologies (AWS, k8s, terraform), and that can be a nice base for a consultant.
If you are doing any moonlighting while employed, check your employment contract and ensure it is allowed.
> One of the best pieces of advice I’ve acted on is deciding to specialize.
How do you choose what to specialize in? I have experience with quite a few things, but I don't know if it feels the same once I'd be specialized in it.
The author himself seems to be an expert in React + React Native + GraphQL.
Maybe the selection disappears on Firefox because their custom context-menu is rendered on top of the selection, thereby "obstructing" the selection by a few pixels?
Reader view is toggled by F9.
Since then the world has changed quite rapidly when it comes to norms around writing, pronouns, and gender, so perhaps "they" fills the same role for younger writers.
Calling that an "alteration" rather than a "substitution" gave me a different perspective, though.
I'd often find Alice/Bob examples in (programming-related) books, but seeing "she" instead of "he" in situations where gender isn't important wasn't that common in my experience.
Some other commenters pointed out that the practice is quite old in English. My use of English is mostly limited to reading technical information and blog posts so I can only assume that the alteration not being popular is (was?) mostly intrinsic to the technical articles/blog posts.
That's funny. Saying "off-putting" I had exactly that in mind.
What? She? I must've zoned out.
Maybe the irritation's worth it for the effect that effort's having on culture. I really don't know. In the end, it's not that big a problem for me, just a little irritating, so I simply hope it's doing something actually-helpful for someone and don't worry about it too much.
Why would using she be any different than he? It's like Alice and Bob of oh so many quantum thought experiments but without Bob. If there is only one protagonist you have to go 50/50.
"Everything does not need"
But in my head it sounds better as
"Not everything needs"
Is the latter not acceptable English? Or is it something regional? In my language, it is customary to negate individual nouns, but some English speakers seem to only ever negate the verb.
"Not everything needs..." is definitely acceptable English, and sounds more-correct to my ears. It would also be fluent to write that sentence as "Some things do not need...".
As a logician, I find the phrasing "Everything does not need..." to make me a bit queasy, and I would prefer if it were regarded as unacceptable.
HOWEVER, it's not unacceptable. For example, a famous aphorism is "all that glisters is not gold" (often diluted, in recent centuries, to "all that glitters is not gold"), which is surely not intended to claim that gold never glitters. So I guess I have to tolerate hearing it the illogical way.
Fear not, the exact same thing happens in other languages. Must be that you just don't hear much, for instance, Swedish: "Allt som glittrar är inte guld." (Should logically be "Inte allt som glittrar är guld", just like in English.)
> Same goes for the inverted usage of "all but", which imo _should_ mean "everything except for" but seems to mean "almost". I wonder what causes this.
That doesn't feel "inverted" at all to me. You just need to interpret the "all" as it must have been intended: "the whole"; "all the way there" -- then "all but" quite logically becomes "not quite the whole way there".
Like, if a piece of software was almost, but not quite completely, finished ==> "the software was all (the way to) but (not quite arrived at) finished."
Maybe it helps to see the similarity with this related form: Almost all the bugs were fixed, but one was left ==> "All but one of the bugs were fixed." (The bug-collective was all but vanquished. :-)
However, some people use the first incorrectly when they really mean the second. This happens often enough that you could argue the incorrect usage has now become idiomatic as well.
All that glisters is not gold—
Often have you heard that told.
(Also Tolkien, in the Riddle of Strider, in Fellowship. Imagine a comparison where Tolkien is the less-distinguished writer, how often does that happen.)
The former example seems characteristic of non-native speech, but is completely understandable also.
"Everything does not need" emphasises everything.
"Not everything needs" emphases not.
It's subtle, and probably subconscious, but it has an effect.
One reason I'm participating in HN is to hone my written english so I appreciate it very much you pointed this out :)
My native tongue (Finnish) is not in the Indo-European language group so I'm occasionally blind to these sort of patterns that are maybe more obvious to speakers of other Indo-European tongues.
I realize how my initial comment might sound bitter as the topic is quite controversial, but I'm more curious than anything.
Seeing such "unusual" placement of pronouns a few times in a short period of time made me wonder what's the deal.
Which, if you're trying to avoid gender stereotypes, leads to awkward he/she, somewhat obviously switch among male and female personas, etc. The typical approach in style guides these days is use singular they. 2nd person got rid of the separate singular and plural pronoun as well so it's not unreasonable.
Culture wars, but a much more gentle version than the current variety. There's definitely an agenda here as it wasn't done in any kind of unthinking organic fashion, especially given long-established norms (which weren't always male). Rather like replacing BC with BCE.
Some future historian will probably look at an AP style book over time and pass judgement on us all. Dunno how it will look. Maybe they'll think of the singular use of 'they' as implying that humans of this era had multiple personalities.
There are people, today, who will fault you for not using 'he' or 'she', since client is singular and 'he' & 'she' (unlike 'they') is too.
If you insist on being gender-neutral, you have to write "he or she" which quicky gets tiresome. Or rewrite without using pronouns, which can also be awkward.
I learned that "he" is gender-neutral if there is no other context, and that's the way I continue to write.
"And whoso fyndeth hym out of swich blame, They wol come up..."
That your English teachers were misinformed prescriptivists isn't the language's fault, and shouldn't be used as a ruler by which you measure the writing of others.
Comments like this are why I love Hacker News.
Why do you think it's supposed to be a "step towards reduction of sexism"?
Don't you? Then what do you think it's for?
I find it refreshing to read, but I also doubt it plays a major role in the 'reduction of sexism'.