I'd only add (or rephrase other comments to the same effect) that slow growth is not only about staying within your capacity, it's also about learning what you're doing so well that you can systematize your processes for maximum results with as little speculation and risk as possible.
You could call that "being efficient," I guess, but I think I mean it in the macro sense, not looking for nickel and dime efficiencies.
Aha, I've often vaguely suspected as much with certain well-known food columns. I wrote it off to incompetent cooks, but it makes more sense that they were just making it up as they went along.
One that sticks in my mind was a recipe that called for a dozen whole cloves, where really you'd just use one or two. If any.
A dozen whole cloves, used in the way the recipe called for, would make pretty much any dish painfully inedible.
The handful of times I've dumped a bunch of time and effort into an at-home screening assignment, it took far longer to do well than whatever the allotted time was made out to be. Emphasis on the "do well" part.
My (unscientific) sense is that these things are really just screening for people willing to jump through unreasonable hoops.
I feel like this is a mismatch of delivery expectations rather than a problem with take-homes. I explicitly tell our candidates that I don't expect them to put together a polished solution in the couple of hours they have for the take-home. That's fine. The point is to merely get an idea of someone's ability.
Counterpoint: what idea can be had from rushed / time-constrained / deliberately careless work examples?
Any junior can copy / paste some functional yet poorly done code. Or make ChatGPT come up with something, I suppose, now that that's a thing.
Experiences vary, but every job I've taken and liked - no coding tests of any kind. More of a discussion about an example piece of code, for that part of the interview, but no requirement to come up with new code.
I'm a firm believer that you can evaluate exactly what you test. If you are asking someone to rush to deliver an algorithm, you are testing their ability to rush to deliver an algorithm. If you actually want to know whether they can deliver a polished solution, you asked the wrong question.
In my case, I actually do want to know how quickly people can pick up a new problem and get started writing some code, and that's how I evaluate what they deliver.
> Any junior can copy / paste some functional yet poorly done code. Or make ChatGPT come up with something, I suppose, now that that's a thing.
I have two thoughts about this.
1. If people can deliver, they can deliver.
2. If you come up with an original problem (rather than lazily copy and pasting some fizz-buzz type problem), you'd be surprised how many people fail to apply any basic problem solving.
> a discussion about an example piece of code
I do find that it is useful to keep code exercises short, but ask a few questions about their solution as a follow up. People often struggle to explain their solution if they just copy/pasted.
It might be better to spend some free time cultivating a side project or two where you can exercise your dev skills without endangering your income. Who knows, it might take off and turn into your own company down the road.
Once you give up salary and seniority, though, those are gone with the odds stacked against getting them back.
And there is the cynical (but non-zero) chance that many people will assume you were demoted for poor performance / behavioral issues / insert your own negative perception, instead of a voluntary move.
Tread cautiously, I'd say, don't do aything to endanger your current position and income.
A tale as old as (Internet) time - I've seen this cycle happen, too.
Tangentially, I have to wonder to what extent misapplication of Agile, and similar, project management processes is to blame.
You'd think for most relatively simple sites, like we're talking about here, it ought to be planned once and built once, but something about the mindset that the goal posts can be moved during planning and development seems to drag everything out at length.
I think that's pretty backwards. Non-agile is how you get these things being rebuilt every other year because by the time it's built the requirements that were originally gathered are obsolete.
Just a thought, not a ding on the idea. What kind of controls are there or will there be re: plagiarism scenarios?
For example, a user sends in an academic draft, represents it as their own, and does so with the intent (explicitly voiced or not) of using the service to get past automated plagiarism software / filters?
If you're familiar with academic writing, it's a pretty plausible thing that could happen.
Or is that type of abuse-prone material automatically turned down? Couldn't tell from the site.
That's certainly a possibility, but I'm not sure Foster would necessarily be the best tool for a use case like that. Our contributors tend not to directly re-write specific content. Most of the input happens at the idea level. A lot of GPT-3 tools like Wordtune would likely be more helpful in finding new language to express the same idea.
It might help with credibility and adoption rates to make it explicit on the site that your editors aren't going to work on material that looks like it's being used, or could be used, for plagiarism or related "no outside help" rules.
For non-STEM writing - your typical business and humanities-oriented material - you'd be surprised how frequently this issue comes up with instructors, teaching assistants, etc.
That's good to know, thank you. Our background isn't in academia, so this wasn't necessarily on our radar. Will keep it in mind as we further develop our content policy.
I'd say it's easy to start spending and keep spending the bucks if you're aiming to duplicate a conventional print run from a large press, with no expense spared, but it isn't necessary.
Sites like Canva will take care of cover art, and print on demand presses like Blurb let you start with a very reasonably priced short run and will hook you up with a variety of distribution networks.
You can quite literally publish a professionally packaged book for a few hundred dollars or less.
There's also Amazon, but I've never tried it so can't comment on it.
Cogent point. I've worked with outsourced and offshore teams quite a bit.
The attempt to arbitrage lower wages based on geography for highly skilled roles has almost invariably, over any meaningful length of time, run up against increased organizational costs and managerial overhead for all kinds of reasons. You name it.
My general impression is that it tends to work out as a "Rob Peter to pay Paul" kind of scenario - a wash or a net negative.
There are always stories where, when everything went really well for everyone involved, it worked out as an advantage, but those - in my view - are the exception to the rule.
I'd only add (or rephrase other comments to the same effect) that slow growth is not only about staying within your capacity, it's also about learning what you're doing so well that you can systematize your processes for maximum results with as little speculation and risk as possible.
You could call that "being efficient," I guess, but I think I mean it in the macro sense, not looking for nickel and dime efficiencies.