> Say to yourself, if I had to ship this literally four hours from now, what would I do? How could I fake it so that it appeared done? Do that. Then do it again. Then do it again until it's sort of really done.
Seriously, the only practical thing that works is exercise and sleep. If you have to push through, exercise 1hr per day and sleep at least 6 hours during your ordeal. The exercise will clear your fog brain like nothing else can. Also, moderate your caffeine intake during this period - your adrenals are already shot from stress and exhaustion, no need to make it worse.
meh, exercise and sleep are great, and they certainly help, but sometimes the job itself is the core problem... Doesn't matter how fit and well rested you are if you're sick of doing the same thing day after day after day after day...
This is exactly the analogy I use with developers who tend to over-engineer.
In a perfect world, we'd be guaranteed that our code will never need to change. In this world, abstraction is worse than worthless - it is a cost with no benefit.
In the real world, there are risks. Bugs will be fixed, requirements may be changed, back end systems may be replaced or scaled, and so on.
Building things the "right way" is exactly like buying insurance against those future risks. Business-savvy developers know how to estimate the probability and cost of the risk this code will face over its lifetime, and weigh that against what it would cost to build in extra code to mitigate those risks right now.
Thus, choosing the right level of abstraction and engineering overhead to add to a particular feature should be a series of decisions and tradeoffs, not just a determination that it's always better to build the "right way".
In my experience, developers tend to be overly confident in their ability to predict and mitigate future risks, and suffer severely from hindsight bias. They rarely lament the extra abstraction layer they built and never needed, but are quick to point out when modifying a feature will cost more because of weak abstraction.
Even in those perfect conditions right level of abstraction may cause the system to be build with less code and faster.
That being said I agree that it's always a trade off; I actually did many refactorings where I threw an abstraction level out instead of adding more of them, resulting in faster and easier to maintain code. It's definitely not good to invest in abstraction blindly.
In a perfect immutable system, I'm still going to want my code to be as DRY as possible, because I don't like typing the same thing over and over again.
In this perfect hypothetical world, you wouldn't eschew abstraction; you'd just know exactly what abstractions to use to perfectly model the problem you're solving. Those ideal abstractions would be different, since the problem is immutable.
Ask yourself this: in this perfect world, would you code with a magnetized needle and a steady hand? Why not?
Not sure how the magnetized needle analogy applies here. The point of the hypothetical is to avoid developing your own extra layers of code for future reuse. Avoiding those that others have developed for you already would not make sense.
As to the rest of your point, yes. Anything in service of pure development speed would be desirable. If you could abstract and reuse something faster than you could implement it in some other way (eg copy/paste), then it would make sense to do so.
Suggestion: you should now build a second app which generates a stream of single-use urls that are invalidated as soon as they are accessed, and which proxy through to an original base url. Then hook the two apps together and profit.