This is an economically rational decision. But it is also a bad one.
It's a good way to build non-scalable applications. Because if you scale the application, then at some point, the computer's time will become more expensive than developer's time. Of course, that cost is economic externality for the development shop, so why should they care?
Edit: I am not sure word "scale" is obvious. There is Google-like scaling, which is we run the software on many machines in house. But there is also Microsoft-like scaling, which is many users run the software. Collectively, they have pay the cost and they have to waste the energy.
No one's saying "let's do stuff the stupid way!" - they're saying hey, let's focus on getting users before fleshing out the technical details of the what-if-we-actually-make-it scenario. Or, as the classic saying puts it, don't put the cart before the horse.
Yes exactly! The point is you don't build a scalable app until you have a scalable business idea. The first iterations of a startup are about testing different ideas, not about building sustainable architecture.
Unless you are building a technical work of art, it's all about finding the right product.
Some decisions need to be made early on. You may be fine with MySQL, but you need to think about decoupling different parts of the process and to think about the implications of delays on the interfaces from the start, even if they aren't there.
Most likely you'll never reach Google scale, but, you'll be happy you did that as the application grows more complex and you don't have to test every part of it for each tiny change.
This is a relatively narrow scenario. Some applications "already made it". I'm helping build an IIoT solution that can process a really ludicrous number of measurements per second. If the code is optimal, we have a pretty positive impact on the company's bottom line. If it's not, we'll become famous for being the first group to unwillingly enter Top500 territory.
It's a good way to build non-scalable applications. Because if you scale the application, then at some point, the computer's time will become more expensive than developer's time. Of course, that cost is economic externality for the development shop, so why should they care?
Edit: I am not sure word "scale" is obvious. There is Google-like scaling, which is we run the software on many machines in house. But there is also Microsoft-like scaling, which is many users run the software. Collectively, they have pay the cost and they have to waste the energy.