
PHP vs Node js... Let's be honest: Node.js couldn't kill PHP. Why? - clubminsk
https://belitsoft.com/php-development-services/php7-vs-nodejs#1
======
claudiulodro
#1 reason the article doesn't mention: It's incredibly cheap to host PHP.
Hosting companies push WordPress and other PHP-based hosting heavily and make
it super easy for anyone to get a live site going with PHP.

Last time I launched a node app, it took me a full day of messing about in
AWS. Last time I launched a PHP app, it took me less than a half hour at
Siteground.

~~~
moomin
TBH, PHP is viable for pretty much exactly as long as Wordpress survives and
uses it. If that changed, I wouldn't fancy PHPs chances.

~~~
paulgrimes1
I disagree. I've been using PHP for about eleven years now, even since the bad
old days of PHP3, and I have never used, and generally advise against using,
wordpress. PHP will be around long after wordpress.

I do concede that if PHP went away, WP would somehow survive, like some kind
of post-nuclear cockroach.

~~~
exclusiv
I'm in the same boat. I cranked a lot of things out with CodeIgniter back in
the day. Expression Engine is superior to WordPress but if you like either I'd
suggest Craft CMS now and moving forward. For general apps, I use Laravel or
Lumen. And React PHP for scraping. And don't forget The League [1]. Lots of
great PHP stuff out there.

[1] [https://thephpleague.com](https://thephpleague.com)

------
segmondy
Node was not built to kill or replace PHP, so why should it? I mean even C++
didn't kill or replace C. Both languages can coexist, I like em both.

------
combatentropy
I maintain a mountain of PHP. What attracts me to JavaScript is its syntax:

PHP:

    
    
        $fruit = array('apple', 'banana', 'orange');
    
        $fruit_colors = array(
            'apple' => 'red',
            'banana' => 'yellow',
            'orange' => 'orange'
        );
    

JavaScript:

    
    
        var fruits = [ 'apple', 'banana', 'orange' ],
    
            fruit_colors = {
                apple: 'red',
                banana: 'yellow',
                oranage: 'orange'
            };
    

That's like 20,000 fewer keystrokes and much easier to read. My favorite new
feature in PHP is the short array syntax.

Even those less minimalistic than me, I think the draw to Node is the
language, the consistency of having the same language front and back.

Much ado is made about which one is faster. But I would say roughly 99% of web
apps are CRUD apps, with a few users per hour, and the bottleneck would be the
database anyway.

For me what PHP has going for it is my familiarity with it. Plus, despite all
the nitpicking articles, it is rock solid, especially compared to Node. And
compared to Node, PHP is a hyper-organized library of everything you need,
just an arm-length away.

~~~
dagn
A few months ago I was tasked with porting an internally used Node.js script
to PHP 5.6. The code made heavy use of nested async callbacks and other
patterns frequently only seen in Javascript codebases. I thought I'd have to
rewrite most of it from scratch, but the PHP code ended up nearly identical to
the original Javascript. Line-for-line, ignoring minor syntactical differences
(like the $ in vars), these two ~400 line scripts ended up being 98%
identical.

Anonymous functions, closures, async calls, syntax short-hands... PHP has it
all.

------
erikpukinskis
PHP uses an imperative programming model and has a very clear URL->source file
mapping. This means you can look at the URL, find the source file, start at
the top and trace your way to the problem.

Node (and Rails, and Java and lots of others) use a declarative router. The
app boots. Shit happens. Routes exist. Middleware is involved. There's no way
to know what code is implicated in a given URL.

If you're a pro and you can afford to spend a year or so learning framework
internals while not getting much done, you develop a sixth sense for where in
the codebase an error is likely to be, but you need to spend many weeks and
months scratching your head to get there.

PHP is much more beginner friendly in this way.

~~~
jayflux
When's the last time PHP has been used in this way? Don't people use
frameworks these days such as symfony, silex and laravel etc? And then point
the web server to the entry point like with any other language?

~~~
krapp
>Don't people use frameworks these days such as symfony, silex and laravel etc

People _should_ , Composer (with PSR autoloading) and nikic's FastRoute get
you most of the way to a really nice, lightweight meta-framework barring
anything else, for practically nothing. There really is no excuse for the "URL
points to a file" model to be a thing anymore.

But a lot of legacy PHP code doesn't do that, and likely a lot of newbie code
doesn't, because they follow old tutorials, and that model is faster and
simpler.

Not even Wordpress uses routing AFAIK.

~~~
mysterydip
Honest/naive question: why not? Plenty of mvp or weekend projects can get by
easily with url=page. If it's faster and simpler, what's wrong with doing it?
Surely there's room for both on the intetnet.

~~~
krapp
Having a router allows you to not have all of your files be web accessible, by
using a single point of entry and a whitelist for all possible paths. Most
projects which map PHP files to URLs have everything in the web root.

If it's just a small brochure site with a few pages then it's no problem, but
forums and larger projects built like this can leak information and expose
vulnerabilities when PHP files which weren't meant to be directly accessed
are, such as pages that perform SQL queries expecting variables in context
that don't exist when accessed directly, or old config files, text files, open
directories, etc.

But "url=page" is still basically routing using the query string, and IMHO so
is access control using .htaccess. Any system where URLs are validated and
where they don't directly point to files on the server counts.

~~~
combatentropy
> Having a router allows you to not have all of your files be web accessible

Having a router may make it easier, but it's not the only way.

You can just put your libraries, passwords, etc., in some directory outside
the document root.

    
    
        /var/www/example.com/public
        /var/www/example.com/lib
    

Then you use "include" to get them.

~~~
krapp
Having worked on projects with includes within includes within includes, and
dealing with global variables defined in one file, used in another and reused
elsewhere, it's definitely not the _best_ way and it seems to destroy
encapsulation.

If you can keep it under control, though, of course it's no different than
includes in C/C++ (it probably literally is a wrapper around the macros.) But
modern PHP prefers using autoloaders, so you never actually have to use
include statements to begin with. You could just as well define those
directories and include them through a Composer setting.

~~~
combatentropy
> includes within includes . . . global variables

Yeah, that's a pain. You have to exercise a little discipline. Of course none
of my code has that ;)

With hand-typed routing, can't you accidentally define two routes that
overlap, so some URLs could match both? The only reason it goes to one and not
the other is something random like the order the routes are defined in the
file? Like if you defined this route:

    
    
      /a/b/c
    

Then you slept, added a bunch of code, came back in three months and put this
in:

    
    
      /a/*/c
    

Now you have two routes that match the same URL. And maybe they're separated
by several lines of code, so that the mistake is hard to spot.

You can't do that with a filesystem, put two files in the same place.

~~~
krapp
> The only reason it goes to one and not the other is something random like
> the order the routes are defined in the file?

To be fair, that's not random, that's pretty explicit. And if you're
optimizing for speed, short circuiting at the first match is a good idea.

But if you're using a router, chances are you're passing the segments as
arguments to some function or method anyway, so route /a/*/c and /a/b/c should
both wind up returning the same content if the second segment in both cases is
'b'.

------
chairmanmow
I can't take this article seriously when it shows a survey at the top claiming
to show a stack overflow developer survey for back-end technologies that lists
Javascript at 54% happiness, and then further down it lists nodeJS at 14%
happiness. I guess the other 41% were using io.js.

Apparently people are also happier using angular on the back-end. Which makes
no sense. Also, pretty sure WP still runs PHP on the back-end. So much
inaccuracy here.

------
tmzt
I have successfully run node.js applications on a hosting account which
supported fast cgi for Rails applications.

Unlike normal node.js applications, the process is started and managed by the
provider's web server and applications can be uploaded using a standard FTP
application.

Here's a similar library: [https://www.npmjs.com/package/node-
fastcgi](https://www.npmjs.com/package/node-fastcgi)

~~~
clubminsk
Lucky with the provider?

~~~
tmzt
This was a couple of years ago, but the provider was Bluehost. While I had
shell access and a compiler, I only used it to compile the node binary and
make the main js file executable with a hashbang.

------
lioeters
Due to the ubiquity and ease of hosting PHP, and the current shift to full-
stack JavaScript frameworks, I predict a long transition period where various
"bridges" will be made to interface between the two. Mainly I'm thinking of
PHP-side exposing content via REST APIs, with a more modern frontend written
all in JS.

~~~
jbrooksuk
Where I work we use PHP 7.0/7.1 with the Laravel framework and Vue.js as the
front end. Works well for us!

~~~
lioeters
I'm curious, how do you reconcile server- and client-side rendering to be the
same? Specifically, I'm thinking about initial page load/reload and route
transitions. Do you share templates/partials between PHP and JavaScript?

------
raarts
'complement' not 'compliment'

~~~
clubminsk
thank you

------
partycoder
There's still people working on Visual Basic 6 as well. That doesn't mean
there are no viable alternatives.

~~~
cordite
Faced VB6 professionally, there are viable alternatives, but end users don't
buy your software because of what technology you made it with, or originally
made it with. They buy it if it works and does what they need. It comes down
to viable transitions and the resources needed to make it happen.

Indeed from the last 6 years, companies are more willing to try out new
technologies, like node, or even docker! But only places that can truly
develop decoupled from the technology already used can jump on board for these
things. The transition cost for adding other technology when decoupled is
magnitudes lower than migrating an existing functioning production proven
product.

I have my own opinion on NodeJS, but I give it credit for the tooling it has
outside of production server uses.

~~~
partycoder
You can imply node.js is not a viable production server solution. But you
should rather be more explicit why. I am curious about your perspective.

PHP as a project includes a templating engine, a scripting language, and a lot
of native code built without a modern approach.

node is a more scoped project. Doesn't implement its own scripting language,
and doesn't include a templating engine and a lot of the stuff PHP does. It
focus on doing a single thing well. The native part of it is libuv, which uses
a consistent approach: non-blocking asynchronous I/O.

Many other projects also build on top of libuv and v8, and any improvements
coming from those integrations ultimately benefit node.

In comparison, PHP is one large project encompassing everything. As a result,
I won't expect PHP to move as fast. It's a larger project with lots of
backward compatibility concerns to take care of.

I know that PHP has known large deployments but none of those run on the
reference implementation. node.js does have large deployments running on its
default implementation.

~~~
cordite
This perspective is only from my limited experiences with Node in writing and
interacting with existing node software.

Node is great if you want shared state, there is no overhead to access state
introduced by many users of your system. Making a game server of sorts seems
like a good fit. It is also running on a proven technology which is battle
tested by millions of users independently each day.

But, with the cost of trivial access to shared state, there is no isolation. A
metaphor from the Erlang community is that it's tolerable to crash a one on
one phone call, but not the entire switch and everyone connected through it. I
have experienced existing software with space leaks which need to be restarted
every week to remain operational in production with tolerable latency and
resource footprint. It doesn't seem sane to me. (The example here is a channel
pub sub server)

Indeed PHP for large deployments requires tuning or alternative
implementations (e.g. HipHop) to be reliable and fast at scale. But one aspect
PHP does that node does not is assume isolation. Isolation is not a guarantee,
but if one page crashes, most of the time others are not directly impacted. If
on PHP processes runs out of memory, same story. If one tries to starve the
scheduler, it will eventually time out and be killed.

Your point on it having a legacy of many integrated parts is very true, and I
share your conclusion on not expecting it to move fast.

Many developers are coming to the shared perspective that simple software
should have simple extensible cores and it is okay to have multiple
implementations for a class of features (e.g. templating). This model of
independent reusable parts is where we will see fast moving technology grow.

~~~
partycoder
First of all, what you seem to be criticizing are server frameworks such as
express, rather than node. node does not impose a paradigm for how requests
are handled in a web server. It only provides the building blocks for a web
server framework. In theory it is possible to create a library that forks the
process for each request, if so you prefer. I personally think that might bit
a be inefficient.

Then, there are ways to mitigate the problems you describe. I know this first
hand as someone who has been behind substantially large deployments.

While leaking resources like memory, file and connection handles is a problem,
there are ways to find them and fix them. While a bit of discipline and rigor
you can be safe. But a lot of people in the node community are usually
disregard anything that cannot be sold and usually end up creating problems
they cannot get out from. And usually make those problems worse by adding
workarounds the problems they created.

With a little bit of good practices, unit testing, profiling, load testing,
code reviewing, etc... something that you can achieve by still being able to
have work/life balance, by thinking like an engineer instead of discarding
every software engineering book you read at school, by paying attention to
what you do and not allowing excessive complexity, you should be in a
comfortable position to fix a bug related to leaks.

The problem are imprudent people that want to boost their careers by pushing
features that do not implement non-functional requirements to get the favor of
product stakeholders. That approach may work on frontend, may work on some
contexts, but not on backend node software. That's the easiest way to have a
cascading failure of servers on 100% CPU/100% memory usage beyond repair.

Usually those people come in one flavor: people that think that because they
know JavaScript, they can write server software. That's not it. To write
server software you need to know network protocols, operating systems, memory
management, algorithms, databases, distributed systems, etc.

------
igtztorrero
Asyncronous thats the difference. Nodejs its asyncronous, thats powerfull but
its a nigthmare to control it in a crud database api. We try to change from
php to nodejs in 2016 but async was the difference.

------
pteredactyl
Why the clickbait? Use the right tool for the job.

