I'm pretty sure it depends largely on the type of business. Businesses where software makes them money tend to have high code quality and developers that care about that sort of thing. Businesses where software (or IT as they'll call it) is a cost center tend to be like you described.
I'm also pretty sure there is a strong correlation between PHP and that type of environment. It's much less common with places that use Ruby/Python/etc. Probably for the reasons Paul Graham wrote about with Python vs Java: http://www.paulgraham.com/pypar.html
What are examples of companies that have a verifiable 'great software' culture and chose PHP because they prefer the language?
Most engineers never exit the "whining" phase though. But those who do, make a truly good mate to work with.
However, at some point it will become obvious that you can/can't effect change in a meaningful timeframe. And if the latter, you should move on.
The sad truth is that often the bad type of programmers the author describes are unwilling to learn, and are especially reluctant to take lessons from a new/young colleague.
> I've made sure that my code was secure
> has just always been embedded into my programming philosophy
> So their environment was broken to me.
> I've began to realize that I'm a much better programmer than all of them
Good code quality, and security, is something everyone needs a bit more of, but be careful not to fall into dogmatic thinking. They are paying you to do tasks the way they want them done. If you can show them a "better" way they might take your advice, but you should never expect it. Programming is more than just a job skill to me, so I always have to remind myself about the line between "my code", built on my own time, and "their code" for which I am being paid to build/maintain.
> I showed them OSWAP, and they have never heard of it.
> I mentioned MVC pattern, and the manager didn't know what that was.
This is common. Never hearing of something is not bad. No one can keep track of all the tools available, learn them well enough to use them, and still accomplish what they need to do. No matter how much time you spend trying to keep up, you won't.
Also, your coworkers, and manager, might already know what will and won't be approved. So, they are only informing themselves of things that are likely to be important to their tasks.
> I've began wondering if all work environments are like this.
 There are many definitions of better. (Time, money, complexity, maintenance, ROI, etc.)
 Some managers / employers want you to mix the two concepts, but, in my experience, this is just encouragement to focus more attention on the tasks they are giving you. You need to draw a line.
Maybe with respect to "code quality", but it would be unethical write code with blatant security flaws, even if the organization paying you says "that's just the way we do it here".
But, just some advice for you in the meantime: don't be that developer who insists on your standards, no matter how much you feel yourself to be right. If you do that, you're going to be miserable and you won't get a good reference for the next job.
But more importantly, you should take this opportunity to learn about everything in the job that isn't coding. Like teamwork, communication, respect.
Success at the job is not about doing what's right in an abstract sense; it's about doing things that decision makers perceive as valuable. If you feel strongly that they're doing the wrong thing, then you're first going to have to change what they perceive as valuable.
In your programming career, you will often be employed by people who are less mentally agile than yourself. But you're going to have to learn to respect what they know and do well - understand the customer, forge relationships. Computers are so fucking efficient these days that even naive solutions have a lot of value sometimes, so deal with it. (There's always some other programmer who would be appalled at the shortcuts you take; you're not special.)
TL;DR: If you want respect, you have to give respect first.
Another thing that I've noticed, and that has happened > 1 time is that I'll write something, spend like all day on it, then come to work the next day to have found out that the manager re-wrote it claiming "simplification" of the code, which is in fact not true, or claiming my code had a "bug" These bugs aren't security bugs, or anything for that matter. Their small user permission things that I was never told about and is not documented. I don't care if I can't work on something, but if you want to do it, then do it, don't ask me to do it, then re-write it because I could have been working on something else in that time.
That is totally legit, if there is a naming convention in place on the project and the developer violated it. If it's just his personal caprice, then yeah it's dubious, but better an idiosyncratic dictator than no standards at all.
> then come to work the next day to have found out that the manager re-wrote it claiming "simplification" of the code, which is in fact not true
This is extremely common. You've already documented how this shop has no good procedures, so people waste a lot of their time rewriting stuff.
I hate jobs where I have to be someone else's extra hands. It's a sign of poor organization on every level.
That said - I hope as an open source programmer you've learned not to get attached to your code.
Much better to explain the bug/problem and let them fix it.
Kids these days. I once worked on a system where the size of the switch statement broke the compiler. I believe the limit was 64k lines in a switch, although my memory is a little hazy. Of course you couldn't actually reasonably edit a single file with that much code, so it was split into several files, each of which was still thousands of lines long, but at least you could load the smaller files without crashing emacs with out-of-memory errors.
default: printf("unhandled action")
My first thought is to have meetings where you and your team make a list of problems that you guys are facing (is logging not enough? is disk filling up quickly? is it getting hard to maintain some part of the code, etc). Don't go overboard. People who are much more senior than you are in this organization have pride. Imagine you worked your way up and did most of the coding and suddenly a new hire called your code shitty and ugly and non-functional, that hurt. Also, some people fear of breaking system. School in particular hates it. They don't care; students don't care. If school's portal is down for more than 1 hour, someone is gonna get an ugly call from some executive admin.
I can say testing, MVC patterns, all that shit (excuse my french here, in your manager's hypothetical voice) are least priority. What are your priority. Prove SQLi. Then fix it. Are there documentations for how to set up a server running the software you guys are working on? Is it done with a shell script. If so, is it working?
Testing and new development patterns shouldn't be your priority. They are cultural. You can't build Rome in one day unless you have all the people behind your back.
Scrum style, sure, but do it slowly. Also, this may sound extreme, but sometimes, you need to play a bit dumb. Don't show off. Some people can't face the fact they are out of the league... sometimes playing dumb/doing things a little slow can make things smoother.
I get the impression that this guy thinks of himself as "a coder". He wants to write code all day long. He wants to write the best code in the world. He wants to write code that's robust and tested and modern and secure and clever...
...and all for the sake of writing code that's interesting to him.
In the real world we have to write code that enables users to do things. That's it. Usually that also means writing code that's tested, secure, robust and so on, but for the sole purpose of making users better able to use our software.
We write things that are secure so that users don't lose their data/privacy/money.
We write things that are tested so that users don't face errors and bugs when they use our software.
We write things that are robust so that users can use our software whenever they need it without it falling over all the time.
If I was the manager of his team and he came to me suggesting "Dependency injection of the database objection into a class" I'd say no too without a good reason - the entire team would need to learn the new code, any code that relies on the old system would need refactors, and so on. But if he came to me and said "If we used an injection pattern we'd be better able to switch to a failover server, we'd be able to add features faster and it'd enable use to use a better security model", I would listen.
A big part of development is actually "sales" - if you can't sell your ideas to your team, you'll not get to implement them, even if they're the right way to do things.
When you apply for your next job be careful to ask about their coding practices. Ideally you should be able to find their lead devs on github or somesuch and view their code to get a sense of how good their work is.
Your challenge, if you choose to accept it, is to hack the social environment: get them all slowly to come to meetups and hackdays and stuff and engage with the rest of the industry. They'll hear the buzzwords and learn, and change will happen. It will take a while...
One of the reasons why open source is good is because you associate your code with your name. You take personal responsibilith for it. At a company where developers have come and gone over time, that doesn't happen.
Now, how you approach this matters. You can either become one of those who don't care, you can start to roll the stone uphill or you can seek greener pastures. It's all up to you. The people you work with sound like they don't care for things as long as it works. Do you want to become like that?
To the author of the article: move higher. Whenever you feel like you're in the top third of programmers at the place you work, it's time to move on.
Scala and Clojure are relatively new languages but they have pretty good web frameworks, which tend to be closer to RoR than the J2EE stuff.
C/C++ is indeed used very rarely - it's just too dangerous and unwieldy.
OP should definitely move away from PHP though.
Can we just stop pretending that a language cobbled together in 10 days has good parts except by chance?
In addition, the language wasn't developed in 10 days. It's first design was. And it borrowed from earlier languages.
LOOOL. You're joking, right? On which planet do you live?
And I could also easy complain about you: why did you choose this aged stack for your first programming job in 2014? This stack is still very popular and nothing bad but yeah, there must be some reasons why you chose this stack which I don't know.
But I won't complain.
Beware of structuring and refactoring too much, or writing low-ROI tests. You must find balance.
I feel it, OP.
Seriously though how can you start generalizing / assuming about working as a professional programmer when you're not even 1 year into your first job?
Try your best to change things, but don't try your best for too long till you burn out and are miserable. Some programming jobs are there just to separate the wheat from the chaff.
How was your interview process in comparison to the quality of people you're working with now?
(We're looking for engineers at my company if you're interested..)
Try this on php.net.
[Edit: I tried to download the stable php release via https. Try to guess how that worked out]
You know, I have a sneaking suspicion that every programmer thinks this way.
Which is also why changes are discouraged - no need to mess with something that works unless absolutely necessary.
Not the best policy, but if it works - it works :-).
Even if they don't have any formal coding standards, a company can ship good products as long as they have at least a few decent programmers.
Ad hoc / beta testing can fill in for unit testing. Bad coding style and low modularity won't necessarily make a project slower or worse, just harder to maintain. Lack of version control isn't necessarily a major problem if there are daily backups of source code, and if each part of the system is being developed by only one person.
No tech company or startup would ever consider things like that sane practices, but many companies do get away with it. Especially if what they're programming isn't the company's primary product or service, like in the case of a university.
You will not come into heaven straight from university and you will have to learn to work in environments like this.
Blame your study for not telling you about how the real world works. Don't blame legacy code because it's something that exist no matter how unicorny the world is.
Negotiating managers and project owners is even more important than just being able to write testable code.
So take your skills and apply them (iteratively) to your current job, or go work at a startup where everything is hip and trendy. But remember you're at the bottom of the ladder right now, and you're the one making things better for companies and yourself.
Also remember the golden rule:
No one wants to pay to refactor code that works into more beautiful code that works.
Why am I even here.
Yep, it's pretty common (and sad).
and no, it doesn't make you a better programmer.