Hacker News new | comments | show | ask | jobs | submit login

What I mean by "everyone on the team plays by the rules." is:

- all pointer access done via smart pointers

- std::vector instead of native arrays

- Use std::vector::at() instead of std::vector::operator[]() unless profiling shows a relevant performance increase

- std:string instead of char*

- References instead of pointers for mutable parameters

If you do C like coding in C++, which is of course possible, then C++'s safety over C gets thrown out of the window.

When I get to decide, the continuous integration build is always done with all warnings enabled, warnings as errors and static analyzers tools.

The developers can do the local build as they prefer, though.

std::vector isn't "safe." If you're using a std::vector::iterator and someone appends to the end of the vector, your iterator may be invalidated. std::string isn't safe either. It's easy to create references to strings that don't exist any more, by returning a const reference to a string and then later deleting the string. smart pointers aren't safe-- partly because of cycles, partly because of references to smart pointers, partly because you inevitably have to convert them to something else to use them. I've been using C++ for years and I've debugged all these problems.

> I've been using C++ for years and I've debugged all these problems.

Me too, my first C++ compiler was Turbo C++ 1.0.

They are a lot safe than using the C direct pointer manipulation idioms that make it so easy to create insecure code that can explode at any moment.

What STL offers might not be 100% as safe as the Pascal family of languages offer among others, but it sure is a way lot better than using plain C idioms.

The problems you describe are quite easy to spot if a static analyzer is made part of the build.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact