That said, IMHO, there is no "clean" C++ code. There are C++ codebases that use different styles, and their "quality" more or less is context sensitive.
Personally I felt the best tutorial to C++ were actually two other programming languages - Scheme and F#.
Scheme drove in the concept that it's totally fine to model things based on the shape of the data and that you don't need to create type based abstractions around every thing.
F# then demonstrated how a language with type system is supposed to work.
The problem with C++ is that the language is so verbose that unless you have an abbreviated model in your head how a type based language can be used to solve problems in best way, you will get lost in trivial C++ minutiae.
So, personally, I generally think "Am I solving a Scheme like problem or Standard ML like problem" and then try to apply C++ as simply as possible.
Several academics have created a career of how to patch C++/Java/C# with concepts that make them less fragile in a collaborative context:
In my opinion design patterns are not a fundamental concept, but rather provide common syntax for collaboration purposes for various patterns that are parts of language and more or less invisible in e.g. Scheme or F#. But if one is diving into C++ it's probably convenient to be familiar with these concepts.
If your C++ code is "so verbose", you are Doing It Wrong, most likely by doing it the old way. C++98 was verbose.
So - verbose in terms of the spec which describes what the code actually does and mental chatter I have to do with myself when writing it "should I return a ref or a smart_ptr or a value or should I use move semantics..." whereas in F# it's just a ref and garbage collector will deal with it. So I can skip the mental chatter part.
a) displays a way to put together a non-trivial C++ application
b) displays several extremely useful graphics related algorithms and patterns
c) Is documented on a level and quality few public equal code bases reach - so get the book first and then start reading the sources
Interesting - can you talk a little bit about why those specific design choices are questionable from a modern C++ POV? How would you do it differently, given the problem & domain PBRT covers?
I ask because my mental "sweet spot" in terms of understanding and writing C++ is approximately C++11. It's pretty easy for me to go into symbol shock when I see some of the latest stuff in C++17 and C++20... and I suspect I'm not the only person that happens to.
A much better approach is to allocate large chunks of memory at one time and run through the arrays of floats directly. Instead of tracing and shading one path ray all the way through, finding out where it hits, running the brdf, looking up textures, casting another ray, etc. it is much better to do each stage in large chunks.
Going from data structures holding lots of virtual pointers to something like I described above can be a substantial improvement in speed and give a lot of clarity at the same time.
Many times programs with a lot of inheritance end up with their execution dictated by types calling other types' functions, which makes it more implicit and buried rather than linear in a top level function. It becomes similar in a way to callback hell where you are trying to track down how the program gets to a certain place. Many times it takes adding break points and walking back through the call stack to reverse engineer it, rather than looking at a single top level function that has a lot of steps.