You are right, but not wrt small vs large. Instead, OO works well for artificial situations (classic case - UI frameworks) vs. real world situations (classic case - person isa contractor vs. person isa employee).
As soon as you start modeling real world situations (like bill being an employee but at the same time working a late shift as a contractor), then you're in the world of complex stuff that single inheritance, multiple inheritance, or anything less than insanely complex relationships requiring tens or hundreds of underlying entities to enact just can't handle.
IMO one reason for these fruitlesd ideological arguments -is OO good? Is it bad? - is that OO was originally sold using these real world examples. Which anyone who's worked on a real world system knows is BS.
As soon as you start modeling real world situations (like bill being an employee but at the same time working a late shift as a contractor), then you're in the world of complex stuff that single inheritance, multiple inheritance, or anything less than insanely complex relationships requiring tens or hundreds of underlying entities to enact just can't handle.
IMO one reason for these fruitlesd ideological arguments -is OO good? Is it bad? - is that OO was originally sold using these real world examples. Which anyone who's worked on a real world system knows is BS.