I rarely meet a CS major who will accept that small things will always stay small. They will frequently talk you into more complex data structures with the assurance that you are just too stupid to realize your problem will suddenly need to scale several orders of magnitude. Do they teach you in CS-101 to interpret '+' as the exponential operator? It often feels that way.
Are they fresh graduates? It is very important to understand the workload distribution for any optimization. Even if small things can sometimes get large, optimizing for the small case can often yield large gain as they may occur frequently. And complex data structures are usually worse in the small case...