Photos of your dog are not going to trigger it. Someone would need to engineer the 30 photos of your dog tweaked to hash to a particular value, and then convince you to save them to your device and then upload to iCloud. And then some portion/abstraction of the dog photo would need to convince a reviewer they were looking at CSAM.
The more likely path to trouble is legal NSFW material that's been engineered.
The more likely path to trouble is legal NSFW material that's been engineered.