1. Software. The phones run complex, low-assurance software in unsafe language and inherently-insecure architecture. A stream of attacks and leaks came out of these. The model for high-assurance was either physical separation with trusted chip mediating or separation kernels + user-mode virtualization of Android, etc so security-critical stuff ran outside that. There was strong mediation of inter-partition communications.
2. Firmware of any chip in the system, esp boot firmware. These were privileged, often thrown together even more, and might survive reinstall of other components.
3. Baseband standards. Security engineer Clive Robinson detailed many times of Schneier's blog the long history between intelligence services (mainly British) and carriers, with the former wielding influence on standards. Some aspects of cellular stacks were straight designed to facilitate their activities. On top of that, the baseband would have to be certified against such requirements and this allowed extra leverage given lost sales if no certification.
4. Baseband software. This is the one you hear about most. They hack baseband software, then hack your phone with it.
5. Baseband hardware. One can disguise a flaw here as debugging stuff left over or whatever. Additionally, baseband has RF capabilities that we predicted could be used in TEMPEST-style attacks on other chips. Not sure if that has happened yet.
6. Main SOC is complex without much security. It might be subverted or attacked. With subversion, it might just be a low-quality counterfeit. Additionally, MMU or IOMMU might fail due to errata. Old MULTICS evaluation showed sometimes one can just keep accessing stuff all day waiting for a logic or timing-related failure to allow access. They got in. More complex stuff might have similar weaknesses. I know Intel does and fights efforts to get specifics.
7. Mixed-signal design ends up in a lot of modern stuff, including mobile SOC's. Another hardware guru that taught me ASIC issues said he'd split his security functions (or trade secrets) between digital and analog so the analog effects were critical for operation. Slowed reverse engineering because their digital customers didn't even see the analog circuits with digital tools nor could understand them. He regularly encountered malicious or at least deceptive behavior in 3rd party I.P. that similarly used mixed-signal tricks. I've speculated before on putting a backdoor in the analog circuits modulating the power that enhances power analysis attacks. Lots of potential for mixed-signal attacks that are little explored.
8. Peripheral hardware is subverted, counterfeit, or has similar problems as above. Look at a smartphone breakdown sometime to be amazed at how many chips are in it. Analog circuitry and RF schemes as well.
9. EMSEC. The phone itself is often an antenna from my understanding. There's passive and active EMSEC attacks that can extract keys, etc. Now, you might say "Might as well record audio if they're that close." Nah, they get the master secret and they have everything in many designs. EMSEC issues here were serious in the past: old STU-III's were considered compromised (master leaked) if certain cellphones got within like 20 ft of them because cell signals forced secrets to leak. Can't know how much of this problem has gotten better or worse with modern designs.
10. Remote update. If your stack supports it, then this is an obvious attack vector if carrier is malicious or compelled to be.
11. Apps themselves if store review, permission model, and/or architecture is weak. Debatable how so except for architecture: definitely weak. Again, better designs in niche markets used separation kernels with apps split between untrusted stuff (incl GUI) in OS and security part outside OS. Would require extra infrastructure and tooling for mainstream stuff, though, plus adoption by providers. I'm not really seeing either in mainstream providers. ;)
That's just off the top of my head from prior work trying to secure mobile or in hardware. My mobile solution, developed quite some time ago, fit in a suitcase due to the physical separation and interface requirements. My last attempt to put it in a phone still needed a trusted keyboard & enough chips that I designed (not implemented) it based on Nokia 9000 Communicator. Something w/ modern functions, form-factor, and deals with above? Good luck...
All smartphones are insecure. Even the secure ones. I've seen good ideas and proposals but no secure[ish] design is implemented outside maybe Type 1 stuff like Sectera Edge. Even it cheats that I can tell with physical separation and robust firmware. It's also huge thanks to EMSEC & milspec. A secure phone will look more like that or the Nokia. You see a slim little Blackphone, iPhone, or whatever offered to you? Point at a random stranger and suggest they might be the sucker the sales rep was looking for.
Don't trust any of them. Ditch your mobile or make sure battery is removable. Don't have anything mobile-enabled in your PC. Just avoid wireless in general unless its infrared. Even then it needs to be off by default.