The transition point was when Intel moved the DRAM controller and PCIe root complex onto the CPU die, merging in the northbridge and leaving the southbridge as the only separate part of the chipset. The disappearance of the Front Side Bus meant Intel platforms no longer had a good place for an integrated GPU other than on the CPU package itself, and it was years before Intel's iGPUs caught up to the Nvidia 9400M iGPU.
In principle, Nvidia could have made chipsets for Intel's newer platforms where the southbridge connects to the CPU over what is essentially four lanes of PCIe, but Intel locked out third parties from that market. But there wasn't much room for Nvidia to provide any significant advantages over Intel's own chipsets, except perhaps by undercutting some of Intel's product segmentation.
(On the AMD side, the DRAM controller was on the CPU starting in 2003, but there was still a separate northbridge for providing AGP/PCIe, with a relatively high-speed HyperTransport link to the CPU. AMD dropped HT starting with their APUs in 2011 and the rest of the desktop processors starting with the introduction of the Ryzen family.)
Nvidia's chipset line for Intel motherboards started with the Pentium 4. There may have been relationship issues between the two companies that prevented Nvidia from entering the Intel chipset market sooner using a derivative of their Xbox chipset, but none of that has anything to do with what ended the nForce chipsets for Intel.
I don't know if this was market savvy or a footshoot that made their ecosystem weaker.