I believe the decline of OpenStack is inherent in the type of technology. I don't believe that type of "infrastructure plumbing" software lends itself to high-speed high-quality innovation in the "open source" model. Some reasons:
1) it's boring infrastructure software: Because it is software for "data center" infrastructure, it's not glamorous and interesting enough to attract programmers who don't interact with that world. Other successful open source projects like LAME MP3, ffmpeg, Linux kernel, etc have programmers that can experiment on laptops and ordinary users that can download and play with it.
2) no abundant source of high-quality code contributions: The enhancements you want contributed into OpenStack should be provided companies running complex private cloud operations. They would battle-test the code and that knowledge would trickle down into the open source contributions.
However, a company that's facing the prospects of a complex private cloud would more likely choose Amazon AWS or Google Compute platforms instead of tackling OpenStack. This removes a subset of potential OpenStack contributers. OpenStack is so complex to run/customize/maintain that it requires building up an internal staff with skills similar to AWS/Google/Facebook's datacenter engineers.
If the company does decide to run OpenStack, it's likely they won't contribute back any source code. (Same as Linux situation where most users of it don't contribute source code.) This removes another subset of potential contributors.
Probably the best "private cloud operating systems" are developed by Google/Facebook/Amazon/MS for their own datacenters and those 4 companies are not contributing to OpenStack. Wal-mart has sophisticated datacenter operations and they may be the largest Openstack user but their contributions to OpenStack do not match the new innovations of AWS mentioned at re:Invent or Google CloudNext. Yes, Red Hat is a sponsor of OpenStack but their internal cloud operations are not stressed like AWS/Azure/GCP with customer-facing challenges nor do they manage IT complexity like Wal-Mart.
Sometimes, commercial versions of plumbing software do lose to open source projects. One example would be Microsoft's Dryad "distributed computing" losing to open source Hadoop even though MS had a 7 year headstart. However, Hadoop+HDFS is still not as fast or reliable as Google's internal distributed mapreduce. (Maybe the idea of buying a hundred expensive Windows Server licenses to satisfy a 100-node Dryad cluster contributed to the market ignoring it.)
Based on history of how certain open source projects fall behind proprietary counterparts (such as OpenCL not as cutting edge performance as NVIDIA CUDA or OpenGL not matching MS DirectX features), OpenStack seems on to on a similar trajectory of "not as good as the proprietary alternative".
From my perspective, companies that decide to run Openstack do contribute code back, but the process is so long and arduous to get code accepted that it's not hard to see how groups would just give up and just merge upstream to their own.
Core reviewers have to accept your code, and they are on no timeline to do so. Months will go by without a review.
I think that's a slightly different reason because OpenCL is just a spec with no real teeth. NVIDIA/MS have a pretty tight grip over the market both in h/w and s/w, and creating a CUDA like spec does nothing meaningful if there is no alternative source for hardware and software. Everything including drivers, libraries, compilers etc. is closed, whether it's CUDA or OpenCL. So aside from some vague promise of portability, OpenCL doesn't really open anything.