There's a general class of problems to be solved with data structures, which is that you have a collection of records of data, and you're trying to find [1] a record, or collection of records, given some criteria. In essence, you have a database of some kind, so it should not surprise you to learn that databases (and things that are databases but not normally thought of as such--filesystems, for example) rely very heavily on these more "exotic" data structures internally. Indeed, a database is basically a heavily-optimized data structure that usually optimizes for the bursty nature of disk access [2]. And given that implementing these data structures tends to be a rich source of bugs, if you can offload the work to a database implementation, you probably should.
I will also point out that when you view the problem of data structures as an implementation that solves various queries, you can start building tools that automate the implementation of complex data structures given the queries you want to ask the data structure. I fully expect to see these tools start to become available over the next decade or so, given the current progress of program synthesis I see in the academic literature.
[1] Or insert, delete, or update. But usually you already have the record at that point, and you just want to update the data structure's representation of that record.
[2] You can apply the same techniques to cache access and even vector register usage for purely in-memory data structures. Same problem, just with different sizes of disparity.
>I fully expect to see these tools start to become available over the next decade or so, given the current progress of program synthesis I see in the academic literature.
Interesting. I did know about program verification efforts (trying to prove that programs are correct), but had never heard of program synthesis (assuming it is somewhat different from code generation). Do you know of any examples of the kinds of areas it can be applied to?
The major field where it's already being applied is in automating the conversion of data--building the program that can take "John Doe" and turn it into "<name>John</name> <surname>Doe</surname>" just by example. You can also do similar generalization to automate things like constructing highly repetitive graphics (e.g., draw a few teeth of a gear and the program does the rest). Superoptimization is the other main prong, where you really combine the synthesis and the verification to get correct faster code.
I will also point out that when you view the problem of data structures as an implementation that solves various queries, you can start building tools that automate the implementation of complex data structures given the queries you want to ask the data structure. I fully expect to see these tools start to become available over the next decade or so, given the current progress of program synthesis I see in the academic literature.
[1] Or insert, delete, or update. But usually you already have the record at that point, and you just want to update the data structure's representation of that record.
[2] You can apply the same techniques to cache access and even vector register usage for purely in-memory data structures. Same problem, just with different sizes of disparity.