Sure, but algorithms that explicitly need a linked structure for performance are very few compared to algorithms that have the same or better asymptotic with a hash and a vector or for which the input sizes are such that the large constants pointer chasing causes don't make up for worse asymptotics.
If the algorithm calls for a balancing tree, you'd end up re-implementing it across of bunch of (linked) hash tables or over an array treated like a heap. Explicitly, or if one does not know what they are doing, implicitly. In either case you would have no performance advantage.
If you know what you are doing, you may well realize a performance advantage.
For example LevelDB (which is conceptually based on Google BigTable's design) looks a lot like a balancing tree in its access patterns, but is a lot faster than a variety of alternatives. And it is faster in part because sequential data is stored sequentially in order instead of using a data structure that results in random access patterns.