Right; the page https://the-algorithms.com/algorithm/fibonacci-numbers details a smart algorithm using matrix powers and then lists implementations for many languages. But the Haskell implementation is the dumb exponential one rather than the smart one shown.
Btw, another less cryptic way to generate the list of Fibonacci numbers is
there's a reasonable expectation that a repo named "cool algorithms" won't use extremely dumb exponential-time versions when linear-time ones exist. Also in this case there's a logarithmic-time one that does repeated squaring of a 2x2 matrix.
Arithmetic on bignums is not O(1), but rather depends on the size of the numbers. This is important when considering say the repeated matrix squaring vs an evaluation of the closed form.
They both beat naive exponential time algos, of course.
yeah I agree. I think it's because they pretty much allow contributions of algorithms from various people and the code review standards are not super high. I recall at one point the BFS algorithm in Java actually used an ArrayList (roughly equivalent to std::vector) for the queue, which is silly because popping from the front is O(N), so I had to submit a fix for that:
but I agree that basic low-hanging issues like this (and the exponential-time fibonacci that another commenter pointed out) really prevents me from taking this repo very seriously
> Implementations are for learning purposes only. As they may be less efficient than the implementations in the Python standard library, use them at your discretion.
It looks like the AVL tree implementations do the lazy coding way of updating balance factors and recursively descend to recompute the heights of the left and right nodes constantly. The balance factor can be updated directly from the kind of rebalancing rotation being used. I would not call that properly implemented since it is wasting a lot of cycles on that.
Once they're done with the rotation they're recomputing the balance factor by calling something like height(left) and height(right) which recursively descend to count the height on both sides. You don't need to do that. Given the old balance factor and the rotation you can compute the new balance factor.
(Some really bad implementations don't even store the old balance factor in the node and call a recursive height function to figure it out beforehand as well, I just skimmed the implementations I looked at so I didn't check to see if they were doing that as well)
Bummer that it’s missing so many languages for many common algorithms. My friend and I were just talking today about how you would do a (variable sized) sliding window in Scala in a functional way.
This needs more moderation. I searched for “Dijkstra” and got many separate results with only one implementation each, despite presumably being the same algorithm.
Common Lisp has a really tiny community despite showing up a lot on hacker news (the shadow of Paul Graham talking it up years ago?) It's not even in the top 50 GitHub languages.
I think if you want to see an algorithm there in some language, you have to find out how to contribute that yourself. People starting projects like this don't know every language and don't have time.
So, will this finally provoke reform in the typical corporate software developer interview process? It’s been decades since using a library became the right answer to deploying a well-known data structure or algorithm.
The other day I was surprised to find that Python doesn’t come with linked lists and apparently there isn’t anything like an agreed upon `pip install linked-list` solution.
Curious why both of you need linked lists in Python? Are you doing a lot of list insertion operations or something & allocations are bringing it to a crawl?
I'm quite tickled by the idea that you need a linked list because your allocator is spending too much time - traversing a linked list.
I thought one of the advent of code challenges needed a linked list, but other than that, I have never needed one.
I was just surprised as I thought it was a basic structure, so it just seemed like there ought to be a `from collections import linkedlist` because python just feel like the kind of language where you just do that without even consulting the docs.
Yeah for sure. I think your intuition is almost right. Python does have a "batteries included" philosophy, but it's also not usually clear or documented what data structures something is using. Like I presume that Python's sets are syntactic sugar over dictionaries where the values are None, but when I skimmed the docs just now I didn't find any mention of the implementation details.
Python doesn't want you thinking in terms of data structures, just in terms of functionality. Which I think is unfortunate and limiting.
Correct (actually a linked list of blocks[1]). It supports insertion and removal from both ends in O(1), which is very useful. However, it does not support insertion and removal from the middle (via an iterator) in O(1), so it's not a complete replacement for doubly-linked lists.
Linked lists are popular because they are useful in teaching basic data structures and complexity analysis. They can be compared to array lists.
But in real life the extra cost of allocating a node is way higher than simply copying over a block of memory, for most realistic sized workloads, which are small.