Why the indirection in the pricing page. Like, if a unit is 1,000 searches/month, why not just put that in the matrix.
Free = 10,000 searches/month.
Standard = $1 per 1,000 searches.
...
Pay as you go:
1,000 - 10,000 searches = Free
11,000 - 100,000 searches = $1.00/ 1,000 searches
The whole to unit conversion really just adds a level of indirection that I don't understand. This was further confused by units having different colored dots depending on the plan, making me think there were 3 different kinds of units.
A slider would be nice, let me slide it to what my search volume for a given month would be, and tell me how much that would cost, factoring in volume discounts.
Additionally, this is a huge red flag:
> If you exceed your committed usage, there are overages that will be charged.
What are the overages??! Why is it not just sliding back to pay-as-you-go pricing, like reservations for say EC2 work.
----
As an aside, we use Algolia to power some search features at Discord. This new pricing structure looks to be an order of magnitude more expensive (we fall under the "contact sales" usage here...) Luckily we're grandfathered in or we'd have to consider putting a cloudflare worker in-front of this and leveraging that to do caching of common hot queries to reduce cost.
Thanks for your feedback, we have some more work to make our pricing page clear. We need to add a simulator on this page.
The reason of this indirection is that we still have to deal with data/record. It is unfortunately not possible to pay only for searches, you can imagine a use case that push 100GB of data and perform only a few searches. The unit gives access to 1k searches request and 1k records. For the majority of users, they will pay per searches.
For SaaS use case, we have a different pricing where we price per GB with volume discount.
There is a volume discount, so the more units you consume, the cheaper they are. And if you commit to a year, the volume discount applies on your your yearly capacity. This give you a significant discount if you commit to a year. This is how you can have overages. Of course if you stay on a month-to-month play, there is no overages.
I've been working on an open source alternative. It's dead simple to setup and run (including raft based clustering). It also integrates seamlessly with Instantsearch.js library.
Hi there, I see you posted multiple times your Typesense link. Just wanted to point out that your demo website is broken for some days now. I keep getting "ERR_CERT_DATE_INVALID".
We (Streak) are in the same boat. Looks like we'd be paying approx half a million dollars a month on their new pricing which would be ~100x more than we are paying now. Haven't heard from our enterprise rep but starting to get nervous...
Sounds like the new pricing is for their ecommerce customers given how much value they provide them, doesn't seem to make sense anymore for SaaS use cases.
No worries, it will not be a 100x on the pricing. We will add a pricing calculator to simplify the projection.
Btw, for your use case we designed a different pricing that we call OEM pricing that is simply based on the GB used and not the numbers of searches/records.
Also you can keep your existing plan, we force no-one to move to the new pricing.
I don't know what the original prices are, but as someone who might like to use algoria search instead of MVP search, the idea that pricing scales with our usage (e.g. free tier -> lowest paid tier) is very appealing. There's too many SaaS where when your usage jumps plus 1kb suddenly they want $100/month instead of $0. Which is hard if you have a tight budget or wnat to keep a tight reign on your budget.
Not the GP, but I figure their point is as follows:
If I'm running an e-commerce website, I don't mind pay-per-search since those searches may turn into sales, so the cost is justified. My income scales with search count, and the Algolia price is part of user acquisition costs.
If I'm running a SaaS business, the search is a feature for customers who have already paid, so I don't see any further returns from the search being used. The more a client uses search, the less I'm profiting from having them as a client. They could potentially even cost me money to service them!
Interesting. This reminds me of Canada Post's address complete API. You can integrate it with your website or app to ensure the addresses the users enter are valid within Canada and entered correctly. The pricing is around 5–10¢ per search [0]. At that price point, it only makes sense to use it for e-commerce if you are going to deliver physical goods to the customer and want to minimize the number of returned "address not found" packages. But if the pricing were lower, e.g. close to Google Map's 0.3¢ per search [1], I imagine the list of potential uses would have increased substantially.
Many local governments publish shapefiles (geojson) that have address information. You could parse out addresses and have a fairly simple free option. Since this data is based on property taxes, these data sets aren't going to be missing parcels (or the tax collector would be missing potential revenue).
That still leaves out a lot: unincorporated lands, First Nation reserves, small settlement where all the mail goes to a post office instead of individual buildings, etc.
Also, one of the main benefits from Canada Post API is that it gives you the canonical address Canada Post uses to deliver packages. I don't think this always overlaps with addresses municipalities use to assess taxes. For example, I can imagine for an apartment building belonging to a real estate management company, there would be a single entry in the municipality database (because there is only one owner who needs to pay taxes), but one entry per unit in Canada Post database.
Yeah, my initial suggestion likely misses some corner cases. One case I am fairly confident it covers is (not sure if correct term) mutual interest land ownernship (aka, condos). Each layer has the same shape, but they're stacked one top another; each layer is a billable portion of the interest (e.g., a single condo on the shared land).
A poor man's approach could be to query the publicly available data sets, and use that data when there's match. Then, only pay for queries that fall into the edge cases.
Does an e-commerce site search really need anything more than fuzzy matching on item names? It pay not be perfect, but if it provides relevant results 90% of the time, you're paying a big premium for SaaS search that may only net you a small tail end of additional sales.
In particular for ecommerce sites it pays to invest in search. A large percentage of transactions is initiated by search queries. You are correct that fuzzy matching will likely get you decent results. However, finding the right product 90% of the time or 98% of the time makes a huge difference if you have a high volume of transactions. Synonyms and a better spelling model can massively improve your results and the resulting conversion.
But search relevancy is only one part of the equation. Think about a physical store and how the milk is usually in the back and a few high margin items are close to the counter or strategically placed. The same can be true for an ecommerce store. If the search engine has the ability to take business metrics like revenue and margins, or customer data like loyalty programs or brand affinity into account, you can much better optimise for your desired business outcome.
We just recently switched one of the largest ecommerce retailers in Australia over from Algolia to Sajari by doing the above and increasing their conversion rate by 10%.
This pricing change from Algolia makes it easier for low-volume search users or someone who is just getting started to use their enterprise features, but at the same time as the parent mentions, makes it an order of magnitude more expensive for mid-to-high volume users. Imo, this is a mismatch as there is already a trial in place for someone who's just getting started to try those features out.
For anyone interested in creating a search at scale, I would recommend checking us out at appbase.io (founder here :wave:) - we provide relevant search, analytics, and access control for search and are built transparently on top of Elasticsearch. It also doesn't hurt that we author some of the most popular open-source search UI libraries: https://github.com/appbaseio/reactivesearch
It is also less expensive and more accessible for mid/large volumes. We have built-in volume discount in the pricing that reduce significantly the price at scale
> What are the overages??! Why is it not just sliding back to pay-as-you-go pricing, like reservations for say EC2 work.
I'd guess because they'd like to accurately model their compute requirements to try to get as low of a price as possible (e.g. via reserved EC2 instances).
If clients constantly and consistently shoot past what Algolia plans for, it might cause issues or at the very least cause the company to spend more on, on-demand instances than they intended to.
Free = 10,000 searches/month.
Standard = $1 per 1,000 searches. ...
Pay as you go:
1,000 - 10,000 searches = Free
11,000 - 100,000 searches = $1.00/ 1,000 searches
The whole to unit conversion really just adds a level of indirection that I don't understand. This was further confused by units having different colored dots depending on the plan, making me think there were 3 different kinds of units.
A slider would be nice, let me slide it to what my search volume for a given month would be, and tell me how much that would cost, factoring in volume discounts.
Additionally, this is a huge red flag:
> If you exceed your committed usage, there are overages that will be charged.
What are the overages??! Why is it not just sliding back to pay-as-you-go pricing, like reservations for say EC2 work.
----
As an aside, we use Algolia to power some search features at Discord. This new pricing structure looks to be an order of magnitude more expensive (we fall under the "contact sales" usage here...) Luckily we're grandfathered in or we'd have to consider putting a cloudflare worker in-front of this and leveraging that to do caching of common hot queries to reduce cost.