There is Rotten Tomatoes for summarizing movie reviews and Metacritic for video game reviews, why hasn't a review aggregator for other heavily reviewed areas like consumer electronics caught on?
Well, consumer reports is still kicking. I don't think an aggregate review service does much good though.
Reviews for physical items are super inaccurate. The average consumer doesn't have the money to buy 10 laptops and compare them, so they buy one and hold a biased opinion about it. In 5 years when shitty battery and defective hinge become apparent, the laptop is already off the market and the user isn't interested in reviewing it.
Besides, a ton of reviews are fake these days. You just can't trust them anymore. And for many products, the manufacturer cheapens them oven time without telling the public. So a review from 2 years ago may not reflect the quality of today.
The internet used to be filled with nerds and people that are interested in things enough to seek information on the web. Now, internet is a marketing tool, filled with corp influence and mostly garbage.
I really miss the old days of niche forums, gaming communities with own servers and forums. You were actually able to get decent and reliable information, now internet is mainstream.
> The internet used to be filled with nerds and people that are interested in things enough to seek information on the web.
and/or actually do proper testing, and share their results, in a consistent manner, but the effort and time is simply no longer worth it. quality content gets buried under 10 second video clips, spam sites, and your content gets copied by large websites. 10 years ago I spent up to 200 hours testing one video card, having build identical systems to do proper apple vs apple tests. But all that time invested, even then, was barely worth it from a commitment point of view. And if you publish such an article now, your viewership is so limited to a very select few. New generation simply doesn't know how a computer actually works, where as 30 years ago, you had to learn more than just the basics to be able to operate it.
The real world used to be filled with these people too. They'd have brick and mortar shops to sell quality products, and customers could step in and get expert advice on what to purchase within their budget.
> In 5 years when shitty battery and defective hinge become apparent, the laptop is already off the market and the user isn't interested in reviewing it.
Yep, and quality can vary quite substantially from one year to the next. I mostly like my 2015 MacBook Pro 15" with an AMD dGPU, but many people hated the 2016 MacBooks with the butterfly keyboards that quickly broke and touchbar that would freeze up.
It's especially an issue for products with bad model names. Is my experience with the MSI GE72MVR Apache Pro-080 going to be insightful for anyone considering a current MSI laptop? No idea. There were tons of MSI laptop models back then too with who knows what quality.
Automated lab tests can't perfectly represent real world tests, but I'd still like to see more of them. I remember seeing a machine fold and unfold the Samsung Galaxy Fold around 119,380 times before half the screen stopped working. [1] While it's a sample size of one and not a perfect representation of real world use, it's a lot better than nothing. I'd like to see similar tests for opening and closing laptops, plugging and unplugging cables into ports, pressing keys on a keyboard, etc. Some things can't be simulated, such as long term battery health, but there's a lot that could be tested but isn't in nearly all product reviews.
Something that'd be expensive but that I'd like to see is long-term automated tests to see how frequently a machine crashes. The machine should browse the web, play games, and use commonly used software: Adobe CC, Office 365, G Suite, Slack, Zoom, VLC, ffmpeg, AutoCAD, Blender, Unity, Unreal, and various Docker instances, IDEs, compilers, runtime environments, local servers and databases, etc. It should have automatic updates on and reboot only when required for an update, though sleep and wake-up should be tested regularly. Then, one could analyze how stable of a machine it is.
Personally I'd rather buy a slightly older machine that is proven to be stable than a brand new machine with better performance but questionable stability. Unfortunately, neither is currently an option for me, and with OS and driver updates, stability and performance can worsen at any moment with little (convenient) recourse (or in the case for phones, often no recourse at all.) If my work tools were available on Linux or worked through Wine/Proton/etc., I'd probably try an immutable OS like Fedora Kinoite just so I'd hopefully have more stability. I could automate stress testing the drivers after updates to make sure they were safe. Unfortunately, depending on your hardware, it may never pass driver stress tests even on a clean install (even on Windows, which the machine was designed for!), so may have to exclude certain tests.
It's easy to review entities that all exist in a single database.
Consumer electronics products are far harder to find categorized in a single place.
You also have to contend with the fact that movies are essentially immutable - if you watch the same edition as someone else you are seeing an identical product. Consumer products might be damaged, might be counterfeit, or might be incorrectly classified. All of these makes it really hard to build a single source of truth for reviews.
The bigger issue though, IMO, is the duration of the experience. A movie lasts for 1-3 hours. That's the extent of the experience, and thus all reviews are fairly constrained. Consumer electronics can last for decades, so it's very hard to know when a review should be left, how long a person needs with a product to feel ready to leave a review, whether it should be a "long term" review, etc.
> You also have to contend with the fact that movies are essentially immutable - if you watch the same edition as someone else you are seeing an identical product. Consumer products might be damaged, might be counterfeit, or might be incorrectly classified. All of these makes it really hard to build a single source of truth for reviews.
There's also things like the vendor replacing parts but keeping the same product name/SKU, or technically giving it a different ID but hiding it so far down the marketing materials that you're unlikely to be able to find it. This comes up a lot in the aftermarket ROM communities when you have to say things like "this image works on SomePhone 6a+ but only the 2022 model!". Granted, movies also can have sometimes silent edits that are presented as if they're the same thing (looking at you, Star Wars) but I would argue that it's less pervasive and there are fewer versions to keep track of.
Even in rottentomatoes reviews of tv series are usually inaccurate because critics only watch first couple of episodes before writing their review while a lot of shows take time for the plot and characters to develop.
It's because online reviews are a done deal. A reviews site is one of the easiest thing a novice programmer can build. However, there's little money in the endeavor unless it's rigged in one way or another. The profit is in taking money from companies begging you to ban "trolls", outright manipulating aggregate ratings and how reviews are sorted, or waiting for some relevant company with a conflict of interest to buy you out (Rotten Tomatoes itself being an example of this). There is of course the issue of deterring bots.
The solution to that is to strip out the profit, Wikimedia style.
Building review sites is the easy part of it, that isn't preventing anything due to competition (like a programmer creating blog software to compete with Wordpress, building it is the easiest part; or creating an online store / shopping cart service to compete with Shopify). Building it does nothing of consequence. Acquiring hundreds of thousands of high quality product reviews is extraordinarily difficult, and then you have to keep them coming in forever at that high quality.
That these things are easy to build at a basic level, poses absolutely zero challenge to Wordpress or Shopify et al. I'm not exaggerating, it threatens them not in the least, because it's meaningless. It doesn't matter if someone can build an Uber clone in N months, they won't be able to do the actual hard part of competing with Uber.
Rotten Tomatoes worked because there was a major regionalized media industry (newspapers, magazines, and TV stations) that sponsored film critics, which created a rather large pool of dedicated expert film reviewers, a significant fraction of whom were sort of inherently credible.
Are there enough dedicated consumer electronic reviewers to bootstrap the same kind of thing in that space?
And of course, another problem you have is that with the collapse of the local news industry, Rotten Tomatoes has lost enormous amounts of credibility.
Exactly. Even for movies, the Rotten Tomatoes we knew is gone. New movies don’t get as many reviews and some of them are clearly biased. A good rotten tomatoes score doesn’t mean a whole lot today.
Because there's more money interested in subverting or eliminating such efforts than there is in seeing them run well.
Ideology won't even save you there: it doesn't matter if you want to run such a site with editorial integrity and all the trimmings. The folks who want you silenced outnumber you and have more resources than you do.
I've tried to build this in the past with Looria.com, where we aggreagted and summarized reviews from the most trusted sources, e.g. Reddit: https://www.looria.com/reddit
Couple of challenges:
- Astroturfing is everywhere
- The data sources, especially social media, become more protective with their data
- Monetizing this is super hard. As an aggregator, you're always just the intermediate.
Vetted.ai is working on something similar and they raised $14M in 2022. They are likely faceing similar challenges.
Consumer Reports, wirecutter, and TheVerge each do a decent job of this service. IDK if an aggregator adds much value unless there are more critics/reviews to aggregate.
I wonder how valuable crowdsourced electronics ratings would be.
Most consumer electronics reviews are absolutely dire: products reviewed in skin-deep ways by people who don't really have the chops and/or resources to review them in any kind of in-depth and meaningful way. I don't know if aggregating a bunch of crap reviews would yield more value or insight.
Whereas with movies, the ultimate test is just whether or not a person enjoyed the movie. Whether or not the reviewer is a knowledgeable cinephile I think there is a value in aggregating that.
There is also the issue of... relative performance and long term performance. To really decide if e.g. a hard drive is worth buying you'd need to benchmark it against its peers and perform longer term reliability tests. Reviewing a movie doesn't have those kinds of constraints.
I suspect part of it is because movies and other forms of media are more likely to be 'shared' experiences than consumer electronics are. Like okay, some systems are probably popular enough with the population that this sort of review setup could work well (iPhones/iPads/Apple devices, video game consoles*, high end Android devices, maybe certain leaders in their market niches), but a lot of the time the market would be spread too thin to provide a reliable set of reviews for every product. Like, how many toasters exist on Amazon right now? Apparently about 740 from a quick check.
Add that to however many other brands and models aren't listed there, and I'm not sure you'd find enough reviews to make such a site worth it. Especially not professional ones, since even the likes of Which don't review every single device ever released.
It might also be surprisingly hard to find said products in such a database if it existed too, since often only the model number is slightly different, with the core name being identical across variations. So I suspect it'd be significantly more challenging for users to use than Metacritic or Rotten Tomatoes, where looking up something like 'James Bond' or 'Star Wars' or 'Marvel Cinematic Universe' will get a bunch of easy to understand results.
Unlike video games and movies, there's just not much overlap to aggregate for any given consumer product because there are just so many devices in every category.
The best case scenario seems like it would be PC gaming gear, for example, since there's so much coverage. But consider "best gaming mouse": the first handful of google links all cover different mice. I'm not sure what that UI would look like if you tried aggregating this. And I think your aggregator would feel like a shallow passthrough rather than anything independently useful like those low-effort made-for-adsense spam sites.
If you're fishing for ideas, I think first-party curation is far more useful and in line with what people want. Consider how https://www.logicalincrements.com/ works for PC parts, something I use every time I want to buy something PC related.
I don't want to compare a bunch of options. I want someone to filter down the selection for me.
The Wirecutter served this purpose for me for a long time. But I also check the Amazon reviews.
With any kind of purchase, I do the same thing as I do with news - I survey all of the sources and look for the outliers and also the common threads. Then I form my own opinion.
Wirecutter isn't bad per se but they fail to mention that many of their reviewed brands are really just OEM/ODM products. Nothing fundamentally wrong with that, but when UGreen/Baseus/even Anker have products that are virtually identical to other "random name" brands, you don't need to spend that much on name.
A Samsung tablet vs iPad is an actual comparison. A "ODM but logo 1" vs "ODM from same factory but logo 2" isn't really what average consumers think it is.
Once upon a time, Amazon was that review site. But as someone else pointed out, if you've got money and want to subvert reviews, you can do it. Also Amazon is so flooded with weird junk brands nowadays it's hard to find anything. Hey, want to buy a XOOTUSX phone? Me neither.
I usually find myself reading RTNGS for common hardware like computer stuff. For more niche things like piecing together a live-looping, with-sequencing, synth/music performance setup, I watch a lot of Youtube videos.
Because you can't release the same movie under a different title at each theater to avoid comparison shopping. This is exactly what happens with consumer electronics and mattresses.
It's a lot easier to review Movies then it is Products. A movie is a finite experience, and you have a large frame of reference for comparisons.
At what point do you review the product? You're relationship with the product will change over time, and probably will skew negative as it gets older.
There's also the relative exposure issue. I've seen probably a ~thousand movies in my life but have had like 3 Air Conditioners. I'm barely equipped to say what I thought about LadyBird.
There are categorical differences between movies and appliances.
For example, movies last about two hours while even a terrible appliance is likely to last about two years and when a movie is over ths watcher doesn’t suddenly have a significant problem in their life. When an appliance fails people usually do. So the incentives to write a review are different.
Or to put it differently, nobody buys fifty microwaves a year while many many people watch fifty movies a year.
Because consumer electronics manufacturers have long, long understood the need to (mostly) obfuscate pricing and (as a bonus) make comparisons/reviews impossible.
The thing you bought at big-box retailer A sporting brand B and carrying model number C? The exact same thing is sold by retailer D under brand E and as model F.
Too many brands and products. Movies are a relatively rare product, compared to consumer electronics, as a whole, that rarely distinguish themselves, except on price, For the exceptional few that actually have superior components, there are already sites that review them.
I guess it’s because it would be hard to keep it updated. A review about the movie Inception is relevant today and will be relevant in 10 years. A review about the iPhone N will only be relevant for a short period of time.
Hmmm, and yet early reviews of The Birth of a Nation called it "a triumph".
To be fair those reviews are of interest today, just as the evolution of reviews of that work since are of interest.
In the same manner tech Historians (there are such beasts) will find reviews of various releases of historic tech to be of interest, just as here on HN there's interest in old articles about the performance of BBC Micros, ACORN Achimedes, Commodore 64's, etc.
There are price aggregators like idealo.de and prisjakt in Noridcs. They are not focused on reviews but they do have some reviews. They could be platforms that could aggregate reviews as well. But hard to keep them credible.
Because the same movie doesn't flood the market with hundreds of different names. Similarly, the content of a movie doesn't change from year-to-year while the name stays the same.
in the uk there is a organisation called "which" (which.co.uk) who run a battery of tests on all sorts of things from phones, cars, insurance, ovens and chairs. They then give ratings.
You haave to pay a subscription, but that in theory keeps them on the straight an narrow, and at least avoids them being a advertising system for amazon or some other large conglomerate
I think you may be missing how many SKUs big name products have these days. Things like Sony and Kitchen-aid have SKUs per major retailer to avoid things like price matching and in many cases its unknown if they use similar parts.
They only review the initial experience with the product. If a car has an engine that explodes in a few years, or an electronic device loses updates in a year from purchase and turns into a brick - CR will not capture this.
What do you mean? I have been a member for a number of years and they ask me either yearly or semi-yearly if I still own the product, if I would still recommend it and about any new major purchases.
That's not exactly accurate. Have you seen the pages and pages of red and black circles? They track the reliability of vehicles over time (5-10 years) broken down by problem area (transmission, electrical, etc). That's where the real value is.
The variety of products is enormous, and is deliberately made complicated by the vendors - is SHP65CM5N different from SHP78CM5N? Do reviews of one apply to another? When a new model code pops up every year - does it invalidate the old reviews? If you watch "Oppenheimer", and your pal watches it - you can be sure you watched the same movie, and can compare the notes, and tell your other pal who didn't watch it yet whether you liked it or not, and it'd be useful. But with every store having their own model code set and those rotating all the time and having a thousand options, your experience may not be comparable to somebody else's. There are review sites and recommendation sites, like Consumer Reports, which kinda approach it, but there's no way to make it as systematic as for movies or video games, because there's too much variety.
Reviews for physical items are super inaccurate. The average consumer doesn't have the money to buy 10 laptops and compare them, so they buy one and hold a biased opinion about it. In 5 years when shitty battery and defective hinge become apparent, the laptop is already off the market and the user isn't interested in reviewing it.
Besides, a ton of reviews are fake these days. You just can't trust them anymore. And for many products, the manufacturer cheapens them oven time without telling the public. So a review from 2 years ago may not reflect the quality of today.