I tested one of their preproduction LIDAR cameras just before it was released and it was great. Their SR305 was also really good for close-range, high precision applications, but they scraped that model earlier this year.
For us it was the SR305 we were interested in as it had the specs we needed at the right price point. We had a meeting with the RealSense team about a year and a half ago and they strongly hinted at the SR305 being EOL soon which made it very hard for us to commit to RealSense for the project we were working on.
I'm quite surprised about this news though. Back then they suggested that LIDAR and stereo cameras would be their focus going forward. I can only assume this must be a recent move given what we were told back then and that the L515 is relatively new.
Huge shame about their facial auth products too. My understanding is that these are used in quite a few products, but they also use structed light tech so I assume that's why that decision was made.
It's worth noting that the L515 isn't replaceable by the D455 in many use cases. The D455 is good, but doesn't have the same kind of range and precision the L515 has. It's depth data is also far nosier, as is often the case with stereo depth cameras.
My company built several successfull products using TI’s mmWave sensors. If you’d like more info, send me your contact to mmwave@mailinator.com, I will be checking that mailbox today and tomorrow.
I'd recommend the Stereolabs camera. It's a very versatile piece of hardware, quite mature in development from what I've been able to tell so far, and has a great SDK. Doesn't have lidar as far as I know but fills the same niche.
"Consumer-grade" reliability really should be the highest-grade reliability. Cables get cantilevered by standing desks, rolled over by office chairs, bitten by dogs, chewed by babies, shoved in pockets and skiied with, and sat on.
The people who designed USB-C were probably lazy people sitting in office chairs all day and never got out enough to see what real consumers do.
I break about a USB-C cable a week. Never happened with headphone jacks, barrel connectors, IEC power connectors, and the like -- those are real consumer grade stuff, especially IEC.
Definitely shouldn’t be anywhere near highest reliability. Professional equipment has to stand up to a lot more use—like a camera you use every day, made with more metal and with thick rubberized grips. If I have a personal camera, I’ll take care not to drop it. If I’m a professional photographer, the question is how often it gets dropped. If we’re talking cables, “professional” grade (to me) means the connectors are going to get hundreds of cycles per year, and the cables might have to snake across the floor and get tread on five days a week. In general, “pro” grade stuff can be bigger, bulkier, more expensive.
Then there’s stuff built for public spaces. ATMs, books in the public library, turnstyles at the metro, etc. All designed to be in close contact with people who either just don’t care or are actively trying to damage it.
TBH I don’t know what the grade below consumer-grade would be.
I would say that the grade below consumer grade is “functional prop” grade.
There’s a type of consumer electronics available on Wish/AliExpress/etc., which is designed for people who want to appear at a glance to be using a certain expensive device, for status-signalling reasons. The devices only have to be real enough to allow you to pretend to be using them without drawing suspicion, and only have to be rugged enough to last until the device being mimicked isn’t fashionable to have any more (at which point you get a new knock-off aping the new cool thing.)
Surprisingly, these devices do work! They wouldn't be convincing otherwise. (How can you seem to be using the best new phone if you have to pull out a different phone to check your text messages? How can you seem to be playing a Switch or a PS4 if it doesn't turn on and show pictures and sound on the screen/a TV?) It's just that they work as terribly as you can imagine, given that costs are optimized to meet the minimum needs of someone playing pretend and no more.
Military people just sit in canopies and hit buttons these days.
Consumers are the ones that will actually have cables bitten, chewed, stomped, sat on by unfit, heavy civilian asses, jammed into wheels of office chairs, rammed into by strangers' hips in coffee shops when your connector hangs off the edge, dunked in cereal, splattered by kitchen grease, smashed into rocks at Yosemite, and shoved into bags while plugged in after a slow TSA checkpoint and needing to sprint as fast as possible to a departure gate.
Very different set of ruggedness requirements. Most cables of the 80s and 90s were completely fine in these civilian environments.
I break Anker PowerLine+ cables all the time, not sure what's better.
The connector design itself is shit, and the rubberized housing of the connector isn't mechanically supported as any self-respecting consumer-grade cable should be, all torque transfers to the PCB and that's horrible design.
I use IKEA nylon cables and they work perfectly fine.
There's no reason the torque has to transfer to the PCB. That's just poor design. My phone's USB-C port is sleeved by the chassis for example, and so is the one in my GaN charger. There is zero torque transfer to the PCB, the chassis or frame takes all of the load.
You can't rely on the cable to cancel the torque anyways, rubber yields and the normal force at the housing will pull the connector out slightly allowing for torque to still be applied to the connector, you have no choice than to brave either the female, male, or both ports to the chassis in some way.
I go outside a lot. I think most likely the people who designed USB-C just sat in offices all day and didn't care at all to go ask skiiers, hikers, farmers, horseback riders, and bikers what their day was like.
There comes a point where you have to take responsibility for your lifestyle, and recognize that nothing lasts forever. I too have never lost a USB-C cable, but I don’t move that cable much. When you move things a lot, they break. C'est la vie.
No, I disagree, I think consumer products should be designed to withstand consumer lifestyles.
The designers should take responsibility for it, and I pay them for that.
Headphone jacks worked great until they made goddamn headphones plug into USB-C as well and then they started breaking. Technology to withstand consumer lifestyles did exist until USB showed up.
I can totally commiserate with you on the headphones. Not even kidding, at one point I suspected that the powers that be got rid of 1/8 inch jacks just to electrocute political dissidents and enemies of the state by the push of a button. Is it really so far fetched? Thunderbolt can push 3 amps (please correct me if I’m wrong - I’m going by the USB-C spec), and .1 amps is lethal. Bzzzt!
Anyway, I’m not trying to get down on you - I was just pointing out that nothing is intended to last forever. I once considered buying a military grade reinforced laptop that was way under specced just because I wanted to take it wherever I wanted and be able to bang it around.
The truth is that electronics are easily damaged. Take fiber optic cables for instance. If you bend them ever so slightly too far they will stop working, because the throughput medium is glass. It’s just the nature of the medium.
Arguably, we are the most advanced machines on this planet, and yet if we slip and hit our our head, our machinery could easily be put out of commission. I think it’s fair to expect machinery to fail outside of its intended use case. Have you ever bent a piece of copper wire back and forth and feel it get warm before it breaks? That’s just the nature of the material. It is what it is, you know?
I’m not saying that you’re wrong to want or expect sturdier constructs - because they are often available for a price (using less abundant materials). I’m just saying that if you want that it’s probably not going to be consumer grade, and it’s not going to be cheap.
I’d disagree pretty strongly that the majority of laptop buyers[0] are going out there and doing things. If anything, we’re trending more in a sedentary direction as a species.
Also, as an active person myself I’ve never had a problem with USB-C; I don’t use my laptop out hiking and fishing, and I’m kind of confused who in the world is riding horses or bicycles and also fiddling with their consumer electronics at the same time.
And while maybe USB-C is less durable than headphone jacks, I can also say that wired headphones suck when you’re being active. I can’t even count how many times a door knob ripped out my earbuds and sent me into a fit of rage. I switched to Bluetooth and never looked back.
0 - Of course the world’s poor are active, but they’re also not the target consumers for USB-C devices.
> I’m kind of confused who in the world is riding horses or bicycles and also fiddling with their consumer electronics at the same time
Please don't dismiss things just because you don't do them.
I go on long bike rides with navigation on and that drains the battery quickly, so the needs to be constantly plugged into a portable battery pack, while in my pocket. This extremely common.
Honestly, I have no idea what Intel was doing playing with sensors and robots. I think the previous CEO was trying to diversify, seeing his failures in the CPU business, and rather than playing to company's strengths he looked at whatever hot tech was out there.
Yeah the previous CEO bascially announced a number of big bets - AI, 5G, VR, Autonomous Driving and one or two others (can't remember off the top of my head), they were all associated with a number of big acquisitions most of which ended pretty embarassingly (Altera for 5G, MobilEye for Autonomous driving, Nervana for AI etc)
It followed the same familiar pattern, screw up the company over a long protracted acquisition, try to gain market share through bundling, all whilst driving the core talent ouf of the business by completely failing to invest. This is a big reason where Intel's problems came from. They looked to diversify rather than deliver.
How is mobileye embarrassing? They seem to be overwhelmingly the most common choice among automotive OEMs, and their growth rate is really solid already.
They bought mobilEye because MobilEye and Intel had signed a deal with BMW to produce autonomous vehicles. To quote their press release
> BMW Group, Intel, and Mobileye are joining forces to make self-driving vehicles and future mobility concepts become a reality. The three leaders from the automotive, technology and computer vision and machine learning industries are collaborating to bring solutions for highly and fully automated driving into series production by 2021.
Needless to say - a few years of toiling away and it turned out that this was not going to happen the way Intel had hoped. In fact if you go and look at BMW's website about autonomous driving it's almost comical that their list of acheivements stop the year before that deal was signed.
I'm not saying MobilEye aren't doing decent stuff (and they've certainly done a fantastic job of staying independent of Intel) but they've done nothing like what was expected of them. It's also important to remember that the bundling and strategic acquisitions Intel make often masks how successful they really are in a business - it'll often turn out the design win is more about bundling a bunch of services together rather than actually beating out competition.
Sure, this isn't living up to earlier estimates, but everyone in this field has been way behind early estimates, often by years. I think Mobileye revenue was up >100% last quarter? Obviously pandemic recovery helps, but even without it, 50+% seems likely.
If they can keep having failures like that, investors are going to be really happy with the outcome.
The entire strategic partnership that caused Intel to buy MobilEye disintegrated and now MobilEye are trying to resurrect it with a Chinese car manufacturer on their own.
>. I think Mobileye revenue was up >100% last quarter?
As I say, they're doing fine. It's nothing like what they promised to do, and the company has restructured in a way that clearly indicates they'll be spun out in the next 24 months (like Mcafee). But the premise of the "big bets" wasn't shuffling at the edges, it was meant to be the future of Intel, and that's clearly failed at this point.
RealSense camera technology was used in laptops for Windows Hello the same way as Intel NICs and Intel GPUs and Intel audio controllers and Intel SSDs were used by their traditional segment.
It'd be a bit silly for Intel to be successful in all of those spaces and not ask "and what would it look like to sell this part without bundling our CPU".
This isn’t the first time Intel has flirted with things outside their core competence. They dabbled with SSDs, mobile phone modems, … etc. It’s just something Intel does. Why? Don’t know.
All companies need to flirt with things that might be good. Some of those things will turn out so great that you leave your orignial company behind, while others will be duds that you drop, and still others will be small side businesses that make you a bit of money in down times but otherwise are just barely worth doing. Intel used to be a memory company that flirted with making CPUs. I don't recall when they left the memory business (I think in the 1980s)
The sad thing is that Intel had a real advantage in RealSense... they had good processors in there (especially with the Movidius chips) that were well integrated (e.g. they didn't have to support a fancy SDK for all users - just the internal Realsense group). Doing the same 3d processing with ARM or Intel is 1/10th the frame rate at much more power.
Thing is, my personal perspective (haha) is that single-shot stereo is a dead-end tech. There simply isn't enough information to infer depth in the difficult situations, range is severely limited, and every frame it basically re-calculates all of the information from the previous frame from scratch, making it wildly inefficient.
Of course, it works well as a tech demo and in controlled environments, and it's easy to understand and then brute-force with custom hardware. The problem is that it sucked the life out of better techniques which might actually work in the real world.
Can you elaborate on the benefits of what intel built here as opposed to doing the processing on host machines? My company uses these in some products but I never investigated the market or trade space for this type of device
Yeah! There is a lot of processing to do the disparity calculation... it's basically doing lots of correlations with a different amount of x shift and finding the delta-x that has the highest correlation. This delta-x disparity gets calculated in to a z distance.
So, intel made a custom processor for this that's really good at the correlations. The original one was the D4, and later they used their movidius chip. Both have lots of multiplier-accumulate silicon, so it can do the computations in parallel. Their architectures are also set up for convolution (which re-uses a lot of data) rather than random-processing (like a CPU does), so they could feed these math engines without a lot of data transfer -- this makes it more power efficient. You could do something similar in an FPGA, but dedicated silicon is going to be faster, cheaper, and use less power.
Tech companies don't know what to do with all their money (although Intel perhaps a bit less so).
Google's core is search & advertising. But they are also in email, video streaming, messaging & video chat, cloud computing, mobile phones, gaming (Stadia), self-driving cars, drone delivery, quantum computing, and probably many other things that I can't think of.
Edit: Yes I know it’s technically Alphabet, but in practical terms, it’s Google
> Tech companies don't know what to do with all their money (although Intel perhaps a bit less so).
Lidar has high computational needs, which directly influences accuracy, and both spacial and time resolution.
Post-processing steps also have high computational needs, such as infering structure, do coregistration, handle point density, etc.
Also, practical applications often demand and depend on meeting constraints such as low power requirements and size.
If a company such as Intel managed to leverage their know-how to provide
Lidar hardware that was competitive in both performance and price then they could as well develop a machine that prints money.
I get SSDs and modems, this allows them to leverage their fabs. Maybe even CCDs. But 3D computational geometry, camera lenses and LIDAR? And drones? This seems a bit out-of-line to me.
I thought that already happened? I think they are just selling what they have now but stopped development. Keyence, Flir and a few other established machine vision companies have stereo systems but nothing like RealSense, which was trying to make stereo more of a mainstream application.
They aren't officially EOLing the stereo cameras yet, but their assurances about how long they'll remain in production have been very hand-wavey.
Does FLIR actually have a stereo solution? Back when they were Point Grey they sold the Bumblebee, but all of the stereo processing was done on the host machine.
Yeah, I guess that's obsolete too. My experience with stereo is that there's so much work in calibration and ensuring you get reliable correlation results between regions from the two cameras that plug-and-play solutions are very hard.
I have a D435 and it's 100% plug and play. It works very reliably. You can recalibrate it by pointing it at a wall, but I found it wasn't ever necessary.
So what is this? In this thread and previous threads I read many comments about people being hit by intel dropping the shoe on products that are actually great. Why don’t they capitalise on these great products and actually diversify?
I have no direct knowledge, but Intel has had a history of struggling in everything but x86, until recently where x86 has struggled.
Of course it's management, but there is a particular kind of management that spawns around highly technical, highly profitable industries where engineers originally took the company to prominence.
Basically, engineers in management will be displaced by sociopathic MBAs in management that will use machiavellian social isolation combined with financing tricks.
In a way this goes back to a classic high school quote: popular people are popular because they spend all their time being popular. Nerds aren't popular because they spend their time learning.
So the "cool businesspeople" swoop in, isolate and eject the engineering-focused management with quick fix finance tricks and the usual middle management dirty pool.
Meanwhile the engineering-focused managers, invariably doing no-good-deed-goes-unpunished things like good engineering and good-for-the-companey... well, those people's days are numbered.
Then it devolves into a two-caste system: the engineers thrown enough money to keep the moneymaker ship on course and flowing, and the management caste partying on the decks.
This is not an environment conducive to moving into new markets.
In our area I think where I work we’re the only ones that have a bunch of those sensors. It’s niche, HN in general is niche so I think they’re not selling millions of their LIDAR sensors. Still for those that rely on them it’s sad.
I have a D435 and it is pretty great. Easy to use SDK, high quality depth information, high frame rate. One of the few true USB-3 UVC cameras too. Unfortunately it's really expensive, combined with limited applications and heavy processing requirements I wouldn't be surprised if they discontinue those too.
For us it was the SR305 we were interested in as it had the specs we needed at the right price point. We had a meeting with the RealSense team about a year and a half ago and they strongly hinted at the SR305 being EOL soon which made it very hard for us to commit to RealSense for the project we were working on.
I'm quite surprised about this news though. Back then they suggested that LIDAR and stereo cameras would be their focus going forward. I can only assume this must be a recent move given what we were told back then and that the L515 is relatively new.
Huge shame about their facial auth products too. My understanding is that these are used in quite a few products, but they also use structed light tech so I assume that's why that decision was made.
It's worth noting that the L515 isn't replaceable by the D455 in many use cases. The D455 is good, but doesn't have the same kind of range and precision the L515 has. It's depth data is also far nosier, as is often the case with stereo depth cameras.