I work in Automotive Manufacturing IT. Something left unsaid about IIoT is that when the business needs something, they go and buy it. There is no plan for IIoT, there sometimes isn't even a plan for IT.
All of your pretty analytics won't do jack if all you get back is -999. Things get complicated the closer you get to the metal, so systems are installed and configured to work, and then you back away slowly and don't breathe on it.
Eventually you have a plant full of things that look very similar from the front office, but once you dig in, all of the interesting stuff is completely different, and different in different ways.
I currently working in foodservice equipment, and I'll concur with this.
The problem with IoT isn't the backend. There are plenty of companies that figured out the servers, ingestion, security, dashboarding, etc.
The problem is all the nodes. Customers aren't going to replace equipment with newer systems just because it has IoT capabilities, which means you're attempting to retrofit machinery with sensors and connectivity. Or else you wait until the major chain customer has refreshed every single piece of equipment in every store. Set your calendar for 7-10 years and check back in.
For retrofitting, every single case is different, it's custom, and 90% of the time it's not easy. And, no, slapping a Raspberry Pi to the side of a milkshake freezer isn't the answer. Some products like Helium are closer, but an array of open-collector GPIOs isn't the answer either.
The only way to win here is to be highly vertical and close to your customers not only in business knowledge but actual integration with the equipment makers. I certainly don't see GE, MS, AWS, Google or anyone else really making the commitment to that kind of stuff.
I cannot see any IoT revolution. Let's globally forget about the consumer market, already dead, but focus on the industrial applications.
Going from $10 to $1 a sensor didn't trigger any change. Not surprising when yearly maintenance/battery replacement/shipping/installation/network fee is circa $50/y per sensor and this economics didn't change. Big data didn't change anything, as the cost of storing info was already peanuts before.
>> For retrofitting, every single case is different, it's custom, and 90% of the time it's not easy. And, no, slapping a Raspberry Pi to the side of a milkshake freezer isn't the answer. Some products like Helium are closer, but an array of open-collector GPIOs isn't the answer either.
>> 90% of the time it's not easy
Why ?
And maybe the right approach is a marketplace approach ?
Say i'm an someone who have created a node for fridge X, at my local town. allow me to upload designs of hardware(or maybe standard programmable hw, what makes sense), software, cables, documentation, maybe verification - and to sell them to anybody, easily.
And when someone orders such designs, the marketplace sends him everything fully working with simple installation instructions ?
Quite a bit of industrial stuff is one offs, were you have the only one(s) in existence. I've made devices for industrial use were the customer only needed 1-5 of them.
Exactly. If you've never debugged code with a logic analyzer, you can't understand what a hellacious bespoke PITA embedded realtime software can be. Personally, I like this stuff, but it's not everybody's cup of tea, and while your experience is transferable to the next problem, your work product often is not.
It's also not usually running on state-of-the-art chips and architectures. Think about hooking into a 8051 microcontroller (circa 1989) with on-chip RAM and EEPROM and no spare I/O.
For those types of devices, do you think a board built around something like a PSOC chip(fully programmable analog+digital+mcu chip) or something improved along those lines, give enough programmability to cover plenty of such devices and help implementers ?
The hardware might be fine. With the software, it's already harder, it needs adaptation, if not writing from scratch. With sensors, it's fitting them to the unique mechanical circumstances of the one-off machine. And both if these you can't scale, can't reuse.
For that specific customer, off the self stuff would not last more than a day or two as it had to survive -20C to 80C temperature swings in the space of a few minutes, along with the enclosure being rated to IP69K. The single board computer we used was off the shelf, then modified to survive the conditions inside the enclosure (eg: adding conformal coatings).
Making electronics survive those conditions is expensive and most people don't need that level of protection.
I have this feeling you're used to an open ecosystem (like web design) where everyone shares code and designs somewhat freely.
In my line of work, that's the complete opposite of what goes on. Ecosystems are closed for a number of reasons including patents, warranty, and liability.
Unless you work directly with the equipment maker and their circle of suppliers and customers, getting full approval along the way from every party (including the end customer who has no idea what's inside that black box), you're pretty much going to be on the outside.
>> Unless you work directly with the equipment maker and their circle of suppliers and customers
What does this mean ? being a certified field engineer, with the explicit permission of IOT retrofitting ? And if so why ?
And assuming the equipment manufacturer is interested in a way of doing this, wouldn't he be interested in scaling and maybe some profit and other benefits(we'll probably find some benefits along the way) ?
Companies that manufacture and sell industrial equipment shy away from any third-party modifications or additions to their units.
First off, it will invalidate the warranty and cause many headaches for service personnel. Customers depend on that warranty and service network and many times it's part of the purchase contract.
Secondly, a lot of equipment is validated and approved by the customer before installation. For example, any piece of kitchen equipment that goes into a McDonald's has been validated and tested in their corporate kitchens before it's even allowed to approach an actual store. Modifications and retrofits are, again, not allowed without their permission. So unless you have a business relationship with McDonald's beforehand, you won't get the time of day from them.
Thirdly, there's no profit motive in it for the manufacturer. I've worked on systems where IoT subsystems can help with predictive maintenance (thereby minimizing emergency service calls and expensive downtime), but the cost of that system typically isn't passed on to the final customer price.
> All of your pretty analytics won't do jack if all you get back is -999. Things get complicated the closer you get to the metal, so systems are installed and configured to work, and then you back away slowly and don't breathe on it.
The risk aversion that some sysadmins feel towards changes is absolutely nothing compared to what floor managers want out of their technology. They want to never touch it ever. They want it to work and they want never having to ever think about it. This is perhaps the polar-opposite of what IIoT provides today, requiring constant configuration and security patches. And GE and others are still wondering somehow why the tech isn't taking off.
They want to never touch it ever. They want it to work and they want never having to ever think about it. This is perhaps the polar-opposite of what IIoT provides today, requiring constant configuration and security patches.
Indeed, I would extend that to all technology in general today --- software that seems to need constant updating and is in a state of flux, wondering what else is going to get broken with the next update, etc. Anything web-based has a particularly high churn. It's actually quite unsettling even for someone who isn't running an industrial process but just wants to use their computer to get stuff done.
The lifetime of a lot of industrial process equipment is measured in decades, and maintenance/downtime is very infrequent. Continuous reliability is more important than latest and greatest.
Only moments ago, a new Windows 10 update killed my VirtualBox, forcing me to also upgrade VirtualBox to the newest available version. Fortunately the VB folks seem to care a lot about backwards compatibility - I'd be royally pissed if I had to also rebuild the guest system from scratch on a Sunday morning...
The point being, I see why constant updates are a necessity for anything Internet-connected - but it doesn't make me hate auto-updates any less.
> Continuous reliability is more important than latest and greatest.
For me, as a user and developer, this holds too, but what can one do. ¯\_(ツ)_/¯
They are prohibited from touching it because vendors of equipment that is part of the assembly line have contract clauses that can bankrupt them within hours when equipment fails and cannot be repaired or replaced within 2 hours, so they in turn forbid any changes.
I think the "scaling" is not about the software platform or connectivity etc, but that it's hard enough to transfer what you learned from the sensor data of a gas powered turbine to the sensor data of a steam powered turbine, and of course even harder or impossible to the pump motor in an ice cream manufacturing plant.
And there are millions of different pumps, motors, gears and whistles and their parameters and environment is so specific to the industry and actual machine.
It does not scale to offer the same thing "horizontally" across all industries. So they don't offer generalized "industrial" IoT to power plants and ice cream manufacturers alike, but focus on a few industries they know well.
You can of course call yourself horizontal across all industries if your offering ends at the software platform or some other building block, but that is apparently not what industrial clients are buying. They buy a wholesome solution.
I know for a fact that they've built some utter shite IoT hardware as well. Can't give details, but it's a sensor that gives data which can be used to detect early on the failure of a GE $very_expensive_machine. One of our clients bought hundreds of sensors, and after two years only ~30% of the sensors are providing sensible data. The rest are suffering from constant power failures, or just showing constant data, or even measurements decreasing with time (which is not physically possible).
> Since launching its industrial IoT effort five years ago, GE has spent billions selling the internet of things to investors, analysts and customers.
> spent billions
Does that seem realistic to anyone? I'm sure they've spent a lot of money, but billions? Even if it's a single "billion" that's basically 1/25th of GE's total worth.
According to this site [1], GE spent $393 million on advertising in 2014. So I would say billions on any one product line over 5 years is improbable. Maybe half a billion is believable.
I / my company was founded 10years back we focus on CNC Machines specifically. Recently with one our our clients we’ve been working with three leading iIot solutions; some mentioned in the article (we provide the data from the machine into there platforms) six months into the project it is really evident, domain knowledge is key to success. I believe there is an expectation that analytics, Machine Learning, Big Data etc can be applied without the knowledge and results can be delivered. The problem is automation equipment is not / has not been designed to deliver data at a scale this technology can be applied successfully, domain knowledge is still required to make it a success. Shameless plug www.cncdata.co.uk
Yours might be the most salient point: "Let's just sprinkle it with magic IoT dust, abracadabra, profit!!!" is evident cargo cult, but still manages to find traction.
Since we are talking about IoT and market I would like to ask you guys to share some thoughts..
I am actually focusing quite a lot on IoT in general.
I was lucky enough to land a quite interesting contract to manage a LoRaWAN architecture, I learned every trick of one open source implementation[1] (big shout out to brocaar who is the author) and now I am considering selling my expertise as a product.
So providing for a flat monthly fee all my knowledge in running such system plus my expertise about LoRa in general.
Then I will segment the market considering the number of devices that an user need:
1. Extremely cheap (50€/month), sharing the resources with other tenants and best effort email support for less the 50 remote nodes
2. Quite expensive (5000€/month) for a High Available, scalable and isolated architecture (plus guarantee support) for whoever needs to manage more nodes.
Do you believe this is a viable solution or I should focus on something different?
You might be a bit early (especially in the USA), but I think helping people implement LoRa will be a massive market. (Bigger than setting up WiFi networks for companies back in the day.) It covers a unique niche between GSM and WiFi that just couldn't be solved until now.
I think your segmentation is way to rough. There is a ton of space between the farm with a few hundred nodes to manage, and Nest with a few million nodes.
People are just learning best practices to manage a few thousand web servers. Managing a few thousand brickable devices with KBs of RAM is more challenging.
You definitely have a point on the segmentation, however, there was a rationale behind it.
It seems to me quite unlikely that someone will buy only the software and not the hardware, I believe that most of these systems will be installed by some vertical company that will sell both hardware and software. And I honestly don't have the knowledge, nor the capital, to set up a vertical company around (say agriculture).
It ends up to understand what should be my customers and I am afraid they won't be the final user of the system.
My customers are technical, I provide them with digital data that they should somehow consume and it is not an easy task...
My thoughts are that no-one will invest in hundreds or thousands of nodes till they have seen and bought and used a dozen - and the supplier of the prototype will almost certainly be the supplier for the rest.
For many companies the sales channel to this might be the it agency (postlight type of company) - so I would suggest selling your expertise to these guys first.
Postlight is a NYC based agency that do "initial digital MVPs" or similar - taking companies to a starting point for some new venture - do a good ish podcast - search iTunes
It's not clear from the article what doesn't scale -- is it that businesses stretch themselves too thin or is it that making a platform that encompasses any and every use case is not a viable business model or is it that the software technology itself doesn't scale? Anyone know?
It sounds like it's trying to say that creating a walled-garden IIoT ecosystem doesn't work, and that IIoT products are so specialized and application specific that success in one part of the market doesn't usually translate into success in different areas of the IIoT market.
The whole point of equipping existing installations with IoT sensors is to do data science that lets you improve on some important metric, but for that you need to have deep domain insight as well as the technical insight to wrangle the data. So the customer knows the domain but lacks the technical resources, and the vendor often doesn't understand the domain. That's why you need to do vertical solutions in this market.
(I work on a vertical solution for IoT-driven data science in the facility management sector.)
> It's not clear from the article what doesn't scale…
This is the main problem with the article IMO.
Many IIoT technologies and platforms will be used across vastly different industries. Others will be industry-specific. As they always have, industry-specific vendors will integrate industry-specific elements with more "scalable" elements to provide complete solutions for specific customers.
OTOH, every complete IIoT solution will be bespoke. Sales, marketing, and integration will not be scalable.
> OTOH, every complete IIoT solution will be bespoke. Sales, marketing, and integration will not be scalable.
This is what Palantir learned, and what Oracle semi-avoided in their positioning:
Your scalable solution is worthless without last-mile integration, and that integration is always bespoke.
Is the IIoT pushing for interface standards, a la SQL?
That would seem the main way to scale and extract the kind of revenue they're looking for. Create a standard (and therefore market) for smaller fish to perform the integration, such that it lifts the data up a level and products can be built that operate against diverse sources.
>Your scalable solution is worthless without last-mile integration, and that integration is always bespoke.
Yep. And always has been.
Spreadsheets are a great metaphor for this ... or an artist's canvas. They're both capable of amazing things being done on/in them ... but only with the application of custom effort.
Branding and philosophy. SCADA is meant to do something in particular; IIoT is meant to be cloud of devices to do whatever you need. but at the end of the day,you still have to do something in particular
It's not a dumb question at all. It is THE question. The real answer is probably the IIoT is just SCADA-over-IP. But noone will ever get rich giving an honest answer.
Companies such as Green Hills Software and Lynx Software Technology have real-time OS's + middleware that have been used in all kinds of industrial deployments.
Maybe the problem is that General Electric is a mega-corporation instead of a company with tons of experience developing embedded software in many markets. It helps to really understand the domain and what users want when developing solutions. Also, only so much can be re-used. Each embedded project is partly a custom job with the amount of customization depending on the use case.
Samsara/Sansara has the connotation of meatspace in Sanskrit and derived languages. Buddhist philosophy says that life in meatspace is suffering, but the word isn't limited to that.
GE spent a shit-ton of money trying to jump into the smart grid business too. They had commercials on television, but never really any products that went anywhere in the market. GE's failures shouldn't be overly generalized.
In hardware, if too much of the product has to be customized or there is too much variation, you can only win by being Oracle, charging exorbitant amounts to a small set of customers. The market in industrial IoT is not nearly so concentrated, and so the best strategy is probably to pick some lucrative subfield and expand around that. Try to make products that cover everything and you will just get killed by unit economics. But there is no gigantic, homogenous market like "all humans with a disposable income" that the iPhone got to cover, so IoT will always be somewhat competitive.
If you have a specific system at hand you can say if it contains IoT or not because in the sum all IoT systems have some commonalities. But you cannot build a platform for it because the parts are so very different. The most common thing is that there are some programs that send data over a network which then is collected and analysed. The general purpose platform for that already exists and is called "internet". But as you might've guessed: we already have that.
I believe we'll see more interesting solutions in industrial IOT when 5G becomes more widespread and industries can start touching problems that lower latency and higher bandwidth wireless can tackle.
With LTE you get less than 5ms latency, while 5G is promising less than 1ms. However, I think those numbers are under ideal conditions. AFAIK, there's no latency guarantee. Since I'm not familiar with the landscape, I'm having a hard time understanding what kinds of problems are bottlenecked by an hypothetical ~4ms?
The bandwidth improvements are huge, but there's no mention of guaranteed speeds, nor do they mention if the quoted speeds apply to both upload and download. It also important to remember that mobile data is incredibly expensive.
The problem should be the ability to sift through and recognize patterns in the massive amounts of data collected, in a manner that provides value to end users. The problem of data (IO) collection/remote collection is long solved. This is what happens when the focus is wrongly placed on how the data is collected rather than what to do with the data when you have it.
All of your pretty analytics won't do jack if all you get back is -999. Things get complicated the closer you get to the metal, so systems are installed and configured to work, and then you back away slowly and don't breathe on it.
Eventually you have a plant full of things that look very similar from the front office, but once you dig in, all of the interesting stuff is completely different, and different in different ways.