Hacker News new | past | comments | ask | show | jobs | submit login
Why Are Ultrasound Machines So Expensive? (maori.geek.nz)
322 points by grahar64 on Dec 21, 2016 | hide | past | favorite | 182 comments



Few days ago we visited a company that makes avionics for small private planes. I looked at the hardware, a simple touch screen and tiny linux computer with usual sensor connections. I thought I could build this myself in a weekend for may be $300 tops which is an order of magnitude lower than their sell price. However what I learned was that the thing is expensive is not because hardware or software is expensive but the QA and certification process. There are regulations that avionics needs to work in all kind of weather at wide range of temperatures and pressures while still being accurate and have million hour of run time testing without any crashes and survive extreme shocks, vibrations and G-forces. Someone is after all betting their lives on your device. Now suddenly you have to care about strength of every soldering joint, specs of every transistor and reliability of every screw. Add on to this the customer support, marketing, warranties, legal expenses, returns and other typical overhead of production. If I'd to make device that is compliant of all these, it wouldn't come cheap. Making a hobby device for demo to school kids is quite different than making device that would help people make life and death decisions.


Same is true of software though.

A company I worked for had a new website built (their big customer facing domain). It was just a Drupal theme, responsive, absolutely nothing fancy.

An old friend of mine guessed $20k, maybe $30k tops if you were being crazy, and I thought even that was high. Back in our consulting days we probably would have quoted lower.

They brought on a company for $1 mil, ended up 6 months late and $1.6 mil.

Gotta love those project managers and "status update meetings" to burn the cash.


The issue is support.

Most companies won't buy unless you can guarantee a support contract for at least two years with new features being included in it. This is why software prices just skyrocket easily.


There was no support, that was to be handled by in-house IT dept. 60 day "warranty"


Recounting my experiences from the biggest customer (the federal government), everything costs a million plus because you have to have so many bodies to support it. Aside from the obviously unnecessary committee meetings in which a panel of stakeholders spend four hours debating the minutiae of a website (font, colors, stock photo selection, the usual), you have to have an "architect" to navigate the bureaucratic process of getting DNS entries, database tables and backup systems hooked up because you can't possibly bring your own.

You have to have at least one developer who makes sure that its authentication uses CAC, X.509 or at least Active Directory as its authentication source. You have to have at least one specialist developer who ensures that logs and audit trails are DCAA compliant.

You have to have at least one developer to ensure that the website is 509 compliant.

So on and so forth, and that's not even counting however many developers you need to actually build the basic functionality they've hired you to build.

On top of that, you need to have a proposals team capable of submitting a proposal to an FBO request for proposal, a contracts lawyer to ensure that you're meeting the contract terms with the technical solution you're providing, an accounts payable & accounts receivable team to team up with the contracts lawyer to figure out how to prove to the government that you actually delivered the module or widget they're insisting is deficient so that they can slough off paying you for as long as possible because they're in continuing resolution, a veteran / minority / woman / disadvantaged person with ownership stake to be eligible for program 8A opportunities, a project manager to actually manage the work, a program manager to try and upsell the opportunity and keep their ear to the ground to get the insider details on potentially up and coming work that they can pass to the proposals team, and a subcontractor with some kind of past qualifications in government work that you can leverage to ensure the government that you are, by proxy, also qualified to do that kind of work for that kind of government agency.

And of course, with the myriad, competing interests of the various stakeholders, a Herculean degree of risk aversion, and the fact that everybody wants you to fail, including the customer who didn't sign on to a risky project that will likely fail so are duty bound to ensure that it does fail and especially your "support team" of people who you have to ask for DNS entries, database tables, etc. that are actually competitors of yours that also bid on the work but obviously failed to win, they just want you to fail so the contract can get re-bid and they have another shot at winning it.

Of course, that's not the worst part - the worst part is if and when a project appears likely to succeed despite all the attempts at ensuring it doesn't? Then everybody and their mom becomes "active" stakeholders in your project because it's a chance to hitch their wagon up to a success that they can put on their resumes to get promoted... they want you to customizer your project to suit their needs so they can put their stamp on it so it provides a clear and measurable benefit to their division / department even though this work is clearly ad hoc, not in the contract, and they're extremely unlikely to want to pay you for it because of that.


when it comes to maintaining code that affects financial data and customer information the certification can be pretty extreme to people not familiar with the risks. Even two teams of auditors can be involved for some work (internal and external). Even after it deploys you watch new changes like a hawk


As I said, the website had nothing like that. It was about 40 static pages of product info/business info, using Drupal.

Completely boring nothing-special website.


Exactly.

This realization you had is precisely one of the reasons for which I was very vocal in a recent thread [1] about a startup that wants to (watch out, projectile vomiting coming) make manufacturing "orders of magnitude" better.

The problem with a lot of these ideas is ignorance of what real product development and manufacturing entails in the context of real-life applications, regulatory constraints, liability and more.. They think that because you can iterate code fast while sipping a latte at Starbucks the same "logic" can be applied to manufacturing and, voilà, "orders of magnitude" better manufacturing.

Reality, as you have come to realize, is often far more complex that machining a few pieces of metal, throwing together a microprocessor board and writing code over a weekend. And every industry is different.

[1] https://news.ycombinator.com/item?id=13177786


> Add on to this the customer support, marketing, warranties, legal expenses, returns and other typical overhead of production.

This is often why startups and SMEs can sell at lower cost too. The overheads are lower. I work for a medical devices company and our main competitor is more expensive than us and keeps having layoffs (or so it is rumoured). It's hard for them to cut prices for similar products when they have 10x the staff.


Business love regulation. Really keeps the upstarts out of the game if they need to have twenty different certifications to get their product to compete.


This can be true, but there's a more immediate reason they love them: the perception that the industry is safe leads to greater sales volume and much cheaper sales cycles.

As an example, consider restaurant sanitation regulations. If restaurants are frequently unsafe, people will learn pretty quickly that it's better to eat at home. But if restaurants are generally seen as safe, people will eat out a lot more. That benefits all restaurant owners.


Or you could just not regulate medical devices, and if someone dies from a lethal dose of radiation from an X-ray machine then shrug they shouldn't have used that model I guess.


Is that really the only other choice here? Knocking down straw men will also not get us where we need to be.

Regulation can harm at least as much as it can help. Consider that regulation is based on ideas that people think might help a situation, then add in some partisanship. When coming up with ideas, how likely is it that an idea is not going to be a good one vs it will do some good?

Perhaps it would be better if there was some way to incrementally evaluate regulations. Perhaps if they had a defined lifetime to force re-evaluation in light of actual experience. This would prevent at least one negative aspect of regulation - locking into practices that were thought to be safe and maybe were, but are no longer the best/safest way to do things.


You asked, "When coming up with ideas, how likely is it that an idea is not going to be a good one vs it will do some good?"

My guess is about 99.9% of ideas are bad.

More to the point, many regulations expire or must be periodically re-evaluated by law.


'Notice and comment periods' ensure that large businesses are heard, without any risk of new or not-yet-founded competitors being taken into account.


ADS-B transponders are a good example of this. They are mandated for most aircraft by 2020 and I've read/heard that the cost can be in the thousands.


> A computer that can run a MHz frequency transducer is easy and cheap these days, e.g. a raspberry pi’s GPIO pins can run that frequency.

This is ridiculous. Transmitting needs a good amount of energy for a high number of channels (several dozen, and quite often > 100), and at a high frequency. And if you are driving a 10MHz transducer, you will for ex drive it with a 80MHz numeric signal (at least when using a low number of levels, which you often wants to in order to keep an high efficiency for the relatively high power TX)... Citing a raspberry GPIO pin to do that shows that guy does not know what he is talking about.

Reception is also not trivial at all, if you want a decent quality. You also sample at a rate > to the centre frequency.

Of course you might be able to construct an amateur toy low-end ultrasound machine, but it would be of no clinical value (and of limited value for a lot of other purposes too). Also without extensive measurements, you should not use your resulting machine on any living thing...


Power isn't a problem, because at that frequency any kind of amplifier can be used. Overall IO speed usually is for the simple fact that very few chips are specifically designed for that kind of thing, but it isn't impossible. AM335, FPGA and many DSP chips will handle that level of processing for <$20. The Raspberry Pi can't be used, but you could use a Beaglebone black if you tried hard enough (although it is also a poor choice). Regardless direct sampling ADC at 80 MHz or less is far from an impossible task.

In the end I would say the writer is right and you are wrong. He is painting with an extremely broad brush but his guesses are dead on the money. You should look at the links he provides at the bottom, there are exceptionally good DIY and open source ultrasonic machines.

Making the transducer is very, very difficult but he doesn't pretend it isn't.


> direct sampling ADC at 80 MHz or less is far from an impossible task

The Sonix linked below has 32 40MHz 10bit ADC channels, which is a factor of 64 off from the 80 MHz figure if one were to assume 8 bit, and places us firmly into the realm of several-hundred-dollar ADCs.

Not impossible, but I'm pretty sure that's where much of the cost is hidden.


That kind of figures is better than nothing but is exactly what I had in mind between toy systems and what good quality ones are able to provide. 80MHz is actually not mandatory but 40MHz starts to be really on the low end side, I think, but for low frequency probes that should still be quite good. Of course if you stick to simple B-mode you can get something visible with pretty much anything. 32 channels seems to start to be quite low, though, but yeah, better than nothing.

If you start to drive your TX path with "small" (depending on the number of channels) FPGA or CPLD, yeah this is more possible than with a raspberry pi GPIOs, this is even how some good ultrasound machines are build (depending on the volume)

In the end the cost of the hardware will be not far from the production cost of a "conventional" ultrasound machine. Because that industry has "low" volumes and very high R&D costs (especially for machines intended for diagnostic use), the difference between production cost and market price is very high. If you can eliminate part of that difference because you operate in another context, that's one way to drive the price down.

So yes, it should be possible to build one with limited capabilities for a few k$ per unit. (Spending various amount of money on R&D, depending what you're doing precisely.)


> Because that industry has "low" volumes and very high R&D costs (especially for machines intended for diagnostic use), the difference between production cost and market price is very high.

This is true, but this is only half of the story : the market price is also high because the elasticity of the demand for medical devices is really low. The hospitals charge a lot of money to their customers and their willingness to pay is also high.

I've developed software for customers working in the legal industry and it's the same kind of market: these people pay a lot for really simple stuff because they have money and want to get things done.


Hm, on the buyer side, it depends on the country but in some it is actually very competitive. So smaller players have a hard time to survive simply because of their lower volumes but similar R&D costs... Plus an ultrasound exam is not too expensive in some (most?) countries.


10 bit, 32 channel, 80 Msample ADC is <$300 [1] at prototype quantities, and <$150 at production quantities. 12 bit, 64 channel, 80 Msample ADC could be done with 4 of these [2] for $620.

An AM5716 or 5726 would be $30-$40 (although buying one is a little trickier). Thats 7-8 processors in one, plus 1-2 DSPs. Plenty of power to handle all the processing required. Even a 128 channel device should only be ~$2000 in parts (minus the transducer). One with limited capabilities should be <$500 (again, minus transducer). Much less if you make a few more compromises in the ADCs.

We can make SDRs for incredibly low prices nowadays, the only real difference between that and ultrasound is the number of channels. The only reasons ultrasound machines cost more than 5 grand are economic inefficiencies. The engineering has been solid for a long while now.

[1]: http://www.digikey.com/product-detail/en/linear-technology/L...

[2]: http://www.digikey.com/product-detail/en/texas-instruments/A...


You probably don't need 10bit, but yeah, 32 channels is needed. I wonder how did older machines did that, if it was done using analog processing

I wonder if there's an equivalence to 'synthetic aperture radar' on Ultrasound. then maybe you don't need 32 channels


Very old machine did some analog processing. Then it switched to digital on dedicated hardware. New machines can do all the processing in software, running on conventional hardware.

There is all kind of fun stuff that you can do with ultrasound, and some are similar to 'synthetic aperture radar'. However, 32 channel is still quite low. Lots of probes for humans are commonly using ~100 channels.


Do they actually sample the baseband digitally?

I would expect that they are looking at the return as an amplitude modulated and phase-shifted signal at the carrier frequency. They remove the carrier frequency and then look at the (much slower) modulation frequency.

I'm not sure what you gain from looking directly at the carrier frequency...


The relative bandwidth in ultasound is immense conpared to radar etc where a relatively narrow bandwidth is used. As an example one of our probes in university days was something like from 5 to 30MHz. Doesn't save much if we would mix it down that 5. Instead of 60MHz ADC we would need 50MHz. Not worth it.


You don't necessarily need the baseband (I'm not even sure if you ever need it) & depending on the mode you can tune the amount of bandwidth needed.


Do you actually need to do baseband sampling? It might be cheaper to bandpass and downconvert the received signal with an analog network first.


You need the phase information. Actually, that might be a better way of simplifying the information to be sampled - a set of phase comparators.


Phase information is preserved when down converting. But resolution certainly will depend on bandwidth. I read that ultrasound propagation delay is ~650 ns / mm... this is something like 500 KHz for 1 mm (using rise-time * bandwidth = .35 approximation).


It's definitely one of the most expensive parts, but its still a factor of 100 cost difference in parts to product. Even assuming there are 10 parts that all cost a few hundred dollars each (processor, memory, transducer, ADC, amplifier), thats still a 90% profit margin before r&d and other overhead. It's very high.


You don't need to sample at the carrier frequency for doppler measurement though. Time of Flight... I don't think so either.


> Citing a raspberry GPIO pin to do that shows that guy does not know what he is talking about.

Agreed, but I wouldn't rule out the possibility that a crazy individual with sufficient hardware and software signal processing experience could make it work, where "work" allows for a significant tradeoff or two compared to a modern commercial machine.

DAC is probably the easy part -- use a LFSR and maybe some jellybean filters to get spectral content where you need it. 10MHz is slow enough that you could probably phase it in software if you handwrote everything in bare-metal assembly, paid attention to interrupts and DRAM, etc, etc. At first I thought you'd need a coprocessor (and in that case it might as well be a proper Spartan 6) but perhaps it can be done with a dedicated core. ADC is the probably the harder part. You can't kick the can down the road anymore. Both cost and value are tightly coupled to 2^(bit depth)*(sample rate). If you try to make a shitty ADC out of GPIO pins, you just wind up with a shitty ADC, you can't really hide the flaws and play up the strengths like you did with the DAC.

Or maybe I had it backwards and you could use a good DAC and some sort of sigma-delta like trick to make the shitty GPIO - ADC work.

Fun stuff to think about, but I suspect the real innovation will happen when someone in hardware realizes that DAC/ADC/FPGA technology has slowly and surely advanced far enough that they can do to the ultrasound market what the DS1054Z did to the oscilloscope market.


For our first child, I was considering building a DIY ultrasound machine, since I had several friends doing their final MSc project on ultrasound hardware and software. I came as far as setting up a shopping list, where I assumed a high-end laptop and Matlab with all toolboxes for free, together with some code from my friends. Still I ended up north of $3k for hardware, and several hundred projected work-hours. A lot of the cost comes from needing 50-100 transducers to get a useable image.

OTOH, the single transducer doppler ultrasound devices that's used in early pregnancy to hear the baby heartbeat can be had for $30 on ebay.


> A lot of the cost comes from needing 50-100 transducers to get a useable image.

For-parts ultrasound arrays (which are usually broken because of obsolescence, frayed wires, cracked cases, bent connectors, etc, rather than destroyed ceramics) cost tens of dollars on ebay. But I agree with your overall point: hardware hacking ain't cheap. A $3k bench is north of desperation territory but still a fraction of what entry-level EEs get at the company I work for, and we're not in any line of business that would be considered "performance" by test equipment standards.


> a 10MHz transducer, you will for ex drive it with a 80MHz numeric signal

Nope, they basically use a series of pulses at 10 MHz and rely on the frequency response of the transducer to turn it into a nice wave packet.

Source (look at pulseShape):

http://www.ultrasonix.com/wikisonix/index.php/Texo_Parameter...


If you want nice stuff on a probe with an analog center freq of 10MHz, you can do PWM (in various forms) with an higher freq. Now 10MHz is not be the lowest probe freq avail, it depends on the usage. Higher freq gives better resolution, but has a worse penetration.


I think you are missing the point.

A 10MHz transducer will be resonant at 10MHz. If you drive it with a square wave at 10MHz, it will naturally respond best at frequencies that are 10MHz, and will very poorly respond to frequencies that are not 10MHz. Thus, a 10MHz square wave driving a 10MHz transducer will produce a pretty good 10MHz sine wave sound signal. This does depend on how sharp the resonance is of the transducer, but it shouldn't be that hard.

Ignoring that, creating a 10MHz RF sine wave isn't that hard using classical analog techniques: wifi chips in modern computers create a 5MHz carrier wave that can be amplified and doubled with some cheap off the shelf parts.


You typically do short pulses in most modes. You can use some form of PWM to apodize on the edges, when needed.


> You also sample at a rate > to the centre frequency

To be clear, a sampling rate that is only greater than the center frequency isn't very useful. You need the well known nyquist rate, twice the maximum frequency to avoid aliasing

https://en.wikipedia.org/wiki/Nyquist%E2%80%93Shannon_sampli...


This is not strictly true. You can reproduce a band-limited signal as long as the bandwidth of this signal is smaller than half the sampling frequency.

For example, a signal that only has energy in the range of 9MHz to 11MHz has a center frequency of 10MHz, but a Bandwidth of 2 MHz. You could sample, and reconstruct, the signal perfectly with a sample-rate higher than 4 MHz. Any content between DC and 9MHz that would be present would distort the sampled waveform, though, and create artifacts.

https://en.wikipedia.org/wiki/Undersampling

Of course, nowadays most often one just uses a very fast ADC, and one can then choose a suitable sub-band by decimation (in a FPGA or ASIC), a kind of digital frequency filtering.

Undersampling was often used in older RF gear (and, maybe it still is for reaching the hightest frequencies? I don't know what's the current state of the art).


Yes.

This is essentially like using a mixer in RF circuits. I'm pretty sure it's still used in many variations. Generally speaking you can't go high enough in sample rate to allow every RF signal (in the GHz possibly) to be sampled directly. It's always some sort of frequency shifting + filter. The key is to filter out frequencies out of the band before sampling and the band does not need to start from DC (0Hz).


Note the "to avoid aliasing" in the comment you are responding to.


You can also embrace aliasing if you are kind of crazy. Look for "compressed sensing" stuff. Although I'm unsure if this would be useful for anything on ultrasound systems.


Compressed sensing really isn't the same thing - and to do it properly you need random sampling. Still it's interesting stuff !


Not sure why downvotes. Parent is describing exactly the issues with undersampling that grandparent wanted to avoid.


Your actual sampling rate will be higher than the Nyquist if you are trying to process data from an array, as the sampling rate required to detect phase shifts depends on the sampling period (which is determined by how quickly you want to 'move' the array).


> Citing a raspberry GPIO pin to do that shows that guy does not know what he is talking about

I think it's pretty clear that he doesn't, but it's not like he's claiming to be an expert.


An RPi costs about $30. A cheap FPGA Dev board with enough I/O capability to drive a transducer array costs about $300, and requires someone familiar enough with Verilog/VHDL/etc to actually program the thing. I'd guess he's off overall by at least an order of magnitude in any of his cost estimates, even if all the labor is free.

I'm sure such a device can be made much more cheaply than existing solutions, but the author doesn't seem to have a realistic idea of the difficulty.


Also the FPGA is not sufficient. You need something with both power capabilities and a decent freq capability, to drive pulses with some decent energy. On dozen of channels.


I think you do something like using a BeagleBone with its PRUs to handle moving all the data somewhere fast enough in combination with a cheap FPGA to drive the transducer(s) and do a little initial signal processibg to reduce the data rate a little.

This is a little more complex but the cost is going to be of the same order of magnitiude as a Pi


The AM335x isn't designed for data acquisition like this. I should know: I'm doing a project right now with 16 Msamples running through a beaglebone. The PRUs don't really like to write directly to memory very much, and hacky methods are required to prevent segfaults. There are other chips with officially supported DSPs that work much, much better for this sort of thing.


Wouldn't you normally mix down on the receive side for better SNR? Doppler should be pretty negligible, so if your LO tracks the output the receive side could probably be an audio ADC.

Power on the output side is the big one for sure, however. Also I'm not sure the transducers cited are anywhere near big enough.


Mixing is only useful when the change in the signal is slow. In ultrasound you're operating around single wavelengths so you can't mix down.


FMCW systems would be slowly changing around the central wavelength. It wouldn't work for pulsed systems, though - I'm not sure what's more common.


I like free software as much as the next guy, but saying, "If there were free software for this project, the cost would come down." ignores that developing that software is expensive and someone needs to pay for it.

From an economic perspective (and this article is about economics), companies release software when they hope that the cost of the freeloaders is outweighed by the number of benefit additional contributors will provide. (Or that broad adoption of the software has some other benefit to the opening company.)

Generally, the more broadly useful the software is, the more economic sense it makes to open the source.

I see no evidence that there would be any economic benefit for any company opening the source of their very specialized, very expensive software here.

So yeah, if it were free, it would cost less. OK then.


There's also the fact that most (all?) free software licenses explicitly lack a warranty and (to the extent they can) disavow any liability for problems in the product. This is a good thing (you'd be crazy to contribute to a free software project if it weren't the case), but it doesn't fly in the world of medical devices. Nobody will use a product if there's not a liability chain they can point to if it breaks and kills somebody.


That liability chain is always going to lie with the manufacturer of record, and always going to depend on them doing their homework. Typically a 3rd party library vendor isn't going to indemnify you completely either.


The medical institution that used the device on you is liable in that situation.


Right, so no medical institution in their right mind would buy that device. Paying a little more up front for what is effectively a permanent insurance policy is a no-brainer for any large organization.


> Paying a little more up front for what is effectively a permanent insurance policy is a no-brainer for any large organization.

As opposed to paying a little more up front to a third party insurance agency or offering to pay more to the OEM for an optional insurance plan? I don't see your point.

A medical institution will be paying for the insurance one way or another no matter what. It's either fronted by the institution itself, or in the cost of the equipment that they purchase.


Do you have an example of an OEM or insurance company that will provide that policy?


Well put. I'm trying to imagine a hospital asking a third party to insure a homemade medical device. Would never happen.


By that logic, no company would be the source of any original development, because that means that they are the last stop in the liability chain. Free software could, therefore, be treated the same as in-house developed software.


No, because the profits exceed the cost of liability for commercial software.


Bob makes some software and releases it open source as is.

Company X, is considering using Bob's software to release medical device Y. If the cost of developing Bob's software in house > risks of using Bob's software then they can chose to use Bob's software. They can also audit Bob's software release a patch, and then use that version saving money and reducing risks.

Note, it's Company X not Bob that's taking on liability but Company X also get's profit from selling device Y.


Don't most open source agreements ban taking code and re-using it in for-profit enterprises?


The open source definition actually requires allowing that. Some but not all licenses force you to keep the open source license for any changes you make and distribute to the original. However, this doesn't mean giving away the hardware for free, only the software.


No, your cellphone for example probably has open source code.

A common restriction is to release the source code when distributing the software. But, that's not really an issue with medical devices as there is a physical device not just software involved.


As others have pointed out, open-source software is not necessarily non-commercial. Open-source software can be sold for any dollar amount the author sees fit. The author simply must furnish a copy of the source upon request from any recipient of the binary, and cannot charge any more than it costs to fill that request.

The commercial breakdown occurs because each recipient also has the right to redistribute the source code and binaries, so Original Author Alice sells to Bob for $100, Bob sells to Charlie for $50, Charlie sells to Dan for $10, and Dan posts it online for everyone to get for free. Even if Dan's site goes offline, someone will create a mirror. Open-source licenses make this perfectly legal. Thus, open-source authors do not have an enduring market for selling their software.

The loophole for commercial OSS companies is usually something called dual-licensing, enabled by the unique "copyleft" provisions first introduced in the GPL.

Copyleft means that the license is infectious. Any code linked against GPL code also becomes GPL. If someone links against code that they cannot legally make GPL or refuses to distribute the linked code under the GPL's terms, they have violated the GPL and could be sued by copyright holders to enforce compliance, stop distribution, and/or seek damages.

This infectious element is why some people and companies are very cautious about the licenses on the open-source projects they use. Some household names have had some close calls by incorporating GPL code without fully understanding the ramifications (and some household names may be in hot water over this soon, as GPL violations are not entirely uncommon).

Copyleft is great for most pure open-source projects since it means that everyone has to share back not just their changes to the software, but also the stuff they build on top of your project. However, because open-source software usually doesn't sell well (as discussed above), it means that people who want to sell their software commercially cannot use any GPL code anywhere in their software -- unless the copyright holder also makes that code available under a non-GPL license that won't infect the linked software.

This allows people who want to use your code as a foundation or library in a commercial package to pay for a commercial, non-infectious license, and it allows people who don't need that to use the GPL version, which requires that their code becomes free too.

Dual-licensing is the way that many open-source software companies have survived and tried to harness the best of both worlds. TrollTech, who made Qt until they were acquired by Nokia (and then spun off after the Microsoft liquidation), is one such company that lived many years off the dual-licensing model.


Thanks. That's a really good explanation about dual-licensing.


This may just be a reiteration of other commenters, but a for-profit enterprise would just buy liability insurance, and then bake the cost of that liability insurance in the sales price.


Free software is used in medical devices where it makes sense. Sometimes it saves money.

The cost often really isn't from the writing of software, but from the regulatory, QA and hardware development typically. Which is all dwarfed by validation if you need to do a PMA, but typically devices like this will be a 510k. Software development directly is a bit more expensive that some areas, mostly because you really need to follow engineering practices (e.g. traceable V&V, full documentation, test & review practices, etc.). Usually that isn't the dominant cost of bringing a device to market.


> developing that software is expensive and someone needs to pay for it.

or, you know, someone donates some of their time to write it. that's usually how OS works.

think about how much research the person writing the article has done. if you could buy a $25 transducer off ebay, they could have put that time into writing the software.


> someone donates some of their time to write it. that's usually how OS works.

Sure, those high school and university students building up their github portfolio for job interviews.

The majority of serious open source projects are backed by hardware or consulting companies, and usually fade away when that support is no longer there.


Maybe, but for hobbyist projects? IDK, seems a bit condescending or at the very least defeatist.

And who knows, the hobby project might transform into a company. Does that make it an instant "serious open source project"? Got to start somewhere.


You're changing the topic. We're not talking about a general open source project with a potentially large userbase. So what you're essentially advocating is that we have to wait till a few hundred people serendipitously have enough time to dedicate to their hobby of writing control software for medical devices. And after that they have to validate the code, test it and ensure its correctness.

All the popular open source projects succeeded because developers got paid. This fantasy of large scale complex projects with unpaid experts writing code, developing tests, and creating documentation, etc is just that, a fantasy.


> You're changing the topic.

Oh really? Let's see now.

> We're not talking about a general open source project with a potentially large userbase. So what you're essentially advocating is that we have to wait till a few hundred people serendipitously have enough time to dedicate to their hobby of writing control software for medical devices. And after that they have to validate the code, test it and ensure its correctness.

Not at all. A start would be a proof of concept device with barely working software. Maybe even not that, maybe the first PoC uses an oscilloscope for visualising the data. Then someone takes e.g. a Beagleboard and dumps the data via USB into a small Python script. V2 might add a bit of a colour map with matplotlib, maybe a GUI or just live updating Jupyter notebook. That's a start.

Why does everybody assume you'd want to replace the medical devices? Did we even read the same article? The authors even say that

> "[c]reating any device for medical purposes can be incredibly expensive, but this ignores all the other uses that ultrasounds can have in education, imaging, sports training and just for fun".

Except for sports training, do we need medical certification or perfect accuracy? No. So why is it so hard to believe that one person couldn't knock something together in a weekend if the transducers were available? You know, for fun, out of curiosity?


>Not at all. A start would be a proof of concept device with barely working software.

And anyone can already do that. Creating a prototype is like 5% of the work. Nobody really cares about it unless you can ACTUALLY use it for ACTUAL stuff. I work in industrial automation and I have several hacked-together prototypes where I'm running some scripts or software on a micro-controller. And those protypes do "cool" stuff for a fraction of the cost. But there is absolutely no way my customers would ever ever consider using a prototype PLC for anything, not even testing. So I don't think you're getting it. This isn't Linux where your end users are mostly software/technical people. Normal laypeople are not interested in testing experimental stuff like this.

>Why does everybody assume you'd want to replace the medical devices? Did we even read the same article? The authors even say that

Because it would be unethical to produce medical diagnostic devices (or even call them that) without proper code review, validation, and so on.

>Except for sports training, do we need medical certification or perfect accuracy? No. So why is it so hard to believe that one person couldn't knock something together in a weekend if the transducers were available?

Okay so yeah I totally believe you can put together a janky POS that barely works and is unreliable. But without proper validation the results such systems produce are entirely useless.

> You know, for fun, out of curiosity?

Yes, I agree. You can do _ANYTHING_ for your own amusement, fun or curiosity. BUT... if you want to make something that is useful to other people you have to do a bit more work. And that work doesn't come cheap.


I think it undervalues the donated time to say the effort would not be expensive


No. Actually most useful OSS is supported by goverment funding or big companies.


>I see no evidence that there would be any economic benefit for any company opening the source of their very specialized, very expensive software here.

Hmm maybe if you can find a way to lock-in users so they have to always come to you for the hardware. So, one way to do that is on a project that sees a lot of source code churn, you can tightly couple it with your services/hardware. Any competition that's downstream will find it hard to keep up with your commits in addition to having to patch in support for different hardware/services.


If all manufacturers collaborated on a single software project, they could reduce duplicated efforts. That would lower costs for all manufacturers.


That is an argument for private collaboration, not for broad open sourcing.


Private collaboration among competitors is usually frowned upon as illegal collusion. Contributions to a public open source project wouldn't face the same issues.


yes, this is why we have no free open source SDR software, no gnuradio, no osmocom, no rtlsdr, not to mention lack of cheap SDR hardware .....


companies release software when they hope that the cost of the freeloaders is outweighed by the number of benefit additional contributors will provide

Lols, is this how you view open source?


I have seen enough companies weigh opening their source-code to know that is exactly the calculus many of them use. I don't pretend that a single sentence summary captures every detail and consideration though.

That sentence has nothing to do with my own opinion of free software.


How do you imagine a company evaluates the decision to open their software?


There's a lot of nitpicking about the details going on in this thread, and it's very interesting to read.

But I think the gist of the post is that the hardware cost of a high end ultrasound unit is rapidly decreasing due to advances in ADC technology and the general trend of moving more and more functionality into DSP that used to require specialized analog circuitry.

For example, I recently bought a 20MHz spectrum analyzer and oscilloscope with built-in tracking generator for $145. Gear of similar quality would have cost tens of thousands of dollars just a decade ago.

https://www.amazon.com/gp/product/B018XD6Z5O/ref=oh_aui_sear...


That's really nifty, I've got a Serious Business(DSOX-3014A) scope but could never justify a spectrum analyzer. I've got the signal gen so I've though about rigging up their API to do a simple sweep but never got around to it.

For filter work that'd be near perfect.


How's the UX on that? I kind of prefer having physical knobs and buttons, but I can imagine having better analysis and capture outweighs it.

Maybe I should get one of those and a USB DJ controller for input.


The UX is pretty good. The utilities seem to have originally been created for the Chinese speaking market but there are English language versions of the software that are very easy to use.

Good idea about mapping some knobs to the UI.

I've used it to measure harmonics when testing some low pass filters, and the readings match some other gear I own.


> For example, I recently bought a 20MHz spectrum analyzer and oscilloscope with built-in tracking generator for $145.

Nice. Does it work under Linux too?


I'm not sure. I resorted to buying a copy of Windows 10 to use with certain software and utilities that seem primarily supported on Windows.

The windows software seems to be under active development and I was able to skype chat with one of the engineers before I bought it to ask a few questions.


I do synthetic aperture radar signal processing for a living. I make the SAR pictures you see (or use for your research). I think a big part of the cost is the signal processing. You need fast A/D converters too. The software though to make a steerable ultrasound beam is not trivial by any means. Also, you have multiple sound speeds to account for in layered media where you dont know the layers occur or what their sound speed is. Another really hard problem.


Yeah, putting together the software to generate an image is not lot building a database or webservice. It requires engineering and science knowledge well outside the domain of computer science.


You can't program what you don't know.

I actually chose biology over computer science because of problems like this[1]. Now, I don't think I have all the knowledge necessary to build an ultrasound myself, but at least I have the ability to read the literature and make sense of it, and I can understand the language radiologists and doctors use to describe an ultrasound.

I don't think the programming would actually be that hard. It's basically sonar for people. There are tons of builds of devices that use time-of-flight to produce images. I think you could actually get something reasonable working pretty quickly if you had access to testing apparatus and a radiologist.

I don't have the skills to build the ultrasound machine myself, but I'm not an EE. I don't think the programming is a huge barrier though.

Quick edit: I would probably try emulating a system with physical lenses first. It seems like an easier problem. There was an article on Hackaday a while back about a guy who built a phased array radar in his garage, but it seems harder than the physical lens version:

http://hackaday.com/2015/04/07/build-a-phased-array-radar-in...

One quote from the article which I think is pretty relevant:

"If you are willing to trade acquisition time for cost you could implement a much less expensive near-field array using switching techniques"

And here's an article on a DIY Ultrasound development kit:

https://hackaday.com/2016/04/12/a-developers-kit-for-medical...

[1] I specifically wanted to do bioinformatics, but the field pays poorly, and also requires an advanced education.


I can assure you it's not sonar for people. The signal processing is way more complicated than traditional sonar. Sonar also has number roughly 10 channels. Ultrasound transducers have 10x more. The array is generally 2d and not planar either where sonar is generally a 1d planar array.


I just looked at the Wikipedia page for SAR. I think what you are doing requires a lot more signal processing than an ultrasound. At it's most basic, an ultrasound is really just a 1d graph of signal intensity. You should look at some old ultrasounds, it's pretty obvious how low the resolution is.

I'm sure you know more about signal processing than I do, but trust me when I say a simple diagnostic ultrasound is a pretty rudimentary bit of kit. Most medical imaging is pretty simple actually (not accounting for signal acquisition). Radiologists are trained to read fairly abstract charts, and they want as little processing as possible. Imagine if a CT machine tried lining up images, rather than presenting the raw slices. That might make sense for mapping data, but if you were trying to diagnose a displacement of something, like a broken bone, having the image "fixed" wouldn't do you much good.

That's part of the reason why older ultrasound images of babies are so inscrutable to the casual observer. Since the technician is slowly sweeping a 1d or 2d array by hand, the printed image ends up looking pretty weird if the baby moves. An ultrasound can be 100 db inside the womb[1], so the baby tends to start moving when the ultrasound is performed. The horrible images aren't much of a problem, because the images they give to the parents aren't really used for diagnosis. They use the monitor for that purpose. If there is something the tech wants to explore further, they just look at that area some more.

Based on my limited knowledge of SAR, it seems like the processing is way more important because you are working with data that has been captured in the past.

Edit: Edited for clarity, and added source

[1] http://www.popsci.com/scitech/article/2002-01/hey-turn-down-...


You can have a basic ultrasound with only on channel. The following patent is from 1985, and has a pretty good overview of the field at the time. It appears that most if not all untrasound transducers at the time were large single channel instruments.

https://patents.google.com/patent/US4446395A

This patent, from 1989, indicates that most ultrasound transducers are either single element or linear arrays. It was the earliest patent I could find with a cursory look that had a 3 dimentional array.

https://patents.google.com/patent/US5027820A/en

Regardless, even Wikipedia suggests that most of the arrays used for medical imaging use either a single element or a phased array:

To generate a 2D-image, the ultrasonic beam is swept. A transducer may be swept mechanically by rotating or swinging. Or a 1D phased array transducer may be used to sweep the beam electronically. The received data is processed and used to construct the image. The image is then a 2D representation of the slice into the body.

3D images can be generated by acquiring a series of adjacent 2D images. Commonly a specialised probe that mechanically scans a conventional 2D-image transducer is used. However, since the mechanical scanning is slow, it is difficult to make 3D images of moving tissues. Recently, 2D phased array transducers that can sweep the beam in 3D have been developed. These can image faster and can even be used to make live 3D images of a beating heart.

https://en.wikipedia.org/wiki/Medical_ultrasound#Sound_in_th...

Point being, I think you are wrong. I'm not an EE, so I can't speak towards signal processing, but I am a biologist by training, and I don't see a clear reason why sonar principles wouldn't work. We are basically a bag of salt water.

Also, I am familiar enough with ultrasound to be sure that models with only a single transducer are very common. Hospitals and the like might be using the fancy-pants multi-dimensional arrays now, but the units we used to image things in college were definitely not multi-dimension. For one thing, they were older than the patent that demonstrated multi-dimensional arrays.


A phased array is made up of lots of channels.

In sar, the speed of the medium doesn't change, in ultrasound it changes every few mm.


TL;DR there isn't enough demand for specialty materials fab (PZT thin films for mHz range transducers) to benefit from economies of scale.

Other stuff (software, controllers) is also expensive, but it's probably a fraction of the transducer cost so buyers tolerate it in order to get manufacturer support/service contracts


One piece missing from the TL;DR: cost of compliance. Any medical device carries huge liabilities, and can easily bring a company down if it malfunctions. Probably the only other field that comes close to it is avionics.


Exactly this. Same for pharmaceuticals. This is why a "$5 pill" in Africa is $100 in the US. Medical device makers get sued practically every Tuesday. Anytime someone dies, the chain of liability is a very long one.


Eh, that "$5 in Africa" pill is often also "$10 in Canada or the EU." Oversight/certification costs money but that's only a small part of the explanation for why pharmaceuticals sometimes have far higher retail prices in the USA than in other developed countries.


The pill is $5 in Africa, because it costs less than $5 to sell it in Africa. It costs $100 in the US because some Americans can afford to pay that much for one pill, and the rest can use manufacturers' coupons or collectively-bargained pricing agreements to pay less.

Oversight, regulation, and certification is what prevents some applecart-upsetter from barging in to the market, charging $12/pill without any haggling. Instead, the incumbents can be tipped off by captured regulators that someone who won't play ball is on his way, so they can temporarily lower their price to $5/pill until the new jerk runs out of funding and dies. Then it's back to $100/pill. Heck, make it $1500/pill, pour encourager les autres.


No one pays $100/pill. It's "that much" in America because the manufacturer anticipates that the insurance companies are going to expect a big discount. Pricing it at $100/pill allows them to say "Well, for you Mr. Blue Cross, you do indeed have massive volume, we will offer an amazing 95% off and sell the pill to your very lucky customers for $5." Since it is now illegal not to possess health insurance in the United States, "no one" will be paying an uninsured rate.

However, companies aren't selling these pills directly to consumers, they go through pharmacies and doctors. If you don't have insurance or your insurance doesn't cover the medication, these pharmacies and doctors usually have multiple options at their disposal to increase the affordability of the medication, whether it's a manufacturer-provided financial hardship program, substituting the same medication from a different manufacturer (i.e., "generic version please"), assistance signing up for government-provided medical benefits like Medicaid, or something else.

Doctors do the same thing. Their "cash price" is a hyperinflated joke that exists only to anchor their negotiations with the insurance company. If you don't have insurance, you can and should ask about options to slash the sticker price. You can often get an instant 50% reduction just by asking.

The people who really get screwed are the people who have some type of billing snafu and end up with a cash price account in collections. This will haunt your credit report for at least 7 years and will make your life unpleasant in other ways, but even at this point you can usually negotiate a large discount. Such snafus can happen for various reasons and are unfortunately not rare, as you might guess from the ridiculous complexity of the system already described.

If the medical provider sues, it's possible a judgment could be entered against the individual for the cash amount that no one is ever expected to pay anyway. That is the worst case scenario, and it's bad, but even then most Americans are not stuck. They can file for bankruptcy protection and have the matter settled. In most states, bankruptcy will not require a person to surrender property that is needed for their daily maintenance, and some states have very strong homestead exemptions that ensure a homeowner will not be forced to sell his/her primary residence.

Of course, all of this is a massive disaster, but it should be known that in real life, virtually no one pays $100/pill. :)


Of course no one pays that much, directly. 90% of America cannot afford that all at once.

So the trick is to get Americans to pay $10/pill at the point of purchase, and an additional $3 taken out of every paycheck, before taxes, whether they get the pill or not.

The US healthcare system is a cesspool of interlocking scams and cons. Those who genuinely want people to be healthy, and for sick people to get well, are constantly at war with those who operate under the assumption that a person will hand over everything they own (and maybe even some stuff that other people own) for a decent chance at not dying before they're ready to go--and then still charge a little extra to help someone die when they are ready.


>The US healthcare system is a cesspool of interlocking scams and cons.

Yeah, I totally agree. It badly needs rectification. The ACA just cemented the issues afaict.


If a tricorder ever becomes real, and it has ultrasound, that would boost the economies of scale.

http://tricorder.xprize.org


The reasons ultrasound machines are expensive are not technical, it's mostly low volume and high overhead to comply with all rules and regulations.

You can make a useful ultrasound machine with just one transducer (e.g., to measure blood flow through the heart using Doppler, or using mechanical scanning).

You don't need fast CPU for processing, just downconvert to audio range and then 80286 is fast enough.

Source: wrote embedded software for one of those machines back in the day. 16 kB for everything, from keypad debouncing to GUI.


You are confusing several different thing under the vague notion of "ultrasound machines". Doppler mode is useful and cheap machines dedicated to that can and do exist. However even a basic machine able to do only a poor B-Mode needs at least vastly more processing power -- and vastly more channels.


Processing power is not a problem these days. Maybe if you feed all channels at ultrasound frequency to ADC and then try to do all processing in software. But you don't have to, it's oftentimes easier to have hardware (FPGA) pseudorandom signal generators and correlators to do the bulk of signal processing for you.


It depends. An FPGA able to do that is a big one, quite expensive. Today, an even more so tomorrow, you can afford to do that kind of computations in a CPU and/or a GPU.

Regardless of who is doing the beamforming, the TX and RX analog parts are intrinsically quite expensive with at least dozen of channels.


It isn't low volume per say: lots of women get (medically unnecessary) 3d ultrasounds when pregnant to get "a nicer picture" at independent clinics.

A lot of is regulatory, but not completely. - other issues include 1) expensive software integrations for EMR and billing reasons 2) that most machines have multiple wands to attach 3) That billing codes tend to support the use of expensive machines rather than cheap ones if a doctor wants to make money is probably related to billing codes/other non-embedded software


I've been researching diagnostic medical equipment a lot recently, and it seems like the biggest barrier to homebrew medical diagnostic equipment is attitudinal. Discussion about building your own equipment are almost immediately sidetracked by FUD about how dangerous it is.

Proper medical devices have loads of safety features! They have isolated power supplies! They are tested in harsh environments! They fail in a predictable manner! There are regulations that need to be followed! New devices are still expensive because they are better!

Yes, electricity can kill you.

Yes, improper medical advice can kill you.

Yes, malfunctioning diagnostic equipment can lead to an incorrect diagnosis, which, yes, can kill you.

Yes, medical regulations exist and protect us from harm.

While a homebrew machine would not be able to compete with the latest and greatest, I'd hazard that even rudimentary diagnostic equipment could save thousands of lives a year in the developing world. These technologies are not new -- the medical ultrasound has been around for more than 60 years, and the EKG has been around for nearly 100. It seems insane that cost is still such a barrier for the machines used for medical diagnostics, when the price of other technologies has fallen so much in the same period of time.

I bought myself a Rigol DS1054Z, and I realized that I paid $400 for an oscilloscope that would have cost millions of dollars 30 years ago. I thought about the experiments I had done on neurons using the 50 year old oscilloscopes as part of my degree, and I realized that an ECG/EKG can be replicated pretty easily with an oscilloscope. It turns out, building an ECG is pretty trivial. It's not a 12-lead ECG, but it's also something I built out of parts I had on hand.

I don't see why other medical technology should be any different. Yes, unregulated medical technology is dangerous, but the risk doesn't seem to outweigh the potential benefit. If the parts to build these devices is cheaper and more accessible than they ever have been, and the equipment needed to build, test and calibrate the devices is cheaper and more accessible than ever, it seems like the devices themselves should be cheaper and more accessible than ever. I think there is a place for a $20k ultrasound, but when you live hours or days from an ultrasound, a cheaper option could save lives, even if the primary purpose is directing people to get a follow-up with the more capable machine.


I think one of the problems of "but this is just a small engineering project" is precision, reliability, quality-- whatever you want to call it.

There is interpretation that goes into reading an EKG, an xray, lab results. If there's any unreliability whatsoever, that will impact treatment and outcomes.

If there's any unreliability in your EKG readout, that's can be the different between diagnosing a minor heart attack versus chest pain.

The same goes for diagnosing a cancerous tumor from a benign one. There's already ambiguity and interpretation; if there's any imprecision in the measuring device, that's going to lead to bad outcomes-- both overtreatment and undertreatment.


Oh, definitely. I don't want to suggest the added inaccuracy is a trivial problem. It just seems like access to a diagnostic test with a lower resolution is better than no test at all.


In the abstract, I agree -- there are many situations where low res is better than none.

But given the amount of interpretation and reading (it truly is an art, not a science) that goes into reading an EKG and more importantly, a tumor x-ray, I think in this situation, low res is worse than none, because it's worthless (you can't tell anything from it) and it still cost you something.

If you do try to do a cancer diagnosis on a poor x-ray, you are basing cancer treatment on the flip of a coin.

no cancer? Well you might fill them up with expensive chemotherapy, which itself is carcinogenic. Actually has cancer? Well, you've just set them loose with an organic time bomb in their system.


Yeah, it's definitely an interesting question. I took two full years of biomedical ethics, and I think you could make a pretty strong claim for either side.

I personally think that performing a basic reading of an ultrasound or an ECG/EKG would be fairly simple with some training, but I also have a lot of experience reading raw data. I have a few friends who are Radiologists, and I'm going to have to ask them what they think. I'm really not sure what they'd think about all this, and I'm definitely open to the possibility that I'm massively overestimating my ability to interpret even "simple" diagnostic data.


I'd be interested to hear what you find out. I have a buddy who basically lost his police force job because of a mis-read EKG, which caused him to be taken off of his ADHD meds, which doctors wouldn't re-prescribe to him, because they thought he was an addict trying to score. He has a real case of ADHD. He wound up losing his job and his condo-- everything, basically.

But I digress...


From a quick look on eBay, you can pick up a vetinary ultrasound machine for between £800 and £2000. Are these materially different to medical ones, or is the cost difference down to certification?


At least in Germany, the ultrasound machines that most vets have are second-hand from (human) physicians and look older. I have yet to be at a doctor's office in Germany that didn't have an ultrasound, even general practitioners ("Hausarzt"). The concept of sending a patient off to an ultrasound technician for a routine scan seems baffling to German physicians.

Sources: asking the two vets I've brought birds to here, own visits to doctors


Simple and somewhat cheap ultrasound machine do exist. I'm not an expert about this market, but you can probably source some from China.

R&D and will be notably inferior with less capabilities than on high end systems, and you can probably also keep cost done if you skip some certifications.


Low unit test equipment is always expensive, see also: Oscilloscopes, Spec Analyzers, etc.

You just don't see the volume that makes economies of scale kick in over your NRE(non-recurring engineering) costs.


Good point. I used to work at a mobile startup that subsequently got acquired and burned - the run rate for a company of ~140 people was astronomical.

Radio engineers are expensive and rare, scopes and even faraday cages are expensive, specialist software is expensive, handsets were a fortune when you had to buy hundreds of them.

We got sold for (I think) about $330 million, then you look at social media startups that go for a couple of billion. Nobody wants to do stuff that changes the world for a reason.


I would hope that this type of hardware can be offset by the fact that these devices don't suffer obsolescence the same way other digital devices do, which need to be replaced because there's no longer 802.11b networks around etc.


Yes and no. At work we have a scope that will let you save traces onto a 1.44Mb floppy disk. Other people have test gear that runs DOS or Windows XP.


There's also the issue of calibration and recertification. Some industries can skip it but I'm guessing it's a hard requirement in the medical field.


In many ways they're worse, especially specialized test equipment and protocol analyzers, because they're designed to evaluate the very thing that goes obsolete.


And yet oscilloscopes went down in price by a lot in the past couple of decades.


1 gig sampling scope is $250 there days


The FDA process is also extremely expensive. Not saying it justifies cost, but it is a significant upfront investment of time. Additionally, updates to the device over time have to go through approvals again.


One of the linked pages in the article illustrates the insanity surrounding health care and the FDA at the moment to me:

https://www.quora.com/Why-is-the-Philips-Lumify-ultrasound-f...

One of the responses to the question is a respectful, reasonable post; the other responds with the typical protectionist FUD that permeates this area. The second asks about the costs of misdiagnosis by people using it inappropriately, but ignores the lost rewards of appropriate uses that are curtailed, and the costs of overpayment due to lack of competition to determine what level of training and payment is actually appropriate.

The original article might be naive about some of the technological challenges associated with an ultrasound machine, but I think that's missing the point.

When a response to "why can't anyone buy an ultrasound machine" is to disingenuously reply "because you have to have the FDA ensure that it's working correctly and people aren't running around killing each other with it," it puts huge constraints on innovation and growth in this area. I can go buy a crowbar and kill people with it, so why can't I buy an ultrasound and use it to study muscle movement, or for education, like the author of the posted article is noting?

Plenty of technically sophisticated open-source efforts exist, and they can't happen if there's arbitrary and unnecessary prohibitions on them happening. Maybe if the FDA said "hey, go to it" people would realize it's too hard, but maybe they wouldn't. We'll never know as long as there's unnecessary restrictions in the way.


> When a response to "why can't anyone buy an ultrasound machine" is to disingenuously reply "because you have to have the FDA ensure that it's working correctly and people aren't running around killing each other with it," it puts huge constraints on innovation and growth in this area. I can go buy a crowbar and kill people with it, so why can't I buy an ultrasound and use it to study muscle movement, or for education, like the author of the posted article is noting?

Nothing is stopping you making, buying, operating a toy ultrasound, so long as you make clear that it's a toy and not to be used in human health.


> FDA ensure that it's working correctly

This isn't really how it works. The FDA will expect you to be able to produce evidence that you know it works correctly, but they're not responsible (or IMO capable) of determining whether devices work correctly. You make claims about what a device does, and you substantiate the claims with evidence. In addition, you create design and production controls that help avoid and mitigate device defects. The FDA can review your controls and defect records. The FDA will focus in particular on defects related to your device's hazards. Interestingly enough many medical device's hazards are related to the harm you might cause someone by accidentally dropping or shocking a patient or caregiver. While misdiagnosis-due-to-defect is a serious hazard, it can be mitigated to some extent by referring to the skilled physician/technician operating the device.

IMO the controls are good practices that most business would do anyways. They would give them special attention because of regulatory control. But it's true: the cost of audits and other items related exclusively to regulation aren't free, and do add some effective cost to the device.

> huge constraints on innovation and growth in this area.

I think fear of the regulation being overly burdensome does limit innovation here. Is it a net win? I'm not sure. IMO the government could mitigate this by offering DARPA-challenge style grant competitions and marketing regarding the scope of their regulatory domain.


I'm going to be a cardiology fellow at Stanford in July, we use ultrasound often for both bedside informal exams as well as diagnostic echocardiogram. I've used handheld devices like the Lumify and Vscan, as well as the large tractor sized epiq machines. Similar to what is written in the article, my impression is that the actual hardware, specifically the transducer, is quite expensive to manufacture. The software and processing power continues to get cheaper but to have to best quality pictures, require expensive transducers. I've actually been very very impressed by the Lumify and I think it is getting near if not better than the quality of the gigantic epiq machines, primarily by having a very high quality transducer. This is indeed a hot area and knowing people who are actively doing development in the field, there are poeple trying things like having a giant paralleled transducers over the entire chest for continuous 3D images and other interesting ideas that are limited more by hardware than processsing power or imagination.


If you want to improve ultrasound, combine it with the positioning sensors of a VR system so the position of the sensor is known. Then you can do full tomography and build up a 3D model as the sensor is run over the body. For extra points, have alignment sticker targets you can attach to the body to track the patient if they move. Veterinarians would go for that.


> For extra points, have alignment sticker targets you can attach to the body to track the patient if they move

This is roughly how surgical navigation systems work. The leaders in this area are companies like BrainLab and Medtronic, usually using optical tracking sensors (e.g. from Northern Digital), or sometimes electromagnetic, with accuracy in the 1-5mm range. For neurosurgery (the area I'm most familiar with), several companies offer tracked ultrasound integration, usually for live overlay and comparison with pre-operative MR or CT images.

> combine it with the positioning sensors of a VR system

For any folks interested in this, there is an active, excellent open source project called PLUS focused on ultrasound, tracking, and sensor data acquisition, as well as volume reconstruction [2]. There's also an associated 3d visualization ecosystem [3].

[1] http://www.ndigital.com/ [2] https://app.assembla.com/spaces/plus/wiki [3] http://www.slicer.org


You can already get '3D' ultrasound, which is many individual images stitched together. Doesn't work if the baby moves around alot. Expensive and not covered by health insurance.

I don't think vet budgets are very large or favour high tech solutions but maybe you know otherwise.


I had a 3D ultrasound done at this place - http://www.firstviewultrasound.com - not long ago. I guess expensive means different things to different people, but I wound up with a CD full of a few dozen images and some movies for somewhere in the neighborhood of $100, which I thought it was a pretty good deal.


That is a good deal! A couple of years ago it was $500 but radiologists here are notoriously overpaid racketeers.


I wonder how GPUs are impacting medical vision near the field. With a lot of post processing done on a 500$ card can't you get pretty far ?


Typical consumer VR hardware isn't close to accurate enough for this. These sort of systems have been done though, as have 3D ultrasound via sweeping. Works quite well for some applications (e.g. Trans-esophageal)


If you have coarse positioning down to 1cm or so, and a bigger field of view than that with the ultrasound device, you can coarse align with the VR sensors and fine align by correlation. Somebody is probably doing it already.


You can, but it has a bunch of problems that are solved by better positioning


I think this would be difficult for surface ultrasound as the surface (the patient) is deformable!

Varying degreees of pressure are used as part of the diagnostic process. For example, if you are trying to tell a vein from an artery, all things being equal, less force is required to compress a vein than an artery.

Typically ultrasound is used in an interactive way, not to generate static images for interpretation. This applies equally to diagnostics and procedural use.


Actually, ultrasound tomography is being used for some static imaging:

https://www.ncbi.nlm.nih.gov/m/pubmed/22194502/

It works via a different principle though.


It's probably worth mentioning that relatively low-cost and mobile ultrasound machines exist. One company just recently received FDA approval for a wireless ultrasound machine. Target price seems to be ~7k to ~10k

See news here: http://www.mobihealthnews.com/content/clarius-mobile-health-...

See company website here: http://www.clarius.me/


I think there are low end Chinese transducers that plug into a laptop that are $1500.

That said the idea that an ultrasound machine is expensive is just laughable from a commercial/industrial cost perspective. Ok a mid range ultrasound machine costs $50,000. $50,000!!!! Oh the agony!!! Oh please, a taxi costs $30,000 and I can get a taxi ride downtown for $8.

What's expensive is US medical pricing. Where they charge $700 for an ultrasound. When the machine itself has a capital cost of about $600/mo.


You got lucky. My insurance got charged 6k for an abdominal ultrasound. Of which I'm on the hook for 2k still.


Probably wouldn't be a good time for me to mention how much your dentists x-ray machine costs.


TeleMed [1] is slightly higher-end and has CE mark (IIRC in the $1-2k range for the uncleared/demo version). UltraSonix [2] used to be relatively reasonable option (IIRC under $10k), but they were bought by BK and prices may have increased.

[1] http://www.pcultrasound.com/ [2] http://www.ultrasonix.com/


When cheap unregulated Chinese clones start coming out of ultrasounds like this one: http://intelametrix.com/FullSite/ thats when you know ultrasounds can be cheap.


Aside from the technical issues with the machine, ultrasound imaging is harder to perform (in terms of scanning technique) and very difficult to interpret compared to a high-resolution MRI or even simple X-ray. It takes significant training to even find specific organs, and distinguishing between, eg, a gas bubble in the small intestine vs a potentially dangerous foreign object or even a tumor takes a lot of education and experience.


What I find missing from this discussion is the safety aspect of wholesale ultrasound. Yes, we've been told they're perfectly safe. That was once said of x-rays as well and physicians were routinely x-raying fetuses in the uertus up until the 1970's. That was before the evidence of birth defects from exposure to x-rays became so overwhelming that the practice finally stopped.

See http://sarahbuckley.com/ultrasound-scans-cause-for-concern and references quoted therein for a good overview of the current discussions on side effects of routine ultrasound screening, including tissue damage due to cavitation and hearing loss in fetuses.



The regulatory burden is not so high that it can be attributed to the cost of most USS machines - thats just a mirage created by the large corporates who want to defend their place in the market. This isn't just seen in medical ultrasound, but across healthcare.

The regulatory and evidence burden is hard - but its not insurmountable and it certainly shouldn't stop change.


I don't have experience with medical ultrasound, but I have built a few basic ultrasonic systems around a PC or a single board computer for structural testing applications. If you're not doing phased array ultrasonics you can build a system for under $1000USD, possibly less if you can use PVDF instead of PZT. The expensive part is the software.


The cmut or pmut transducer array, supporting hardware, onerous regulatory environment, and price elasticity in Western medicine.


I like the question, its worth asking as technology improves. There are lots of great comments too about the challenges that are non-trivial. Generally my use of ultransonics has been constrained to robotic localization and party tricks but imaging is an interesting question too.

Arrays of sensors is still hard, driving them, reading them, and interpreting them. That particular engineering problem hasn't gotten that much easier.


I met a guy in Chennai, India who sold Ultrasound machines some time back. On enquiry I learnt they don't sell any of the big brands cause they cost in millions and can't be afforded by most semi-urban or rural health facilities.

Instead the one they sell is locally made and costs around 10K tops. He did claim there is not much tech on these devices and I took him at his word then. It makes sense now.


What about emerging markets? If they are so expensive, they probably don't have access to them. Low cost, low performance medical devices that target emerging markets seem like a great area to invest massive amount of money into with a realistic chance of generating revenue.

Beware: Just my thoughts, i really don't know much about the market and the money needed. I am probably wrong.


A company I co-founded, Shift Labs[1], is trying to do exactly that. The CEO, Dr. Beth Kolko, quit her academic post to found the company out of her frustration with low industry uptake of low cost medical devices emerging from academia.

[1] http://www.shiftlabs.com/


Having been involved in designing and building ultrasound systems for a couple of decades, I thought I'd comment on this in a blog post for those looking for a little more detail on the issues involved in building and selling ultrasound devices for medical use.

http://liesandstartuppr.blogspot.com/2016/12/why-are-medical...

A quick summary - they're not expensive, in fact for what you get they are remarkably inexpensive. There's a huge amount or work that goes into them, and the author of the original piece simply doesn't even know enough about the subject to realise he doesn't know what he's talking about.


Unfortunately, there is only a single line in the linked blog post regarding the economics of the situation : "made without the benefit of mass volumes (no millions of devices here)". The rest of the post is essentially meaningless, since it doesn't address the cost issue. The complexity in a sub-1k$ mobile phone is far greater than in a typical ultrasound imaging device that is currently in the field, but the economics of mass production enable massive reductions in price.

In other words, you've provided some suggestions regarding the difficulty of the task, but have not in any way proven or convincingly demonstrated that sub-1k$ ultrasound imaging devices are an impossibility.

From my POV, having also spent many years designing imaging systems and other devices to measure the human body (ultrasound, EEG, EKG, MRI), the biggest challenge to providing cheap consumer-grade tech here is the regulatory burden. If anyone wants to buy a sub-1k$ ultrasound imaging device, there are plenty of unapproved models on Alibaba that work just fine -- provided you're willing to test the device yourself for safety, to ensure it meets your own risk tolerances.


Upvoted. I love it when someone who really knows what they are talking about chimes in.


Having started my career making military radars, I've always been amused by tourists that criticize the cost of products that are required the have extreme performance and reliability in industries that are highly regulated to ensure compliance. The best simple explanation I've ever seen for the seemingly excessive cost was from an episode of West Wing:

https://youtu.be/7R9kH_HOUXM

While an ashtray may seem trivial, this example shows that in life-or-death situations, every detail must be considered and doing so is not cheap.


One thing that is perhaps not obvious - while the rapid iteration of constituent parts has, for example, driven a lot of consumer gear prices incredibly low, one reason for it is that vendors can rapidly rework device internals to capture cost savings due to new part supply.

In medical devices this doesn't really work, and actually becomes more of a problem than a benefit, at least currently. You cannot change the constituent parts of your device without a lot of work, so a rapidly iterating supply can cause you trouble with getting a stable supply of parts for several years. This is (only) one of the reasons that medical grade monitors are so much more expensive than you might expect.


I am wondering if anybody knows of an open-source project focused on designing the hardware and software for “recreational” ultrasound machines. I think like 3D printers (reprap etc), there would be a sizable group of hobbyists and tech savvy individuals in the medical community who would be interested. I don’t know quite enough about the electronics to initiate something like this but would definitely pitch it.


And if the botched together ultrasound machine misidentifies a thyroid nodule, leading to it not being FNABed, you are looking at a possible death from thyroid cancer...


I think such a project would be very important for underserved areas in human and vet medicine. Having a portable ultrasound is invaluable in diagnosing many conditions.


Make a cheap ultrasound device for non-medical use, and let people use it at their own risk. This will still save many lives, especially in poor countries.


Products are sold for prices the market will bear, which is only loosely related to the cost of making them.

Or rather - if you sell a fixed number of items and could only reasonably justify a certain margin, then you don't want to make it cheaper.

If barriers to entry are high - then you sit on your cash cow.

This issue is very prominent in healthcare for both services and equipment.

It's a very costly problem.

Insurance companies want your bill to go up, not down, so they don't act as aggressively as they could to cut costs for small items. Hospitals - same.

Because of the vast costs associated with regulation, overhead, marketing and near monopoly on many products, combined with massive 'price inelasticity' on part of the buyer (i.e. you'll pay 'whatever' to get fixed) - you get a problem.

My parents both worked in pharma, it's an industry flush with cash - they spend big on everything, offices, equipment, staff. They have a doctors sense of entitlement - after all - they are 'saving lives'. And it is serious business, you can't hack your way through most of it.

So as the underlying expenses and regulatory costs go up - so do all the ancillary costs. Add that to the misaligned market incentives and price inelasticity ...

And you get unbelievably expensive healthcare.

I firmly believe you could train a smart person to do an x-ray and to reset a bone, put on a cast and to it for under $1K. And it would cost $10K probably in a hospital sans insurance.

Now - the first 'problem' in that scenario is that doctors are often paid to be good 'when things go wrong', and to get their yield way up (i.e. can't make mistakes) and both of those things are very expensive: you need to have 10 years of 'extra training' for the 1% of the time something weird happens.

Fair enough - but I still think many of those things can be parameterized.

Costs will not come down until their is an agent forcing it: the government, or preferably, another kind of provider.

Wallmart has an approach to business like no other: they force their suppliers to open their books a bit, force their costs down - and then pass all the savings onto the consumer. It's something few understand. Their strategy is volume, and they have an ethos of sucking producer surpluses right out of the value chain.

If Wallmart could feasibly get into the healthcare game on the low end, it could send waves right through the industry, which would be good.


>Products are sold for prices the market will bear, which is only loosely related to the cost of making them.

Anyone remembers:

https://en.wikipedia.org/wiki/The_Hudsucker_Proxy

The calculations of costs and the determination of the retail price is IMHO a masterpiece.


> I firmly believe you could train a smart person to do an x-ray and to reset a bone, put on a cast and to it for under $1K. And it would cost $10K probably in a hospital sans insurance.

Imagine that your solution is a fraction of a percentage point worse than current treatment. Imagine there's 0.1% increase in harm.

The English NHS sees 1m patients every 36 hours. In 2012 - 2013 there were 9 million ultrasounds.

https://www.england.nhs.uk/statistics/wp-content/uploads/sit...

0.1% of 9million is 9,000.

I wouldn't want to tell those 9,000 people that their treatment was, even though they got harmed, good enough.

And that 9,000 is just in England.

If you want to save money on ultrasound spending you probably want to reduce the numbers of ultrasounds being taken. Healthy pregnant women with no problems only need one ultrasound, but in some places they're offered very many more.

http://www.wsj.com/articles/pregnant-women-get-more-ultrasou...

> In 2014, usage in the U.S. of the most common fetal-ultrasound procedures averaged 5.2 per delivery, up 92% from 2004, according to an analysis of data compiled for The Wall Street Journal by FAIR Health Inc., a nonprofit aggregator of insurance claims. Some women report getting scans at every doctor visit during pregnancy.


The way I see it, the benefit of cheaper diagnostic technology is the difference between no access (or extremely limited) to that technology and access to an inferior but otherwise capable technology.


But the US drastically over tests.

It's better to just cut out those needless, and potentially harmful, surplus tests than to add more needless tests with greater risks of harm.


I agree to a certain extent. I think it would mostly be useful in developing areas where you might have someone who has training to make simple diagnostic calls (should this person be put on a 2 day bus ride to a hospital), but doesn't have the tools to make the diagnosis.

Basically, places where buying $10k of diagnostic tools wouldn't be tenable, either because of the price, or because they would get stolen or damaged before they could "pay off".

Places where I think this might be useful are places like Nepal, Sudan, Pakistan, India, Niger, Mongolia, etc. Places that have low development and population densities.

I think a DIY instrument would be especially useful in India, given the fact that it has low development levels but a lot of highly educated individuals and a strong central government.

That being said, this is all speculation. I don't know enough about ANY of those areas to say whether people there would actually find tools like this useful. I'm definitely not suggesting we start filling shipping containers with cheap instruments and shipping them abroad.


Alibaba sells fully featured non-fda approved (usually) veteranary ultrasound machines for as little as $800 USD - this includes a monitor and control panel etc. That is probably pretty close to commodity pricing for a specialised device.

There are also wireless ultrasound probes for around $800 USD - these could potentially come down in price too.

The reason commercial ultrasound sound machines cost 50-500k is that hospital are paying for:

- a brand name with reliability behind it

- the ultrasound rep to come and demo the machine a few times before and after it is purchased, as well as bring in some platters of food. Also the ultrasound machine reps also bring additional machines along for courses in use of the ultrasound for intern teaching etc.

- a support contract

The cheap, ubiquitous ultrasound machine seems like a great idea, and is useful in some settings (like ER especially), however there are some significant issues - probably best understood with the example of echocardiography.

Performing an echocardiogram (ultrasound of the heart) is a highly specialised field with neverending levels of complexity. Firstly, you are often dealing with inadequate images due to the patient's obesity or other anatomical factors, therefore less experienced operators get worse images which can make interpretation impossible. Even when you do get good pictures, it is a very subjective area and 2 operators will commonly have divergent results for the same scan. Thirdly the there are dozens and dozens of parameters which can be measured or calculated which are used as surrogates for functional measurement of the heart - these are being proven or disproven/or becoming fashionable or falling out of fashion over time.

Practicioners need to perform a certain number of echos per year to maintain base competency, and those with low numbers generally perform much worse than those who do echos every day.

A quick, goal directed, focused echo can yield useful results, and does work a lot better than a stethoscope, but then many would argue that allowing allcomers (physicians/ED docs/anaesthesiologists/ICU docs) to perform poor quality scans is a step back from having more specialised doctors performing fewer, high quality scans.

So overall, the issue is probably not that the machines are too expensive, it is that we have not worked out exactly who should be doing these scans. The truth lies somewhere between a very few people (ultrasound trained cardiologists/radiologists) and everyone, but we are not sure exactly where.

Overall I think that the technical advances here will not come from building cheap open source ultrasound machines (although it does sound fun!), but from improving ultrasound machines in their ability to acquire and interpret pictures themselves. This may be by having a remote telesonographer that guides and interprets a scan performed by a layman (eg a nurse) which would allow rapid, remote results without the telesonographer present (and could also allow utilisation of excess sonographers in some locations to places where they are scarce, or allow daytime sonographers on one side of the world to help scan patients at 3am on the other side, or allow utilisation of outsourced remote indian/filipino sonographers for cost savings.

Alternatively, new tech would perform scans automatically (eg. robotic arms, or human operator guided with instructions or haptic feedback) and then do tech guided interpretation - eg. generate all the important data from the information given and present it at a level appropriate to the person requesting the scan.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: