Hacker News new | past | comments | ask | show | jobs | submit login
Contractor admits planting logic bombs in his software (arstechnica.com)
115 points by pkilgore on Dec 19, 2019 | hide | past | favorite | 151 comments



In 2011 Tinley had refused to hand over the password to unlock the spreadsheet for editing when asked, claiming he was protecting his work product.

> For years, the spreadsheet would glitch, Tinley would be hired to come in, would "fix" it, invoice Siemens, and head out again. But that all changed in May 2016 when Tinley was out of state, and Siemens called again about the spreadsheet. The company had an urgent order it had to put through, it told Tinley, and it wasn't working properly again. Pushed, Tinley relented and handed over the password.

https://www.theregister.co.uk/2019/06/25/siemens_logic_bomb/


Must have been a brain freeze moment to hand over that password.


Sounds to me like they used Tinley being out of state as an opportunity to push for the password. I’m sure they were suspecting something at that point.


My guess is that they started suspecting something when he refused to hand over the password the first time, or at most the first time a problem occurred and he was the only person capable of fixing it.


Wow, does 6-months in jail seem a little severe? How does one even get someone prosecuted for this crime?

We hired a licensed plumber on 2 occasions - to install a sink and later a shower.

We just had a different plumber out because the sink was plugged up. He pointed out that the prior plumber had installed the sanitary-t upside down basically guarantying it would eventually become clogged.

We also had him look at the shower because we couldn’t figure out how to get the screen out to clear the hair. Turns out the grate was also installed upside-down and the screws holding the screen in are in-accessible. So, there is no way to get it out without demoing the shower.

Should this plumber be sentenced to 6-months in jail?


I understand it depends on whether the FBI can show malice, planning and intent, and the size of the damage ($42k in the Siemens case). So in your plumber case it's probably impossible unless it's like a pattern of fraud rather than just one mistreated client. If a plumber actually tried to make a fraudulent business model out of such behaviour, then I could totally believe they might get jail time. Furthermore, if you sue the plumber yourself, that's a civil case and so jail time is not possible at all (the Siemens case was handled by the FBI).

I personally do not think that 6 months' jail for stealing $42k from somebody is so severe, to be honest, it's a lot of money.


Right.

If you stole a car (often valued at less than $40k), you could have gotten 1-9 years in the USA (depending on the value of the car).


If he's purposely making mistakes to get work in the future then I don't think 6-months in jail is that bad. It's just plain fraud isn't it? Not to mention an expensive inconvenience for all of his customers who have to deal with his shoddy work.

On the other hand, it's entirely possible the plumber made an honest mistake.


This sounds similar to planned obsolescence, so the moral is if you design it that way, as a manufacturer, you are ok. Definitely a grey area here.

Edit: After some thought, I feel a precedent. My car has parts that don't become obsolete, they flat break requiring never ending service. Surely I can sue for fraud as the auto company has the ability to use another means. (Devils advocate)


Deliberately introducing errors in one's work to defraud an employer is not remotely close to planned obsolescence. Planned obsolescence is not the deliberate introduction of malfunctions, it is the engineering tradeoff between longevity and other characteristics.

For example, Apple noticed that most customers replaced their phones within 2-3 years. A lithium ion battery's lifetime is determined by its charging and discharge characteristics. Its life is extended if it is charged more slowly and neither charged or discharged fully. But this increases charge time and reduced usable battery capacity - both of which are key selling features for smartphones. So Apple optimized the iPhone's battery management to deliver good charging and usable capacity, with the tradeoff that it would degrade significantly beyond 2-3 years of use.

This contractor's activities would have been analogous to Apple introducing code that checks the current timestamp with the timestamp from when the phone was sold and degrading performance when that difference exceeded a threshold.


Weren't there some printer manufacturers doing almost exactly that with ink cartridges?


"Deliberately introducing errors in one's work to defraud an employer is not remotely close to planned obsolescence"

Perhaps, but it's two points on the same continuum.


No they are not. One is intentionally, secretly, and illegally introducing flaws to defraud an employer. The other is managing tradeoffs based on technical constraints.


I don't think that planned obsolescence is unintentional. If it's not completely secret, it's not admitted to either, and it's not illegal because it's not illegal.

Is it particularly that you think that there isn't a continuum from legal actions to illegal actions? Isn't it possible and common that two instances of similar actions can be on the right and wrong sides of the law?


Planned obsolescence is not at all unintentional, it is highly deliberate. Every product needs to plan for it's intended lifespan, from phones to nuclear reactors.

Planned obsolescence and what this contractor did are not on any sort of continuum. This contractor was not making a trade-off, he deliberately sabotaged his work. This is not on any sort of continuum with optimizing phone's battery life for a certain number of years, or making similar tradeoffs.


I agree with you to the extent that companies design their products to become obsolete in a way that actually considers all of the trade-offs in a way that makes optimal use of the resource at hand and does not intentionally shorten the lifespan of its products in an attempt to boost sales at the expense of generating additional waste and reducing the value of the finished product. Once you start doing that, you ARE being intentionally destructive.


What I think you're ignoring is that they are both examples of utilizing dishonesty for profit.

Planned obsolescence could in the abstract be just about making a trade-off between price and durability, but in all real instances it leverages the fact that the consumer can't reliably assess the ratio of cost to lifespan of a product. If the trade-off was transparent to the buyer, then competition would produce much more durable products, even if not infinitely durable.


> it's not completely secret, it's not admitted to either, and it's not illegal because it's not illegal.

No, your assertion is quite wrong. All engineered products have service lives, because they all have failure modes that are taken into account in the design process. If a phone's failure mode consists of the battery dying the the battery will be designed to ensure it will likely work well for a minimum of x time following a typical usage pattern. Likewise an airplane failure mode is the fuseage breaking off due to fatigue, thus it is designed to be flown X times (takeoff/landing counts, number of hours in flight) taking onto account the likelihood of cracks forming and propagating. Targeting a certain service life in a design does not mean there is mischief in play. It only means engineering.

Thus obviously planting time bombs has absolutely nothing to do with engineering a product.


You're talking about 2 different things. The implication with "planned obsolecence" here is that the device becomes nonfunctional or less-functional because the manufacturer simply designed it to fail after a while. See for example the Phoebus Cartel, who actually fined manufacturers who made lightbulbs that lasted more than 1,000 hours.


> because the manufacturer simply designed it to fail after a while.

You're missing the whole point. Everything fails after a while. Everything. It's the engineer's job to ensure that it fails only after X amount of use, which can be expressed in time of normal usage. There are no time bombs, only the fact that designers picked a small service life.


The point is not that tradeoffs are inherently unethical, it's that in practice, deceiving the customer about those tradeoffs affects how they are made. You go to the supermarket and they have to post unit prices so you can compare, although this is gamed as much as possible. But you go and look at shoes, and you can only guess whether the $100 pair is going to last proportionally longer than the $80 pair. Some things, like tires and cars, you can get information from others experience and testing about lifespan, but even so, there is still a segment of the market that buys the poor quality products which is being deceived in the sense that they might well choose differently if it was thrown in their face how they are being screwed.


> Perhaps, but it's two points on the same continuum.

They really aren't. All engineered products are designed to meet a target service life. Even today's houses are designed with a design lifespan of 50 years. A consumer product is not different.


Saying all products have an engineered service life is beside the point. Planned obsolescence is not transparent. They don't tell you what those engineering decisions are - they don't put labels saying this item costs $X/year and this item costs $Y/year. Consumers would choose differently if they had trustworthy information for everything. Saying people are ok with that is like saying people choose to get melamine in their milk if they choose the cheapest.


It feels like your being obtuse on purpose.

Surely you understand the difference between a part that naturally wears out over time due to friction and wear and tear, versus a part that's delibrately installed incorrectly to cause a malfunction?


Yes the plumber obviously went to upside-down plumbing school where he learned the profit-maximizing techniques of installing everything upside-down.


Or he made a mistake.

Is your argument that the programmer in the FA was tired after a long day and accidentally concocted a scheme to defraud his client of $40,000?


Nah he's probably just Australian


You're assuming the guy did that just to you. If he does the same in every installation, I'd say that six months is not enough.


It seems the specifically designed logic to fail at a certain date is the reason. If stuff breaks because of a screwup, or at least something that people can be convinced is a routine screwup, that's one thing. Actually creating a new module to make something fail is something else.

It's like if the plumber designed, built, and installed a device specifically to make your plumbing leak or clog or something at a specific point in the future, instead of just installing a normal and expected device incorrectly.

The law does tend to take intentions into account for crimes and punishment. Killing somebody by a freak accident is different from making a plan in advance to kill somebody and executing it.

This does make it feel rather odd that it's legal to DRM things though.


"It's like if the plumber designed, built, and installed a device specifically to make your plumbing leak or clog or something at a specific point in the future, instead of just installing a normal and expected device incorrectly."

Intent is what matters. The distinction you're describing just affects whether it's deniable.


Well yeah. Doing something that makes it easy to prove that your intent was malicious does tend to change things a bit.


No he shouldn't. We need to stop sending people to jail for all but the most heinous of crimes. He should be ordered to pay you back twice for what it cost to install and repair.


It sounds like you're confusing criminal liability with civil liability.


No, I just don't think jails solve the problems besides costing taxpayer money. Maybe more things should be considered civil.


Does it matter? Practically, what seems more just to you: jail time, or remuneration with additional penalties?


That would be interesting, the victim gets to decide the penalty (up to a limit of course). The perp would have to plead their case to the victim directly, not just the judge. No jury necessary unless requested.


Twice seems a very low.


Just sued for the cost of putting the job right.


Unlike code, law considers intent.


6 months for fraud doesn't seem unreasonable.


That age-old saying comes to mind: “Never attribute to malice what can be sufficiently explained by incompetence.”


Also known as: "Any sufficiently advanced incompetence is indistinguishable from malice."

However, I tend to disagree with this approach when the final result is that the "incompetent" one stands to profitably gain from their alleged incompetence or ignorance.


How about adding your judgement, based on the specific evidence, of the likely cause? One of:

1. Ignorance/Incompetence -- wasn't paying attention?

2. Gross negligence -- doing it wrong was somehow quicker and cheaper.

3. Fraud -- calculated to fail.

p.s. Dealing with similar case of malfeasance myself just now (electrical). Looks to be about #1 20%, #2 80%.


Did the previous plumber install a timer on a pipe, ensuring it would leak every 6 weeks?


The "timer" was based on hair amount in water flow.


Any amount of jail time for non-violent crimes seems excessive.


Shows the importance of code reviews.

I wonder why this is illegal but it's legal for hardware to deny service or even break stuff when they detect you're using something they don't like (I'm referring to printers, but I also remember a case where a microcontroller would try to brick something when it detected a counterfeit cable).


>I also remember a case where a microcontroller would try to brick something when it detected a counterfeit cable

You might be thinking of the big FTDI scandal where they published a driver update for their chips that would attempt to detect a counterfeit FTDI chip and if it found one it would reconfigure the counterfeit chip with a bogus VID and PID thus rendering it essentially bricked. Bricking the cables using counterfeit chips wasn't the worst of it though, some of the chips in question were integrated into expensive equipment and rather than the manufacturer trying to use counterfeit FTDI chips they very well might be the victim being unwittingly sold counterfeits while paying the fraudster full price.


Yes, I was thinking of FTDI :)


> the logic bombs Tinley surreptitiously planted into his projects caused them to malfunction after a certain preset amount of time

Probably the timed aspect of it is the issue. Selling something that purposefully breaks in that way is malicious. Unless he comes up with a truly fantastic excuse but it doesn't sound like he did.


>Probably the timed aspect of it is the issue. Selling something that purposefully breaks in that way is malicious.

Overflow of like milliseconds counter would be just the thing. After all a lot of software did this trick with the 2 digit year counter in the 20th century guaranteeing that massive upgrade and contractor y2k call. And the UNIX 32 bit seconds counter comes to mind too - the guys i guess were planning long-term for a very plush retirement.


I think the intent to damage the systems is the issue, along with the fact that he confessed.


What was the case of the microcontroller detecting the counterfeit cable? I don't think I've ever heard of this.


Probably referring to FTDI-Gate, which is where FTDI shipped a driver that bricked counterfeit usb-serial converters.


I am not sure about a case with cables. But there was a semi famous case of FTDI USB drivers purposefully bricking fake USB counterfeit chips.

It was not that uncommon if you bought a cheap USB to serial or USB to TTL dongle online.


APC does this on UPSes. It’s an extremely bad practice that drives people crazy. They use a standard connector, like serial, RJ45, or USB, but with a non-standard pinout and give you a custom cable. God help you if you throw that cable in a box with other standard cables. And if you plug a standard cable into this non-standard port, the UPS panics and completely shuts down, including anything you have connected to it.

APC devices are generally pretty good, except for this infuriating and dangerous “feature”.

It’s almost 2020, and vendors still have this ridiculous idea that they can lock you into their proprietary ecosystem by doing stuff like this.


I actually know why this is a thing! Originally, APC UPS's used RTS/CTS flow control on the RS-232 connector for communication with the host PC. They wanted to maintain this compatibility (which used a nonstandard wiring) when they later added the Smart-UPS protocol.

That said, there's absolutely zero reason to keep maintaining this ancient and dangerous option.


I think they call this cable an RJ50 cable, although I'm not sure if it's the same one that you're referring to.


iPhones do this for the charging cable. They don’t brick as such, they just don’t charge. Sooner or later that gives you a brick.


I would prefer that cheap “charging” cables not turn into house fires thank you.


Network switches do something like this iirc.


It's probably covered in line 3,781 of the EULA for the printer's firmware.


Historically it wasn’t written down anywhere, but printer companies lost some lawsuits so now it’s actually written on the outside of the box.


...which are definitely enforceable right? I’m trying to remember why we put up with them as a society.


Because we have no effective replacement yet. Common sense, unfortunately, doesn't work for everyone.


Isn't it pretty common for a laptop to complain about a non-OEM power supply these days? Not to mention batteries, probably.


And coffee pods!


Apparently it was a password protected spreadsheet.

Which seems like incredible incompetence of the company to accept code in that format in the first place and to not have demanded the password when the first issue arose.


On the one hand, yes, that's crazy. On the other hand, an argument can be made that company accept proprietary software in binary form all the time, and this is no different !

Still a good laugh from the sidelines...


> On the one hand, yes, that's crazy. On the other hand, an argument can be made that company accept proprietary software in binary form all the time, and this is no different !

It's a little different if only from a professional dignity perspective, lol. Also, every senior programmer I've ever worked with who makes use of third-party binaries will find a moment to say, "I'll just decompile it if it ever gives us any real trouble!"


Corporations have no professional dignity.


We deliver source to the customer, and include compiled binaries and installers as a (paid) courtesy.

Without the source, the customer loses the ability to switch software contractors, which is against their procurement rules. Even if you don't have the clout of being huge, not controlling the source for business-critical software is basically putting the supplier's gun up to your own head.

Companies that take binary-only delivery have obviously never hired anyone who could tell them they shouldn't do that.

Always get the source, or write it in-house.


I once worked for a company that had a spreadsheet for plant production planning. The spreadsheet was put together by a consultant, but it was not protected - the company adjusted it constantly to maximize productivity at the plant. The consultant made more than the plant manager because he was the only person who understood the spreadsheet enough to make meaningful changes as required, e.g. when a machine on the line was replaced with another having a different capacity.


That's the ethical way to do it. If you give a business everything they ask for in requirements, the end result is probably going to be absurdly complicated for anyone to edit even with documentation, training, and open code.


We offer escrow of our source as an option for our customers who have concerns or policies about mission-critical software. The escrow includes instructions about tooling needed & versions (Visual Studio and others), any third-party (Telerik) or open-source things we use, database creation scripts, etc. A couple have run DR drills where they pulled the escrow, went through the instructions, and produced executables.


Vast majority of companies rely on Microsoft and other proprietary software


I was not discussing computing infrastructure, such as the OS or office productivity suites, but rather custom business logic, that is specific to the company.


If you dump the XML and remove the <sheetProtection password=/> line you don't even need to ask for the password.


About 30 years ago, when I was just a kid, my parents would make me go with them when they went shopping. I would always go wander around the electronics section of the store (I'm thinking of Walmart, specifically) to play with the computers.

At that time, Windows 3.1 was the latest and greatest and is what was running on all of the "display" PCs. Unfortunately, the password-protected screensaver was almost always activated -- meaning you couldn't actually do anything on the PCs.

Luckily, those passwords were easily bypassed! All you needed to do was simply power cycle / reboot the machine. Once it started booting up, you would just hit CTRL-C to interrupt and terminate the "autoexec.bat" file. Next, change into the "\WINDOWS\" directory, and open up some .INI file [0], find the line where the password was set, and delete everything after the "=" sign. Finally, save your changes, exit the editor, and hit CTRL-ALT-DEL to reboot.

The PC would start up and launch into Windows as usual, but without any password protection for the screensaver.

(A slightly older version of myself may have, allegedly, then set his own passwords on occasion.)

[0]: "sys.ini", "windows.ini", something like that.


My recollection is that you can also stop AUTOEXEC.BAT from running by holding the shift key.


Warning: This being true, it can still be construed as "hacking" no matter how simple it is. Just because the barrier is easy to get over, it doesn't mean you're legally allowed to enter.

This person asserted the spreadsheet was his "work product". Presumably Siemens's lawyers found this convincing enough to be wary of hacking around the password.


I have a different view. Considering the company employed "Person A" to create a spreadsheet, why would it be a violation to employ "Person B" to gain access to it in the absence of "Person A"? The company owns all of the technology and employs all of the personnel involved, no?


> company owns all of the technology and employs all of the personnel involved, no?

No, he's a contractor. So what they own precisely should have been defined in the consulting contract.

It may have been perfectly legal for him to password protect the output, much like using an obsfucator - but the time-bomb stuff is fraud any way you cut it.


Wasn't he a consultant, not an employee?


Nice tip!

I found the following screenshots/guide that shows "sheetProtection" includes "algorithmName" and "hashValue" but not "password" variable... https://www.excelsupersite.com/how-to-remove-an-excel-spread...


Haha really? That is hilariously insecure.


I surmise that the password feature is not meant for true security. It's not protecting the whole document, just the spreadsheet formulas and VB code. Requesting a password for changes to the formulas obviously prevents accidental mistakes, and makes it unambiguously clear that only some people in the enterprise (those who know the password) are 'supposed' to make these edits. Quite clever, as far as it goes.


Makes sense. The software needs to access that code for the spreadsheet to work, so if it was encrypted for example, then the spreadsheet would be unusable unless you know the password


I guess in any case, if you really care about security, you could just encrypt the file separately and only send someone the encrypted binary.


The Torx screw of presentations.


But then again, how would you make it secure? Clearly, the program itself must have full access to the file (formulas and all), otherwise you wouldn't be able to do anything with the spreadsheet (effectively turning it to a pdf). Short of encrypting the entire spreadsheet and sealing the key in a TPM, it's impossible to implement edit protection.


You must be crazy to use proprietary closed source software. Oh, wait...


Closed source software prepared and used by only one client is a lot different than distributed and used by many clients.


I'm sure there is a lot of password protected spreadsheets in finance industry.


Yep. If it was me I would hold the supervisors responsible.


Of course you would, developers are always blameless!


I read the DOJ link and it just states “intentional damage to a protected computer.”

If he had accidentally written sloppy code that happened to break periodically would that have been illegal? I don’t fully understand what law he broke and how such a law would not also apply to the seemingly infinite cases of built in obsolescence.


> If he had accidentally written sloppy code

The damage was intentional (i.e., not an accident).


Right. In this case he plead guilty and admitted to having done it. I am just curious as to which specific laws he broke and whether it is possible to inadvertently break these laws by being a horrible programmer.

If he had intentionally created a spaghetti code mess that just happened to break from time to time would that be different? I assume intention is difficult to prove in court but I am not an attorney.


> If he had intentionally created a spaghetti code mess that just happened to break from time to time would that be different ?

Clearly, yes. Here Siemens decided to go to court because it was obviously malicious code. With a spaghetti mess they would just have hired someone else to clean it.


> I am just curious as to which specific laws he broke

18 USC 1030, or:

> > intentional damage to a protected computer


Of course if he just wrote bad spaghetti code with no boundaries, no tests, and that breaks everything from time to time, he'd have locked in the company, make other developer's experience a nightmare, and so on and get away with it.

I think this might explain a few things I have seen or heard about in my life.


well only if it was password protected bad spaghetti code, otherwise they would have had a better developer look at the spaghetti and unravel it. But he wouldn't have gotten any jail time.


I'm thinking more about a 'soft lock-in' situation: the developer writes bad code, but not as bad that you need to start over or stop using it. Just bad enough that only they will understand it and other people who try to contribute will get mad and waste a lot of time and effort to deal with it, so in the end, the bad developer continues writing their shitty code because no one else wants to get involved (and there are plenty of opportunities elsewhere).


Im curious about that too. If you were a skilled programmer who studied common bugs, you could introduce those bugs intentionally but with plausible deniability if caught. Then again, in a smaller system that would be easier to detect. But in a multi page web app or something with a big back end, def could slip those in


I once wanted to put a logic bomb for a client that was a startup and for months (years?) prioritize paying others. I had accumulated $30K in debt for them as they told me the sky is falling numerous times and that they’d pay me as soon as the next money came in. They just had raised hundreds of thousands but paid their own salaries and large empty office instead.

I knew I’d have the upper hand if the site suddenly stopped working. But I was afraid of some kind of “hacking laws” being “exceeding access” or whatever (probably stupid given what was realistic) and never did it. My only acceptable option was to do a DMCA takedown at AWS because they had never signed a copyright assignment.

Anyway long story short I never got paid. Been too nice / scared. And the startup went out of business. Many of its investors were pissed. The usual.


It's probably for the best that you didn't go through it and -- although it obviously sucks that you lost $30K in the process -- hopefully you learned a lesson from it that prevented it from happening again!


Malicious compliance by the contractor. Hilarious incompetence by Siemens.


How is this malicious compliance? He wrote code to intentionally stop working at certain times in order to defraud Siemens by getting them to pay for what is essentially the same work over and over again.


It is somewhat like drm, but nobody goes to jail when books stop opening and old games break


I would be curious how they came to realise what was happening.

Also, how were the contractors changes not reviewed?

If the same engineers work keeps throwing unknown problems down the line, the LAST thing I am doing is contacting them again.


Can you even "review" changes to spreadsheet code? I know that Office apps have some support for change management, but is it even up to this task?


There is actually pretty nice diff / compare two spreadsheets functionality inside of Excel.

Screenshot: https://support.content.office.net/en-us/media/9149c7e8-6f0c...


Seems to be dependent on what edition of Excel you have - I had a look for it and couldn't find it... (I'm using the up to date version from Office 365).


Open the Quick Access drop down menu and select More Commands.

In the Excel Options dialog box, select All Commands under Choose commands from.

In the list of commands, scroll down to Compare and Merge Workbooks, select it and click the Add button to move it to the right-hand section.


You need to use the Professional or ProPlus versions...or https://www.xltrail.com/


RubberduckVBA >> export all modules, classes, userforms >> send to git.

That was my release workflow. Worked for the +30k line application I built in VBA.


How about the maintenance services who act lazy reporting client issues to the main vendor? That's one of the undetected patterns I believe.


>Tinley added code to the complex spreadsheets that "had no functional value, other than to randomly crash the program,"

I could say the same about some of the... less talented developers I've worked with in the past. Hanlon's razor might not apply in this case, but that's a scary thought given how the US justice system seems so inept at handling cyber crime.


Off topic but is does anyone else feel that the phrase “logic bomb” is too meaningless for the frequency with which it shows up in reporting these days?

It makes it sound more sophisticated than it is. What’s wrong with calling it malware? Or even better, simply criminal behaviour that happens to involve a computer.


Logic bomb is a term for a very specific subtype of malware, and it is quite informative and useful to use this term - it gives a proper impression about what this particular malware does and doesn't do.

It immediately suggests that it has a delayed action that creates a disruption after some time (and not right away); that it is hidden (as opposed to e.g. ransomware), that it's intentionally deployed there (as opposed to someone accidentally getting infected), that it's most likely not spreading itself automatically like a virus and that the damage isn't controlled in realtime like in a botnet, etc.

Simply saying 'malware' would not tell us this information, so it would be vague and inaccurate instead of using the appropriate terminology.


So the delay, or triggering condition, is the differentiator? Thanks, that actually makes a bit more sense. I skimmed the wikipedia page before posting and it wasn't immediately obvious.


I think the "delayed action" aspect is key to the use of the term "logic bomb" -- as in the phrase "a ticking time bomb waiting to go off" that we've probably all heard.

As PeterisP mentioned, the delayed action is integral to the concept of a logic bomb, as opposed to just invoking a piece of malware that immediately starts breaking things. Delaying the "attack" helps to hide its origin, making it harder to discover the cause of the problems.


Going even further off topic, but it's pretty neat that logic bombs exist in nature. Plasmids are circular dna molecules found in bacteria. They can replicate independently of their host, and they can spread to other bacteria. They often provide useful functions, most famously - antibiotic resistance, to their host. So there is a high degree of symbiosis.

Anyway plasmids have these things called addiction molecules to prevent the bacteria from eliminating them. They create a long lived poison, and a short lived anti-poison. If the plasmid is no longer around, then the anti-poison degrades and the cell dies.


We really like PR in this field. We call making a copy of a file "piracy", as in piracy on the high seas. We call adding a password to an Excel spreadsheet a "bomb", as in a device designed for leveling entire cities and brutally murdering everyone nearby. We call adding restrictions to books and films "digital rights", kind of like the "bill of rights" that protects our country's core values.

The prosecutors and industries that coined these terms are very clever. For the petty crimes that they describe, they can turn the outrage up to eleven by comparing the most minor transgression to murder. In the case of DRM, the industry managed to convince people to buy new TVs, monitors, video cards, and cables... to protect their rights? Their right to be turned upside down and have the coins and bills shaken out of their pants, I guess.


Most things in computing come with a metaphoric or symbolic phrase. The desktop, windows, a mouse, bugs. It's not just limited to things you perceive as weaponized. Whatever they call piracy that isn't 'piracy' will come to have the same prosecutorial meaning behind it when it is mentioned because of the ad campaigns and actual legal action committed in real life under it's name.


I think the word piracy was not invented by tech people, but by right holders who wanted to make it look as bad as real piracy. It was an amazing strategy for them.


As I remember it, the pirates called it piracy, and the right holders preferred calling it theft (you wouldn't steal a car).


"Digital rights" feels quite appropriate and accurate though, and in fact should be considered a natural extension of the rights in the bill of rights.


I'm not sure that software which prevents you from attaching a debugger to iTunes is giving you rights.

The FSF seems to like calling it "digital restrictions management". I can get on board with that. (Every few years I edit the Wikipedia article to try and make that name stick, but it gets reverted immediately.)


To be precise, the "bomb" was being obfuscated by the password, not the password per se.


I guess it depends on your definition but something designed to go off (negatively) after a predetermined time and dork up the logic of a program seems apt.


Weapons don't have be sophisticated to be used as weapons. Even physical bombs don't have be sophisticated to be bombs.

"It wasn't an armed robbery, it was just unsophisticated criminal behavior that happened to involve a gun."


Hmm, is this a common turn of phrase in reporting these days? I can't recall that expression being used outside of some rather old science fiction.


I rarely see "logic bombs" in the news, but I remember the day the customer called angry on April 1st, complaining that their spreadsheet quit working.

It seemed unlikely to be an April Fool's Day joke, but I knew the code hadn't changed so I couldn't understand why it would quit working. Root cause? The fiscal year in the spreadsheet started in April, which made it the "zeroth" month, which triggered a logic failure due to this being a false value.

It would have made a great hard-to-spot logic bomb... if I had planted it intentionally!


>$42k

Doesn't seem like he was a very good scam artist... That's not a lot of money to risk jail over.


He probably didn't know that he was risking jail.


Dilbert covered this back in 95

https://dilbert.com/strip/1995-11-13


> The parties in the case stipulated a total loss amount of $42,262.50

That's an oddly specific loss amount, especially the 50c


What’s odd about it being specific? Take the number of billable hours he spent “fixing” his logic bombs x hourly rate for those events.


The loss also extends to any lost business resulting from the broken code, lost work, time on behalf of the people whose job it was to call him in, etc. These are very difficult to quantify exactly, and so the number of significant figures in the claimed amount looks odd. But the law doesn't allow you to say "oh, about 50 grand", so I imagine significant figure rounding in financial estimates isn't strongly emphasized anyway.


You have to be specific, as others have mentioned. And it's not that odd:

  * Contractor's time multiplied by their hourly rate
  * Whoever was overseeing the contractor for some percentage of their time 
    multiplied by their fully loaded hourly rate
  * Processing and payment of contractor's invoices
  * Discounts or coupons given out as a result of late orders due to this problem
  * Anything in any contract relating to late orders that was caused by this problem, even in part
You start multiplying fully loaded employee costs (including 401(k) matches, healthcare, etc) by fractions of a percentage for how often they deal with this person and it's not at all hard to end up with fractions of a penny at the end of it all.


All I can say is he must be a really good programmer if he needed to deliberately install logic bombs to make his software malfunction after a period of time. I've got my hands full just making things work properly in the first place!


Yeah, really, this is the first programmer I've ever heard of who had lacked bugs to be fixed. There's always new stuff to work on.

Maybe he was only used for some older niche stuff that was going out of style and he was trying to cling to the past.


That or he's hired by people who don't peer review at all, as in he's the only developer.


Still, I think the point remains having the time to even do this is astounding.. can't even keep up getting all my code to work properly in the first place as a mostly single developer!


Maybe that's where he purposely introduces the bug, he keeps the broken version and times the working version. But yeah I agree with you, it's interesting altogether. I didn't mean to sound dismissive / against your viewpoint, just adding on possibilities.


Or just the usual periodic business logic changes.


That's because he's a boomer who probably knows asm. There were no wannabes in this field back in the day.


Why does the title not contain "to ensure he gets new work"? The original title was 2 characters too long for HN, but could have been edited to contain that information. For example: "Contractor admits planting logic bombs in software to ensure he’d get new work"


I wonder how they figured it out?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: