Hacker News new | past | comments | ask | show | jobs | submit login
The Wi-Fi only works when it's raining (predr.ag)
1147 points by bonyt 7 months ago | hide | past | favorite | 324 comments



Let me see if I'm the first one to link to that classic story in the same series, "I cannot send email further than 500 miles"

http://www.ibiblio.org/harris/500milemail.html

Or the Magic/More Magic switch

http://www.catb.org/jargon/html/magic-story.html

It's fun when physical reality meets the abstract models that we have built in our heads of these machines.


Oh man, this is one of my favorite lines of all time:

> "Anyway, I asked one of the geostatisticians to look into it--"

> "Geostatisticians..."

> "--yes, and she's produced a map showing the radius within which we can send email to be slightly more than 500 miles. There are a number of destinations within that radius that we can't reach, either, or reach sporadically, but we can never email farther than this radius."

I adore when experts use their expertise to analyze real-world things like this and provide ridiculously thorough explanations :-D


My favorite story kinda of this nature, of an expert as alien intelligence, was Feynmann's calculations about computer architecture of the Connection Machine:

https://longnow.org/essays/richard-feynman-connection-machin...

It's a few paragraphs, maybe too much to quote, but the bulk of it starts with:

> By the end of that summer of 1983, Richard had completed his analysis of the behavior of the router, and much to our surprise and amusement, he presented his answer in the form of a set of partial differential equations. To a physicist this may seem natural, but to a computer designer, treating a set of boolean circuits as a continuous, differentiable system is a bit strange. [...] Our discrete analysis said we needed seven buffers per chip; Feynman's equations suggested that we only needed five. We decided to play it safe and ignore Feynman.

Guess who was right.

The whole essay is worth reading, if you haven't yet.


That was a great read. It was also interesting to read about what happened to the company.

https://en.m.wikipedia.org/wiki/Thinking_Machines_Corporatio...

One thought I had while reading this was what areas of technologies are still open to amateurs.


I hadn't seen that before, so thanks for posting it. What a great story!


it's a useful analysis. Nobody thought of router hops, but this pattern is pretty much what you'd expect, so it was a very good hint.


My recent version: I was playing a pinball game in an arcade. One particular ramp shot was registering earlier in the day and then stopped working.

Eventually I realized that the sensor is an optical beam, and the receiver happened to be in direct sunlight coming in through a window! So it was continuously receiving infrared and would never report the beam being blocked by a pinball. Sure enough, it started working again once the sun angle changed by a few more degrees.


I have an optical smoke detector that will give (very loud) false alarms if a sun beam can bounce off a windowsill onto it. It works great if the curtain is closed. Debugging that took a few early sunrises.


You missed an opportunity to cheat the machine by waving your hand between it and the sun. ;)


Heh, but not exactly. If I blocked the sun, the receiver still would have been picking up the real beam. I would have had to block the sun _and_ make the shot with a pinball at the same instant... which is just playing the game normally with extra steps.


Or the "Car allergic to vanilla ice cream" story [1].

[1] https://news.ycombinator.com/item?id=37584399


The 500 mile email story is one of my favorite reminders that, fundamentally, we're still governed by the laws of physics. It's funny, but it's also a reminder that, while networks might be very fast, the latency is still going to be governed by the speed of light.


If we are doing classic stories - Grace Hooper and the Nanosecond of wire

https://www.youtube.com/watch?v=9eyFDBPk4Yw

https://americanhistory.si.edu/collections/nmah_692464


Well laws of physics is what gave us radio in the first place.

Some of my favorite video documentaries are on how it was theorized and then slowly developed over years and decades until they finally got to spark-gap transmitters.

https://en.wikipedia.org/wiki/Timeline_of_radio

But just imagine listening to spark-gap morse code radio broadcasts for years as amateur and then suddenly someone does a broadcast test of actual voice (violin!) That must have been incredible to hear wirelessly.

24 December 1906 Reginald Fessenden, that was the leap that eventually gave us wifi

https://en.wikipedia.org/wiki/Reginald_Fessenden


I actually like thinking about the exchange of physical information as a network propagation delay, and entanglement/coherence as a distributed consensus algorithm. They're kinda samey from a conceptual point of view (in my amateur opinion)


I forgot all about the 500 miles story. My favorite line:

> If the problem had had to do with the geography of the human recipient and not his mail server, I think I would have broken down in tears.


I had a customer who used a line of sight system for extending their network across part of a city.

I had a shortcut on my desktop with the weather for that town ready when they would inevitably call and blame our unrelated equipment for some problem.


I worked at a small, local ISP in the 90:ies that had a point to point link across the river, handling the dial up traffic from the telecom company we partnered with.

Every few days, always at roughly the same time, all incoming dial up traffic would drop. A minute later, the customers could reconnect.

It took a while before we realized that one of the huge passenger ferries that docked a short distance upstream was the cause. When it arrived and departed, its chimneys and possibly bridge and highest deck blocked LOS across the river.


I used to work in high-frequency trading. I had several tabs permanently open to the live weather radar feed for regions where we had microwave towers: the NE USA, the South of England, the Alps...


I used to be in an adjacent field and we used to joke about when the HFT guys were gonna get working on some neutrino detectors & sources to signal straight through the Earth. You could use them for science on the weekends!


Sounds awesome, to be able to send a signal directly through the whole earth from point to point. It also sounds like origin story of X-Men or a new cancer


Reminds me of Hubble and the NRO satellites and how Hubble was an extra that they didn’t need.


Honestly I'm half-surprised no-one has tried this yet.


I'm curious to know where your towers were. Do you know if they still exist? Were your microwave antennae co-located on other operators' towers (e.g. those for VHF radio), or did your company have towers all to itself?


Without going into anything confidential - we had some of our own hardware, but generally rented capacity from firms like [0]. Some towers were custom built for HFT, some were shared with other types of users.

A famous blog post investigating some of the towers as an outsider, at [1], will be of interest to you.

If you want to guess where they are, get a globe, find the datacentres where electronic exchanges operate (it's not a secret: Chicago, New Jersey, London, Tokyo, Frankfurt, Zurich...) and draw the straightest possible lines between pairs of them. Microwaves don't cross the ocean.

[0] https://www.mckay-brothers.com/

[1] https://sniperinmahwah.wordpress.com/2014/09/25/hft-in-my-ba...


Is the microwave setup quicker than going through fiber nowadays? I only mean in terms of latency.


Traditional optical fiber has glass in the middle. The speed of light in glass is only about 66% of the speed in air, so microwave is always faster if you can get a reasonably straight path for both.

There now exists hollow-core fiber, where the light travels down an air gap in the middle, which is theoretically competitive with microwaves/lasers/etc. How much this is being used is a secret, but microwave transmission definitely hasn't gone away.


There was a site with stories like these somewhere, I sadly can't remember the URL any more.

I think the one that stuck out to me was the Soviet mainframe computer that would get weird bit flips almost every day, always at the exact same time. Somebody compared what was different about the days it didn't get bit flips on, it turns out those were the days on which a particular train didn't run, the computer was very close to a railway station. What train was it, you ask? The one transporting the (definitely perfectly safe to eat, definitely not filled to the brim with nuclear radiation) cow meat from Chernobyl. The radiation was intense enough to cause bit flips, I'm sure the quality of soviet components didn't help here either.


I think is the one you are looking for:

https://beza1e1.tuxen.de/lore/index.html

Also posted on HN awhile back:

https://news.ycombinator.com/item?id=23005140

Edit: yep, here are your Crash Cows: https://beza1e1.tuxen.de/lore/crash_cows.html


Perhaps thedailywtf.com?


What was the one for Apple stories...?



As someone with very limited electrical experience, the more magic switch story instantly went "the second terminal of the switch is probably grounded to the switch casing" when they explained it only had one connected terminal.

This is a very common thing in older automotive electronics, for example.


First thing I thought of too. Anyone know if there a list of more articles similar to these three?


My wife has complained that open office will never print on Tuesdays!?

https://bugs.launchpad.net/ubuntu/+source/cupsys/+bug/255161...


A user was having a really bizarre problem: They could log in when they were sitting down in a seat in front of the keyboard, but when they were standing in front of the keyboard, their password didn't work! The problem happened every time, so they called for support, who finally figured it out after watching them demonstrate the problem many times:

It turned out that some joker had rearranged the numbers keys on the keyboard, so they were ordered "0123456789" instead of "1234567890". And the user's password had a digit in it. When the user was sitting down comfortably in front of the keyboard, they looked at the screen while they touch-typed their password, and were able to log in. But when they were standing in front of the computer, they looked at the keyboard and pressed the numbers they saw, which were wrong!


I have one first-hand story:

I did tech support via phone for a popular consumer computer brand. One particular call, a woman reported that her computer was restarting every time someone in the house flushed the toilet.

Long story short, her home was in the back-back woods with the home powered by a generator. In addition to powering the computer, the generator was also the source of power for a water pump which would kick on to refill the toilet bowl whenever it emptied. And wouldn't you know that that water pump had a beefy coil around its motor and would brownout the entire house every time it started?


I have a similar one, with an automated monorail hoist. The engineer who started the job had ordered the monorail hoist with a control cabinet with Ethernet comms to tell it where to move (instead of just controlling the hoist directly from the main control cabinet.) After days' worth of shenanigans trying to troubleshoot seemingly random comms drop-outs I'd narrowed it down to only occurring when the hoist was being lowered under load, which led me to the Ethernet cable in the hoist cabinet which ran parallel to the motor cables from the hoist's 6kW VSD. Whenever it lowered, the EMI was enough to nuke the Ethernet connection. Re-routed the Ethernet cable and after that it ran fine.


My personal example: VoIP phones stopping after the Asterisk server was up for 3 days.

Reason: the server had IPv6 turned on, and it steadily accumulated privacy IPv6 addresses. These addresses were all sent in a packet describing the supported media endpoints, using UDP.

And yep, eventually it overflowed the MTU and the phones couldn't handle the fragment reassembly.


Which distro was that? ... asking for a friend ...


Just regular Ubuntu. It was around 2009 or so.


Thanks. I've never run Asterisk on Ubuntu. FreePBX is CentOS based which is mostly what I've run Asterisk on. I only started to worry about Ubuntu around 2012.

That mad IPv6 address thing must have stuffed up more than just a VoIP negotiation packet. DNS switches from UDP to TCP when responses get too large.


DNS is affected, but differently. It's mostly DNSSEC signatures that cause trouble nowadays.

SIP is special because the signalling and media protocols are separate. So when a call is being established, the parties exchange their media endpoint locations. This necessarily means that the server has to list its IPs (or DNS names) so that the client can choose the best one. And as a quirk of SIP, it sends the entire set for each of its supported codecs.


"SIP is special".

Yes, I know, just like ftp 8) I do hope that whomever invented putting the control channel in a separate stream from the data is mildly discomforted. Mind you, all that stuff was invented a very long time ago, when trousers were a major trip hazard.

My go to fix is "symmetric RTP", which seems to have become a default over the last decade or two.


Here's one attempt I've seen in other HN comments at a shared "awesome list" of these sorts of stories:

https://github.com/danluu/debugging-stories


The podcast that kills the car stereo episode of Reply All is pretty funny https://gimletmedia.com/amp/shows/reply-all/brh8jm


I remember one (might have been a hn-er's comment, dunno) about the computer restarting when the toilet was flushed. Turns out it was due to voltage drop when a compressor turned on to refill the reservoir of the toilet.


That's why in rural locales with spotty power it pays to have a UPS on any electronics -- you might not benefit much from 15-30 minutes of extra power in a day long blackout, but it keeps everything happy when the voltage fluctuates.



https://500mile.email although I do wish it had more content!


Thanks for mentioning! I would love more submissions! I have a few stories in my backlog to read and vet, but not enough. I'll be going through this thread and adding more that haven't been added yet.


The Daily WTF is full of them.

https://thedailywtf.com


DNS responses sent over UDP are often truncated if the response is too large. This manifests itself as "machine unreachable if name > x characters" sort of errors when you have really long FQDNs.


There's the car/ice-cream/vapor-lock story. Oops, I ruined it.

https://www.snopes.com/fact-check/cone-of-silence/



Magic/more magic legend lives! I tell that to everyone experiencing a spooky troubleshooting!


i love those stories plus their detailed explanations because you can learn so much from them about technology, physics and even psychology.



I came here to see if someone posted that.


Boy did that [0] send me down a long rabbit hole

[0] magic-story


I don't know if this is fake, but it could be true. I had a similar situation working for a WISP around 2010.

Every night, for about 10 minutes, the connections from our HQ to a relay tower became flaky. At the time we were using two Mikrotik 5GHz cards and some large antennas.

You could sit in front your computer and wait, a few minutes after the sunset, for the monitoring alerts start arriving. After a week trying everything, including changing hardware (to the same specs), I was very disappointed with the thing and got out to smoke around the sunset.

Then some huge lamps we had around the building switched on, based on a light sensor. Immediately I received the SMS alerts on my phone. I ran into the building, turned off the external lights and bingo: 0% packet loss.

It turns out that the building management had changed all of external lamps the week before, with new sodium-vapor bulbs. And for some reason, on the first 5 to 10 minutes with these lights on, it caused very high interference on the 5GHz band.

Changed the lamps, problem solved.


> And for some reason

The ballast and the bulb itself are quite noisy in RF and then they are heating up they even more noisier.


I am just surprised that line of sight issues weren't the first thing checked when you have a bespoke line of sight networking setup, especially when there was no local packet loss.


That whole article felt oddly empty. Like you could read the title and conclusion and be totally fine and satisfied.


Reminds me of an extremely similar case with a long distance microwave link at a mobile telecom provider in Australia that I worked for. They relied quite heavily on microwave link chains and this particular one was in northern Queensland where fixed lines were hard to find and no local engineers were locally present/aware of the changing environment.

Every week day + Saturday, from 7-3 the link would keep cutting out intermittently. Then work fine and the rest of the day and on Sunday… a crane, building a new residential building would operate during those hours right in the middle of the microwave path. Many weeks of theories and time wasted until someone had a chance to visit. :)


Amazing. Reminds me of the fact that militaries really don't want wind turbines in areas where good radar coverage is important (case in point: the Finnish Defense Forces anywhere near the Russian border); even though the blades aren't metal, they're still a source of noise and radar shadow.


I'm not at all knowledgable about this, but: is it feasible (and if so, hard?) or impossible to have some sort of live reporting from the turbines about the speed/position of their blades that connects to the radar system allowing it to ignore what it knows to be turbine noise/shadow and therefore be able to have turbines there and still get good radar?


Well, you can't just ignore a radar shadow or noise. Just like GP's point-to-point microwave link couldn't just "ignore" the crane. You can't make a bad Wi-fi connection faster by just telling your computer to ignore the wall between you and the hot spot. A wind turbine is a solid obstacle that conceals stuff behind it, and even if a single turbine might not be a big deal, most commonly someone interested in harnessing wind power would want to build an entire farm, which would be much worse.


My (probably ignorant) thinking was that if there weren't propellers, just stalks, they could be arranged in a pattern that radar setups at multiple locations could see through gaps and between them have no dark spaces caused by the towers. Leaving the problem that the blades essentially block out an entire circle, but if the radar software knows the position of every turbine's blades (by a combination of turbines reporting location/speed/acceleration of blades in real time and maybe modelling so the radar system can know in time at least fairly if not very accurately where all blades are in a field of them) then when a radar bounces of one it can say "that doesn't count as a hit" leaving only non-turbine objects showing up in the UI that the radar setup outputs?

Is it that radar can't ping / receive at a high enough frequency to distinguish the difference between "this fraction of a ms the blade was at that location so we don't care about the radar hitting something, but the next fraction of a ms the blade had moved and we still got a ping from just behind it so there's something there"? Or some other problem with the idea?


Radars work in the 3cm or 10cm ranges[0], so you'll always get a return from the towers and the blades, and that significantly reduces the energy that gets beyond the turbine to illuminate potential targets.

Reflections from targets also need to pass the turbine(s), reducing the return even more.

Beamwidths are significant, depending on the antenna configuration. You can have beamwidths of 0.5 degrees and more. That helps detection, but reduces target discrimination. It helps get the energy past the turbines, but you tend to have "wound down" the power output to allow for that. So it doesn't really help.[1]

You know where the turbines are ... knowing the exact positions of the blades won't help at the frequencies being used.

In short, it's complicated. Radar pulses are like HUGE lumps of energy flying through space, and delicate things like turbine blades chop them up and defect them around, preventing them from getting to, and then from, the targets.

[0] Marine radars, X-Band and S-Band. There are other ranges and bands.

[1] Yes, this is all extremely simplified and inaccurate, but I'm trying to give a sense of what's going on.


Thanks! I wouldn't have guessed the sizes were so large for radar, and just assumed that in current era radar would have progressed to the point of even being able to not send pings at the rotating blades in sync with their rotations (ie always firing radar somewhere within the circle but only at the spots where the blades are not) the same way machine guns fired past propellers on WW2 (or WW1?) fighter planes figured out. But of course a bullet and a radar setup isn't at all the same thing.


The details I give are for marine radar, where off-shore wind-farms are a significant issue. But radar in general works on similar wavelengths, and a lot of clever processing can be done to "improve" the images you get back, especially when you have a lot of images that can be combined, rather than sweeping the radar around and only get 20 to 120 sweeps per minute.

There are trade-offs between how often you can ping versus how far you want to see versus how much energy you can send out versus how big the "lumps" are, etc. The sums/calculations are pretty simple, but not obvious until you've seen them, and then they are.

But if you want to see something at 32NM ~ 60km, you have to send out a lot of energy, so the "lumps" need to be pretty big, especially as the returns drop off as a fourth power with distance.


Any idea to what extent the problems can or can't be solved by money vs. a limit on maximum quality of radar before needing to break the laws of physics to improve further?

E.g. are the US military saying "yep those are the issues we face too" or "haha our tech has more than enough energy to get around all these problems"?


Most of what I know in these areas is either something I can't talk about, or is surmise based on conversations between others when I was around. There's a lot to do with the laws of physics, but equally, many times people have found clever things to do with the data they have.

I wouldn't care to speculate further, but you might want to read about phased array antennas, and FMCW radar. There are some very deep rabbit holes here, and I certainly would not be your best guide.


Thanks again :)


Depends on the width of the radar beam. They are certainly not thin like a laser. It might not fit in the gap between the blades.


Even if you ignored the turbines themselves, there would still be a "shadow" behind the turbines though I think? Which means you'd have a blind spot every now and then (when the wind is blowing and the turbines are active) which could be exploited by an enemy


Even if you are looking at the turbines "edge-on," there's probably going to be a noticeable Doppler return as well.

Plastic drones with plastic propellers are still visible on radar because the tiny propellers spin super fast, so they light up like a Christmas tree on Doppler radar because the approaching vs receding velocities of the blades are so different.


Any chance you can explain to this idiot (me) the flaw in my thinking? https://news.ycombinator.com/item?id=39903084


Seems like another user with way more experience than me did an excellent job explaining the situation -- I enjoyed reading their answer!


Love that story.

For me it reminds me most debugging I see at work. People coming up with theories and doing some magic incantations on the interface.

Instead of reading the log files or reading error description which makes usually error and fix obvious in 10 seconds.


I've done a ton of low-budget analog hardware debugging, and the major problem with hardware debugging is each attempt to fix the problem takes a long time. If I had wanted to test every idea I had I could easily waste a week. Not to mention that I can't just run some automated test suite after the fact. For hardware, approaching debugging methodically is a necessity, not just best practice.

We don't typically have log files for hardware, but I'm always surprised when otherwise extremely intelligent people first try to debug by applying "fixes" that shouldn't have any causal effect on any weird observations we've gotten. I have no problem with people coming up with theories because each modification takes time, but each theory should ideally explain the data...


Reading log files with really obscure error messages might as well be reading a magical grimoire. Especially when the solution turned out to have nothing to do with the error message.


Hey but that is not a discussion, that's just throwing anecdote to bring other person down or make yourself feel better.


The funny thing is this comment could have just as well been made on the grandparent and would read just as well.


At that point, I'd have called a bush pilot to fly along the lines!


The title might've been a Fleetwood (the other kind of) Mac reference.

    o/~ Wi-Fi's only working when it's rainin'
        Players only stutter when they're buff'rin'
        Websites, they will page load oh so slooooww
        When the rain falls down, you can download


The unusual internet setup is pretty important information to bury a few paragraphs in. Once that was explained it seemed like they should have started by checking nothing was blocking the antenna before tediously running around plugging the laptop into things and following cables and checking power supplies on the networking equipment?

Hindsight is 20/20 but I correctly guessed the ending as soon as that information was added.


I thought the same, although I probably would not have immediately gone to the Wifi transmitter.

After a couple decades of debugging various internet issues the first thing I now do is check the 'source' works (i.e. plug a laptop into the modem directly, but with a different cable). If that works I go down 'the line' until something does not work. That usually finds the culprit quite quickly (and also stops me from messing with my router config when it's an ISP issue).

In OPs scenario, the moment you realise that the office internet is working fine and it's only the home internet that is having issues, the connection between the two would have been the obvious place to look next.

That being said, it's still a fun story, and still quite 'unexpected' that rain could be the determinating factor on whether you'll have working internet or not.


I felt the same, but to be fair to the author, the information was probably buried in the back of their mind too - when you've had a set up like this for years, many of its details become invisible to yourself, part of "obvious" background information that your mind doesn't bother bringing to the forefront.


Sure, but the title has you thinking "Why would this external weather event effect my in-home wifi?" when the answer is "because I have a weird thing outside of my house that makes the wifi work". I feel used.


As soon as he described the physical setup I was screaming "Tree!" internally.


Yes, it was IMHO too buried in the article.

At first (without the full context) I thought it was because rain blocked the external signal interference that was making the AP channel look busy.


[flagged]


I think it’s a fair comment. A lot of readers on HN are adept debuggers, and will start to analyze everything from the first paragraph. By burying the lede like that, it feels like wasted time, to have begun debugging before the (incredibly important) part about the unusual setup was revealed.

Seems almost implausible that the protagonist, with his technical knowhow, did not think of this earlier..

Anyway, it’s a matter of storytelling, and that matters!


I may have just picked this comment to express overall frustration so for that I apologize.

But I don't know - writing is something that comes in a flow. This wasn't some deliberate clickbaity thing by the author, they just wrote it in a way that that made sense to them.

It also seems that the author themselves did not consider the setup at first, which happens, as sometimes we have tunnel vision.

You may criticize his abilities I guess, although overall it just felt like an account of things as they happened to the author, not considering how someone might be trying to guess things once they publish it.

So yeah, I don't know, I just feel like there's too much negativity sometimes. But maybe I overreacted.


fwiw I don't think you overreacted. It's not like anyone's making hbn read this story. it's like complaining about the movie Titanic, that because we know the boat sinks, its not worth watching.

the alternate version of this post goes "I fixed my dad's Internet. The neighborhood's tree grew too tall and blocked the signal so I upgraded the 10 year old hardware. The end." How much less fun and interesting is that?


I'm not bragging, I'm just saying if you have one custom, specialized part in your setup that's particularly out of the ordinary and prone to failure, I'm surprised you wouldn't start there.

If you're e.g. running a piece of software with a crazy custom plugin that overhauls major functionality and then an update to the base software breaks everything, it shouldn't be TOO much of a mystery on where to start looking. When you add weird custom parts to a system, it tends to be a point of failure.

Perhaps the author just didn't remember that they had a custom setup like that, but it wasn't framed in the article like "suddenly I remembered...", it was just stated as a given. And the fact that it was giving them particularly high speed home internet access for the time, it'd be a kind hard thing to forget?


(Author here) I didn't forget, it just didn't seem like the most likely problem.

Like I said in a reply to a sibling to your comment, the gear was ~10yrs old at the time and had been working fine until then. It was perched in a very inconvenient spot because it had to "look around the corner" of the building, so checking the line of sight wasn't just a case of looking out the window.

I went in order of "most likely to be the problem, weighted by how easy they were to check." This is a debugging strategy that has served me well, and I don't regret using it that time either.


But rain is more obviously related^ to a point to point link than aging hardware or some kind of bad update?

(^though of course the improvement is surprising! I assumed there was antennae involved, whether point to point or LTE or whatever, just from the title. The story to me was from the outset why's it better not worse in rain.)


Related, sure!

But it could be rain helping close a circuit on a rusty antenna connector port. Or rain improving the grounding of some neighboring circuit that otherwise drains through the metal scaffolding the antenna is attached to. Or rain attenuating a neighbor's own Wi-Fi that otherwise might have been aggressively transmitting on the same channel as our units.

The rain and the Wi-Fi devices were clearly related. How they were related, was not clear. Aging hardware rusts, breaks, gets yanked around or unseated or pulled out of the ground, or has water enter in places where it shouldn't be.

I was already running diagnostics on everything to figure out which devices might be faulty (local AP, local bridge unit, local antenna, remote antenna, remote bridge unit, remote switch, remote modem/router, upstream connection to ISP) so checking for "update gone wrong" was a 3 second job: I was already in the admin UI, so check logs, nope, no recent updates, done. I'd rather spend 3 seconds checking something that probably isn't the problem but I can know for sure in 3 seconds than risk climbing precariously up a scaffold 30ft in the air only to realize it was just something I could have solved at a keyboard instead.

Risk/reward. Low risk, low reward is okay too if it's super fast and already on the way.


I don't object to your order of debugging, but it was confusing to get that far in before realizing what "wifi" really meant in this situation.


Yes, it’s not a competition, but if you have a line of sight network connection and the network only works when it’s raining, the obvious thing to check is that line of sight.


(Author here)

I didn't think to check the line of sight because I was primed by the fact the bridge had been running fine for 10 years. With networking gear that old, it seemed more likely that a device/cable/power brick had just gone bad with age.

Also, the antenna is on some metal scaffolding propped out 6ft past the edge of our balcony, because it needs to "look around the corner" of the building. It's 30ft in the air, and checking the line of sight involved climbing up there. It certainly wasn't the easiest nor the likeliest thing to check, so I didn't check it first.

Multiple people in the comments just here on HN have mentioned having weird situations caused by routers that had gone bad. I imagine most of their routers weren't 10 years old when they started acting up. How old is your router?


Fair enough. My current router is less than a year old (my ex-wife got the house and the networking equipment in it).


There’s troubleshooting and then there’s the troubleshooting of the troubleshooting post-mortem. GP is just doing the latter.


Once upon a time, I owned a 1998 Volkswagen Wolfsburg Edition, a sleek and vibrant red car that turned heads wherever I went. As a city worker, I found it convenient to park my car at the train station and commute to work.

One particularly exhausting day, I trudged back from the train to the parking lot, eager to get home and unwind. As I approached my car, I noticed something peculiar—all the windows were missing. Panic gripped me, and I initially thought someone had vandalized my beloved vehicle. However, as I walked around the car, I couldn't find a single shard of glass on the ground. Upon closer inspection, I realized that the windows had simply been rolled down. Relief washed over me as I rolled them back up and drove home, putting the strange experience out of my mind.

Weeks passed, and the incident faded from my memory. Then, on a lazy Saturday morning, I sat on my back porch, sipping a hot cup of coffee and enjoying the tranquility of the day. Suddenly, the sky darkened, and a light rain began to fall. As the raindrops pattered against the roof, I heard an unexpected sound—the distinct whirring of car windows rolling down.

Perplexed, I set my coffee aside and hurried to the front of the house. To my astonishment, I found my Volkswagen's windows had mysteriously lowered themselves, allowing the rain to pour into the car's interior. It dawned on me that the windows' odd behavior must have been caused by a short circuit in the electrical system.

From that day on, I knew my 1998 Volkswagen Wolfsburg Edition was more than just a cool, bright red car—it had a quirky personality of its own, keeping me on my toes with its unexpected window antics.


Haha, I have a 2015 Opel Astra (I think they are sold as Vauxhall as well in some countries) and I noticed a similar thing one day: it suddenly had all windows lowered by itself, without me doing anything to cause it.

The first time it happened was on a music festival after hauling a lot of camping gear from the car. I locked it using the remote key fob, put the keys into my pocket and hauled the last bunch of stuff to our camp. An hour or so later someone told me that my car had completely open windows, asking whether that was intentional. Of course it wasn't.

The next time it happened was at home, after shopping for groceries. I locked the doors, carried the box with the groceries into my flat, and when I finished unpacking them I looked out of the window and saw my car in the parking lot - with fully lowered windows. I thought it was a glitch in the firmware or whatever.

A few days later the same thing happened again - I shopped groceries, carried them in, looked out of the window - car had lowered the windows entirely by itself. The same glitch twice within a few days? In almost the same situation? How big are the chances for that?

Then it suddenly dawned on me.

I have quite a lot of stuff in my pockets. Including the car key with the remote buttons. Whenever carrying heavy stuff, boxes and such, there is quite a good chance of me accidentally pressing the "unlock" button not shortly, but for a few seconds. So I took my key fob, stood in front of the car, held down the button...and after five seconds of waiting, all of the windows lowered for just a little bit, and after waiting a few additional seconds they lowered completely.

Since then I know about an interesting feature of my car: it can remotely lower the windows for ventilation in summer.


Another classic:

Can log in while sitting down, can't log in when standing up.

I need to find the reference ...

Edit: OK, here's one version:

https://www.reddit.com/r/talesfromtechsupport/comments/3v52p...


I recall reading a variant of that where a terminal had a "print screen" button, and the claim was it would work when standing but not when sitting down (or was it vice versa).

In the end there were two print screen buttons, only one of them functional, and one of them more obvious when standing (or sitting).

These kind of stories are classic debugging parables that teach you to step back and consider what you may be assuming incorrectly when something absolutely doesn't make sense or seems impossible.


Based on the title, I expected this to somehow be related to "Office chair turns off monitors" (it wasn't, but that is also a good one).

https://news.ycombinator.com/item?id=21978004

[Edit: I see this one has also been mentioned a few times already in the thread]


I had experienced this one and felt like I was going mad, and then I guess the humidity changed and I gave up/stopped thinking about it. It wasn't until a couple years later I saw a post here about the gas cylinders causing monitors to blank.


There's a Mister Bean episode in which his TV will only work when he's sitting next to it (where he obviously cannot see it). I think he manages to watch TV by creating a copy of himself with his clothes next to the TV, while he's sitting naked in front of it


Some years ago I put in a point to point wifi link for a family member, from house to garage "block". I specified a pair of Ubiquity Nanostations which are tiny, PoE powered and have a decent range.

The house end is inside a UK standard tiled roof - dense 3/4", allow for slat, so 1"+ thick and dense material.

The other end is 20m away (LoS) and external mounting was forbidden. The garage block has foil lined Kingspan style insulation. I managed to mount that end near enough to a skylight window to work OK. I then daisy-chained an access point off it.

All was fine until the sky light was replaced with a metalicised one. The signal just about worked until it rained which was enough to nobble it.

When it got annoying enough, me and said family member plotted and I rocked up when someone was absent for the weekend. I moved the garage station to the outside. It now looks like a bird box. I put up a real bird box at the other end too. The fake box would get baked in the sun but the real one is always shaded.


Why was subterfuge required? Something is missing from this story.


If you are renting then you might not be allowed to install an aerial on the building. There are situations where you need council permission to install an aerial or it may be forbidden e.g. listed historic buildings.

Also see: https://www.townplanning.info/permitted-development/househol...

Just what I found doing a quick search.


My gorgeous sister in law laid down some initial constraints, back in the day. My initial solution worked (just) and was good enough.

It failed later and me and my brother may have conspired a little bit ...

Dealing with your nearest and dearest can be quite tricky. I think we got this right ... this time!


Look up WAF!

...

(Wife Acceptance Factor - far more important than Web App Firewall)


One of my most recent 'weird internet issues' was when I upgraded our 50mbit internet connection to 100Mbit and my laptop never really reached 100Mbit whereas my homelab easily got 100Mbit on speed tests.

It took me a while to realise the difference was that the homelab was physically connected and the laptop was using Wifi.

The laptop wifi was connected to the AP at ~1.2 Gbit and a different machine had the same issue. I decided to see what the internal network speed was and found that sending/receiving files to the homelab from a wifi device was also maxing out at ~90mbit.

This then steered me towards looking at the connection between the AP and the router, and I realised that the Wifi AP was connected at 100Mbit to the router instead of 1Gbit. Turned out the cheap CAT7 cable that I randomly used to connect Wifi to the router because it looked nicer than the existing cable was not actually a real CAT7 cable and only provided 100Mbit. Changing the cable fixed the issue! Out of paranoia I decided to replace all of my ethernet cables with decent quality ones.

I don't even remember where that 'fake' cable came from; probably from some random Aliexpress appliance that I bought at some point. I've had similar issues with USB cables that I've amassed, where I forget where they came from and only realise later that they barely fulfil their purpose.


Dtto for video cables - once you get into 1440p/4k and high frequency rates or VBR, that HDMI 2.1 or DP 1.4 certifications start to actually matter.

What is worse is that the hardware tries hard to make it work even with noncompliant cable resulting in things like random flashes of black, or multiple random reconnections after plugging in the laptop.


Great, now you've got me checking our HDMI cables :) Is there an easy way to check what an HDMI cable supports?

I guess the same question stands for USB cables. I remember looking into that a while ago but did not find anything conclusive.


I'm getting confused reading your first sentence based on the rest of your comment; should homelab and laptop be swapped?


I don't think so (but I am a bit tired today, so maybe something I wrote makes no sense :)).

Homelab is physically connected to the router whereas the laptop is connected via Wifi. It was the cable from the Wifi AP to the router that was dodgy which caused the issue.

I noticed because the laptop would never reach 'true' 100Mbit download speeds whereas the homelab did.


I got this far:

"The office and our apartment were a few blocks away from each other..."

and figured it had to be a line of sight transmission.

I encountered this in summer of 1993 when the company I worked at installed infrared (I think) transmission across our two offices, separated by 250m. When the summer sun swept behind the transmitter in the northwest-ish, the wifi went out for about an hour each evening.


TIL about Fresnel Zones!

> Interestingly, objects outside the straight line between antennas can still cause interference! For best signal quality, the Fresnel zone between the antennas should be clear of obstructions. But perfection isn't achievable in practice, so RF equipment like Wi-Fi uses techniques like error-correcting codes so that it can still work without a perfectly clear Fresnel zone.

I wonder if other waves like pressure/audio waves also have a similar effect.

[1] - https://en.wikipedia.org/wiki/Fresnel_zone

(Side note, is this story old? 802.11n isn't particularly new enough to upgrade to.)


My mind was blown when I saw the 4F experiment, where a lens transforms an image into the Fourier domain. I'm not sure if it's related to the Fresnel zone (I think it is not or only very vaguely), but it's pretty amazing:

A widened beam of collimated light (i.e. parallel beams) is sent through e.g. a slide with some image printed on it. Using a lens placed one focal length away, it is focused down to a point (one focal length from the lens). One more focal length from that point, the beam will have reached its original width again, and another lens makes it parallel again, projecting it onto a screen placed one focal length from the second lens:

     |     ()    .    ()    |
    image lens point lens  screen
This will behave exactly as expected at first glance. The image will be visible on the screen (upside down IIRC) and if you hold a piece of paper into the point, you'll just see a single bright dot. However, what's actually present (due to diffraction) is the Fourier transform of the image! If you put an iris around the point, the image on the screen becomes blurry because you just filtered the high frequencies! And what's even more impressive, if you remove the center of the point (e.g. by inserting a glass slide with a small black circle in the middle), you'll get only the high frequencies, and the image on the screen will be the edges of the original image.


Wow! Are you maybe able to find a demo video of this? I'd love to see it!

(I'm the author of the post btw)


I think this one showed some of it (I saw it elsewhere, this is just what I could dig up) https://www.youtube.com/watch?v=wcRB3TWIAXE


Yes they do. That’s what an echo is, sound waves bouncing off an obstacle between two points. That obstacle doesn’t need to be within the direct line of sight, just within the dispersion area of the outgoing sound wave.

At the far end you’ll hear (although in reality, your brain will almost certainly cover this up for you) distortion caused by the sound wave defracting off the obstacle and interfering with the primary wavefront. Hence the reason why people put so much effort into design concert halls, and adding sound dampening treatments to recording studios. Obstacles will distort sound, but energy absorbing obstacles will distort less.


> At the far end you’ll hear (although in reality, your brain will almost certainly cover this up for you) distortion caused by the sound wave defracting off the obstacle and interfering with the primary wavefront.

See: https://en.wikipedia.org/wiki/Human_echolocation


> Side note, is this story old?

FTA:

> At the time, I was still a college student — this was over 10 years ago.


I wonder how much polarization affects things; I was once told that terrestrial FM Radio is transmitted with vertical polarization to reduce interference from tall objects between you and the transmitter.

Terrestrial TV (some of which used bands that overlap FM radio) uses horizontal polarization.


Actually experience the same thing but for different reasons.

I've lived in the same place for 25 years, so I've seen the invention of wifi and then checking every few months for other users on wifi analyzer, I've seen it grown and grown.

Well in that 25 years they've built so many surrounding apartment complexes that the 2.4ghz saturation is absolutely insane. I cannot believe how many networks show up on the analyzer in 2024, has to be well over 100.

But when it rains, it cuts off dozens of those other apartments, and I get better signal inside my own apartment.


I remember hearing about a few common failure modes for early internetworking of adjacent buildings. The first being running a bare twisted pair cable between buildings. Worked fine until the next lightning storm, and then a nearby strike fries the equipment on both ends. You have to use grounded conduit to run strands between buildings my dudes.

But the other one was setting up WiFi between buildings, and tended to be more of a problem in academia because the yearly cycles make it a bit more likely. If you set it up in the fall, and everything works all winter until spring comes, when the water in the deciduous tree leaves attenuates the signal. The nasty part of this one is not the failure mode but the timing. Everyone has been happily using and depending on their sweet sweet bandwidth for six months and poof, it’s just gone one fine April morning.


This is a bit ridiculous. How can you know that you have a ling of sight element to you network and not check that as the very first thing when you hear about rain effecting the wifi?


I had that feeling, too, especially when the author talks about climbing up to the endpoint to check the equipment. He knew it was a line of sight microwave link, so the first thing to check is obstruction or reflection by things that can move a little. I was thinking that it might only work when someone had a big flat-sided truck or RV in the right place to reflect the signal around something.

I've had trouble with tree growth in other contexts. A tree once slowly grew tall enough to break the neutral wire on the drop from the power pole to the house. This put overvoltages on some 110V circuits. Computers were fine. Washing machine emitted a burning smell. More recently, tree growth broke a fiber line coming into my house. AT&T lineman came out and restrung fiber for three poles (I'm a ways back from the main road). He saw me running a desktop computer, slowly, tethered to a phone, and once fiber was reconnected, said "Now you're back in 2023".

(Now to get rid of the dead cable. I have dead DirectTV coax, dead cable TV coax from whoever was before Comcast, and dead Pacific Bell copper, all abandoned in place and some of it sagging.)


Have you never spent hours (or even days!) debugging an issue that seemed obvious in retrospect, once all incorrect assumptions have been eliminated?


The retelling let's us know very early that there is a line of sight component, but when you are sitting there in the middle of a world of no internet, you might just think of it as a "link" to the office.


It sounds fake, especially as the solution would be to get the original equipment higher, instead of buying new equipment?

Also why not lay a cable.


> It sounds fake

I don’t think so. I definitely know tech people who get a particular idea in their head and will debug it to hell and back before taking a step back and realizing the obvious thing they missed. I’ve definitely done it before myself.

> Also why not lay a cable.

It sounds like they were trying to run a network between two properties that weren’t adjacent. They may not have had permission from the neighbor in the middle to lay cable on their property, or it might’ve required laying a cable across a street.


(Author here) Across several city blocks, in fact, and longer than the max range of Ethernet on normal (Cat 5/5e/6) cables.

Past ~300ft/100m, you need a repeater even for Ethernet. We would have needed at least one repeater somewhere along the line, which adds even more cost and complexity on top of needing to get permits from the city and approvals from all the neighbors in between. Anyone that says "just go get a permit from the city" has never tried doing it.


As for cable, you'd use fiber optic. There is really no need to go with copper in such a case.

Other than that I'd agree about your solution being optimal back then, and now. Btw how did you check the power brick, peak to peak voltage measurements? Bad capacitors is likely the single most common failure.


What happens when the tree grows taller? Would the new wifi still go through leaves and branches?

Edit: it probably did as the story is 10 years old.


You omitted to answer the question why the equipment wasnt simply put higher.


You can't just lay a cable across a public street

Also, if it was roughly 10 years ago then upgrading to N wireless was a good solution anyway. Not only did it solve the problem but it would've given then quicker speeds.


Raising the original equipment might not have been possible, and likely would've only been a temporary solution as the tree could keep growing taller.


> Also why not lay a cable.

According to the stories the two bridge endpoints were in different buildings a few blocks apart. You can't just lay a cable in the middle of a public street.


It's just from some guy's office to his house, they aren't going to lay a cable across a few city blocks.


(Author here) Worse: to the balcony of our apartment building. Imagine asking your HOA how they'd feel about you mounting your antenna "higher" AKA on someone else's balcony or on the roof above someone else's apartment.

"Just" move it higher vs replace ~10yr old (at the time) equipment with newer, faster equipment that doesn't have the problem? Easy answer if you ask me, and I'd make the same choice again with ~10yrs of retrospect -- the same 802.11n antennas are still there today!


Most likely a fake story. The internet is littered with blogs that make up stories like this to get engagement. After all, this one made it to the top of hackernews.

Chances of getting proof that this happened are zero


Never let reality ruin a nice story


Similar but opposite story:

20ish years ago I hung out in an IRC channel in which, during autumn/winter months, one person would frequently get disconnected and when he came back complained about foggy weather.

He had a laser line or sight connection. It could handle rain (with some degradation), but thick fog killed it.


Funny. A few years ago I was on #chicken, and there was a person frequently connecting and disconnecting. Turns out they were on a boat and the motion of the boat would be enough to disrupt their wifi directional antenna.

They were rigging a servomechanism to automatically aim the antenna and wanted to write the control software in Chicken Scheme (for whatever reason, never questioned because Chicken is fun).


Yeah. If you're outside during a calm day with snow falling, it's unusually quiet because large fluffy snowflakes absorb sound. Fog does something very similar to optical or radio systems. Rain has much bigger droplets and far fewer of them. :)


> The fix was easy: upgrade our hardware.

This made me smile. My brain autocompleted the fix to something like "help the neighbors trim their tree", but of course the fix is new hardware.


Well, that would have meant interacting with other people...


can't really be that entitled and ask them to cut their tree (or even if you do it by youself)


(I'm the author.) I really like trees! So I wouldn't have wanted to cut it down or even prune it.

Also, I hate operational (ongoing) solutions to problems. Pruning it would have been exactly that kind of solution -- we'd have had to prune it regularly or else it would have kept being a problem every so often.

The hardware fix was easier: our equipment was already a bit old and slow, the upgrade fixed the rain problem while also making it faster, and it's not something we've had to tweak since. I've long since moved out, but my parents still use that same 802.11n bridge today!


When I started reading this it seemed like “internet” and WiFi were being conflated, for example on our neighbourhood WhatsApp group there are often people asking “Is anyone else’s WiFi down?”, when what they should ideally be asking is if anyone else’s (fibre) internet is down. In such cases I internally frown a little, but leave it there.

Anyway, for the situation in this link, they actually have a WiFi bridge from their house to their office which has the connection to an ISP, so it is absolutely accurate to say the WiFi was down in this case.


(Author here) Yeah, my dad's company does this stuff for a living, so I learned to distinguish all the terminology very early on :)


I've had a similar weather experience where my internet connection dropped when it was cold. Turns out some water had seeped into the optical fiber connector, when it froze it broke the connection, and it would recover when it thawed. This one was a nightmare to troubleshoot.


While we're on the subject: I still can't solve this and thought you'd either laugh or you are the only people who know what I'm going through :)

I have some fancy Asus Mesh wifi routers at home. I sit next to the cable modem and one mesh endpoint. My wife sits upstairs. there's an upstairs mesh endpoint but I think neither of us are usually connected to it (mostly serves to extend our connection to go to yard). But when my wife gets up from her desk and walks through our hallway (closer to the non often used mesh endpoint) our internet drops for a bit. My only guess is that the endpoints get mad at meat being in between their back haul? Anyone deal with this and figure out the solution?


Actually, if you take a peak in to the wifi logs on the asus mesh node, you might see that it freaks out and restarts the wifi service. There's a tail mode that is pretty nice.

Restore to the default settings, make sure you have updated the firmware, and cross your fingers.


Strange as it may seem, try turning the power on each endpoint down. You may be getting signal from too many APs in the same place making the mesh elector freak out.


EMI maybe? Certain chairs cause monitors to go blank for a few seconds.

https://mastodon.social/@haeckerfelix/110272427676278609


I think my chair does this, but only when I'm not sitting in it. Maybe my body absorbs the ESD? If I'm doing anything nearby and bump the chair there's a good chance my monitor will lose signal for a second. It happens with both HDMI and Displayport with a number of different GPUs and different computers. The USB-C connection has never had a problem.

I'm in an older home with questionable wiring which I'm sure is also a factor.

I'd replace the chair but it's so dang comfy.


Definitely people can absorb enough RF to block WiFi.

I hit this in a hotel, back when I was doing steampunk conventions. Antique Teletype machines put into brass and glass cases, getting text messages over the Internet. (Early versions of this used Google Voice to read SMS; later versions used Twilio.) The hotel lobby had WiFi, but the function room we were in did not. I'd tested in advance, and was able to get a good WiFi connection with the room empty. But once it filled up with people, we couldn't get through. Had to run out to Fry's and buy a WiFi booster.


just to dispell my paranoia: are you sure there aren't any cables under the floorboards there?


This reminds me of a taxi driver in Dubrovnik, Croatia who told me that his cel service would not work when it rained because the rain changed transmissions where he lived in a way that meant that his phone would connect to a cel tower one valley over which was in Bosnia where he didn't have a data plan.


Not for long! Once the republic of Bosnia and Herzegovina becomes a member state of the European Union (a process which is seemingly progressing smoothly), it will be part of the European mobile phone 'roaming area' that is regulated as per PE/51/2018/REV/1.

End result: your taxi driver's data plan will work whatever the weather :)


Will EU ever allow them in? With the problem of Hungary vetoing a lot (to help Russia) letting B&H in does not look smart.

B&H comes with its own bag of problems too.


Of the European countries currently looking to join, Bosnia & Herzegovina will probably be the last one to actually make it. Which is pretty surprising considering that Kosovo has a is-this-actually-a-country dispute within the EU (which also blocks Serbia's accession), and Moldova, Georgia, and Ukraine have not-so-minor problems of not being able to control portions of their territory thanks to Russian occupation.

The Russo-Ukrainian war reignited a desire in the EU to actually enlarge towards the Balkans and Eastern Europe. But the countries that haven't joined aren't in great shape (the healthier bits of Yugoslavia joined earlier), and it's not clear to me that any of the current batch countries (especially Bosnia) will be able to join before such desires cool again.


Bosnia Herzegovina is so tiny and has all the trouble that comes with the Republica Serbska and all the historical baggage of the region. From a poorly practical POV it honestly doesn't seem worth the trouble at all, especially given how EU decision making works


I had a similar experience about shipping pallet being fully loaded fixing server connectivity issues almost two decades ago: https://chir.ag/tech/?49


That must have driven you crazy. The loop bit makes it so counterintuitive since interference from the pallet was actually helpful.


Exactly. It was also my first real job as an IT manager, and I had just setup my first business wifi network for use with a shiny new Windows mobile scanner. So it definitely made me question if I was cut out for it.

I left the company last year, having grown to 700+ employees in pharmaceutical manufacturing, a far-cry from my one-man IT department for 20 employees making shampoo. And while there were many, many weird issues over the years, none was ever so satisfying to resolve.


Oh wow, what a read! Great story, thanks for sharing :)


I had one recently - old Nintendo switch; worked fine when docked, couldn't get an internet connection on wifi.

Turns out it had been so long that the wifi MAC was picking up a DHCP address that was blocked at the firewall; the dock had its own MAC so it got a good address.


Were you using an Ethernet adapter? The original switch dock has a DisplayPort to HDMI adapter and USB hub. It doesn’t do anything with networking. The OLED dock adds ethernet (and therefore a second MAC).


It was the old switch in the new dock; which was why I was going insane. Both worked in the dock, but only the OLED worked outside the dock.

Had to sit down and think about it for awhile before I realized it had to be the firewall blocking access somehow.


Haha, your OP said “old”, probably autocorrected. I thought, “OLED is new, not old!”


I had an experience like this once! My my laptop would inexplicably and intermittently stop connecting to the internet.

It turned out my bluetooth headset was using the same band as the wifi but I only figured this out after a few months and a replaced wifi card. I wouldn't wish that experience on my worst enemy.


Turns out most consumer electronics operate in the same unlicensed consumer bands, so your bluetooth mouse, headset, wifi, and microwave all tussle for the same stuff.

I had a fun one where every time I would get out of my chair my monitors would turn off, turns out the EM fields from the compression/decompression can actually be enormous in some cases.


My Mac Pro desktop used to wake up whenever I used a MacBook Pro in the same room. Obvious thought was, maybe the laptop was sending wake-on-lan packets for some reason. Turns out, the carpeting in that room tends to create static buildup, and my MBP's charger was not grounded. Touching the laptop would send a mild discharge into the wall line, tripping something in the desktop's PSU to wake it up.


I've experienced something similar, but the chair's discharge was interfering with a PCI riser, tripping just over some threshold that would cause the OS kernel to panic and shutdown. It felt so incredibly unbelievable when we first noticed the correlation that we called tons of people over to watch us demonstrate it just to see if there was something else we were missing.


> I had a fun one where every time I would get out of my chair my monitors would turn off

Wait, can you elaborate? I have the same and I thought I was hallucinating or tripping a cable somewhere.




Hah yep, I figured it out after reading the superuser post which led me to some ancient electrical engineer stuff.

Update: reading the reg one, it also had no cusions, it was a standard herman miller so it was a mesh bottom and back.


Same, my macbook had unusable wifi when playing music via Bluetooth headphones. Switched to playing from my phone, somehow that worked - probably problem with the BT radio in the laptop since I didn't change wifi channel.


Aren't bluetooth and wifi typically on the same module these days?

The worst interference problem I've heard of is how USB 3.0 uses 2.4ghz and therefore can cause problems with devices connected with it.


It causes a big smear of interference but one of the higher regions is inside the 2.4GHz band.


I have a fancy microwave that degrades my fancy bluetooth headset but not others. Did replacing the wifi card work? I'm wondering if I need to switch up my expensive microwave, or expensive headphone, because replacing bluetooth dongle (with another generic one with same chipset) hasn't resolved issue.


Microwaves use the 2.4ghz spectrum but typically not with any real precision which means that while in use they just tank the 2.4ghz spectrum.

*As an aside, one of my favorite things I get to do at work is when onboarding new Jr. Net Engineers is getting them take our spectrum analyzer into our office kitchen and instructing them to watch the spectrum turn bright red while I make a bag of popcorn.

Anyhow to get to your question, the best answer would be to get some distance between your microwave and set-up you're using with the headset. Otherwise if that isn't possible, then you'll want some headphones that does use 2.4ghz. Replacing the microwave will likely not fix the problem since they all use 2.4ghz band for cooking and at least I've never seen one shielded well enough that it didn't impact others while in use.



I couldn't use my apartment complex's laundry machine if I was still connected to my own Wi-Fi (and using it).

It would interfere with the Bluetooth signal.


These are great stories but awful experiences to live through. I am currently going through one right now. My wireless CarPlay connection shuts off whenever I drive past a particular highway section. It never happens anywhere else but this one area. There is nothing of note happening there (it's on a bridge over a lake) but just like clockwork, my entertainment system shuts down and refuses to connect. I have tried everything (reboots, firmware updates, wired connections, etc) to no avail.


I wonder if some kind of jammer is running in that area. Reminds me of a similar story: https://www.theverge.com/2014/5/1/5672762/man-faces-48000-fi...

Weird that even a wired connection does not work. Stay curious!


My first thought as well was there could be something emitting an intense signal in the vicinity that is interfering with the connection. However, there does not seem to be anything nearby (like I said, it is on a bridge going over a lake).


This isn’t your problem, but I had a 1994 Pontiac Bonneville whose trunk release signal was triggered by some kind of radio signal that was part of a reasonably common system (security? Anyway, this was late 90s). If I parked in certain locations (certain stores) my trunk was guaranteed to open. It was sometimes so bad that I would have to start the car, set the parking brake, shift to neutral (any gear other than park blocked the remote trunk release), close the trunk, and only then get back in, release the parking brake, and drive off. Otherwise I didn’t have enough time to start the car and get in gear before the trunk opened again.

As a practical solution, I used a short bungee cord to keep it from opening far enough to be obvious, and never kept anything of value in the trunk if the car would be unattended.


Here's my own 500 mile email story. This happened to me about a year ago.

Just a normal day at the office when suddenly the internet drops out, except for my machine. Everyone else has a network connection, but no internet. Except for me, I can't reach devices on the local network, but I can reach anything outside.

Now, our network is not large or complicated. We have a consumer grade ONT and WiFi router provided by the ISP, and a big unmanaged ethernet switch. There's really nothing to go wrong here.

After some debugging, I notice that I have been assigned an IP address in my ISP's public block. Tracert seemed to show no local network between me and the WAN. It was as if the router had somehow connected my WiFi client directly to the ONT, bypassing the local network. That only barely makes sense, but it was my best guess so I condemned the router.

Next day, new router, same problem. I couldn't explain it. This time though, I didn't have an internet connection, but local network was reachable. Some sanity restored, ar least.

Turns out that our fiber line had been accidentally cut during construction work. Once the ISP fixed that, all was normal.

The question remains, how did I have internet connection through a severed fiber line? It's not likely that the router had a bizarre failure right before the line was cut. I suppose it's possible that Windows had sneakily connected me to some other WiFi network, but then why did I have a weird IP address?

I have no explanations


Does your machine have a cellular modem that gets prioritized only when there's no route to some well-known service via the normal network adapter? And you disabled it (but forgot to mention doing so in this story) around the same time as swapping routers?


Nope! Only WiFi and Ethernet. I had been using my phone hotspot, but that gives me a sane IP in the local reserved block, not one from the public block.

I also was using our neighboring business's guest WiFi, but again that should have given me a sane local IP.

Every diagnostic I could think of told me I was directly connected to the WAN with no intervening networks. Then again I'm not the best at debugging networks so I could be mistaken on this point. I am 100% certain that the IP address my computer was given was not a legal local network address.


Very interesting! Some kind of buggy DMZ type of thing, perhaps, where even the DHCP traffic flowed right on through... who knows.


That's about the only thing that makes any sense. Maybe it somehow faulted to route all traffic straight to the ONT, which could only give one address which I happened to get. And then the fiber got cut the very next day around the same time as we got the new router.

A lot of coincidences and extraordinary edge cases, but it's plausible I guess?


Here is mine.

The admins could connect to their machines, but not to any user machine.

It was winter and we had some heating issues, so I made a script "warmup_the_office.sh" that was meant to launch a "while(true){}" on each core of each PC of the office, but instead launched itself indefinitely on each and all reachable machines, exhausting all pids and preventing distant logging. We had to reboot everything by hand, after some nice warmup.


What a gloriously dumb idea. If I had that sort of access, I'd probably try something similar, and probably get the same results.

Did you admit what happened, or was there a "mysterious widespread network failure"?


The mystery didn't last long, and no problem to admit it, I maybe even contacted them. I guess they put safeguards after that, but I didn't try again to check ;)


We had an office in a very old building downtown with no access to fibre. The best internet we could get at the time was a 3G router in the window. Every afternoon at the same time our connection would drop down below 1 Meg and become unusable. Eventually we realised that down the street there was a large school and every afternoon when classes were over, hundreds of young people would turn in their phones and saturate our cell.


I had more thought of an issue regarding bad grounding (i.e. grounding rods dried out and only work properly when the earth is wet), but trees are even more unexpected.


This was my thinking. Or some other poorly seated electrical connection that somehow got better when it was damp.


My guess was that water accumulates somewhere and bends the antenna out of alignment.


My guess was that the directional antennas were off by enough that it didn't work well in clear conditions, but the rain refracted the signal enough to work. The actual answer was better :D


This was my first thought too. Tree makes more sense.


I know of a case in the Caribbean where where a line of sight connection between two buildings of a bank was being interrupted by a tree from a competitor bank. They asked the competitor would they mind cutting the tree and the answer was sure for the small fee of 1 million, Third hand info but I did hear from a network guy I worked with.


The real lost opportunity here was figuring out the cost of running fiber instead of wireless and charging the $1 less than that for the tree trimming.


At Pinterest, when we were working from one of the founder's apartment, the internet went down. Lots of debugging later and we traced it to a cable that a squirrel had chewed threw..


My Wi-Fi used to work better in the rain because our signal was fairly weak as it was coming from the apartment behind the wall and the channels were generally crowded so (I assume) rain helped to at least insulate us from the networks in the buildings across the courtyard.


That's what I thought this article would be about before I read it. I've observed the same thing before.


Reminds me of this story about repairing a large power line:

https://www.jwz.org/blog/2002/11/engineering-pornography/


FYI, JWZ blocks linking from HN.


I feel like this story started off seeming way more mysterious than it actually was because it took so long to get to mentioning the crucial bit (the long distance WiFi bridge).


As soon as they mentioned the directional wifi i knew it was something physically between the antennas but was guessing human behavior. The tree was a surprise.


I was fully expecting the answer to be that the rain was tamping down some unknown source of wifi interference... Which is a reasonable hypothesis if the packet loss is also within the home network.

I was not expecting the home Internet all went over a long range WiFi bridge, but knowing that a tree makes far more sense as the problem. Strange how it correlates with rain that way.


> Happy April 1st! This post is part of April Cools Club: an April 1st effort to publish genuine essays on unexpected topics.

Honestly, reading "April 1st!", I was expecting this to be one of April's fools but it turned out to be a true and amusing story.

The author was lucky to solve a technical problem in a non-technical way, unlike me!

A decade ago, I had a weird Internet connection issue. The upload speed suddenly dropped to near zero kb/s while the download was alright. I contacted my ISP, and for weeks, they were unable to figure out what was going wrong. I reached out to my neighbor and offered him to pay his Internet bill in exchange for sharing his Wi-Fi with me until my ISP solves my problem, and he kindly agreed. After about three months, my ISP's technical staff was still unable to fix the issue. I gave up, and guess what I did to get around this?! I just moved away to another distant home.


Once I had an Internet cable that stopped for a while whenever I turned on the microwave (understandable), and also stopped when started raining. The trickiest part was that it turned off just for a while and ONLY at the beginning of the rain.

I called the telecom company many times. They charge per visit in case no problem is found. I always had to explain the situation and ask the technician not to charge me and come when it started raining, a very hard thing to do because we cannot predict forecast and the network went down for 10 min only.

It does not seem a big problem, after all, I could just wait 10 minutes. After this happened multiple times a day every rainy week, making me lose meetings, work, server connections, etc... I had to patiently chase the telecom company and even ask for the personal phone numbers of technicians (to ping them when it was going to rain) until they finally found a solution.


Don't leave it at the cliffhanger


It was due to a pipe connected to the rain gutter in the house. At the beginning of the rain, the amount of water streaming through was hitting perfectly a joint in the internet cable. So it was happening only when the rain started, and stopping when the little stream got stronger and consequently jumping it and not hitting the cable joint anymore. It was discovered by the technician. He found it somewhere in the 100m of the cable coming out from the street pole, not an easy task.


This reminds me of when I used to work for an ISP. Every November the "my internet doesn't work when we turn on the Christmas tree" calls would start. It was usually interference from cheap tree lights, but occasionally it was people unplugging the router to plug in their Christmas tree :D


I recall there was a story about a computer mouse not working when it was sunny? It had to do with the sensor. I can't find it so I'm starting to doubt if that actually happened...

Edit: Found it: https://news.ycombinator.com/item?id=37585548


Wouldn't surprise me. Optical mice don't like transparent or some translucent surfaces.


This reminds me of when I took my PS5 to my family's house for Christmas vacation. We both have the same SSID because I set up both access points but they changed the password when they forgot it because they're a bunch of bozos.

My PS5 controller refused to connect to my PS5 and I couldn't figure out why. I gave up and after a few days tried again only to realize that the PS5 controller can't connect to the PS5 wirelessly when the wifi was connected but the password was invalid. I still don't know why it was a problem or if it still is a problem but it was a monster to debug. lol

The reason I didn't fix the wifi in the first place was that I didn't have a spare USB C to USB A cable to hardwire my controller and I was playing a singleplayer game. I think it was last of us.


Weighing down the leaves is one possibility, but is it also possible that water adhering to surfaces was creating reflections and providing alternative paths for the signal? I don't know if such a thing could happen at that frequency, but if you imagine looking out a window at night towards a window in a distant building, with a tree blocking the direct line of sight, and someone in the window sending you a message with a flashlight, you would only see the light via reflections. On a dry night the reflections might be matte and hard to see, but on a rain-soaked night you might be able to see stronger and sharper reflections on metal poles, walls, or puddles that were visible from both windows.


Reminds me of a song by Fleetwood Mac. Wifi only woooorks when it's raaaaining ...


Another fun story is the "My Car does not start when I buy Vanilla ice cream": https://news.ycombinator.com/item?id=21779857


I once had a Time Warner tech blame the moisture content of the air for impacting the copper cabling to explain outages at our apartment. This both makes more sense and, I suppose, is more interesting.


My parents’ house was connected to old copper phone lines from ca. 1940, and by the 1980s rain intrusion from long or heavy storms would cause massive static on the line (made worse when squirrels ran on the line). This was disastrous when I got my first modem ca 1986. Just unusable until it dried out.


Interesting! This was a relatively new apartment building in Hollywood with similarly aged cable — maybe he was right and I'm altogether wrong. They ran the cable up through vertically adjacent units as well which left you at some degree of mercy of your downstairs neighbors.


A wet month in Hollywood is drier than a wet day in the southeast US, but maybe.


Rain can improve signal quality by lowering the noise floor. Essentially, it drowns out the weaker signals originating from other emitters, making your own signal stand out.


I own a Ford Focus and to this day I don't understand why sometimes the gear shifts make this cracking noise when decelerating to zero, but only when it is raining.


car clutches and brakes famously act/sound different in the wet, maybe a clutch or friction material somewhere acting differently?


Is it a 2012-2016 model? Those have the PowerShift transmission (known as the PowerShit on forums) that had a class action settlement, though I think it's too late to cash in on that.


yeah, maybe that's it :-/


ABS?


As soon as it mentioned the line of site antennas I knew it had to be a tree. It often surprises me how much branches can sag from the weight of rainwater


Good read, but i was convinced as soon as i read the title that the rain was shielding the house from a near by transmitters that was blasting too much power.


I will be adding this one to https://500mile.email ! What a great story.


I dislike calling this "magical thinking", just because the plausible causal relationship takes a little time to discover, it's not implausible at the outset.

In fact, the causal relationship between rain & wifi is taken as a given by the author:

> If anything, rain makes wireless signal quality worse

It's not too surprising to discover a causal relationship between two things we already know are causally related.


You don't think it's unusual to find a positive correlation where there is usually a negative correlation?


Unusual yes, magical no


The WiFi only works when it's raining (ps. one critical part of my WiFi setup is outdoors and depends on having a clear line of sight)


The author’s completely over the top reaction to the plausible and not especially weird titular statement gets old, quick.

Especially when they reveal that the network is using wifi antennas over a non-insignificant distance in an urban setting. Of course it’s the local wireless point to point bridge! The first thing you’d do is look down the line of sight for interference.


Reminds me of that time when my desktop computer wouldn't turn on if the printer's USB was plugged in to the wrong kind of port. Took me some time to figure that one out.

Or that other time, when my mom's phone started crashing all of a sudden. Until we discovered, that it was caused by her new ID card in the folding phone case touching the back of the phone.


Back in the dialup days, my dialup would die, and be unable to re-connect at dusk. Other than that, it was fine, for dialup.


My garage door opener works much better when it’s raining and also at night. I’m out case, our solar inverter (or one or more of the optimizers at the panels) creates enough noise to interfere.

I also believe our microwave is adding noise to the same circuit our WiFi router is on. Despite using 5GHz, WiFi is severely degraded whenever the microwave is on.


You should get a new microwave. It's probably not the mains circuit, it's probably leaking enough radiation to overload the RF frontend on the router.

If the router gets enough energy the "sine wave" of the radio signal starts to flatten out at the top and bottom and becomes a rounded off square wave which we call "clipping". This has the original frequency component, but also a ton of other frequency components that push up into much higher frequency bands like 5GHz.

Fun fact, this is why electric guitars add a high pitch scratchy noise, they're reaching the distortion/clipping threshold of their amplifier.


To add another weird tech issue to this collection: The camera shy Raspberry Pi. A model of the RPi would reboot when photographed with a flash: https://www.bbc.com/news/technology-31294745 (2015)


One of the sysadmins at a friendly company had the same problem... only that it was a >2km link in a densely built-up city and it turned out to be a crane that moved in on rails each morning with the beginning of a shift and then moved out in the evening!

Took a guy standing there with binoculars to realize what was going on...


I also had another story: my friends asked me to troubleshoot their DSL connection. It dropped out sometime after the sun set and came back in the morning.

Sure enough, this turned out to be RFI from newly installed solar invertors, creeping down the shield of an unused CB radio coax that ran parallel to the phone wiring.

Grr, I hate DSL...


I remember when I had ADSL2, I would get increased packet loss when it rained. After I failed to find the culprit, technicians from ISPs tried to debug it to no avail. It turned away when the connection got upgraded from ADSL2 to VDSL2.


One winter had been so cold I had cartoon network and other cable stations available on my terrestrial antenna. When the frozen temperatures finished, I had to buy cable because I got used to it


I hate how fake this sounds since it's a funny story.

Trees leaves being weighed down by a slight drizzle? What tree does that? None around where I live.

A tree blocking signals that strong as well? Doesn't make sense to me either.


When it's extremely windy, our office Internet or Wifi speed slows down. Haven't worked out why yet. We have fibre to the building, so it's unlikely the uplink. But you never know.


It may not be the Wifi but something more upstream. If ISPs interlinks have microwave towers, then the physical swaying would have some effect on channel interference. I don't quite understand how (because the general direction remains the same over large distance) - but I have observed this to happen.


Could also have be that the neighbor have a not compliant WiFi device that send out deauthentication packages, then it would also work better during rain.

And the same upgrade would often fix it


Nice story! My guess was on either the water cooling an overheated device, or weight of water bending the roof on which the antenna is attached.


I feel like the first step would be looking at the antenna since you’re using a weird antenna connection to a different buildint


"Thunder only happens when it's raining"

https://youtu.be/Y3ywicffOj4

1:15


This is how you know the WiFi is Garbage - only happy when it rains.

At least you get a clue in this case (and the famous 500 miles email), I was having sporadic disconnections with cable internet for the last 2 month and my ISP can’t find anything wrong.

Had to switch back to DSL and pay more for slower speed.


I was expecting a story about frolicking about in the sun. I'll settle for a hardware troubleshooting adventure.


Why does this sound like it was written by AI?


Human writing with short context window can sound like AI too


(Author here) The image is AI generated. I did not use any AI to write the post itself.

That's just my style of writing, you can check my pre-GPT posts and compare if you'd like.


The article briefly mentions that this was unbelievable because rain should make Wi-Fi worse not better.

That parallels my experience but I didn’t realize was commonly understood. I noticed that in the hot summer the Wi-Fi reception in my yard (IE, farther from the access point in the house) is worse. Eventually I decided that summer heat is really proxy for humidity and that it wasn’t unreasonable for high water concentration in the air to provide an obstacle to Wi-Fi signal.


Reminds me of that old tale about a lady whose phone would not ring, and her dog would bark before the phone rang.


Just the basic facts... Climbing a nearby telephone pole and hooking in his test set, he dialed the subscriber's house. The phone didn't ring. He tried again. The dog barked loudly, followed by a ringing telephone. Climbing down from the pole then he found:

a. Dog was tied to the telephone system's ground post via an iron chain and collar.

b. Dog was receiving 90 volts of signalling current.

c. After several jolts, the dog was urinating on ground and barking.

d. Wet ground now conducted and phone rang.

Which goes to prove that some grounding problems can be passed on.


Thank you! I had forgotten the details.


I have a hard time believing this. Wifi can go through multiple walls. And if these were directional P2P links, they can easily go through and even around trees, I've deployed them in the past and they don't need perfect line of sight.

Granted the equipment could have been cheap, but this sounds questionable. He's asserting that the a few leaves at the top of a tree were blocking it when it wasn't raining? Idk.


I'm the author. Among other things, it's a distance problem -- WiFi can go through walls when the router is right there. But distance attenuates the signal as distance squared, and in this case we're talking about hundred+ yards/meters instead of just a couple.


> in this case we're talking about hundred+ yards/meters instead of just a couple.

Sure but didn't you say you were using directional antennas?


No antenna is directional enough to overcome n^2 scaling. Especially not the mid-tier consumer-grade hardware I would have had access to at the time.

Rough rule of thumb, a consumer-grade directional antenna (at least at the time, maybe they've improved in the last 10yrs) will give your signal strength a one-digit multiplier (say ~8x), meaning ~7-10dB. But that n^2 means that improvement only takes you 2-3x farther, not 8x.

Here we're talking about ~100x the distance, which would need a 10000x = 40dB improvement in signal strength. AFAIK an antenna like that would cost more than the entire city block where I grew up


Won't 20dB at each end do the job? You can get 16-17dB antennas for $35 these days. And a wire parabolic dish promising 23dB for $200.

7-10dB matches the antennas I personally use, but those antennas are smaller than my phone. You can get a lot more out of a three foot spike or a two foot dish.

(I'm assuming 2.4GHz and hoping they're not lying about the gain.)


Right. This is a multiple timelines problem.

This story happened a bit over 10 years ago at this point. At the time of the story, the Wi-Fi bridge was already almost 10 years old -- we'd had the same equipment since I first remember getting internet at home.

That means the antennas at the start of the story were mid-2000s mid-tier consumer level equipment. I don't think a 16-17dB antenna would have been $35 at the time, in either mid-2000s dollars or in today's dollars. I also grew up in a country much poorer than the US, where $35 could feed a family for a week.

At the end of the post, we upgraded to 802.11n antennas (but still tech from 10+ years ago) which solved the problem by being newer, nicer, and having beamforming + MIMO capability which let them be more tolerant of obstacles (effectively, get more dB at the same emitted power level).


I'm not trying to critique the solution, I just think the decibel issue isn't so dire and other factors were more important. Which is supported by the way the problem was fixed.

As far as what was available at the time, I can't think of a super easy way to check that far back, but I will note that the 17dB stick antennas on Amazon have been available at the same price or cheaper for about 10 years.


Free space path loss over 5 meters with non-directional antennas is the same as over 500 meters with 20 dB of gain at each end.


a directional antenna increases the gain of the signal, but it doesn't fix the problem of decreasing with square of the difference and physical objects blocking signal. If the distance was already near the maximum, it wouldn't take much to block it.


Fair point. I suppose it's very much device dependant. The P2P stuff I've used was good for a few km.


Wifi normally uses adaptive transmit power and data rates. If the signal gets a bit weaker, your link slows down from say 300 Mbps to 260 Mbps. No biggie.

But sometimes for direct links you set the modulation, power and data rate fixed. The end result is that changing channel conditions can turn the link from 'working perfectly' to 'not working at all'


(Author here) For our Wi-Fi bridge, the devices on both ends were set to max power.

I don't recall them being able to change data rates very much (if at all) because the ones at the beginning of the story were 802.11g devices, and 802.11g didn't have channel bonding capability or similar tricks up its sleeve. Newer equipment definitely has more options like this.


I cannot know whether the story is true, but wet leaves definitely would interfere more with the link. The typical water content (and conductivity) of tree leaves is relatively low, and it could also be a factor apart from the aforementioned sagging due to the water's weight.

It is also true though that rain water has low mineralisation, and therefore low conductivity.


Keep in mind this was a 802.11g network.


I read about random bit flips but haven't read a good anecdote about it, if anyone can be so kind.


I enjoyed the story, but the writing I enjoyed even more. I really liked the tone and wittiness.


Here I thought it would be because it would be interfering with noise from other networks.


My Wifi doesn't work properly when it's raining, can be combine forces?


(Author here) I'm worried they might combine in a way that leaves them not working neither rain nor shine!


I thought it would be because waves travel longer distances on water.


Issue: rain caused tree branch to go lower unobscured wifi


This should have been a microwave link from the start.


>>> Maybe an antenna connector has corroded from spending years outdoors? Nope.

Most people living in large metros will never fathom how wifi will simply stop working in the suburbs. It is easy to forget that Internet cables -normally hidden in cities- are completely exposed to elements in suburbs

Lost wifi while at parent's ? Check the roof!


That would be ironic on your wedding day.


What kind of a tree is it?


(Author here) I'm not a tree expert, so ... a tall deciduous one? Sorry!


Thanks, was wondering about the mechanics. Boughs with leaves would droop but needle trees not so much and would be more transmissive in the first place.


Sounds like it would be a neat excuse to get the children to go out when it's sunny!


"Close the window you're letting all the WiFi out"


bit disappointed ... this is a very obscure set up and frankly speaking under those circumstances sth like "rain affects the wifi [either way]" is obviously not even close to magical thinking. definitely not worth almost 1000 upvotes.


Offtopic.

Not sure how you're styling your links, but in a dark mode view, they are effectively illegible.

https://imgur.com/a/aSbpVF8


TL;DR: A tree grew into the signal path. Rain weighed it down, bending it out of the way.


FYI this being part of "April Cools" series heavily implies it's not a real tech issue but a riff on the "We can only send an email within 500 km" / "Can't print on Tuesdays" kind of articles.


Hi! I helped review this story, and also am one of the organizers of April Cools. Two things:

1. u/obi1kenobi told me it was a real thing that happened to him

2. The point of April Cools is that the things aren't jokes. They're real essays written with care, just outside of the author's usual writing topics. Some of the other ones we got this year are about hydroponics, current events in Sumo wrestling, parenting, and decaf coffee.


the organizers of April Cools.

The org name April Tools was right there!


Sorry, but why would 'April Tools' be better?


I'm not following, better than what?


Why would the replacement name you suggested be better than the actual name?


Oh, I see. It wasn't a suggestion for a replacement, hence 'org' in there since they were talking about 'organizers'. They could have called themselves the April Tools. Just a dumb joke/unmissable opportunity to call some perfectly nice-seeming internet strangers a bunch of tools.


Like the guy said, you can't be taken seriously. That's the problem with being a April fool. You are just a funny entertaining guy. Not a source of information.


> Happy April 1st! This post is part of April Cools Club: an April 1st effort to publish genuine essays on unexpected topics. Please enjoy this true story, and rest assured that the tech content will be back soon!

The post’s disclaimer (and April Cool’s site itself) both imply that the goal is to touch on novel topics and should be genuine content of the author. That said, this story could clearly be apocryphal.


From the article:

> One such piece of magic new to 802.11n Wi-Fi is called "beamfoming"

That's not quite true. 802.1ln has MIMO (Multiple-Input Multiple-Output) processing, with "multiple" referring to the number of receiver and transmitter antennas. Beamforming is a special case of MIMO, and MIMO is a generalisation of beamforming.

In a "Line-of-Sight" channel with no reflectors, MIMO converges to a beamforming solution. Capacity is then limited by the ability for the rx/tx array to resolve each antenna in the tx/rz array: the diffraction limit.

In a "rich" channel, with reflectors, MIMO converges to a more complex solution, which takes advantage of the angular separation of the reflectors to resolve the individual rx/tx antennas, even if they are too close to each other to resolve with beamforming. Yes, counterintuitively MIMO capacity goes up as the channel become more complex/rich and less line-of-sight, whereas with just beamforming the capacity would typically go down.

You can sort of think of MIMO as being beamforming where beams are bouncing off widely spaced reflectors, but even that doesn't do it justice. In reality, each "beam" is replaced with complex wavefront ("mode") which is matched to the environment and each mode is orthogonal to the other.


"They said it does X but really it does X and the superset Y"

In other words, they were correct then?


They are describing a situation in which a line-of-sight channel is replaced with a rich/complex channel: the exact conditions under which MIMO distinguishes itself from beamforming.

I'd say incomplete rather than incorrect, and the complete story is worth knowing as it makes the solution used more interesting.


(Author here.) Yeah, I get where you're coming from. Ultimately, this was an editing decision first and foremost.

Beamforming is cool and magical, and MIMO even more so. The post wasn't intended as a primer on wireless technology, just as a fun read for folks to enjoy. I tried to sprinkle in some nerd-snipe-quality technical detail and offer links for folks who might want to dig in, and MIMO is explicitly discussed in both the 802.11n and in the several links on beamforming I provided.

I barely managed to explain beamforming without that sidenote turning into a paragraph of its own. I don't think I could have done MIMO justice in a sidenote.


I get where you're coming from too.

I felt the need to mention the difference as so many people equate beamforming with MIMO and claim that there is no difference, when in fact that difference, which Foschini discovered in 1996, is responsible for spatial multiplexing and the high WiFi data rates we enjoy. I figured the typical HN reader might be more technical and interested in the difference, or maybe not.


Haha sadly from reading many of the rest of the comments, the typical HN reader seems unconvinced that trees or rain could impede Wi-Fi signals, because their Wi-Fi at home goes through walls just fine. A few even suggested I shouldn't have bothered with Wi-Fi and instead just laid cable across a few city blocks instead :)

But I do appreciate precise language and the desire to help people learn new things, so thanks for helping make the distinction between MIMO and beamforming clear! I hope at least a few more people know about it now thanks to your comments.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: