Interesting how the three separate R/G/B images are taken independently a few ~seconds~ milliseconds apart, with a different color filter placed over the image sensor.
I assume this is to maximize resolution, since no Bayer interpolation [0] is needed to demosaic the output of a traditional image sensor that integrates the color filters onto the sensor pixels themselves. As these satellites are not intended to photograph things in motion, the color channel alignment artifacts seen here are a rare, small price to pay for vastly improved resolution and absence of demosaicing artifacts.
These are pushbroom linear sensors stacked up in the focal plane. The spectral channels are physically separated by larger distance than neighboring pixels in a Bayer grid. There is a time delay between each channel sweeping over the same location that gets corrected when the final imagery is aligned. The moving plane at altitude violates assumption of a static scene and exposes the scanning behavior.
The B2 bomber photo isn't from a push boom sensor, because it would be distorted since it's moving relative to the camera. This would manifest itself as a shear along the motion vector . There is no shear in this photo, only spectral bands.
I've spent a number of years doing image orthorectification, and satellite imagery post-processing, then also worked on Google Earth and Google Maps, so I can tell you precisely what happened. This is a satellite image from a satellite which photographs different spectra at different times. There isn't enough parallax at typical orbital distance for this much displacement to come from multiple cameras. Looking at some parked cars, this looks like 20-30cm resolution per pixel, which is consistent with the newer earth imaging satellites.
Agree it wouldn’t be parallax from multiple cameras on the same satellite but the delay between images seems quite short for physical filter swapping (~20ms per filter).
(Edit: Actually DLP does it in about that amount of time, nvm)
You can get height if you can find the shadow, and find the time that the photo was taken, because at any given time and date, you can compute sun position, and you know the precise lat/lng of the shadow. The shadow would need to be in the same photo, though. (actually, it's a bit to the northwest, just found it).
If you can find a vertical structure, like an antenna mast, you can measure shadow length and direction and if you know the height of the antenna, you have enough information for some trigonometry.
Speed is harder. You'd need to know the difference between exposures.
Thanks! It's conveniently hidden by a grassy area and some trees which throw shadows of their own. Actually, I should be doing something more useful at the moment...
We know that telephone poles stick about 32 feet from the ground. Measuring the shadow in Google Earth above, it's about 20 feet. Now, it's a simple ratio.
Using Earth, I measured the distance from the plane to its shadow at about 2430 feet, that would put it at 3,900ft, which is surprisingly low, but I don't know the height of that pole for sure. It's the right ballpark, though.
> This one was probably on final approach during the summer when the winds are usually out of the south and was probably about 2,000 feet above the ground.
The B2 is based out of Whiteman AFB near Warrensburg, MO so roughly 25 miles or 40 km south of where the plane is so it is probably on its final approach to land or just took off.
> Basically a 1D sensor that acquires a 2D image because the camera itself is moving.
Or a static camera and a moving subject. This same style of camera is used for "photo finishes" in races. The camera is literally the finish line, and the the first pixel to "cross the line" is the winner.
I haven't seen a CCD CMOS, yet, and I don't _think_ microbolometers are CMOS-based (and wouldn't be surprised if no one publically tried).
Happy to see some exotic examples, though.
Those are always intriguing the hacker curiosity.
Most CMOS imaging sensors in digital cameras do scan, but not like a push-broom sensor. The push-broom sensor only has a single line of pixels that physically moves (or that something moves across), like in a scanner or a photocopier. But as others have said, those sensors might also be fabricated with CMOS technology.
Whereas your normal digital camera has a fixed rectangular array of pixels, and the scanning is just reading them progressively (what's called 'rolling shutter') because it's hard to read them all quickly enough at once. Some more expensive sensors do 'global shutter' which usually has better motion characteristics for video.
Exactly! I'm working on some stuff now to exploit this very effect, the small time difference in the bands can be used to find velocities of moving objects.
That's not what I'm doing but it seems possible you could watch refraction patterns and tell that there are submerged objects, but you'd need to know the water depth without that object there, as there'd be no great way to tell if the object is a submarine or a pile of sand. Also, most of these satellites pass over fairly rarely so wouldn't be super useful for tracking things that move. I'm mostly interested in finding the piles of sand :)
It depends on the satellite. Weather satellites in GEO do because they aren't moving relative to the surface and they have time constraints for scanning the full earth disc. Higher resolution imaging satellites use linear sensors. However, they use TDI imaging which at a low level makes them superficially like an area array with limited vertical resolution but the light collection is still fundamentally different.
I don’t think this is the case as we don’t see any rolling shutter distortion. The linear sensor scans just like a shutter curtain exhibiting the same distortion. The plane is proportioned correctly. Maybe the scan time is fast enough.
The google maps scale shows the plane as being pretty close to that length as well. I think the satellite is high enough up that altitude scaling for aircraft is going to be minimal.
Depends on the needed resolution. You can cover a lot more earth, far more cheaply with a satellite with several meter wide pixels.
When I used to work in remote sensing I remember that Bing Maps had a very distinct resolution transition from spacecraft to aircraft. Aircraft coverage maps were a small subset of the total earth land area, but most US towns had some coverage at that time (~2012).
As you zoomed in, you could tell when Bing switched capture methods because the aircraft cameras were clearly off-nadir, and you'd see off-angle perception differences.
They only use aircraft around larger urban areas. Satellite imagery is used for rural areas. You can see the resolution difference by seeing how far you can zoom in.
If you zoom in on a road near the pinned location for this post, and then go to, say, downtown kansas city, and zoom in you'll see a pretty significant difference.
Flying a plane around the planet sounds a lot more expensive financially and thermodynamically than just beaming down images from an object constantly falling into the views you want to see.
It's definitely a mix, depending on the area and the resolution. For urban places where they have 3D views and higher resolution, it's definitely planes. For a random place like this at low resolution, it's satellites.
The satellites are probably greater than 400 km altitude, or so, while the plane is probably 10 km. The plane is in the same rough order of magnitude as really tall mountains, which the satellite is presumably designed to compensate for, so I agree with your general assessment.
The shadow is 750 m away and the sun is about 15 degrees off noon, so the B2 altitude is roughly 3 km. Cousin comment says the satellite altitude is 614 km, so I wouldn't expect parallax effects to be dominant. This is also evidenced by the fact that both the plane and ground are in focus and the plane is measured to be 70 feet long in google maps, which is what it is specified to be in reality.
You're absolutely correct, but as the aircraft is much closer to the ground than the imaging satellite I disregarded it. Spherical cows and all that :) I'm sure a military analyst would do the calculations!
It actually works out to be pretty close. A satellite flying ~17k mph at 250 miles altitude would largely have the same angular displacement from the ground as a plane moving 400 mph at 31,000 feet. Most satellites fly in an easterly direction, so the question would ultimately be what the inclination of the orbit is relative to the direction of flight.
I think this plane is flying much lower and slower. The B2s are based at Whiteman AFB, which is about 25 miles south of this photograph. The prevailing winds in this part of the country are from the south, so I suspect that it is about to turn onto the final leg of its approach for a landing to the south. I’d guess that it’s no more than 8-10K’ and traveling at a 2-300 knots max.
Definitely. Rule of thumb in a jet is to slow down if you're closer than 3x the altitude. So at 8k feet you want to be at least 24 nautical miles (27 normal miles) away or if closer you want to be slowing down (or descending). If you don't slow down before you're going to have to do so in the descend and typical jets can't really slow down much in that.
Most imaging satellites are in sun synchronous orbits so likely in this case moving north or south and slightly retrograde e.g. west.
I'd bet orbital motion is negligible here because the distortion seems to the eye to be entirely in the direction of the plane's apparent motion vector and I don't see any significant skew to that.
That's super interesting! I love this discussion. Would the satellite be that low though? 250 miles (400km) is still going to experience significant atmospheric drag. Fine for a cheap mass deployment like Starlink, but I'd expect an imaging satellite to be set up for a longer mission duration. Then again maybe the cost of additional mass for station-keeping is worth it for the imaging quality.
According to the image on Google maps it was taken by Maxar. Maxar appears to have a few satellites, looks like 5 in geostationary orbit (~35,700km), 3 at about 400km and 1 at about 500km.
Interesting, I would have made the opposite assumption ("the satellite is moving so much faster than everything else, we can estimate the interval from the satellite's speed alone). It seems both speeds might have a similar impact, as per sibling comment.
What an awful picture of the NRO Director. Surely they have plenty of experience in photography in less than ideal conditions. So why have this photo as the first thing I can see on their website?
The juxtaposition of the Ursa Major and Onyx/Vega patches almost gave me whiplash. Judging by the list of patches it seems like NRO is a fun place to work. Thanks for sharing!
Fun is maybe not quite the word I'd use. The juxtaposition of all the wording about "defense" and menacing-looking characters that would definitely be the villain and not the hero in any comic book or movie is... unsettling. Even if it's a self-aware joke about their public image (like the squid giving the whole Earth the hug of death), it's maybe gone a bit too far?
The gorilla holding the US flag is my favourite. And it might just be me, but the Nemesis badge has to be a sphincter, right? It reminds me of the Greendale flag in community.
I like how they quit looking like patches and just turned into stickers. Even with a black budget, they still had to cut corners and reduce spending on incidentals too?
Click the timestamp for the comment, then click "favorite", then set a reminder in your calendar to check your favorited comments (link in your profile).
What do you mean by “copy/paste cache”? If you mean the clipboard history, there’s no such thing on macOS. If you mean the clipboard (= the last thing you copied), Siri can work with the Shortcuts app, which itself can read the clipboard.
Doing it this way allows not only higher resolution, but you can also use different types of filter for specialist applications. Infrared for example, or some kind of narrow band pass filters etc.
Nonsense. Landsat 8 bands 5, 6, and 7 are infrared bands (5 is NIR, and 6 & 7 are shortwave or SWIR), and bands 10 and 11 are lower-resolution thermal infrared (TIRS).
I think what they meant is prior to the "digital farming revolution" if you called up a satellite/arial imagery company and requested anything other than visible spectrum, you'd not only get some strange looks - they'd also alert the government.
I grabbed my red and blue glasses because this vaguely looked like a 3D anaglyph to me, and while it's not the clearest such image, it definitely appears as a cohesive object hovering above a flat map. Further I zoom out, it definitely looks like it. Inadvertent byproduct of however this was photographed or stitched?
Is it possible this is a a pushbroom sensor, where red, green, and blue sensors are moved along the path of the plane or satellite that's taking the image? My understanding is those have a higher effective resolution than frame-based sensors, which would make sense here.
Seems unlikely to me. With push broom, I'd think you'd get characteristic squishing, stretching, or skewing of objects in motion depending on whether the object was moving against, with, or perpendicular to the push broom path. In this case with the imaging satellite probably in polar orbit and the plane flying east, you likely see the plane horizontally skewed as each horizontal row of pixels captured the plane further east.
Also came here to discuss this. Super interesting opportunity to reverse what they're doing. Yeah I'd agree it's color filters which would avoid duplicating lenses (probably the most expensive component) or dividing the light energy. Filters would give you the most photons per color per shot. I wonder if the filters are mechanical or electronic somehow. As another commenter calculated, it's 10ms between shots, which is not that impractical to move physical filters around when you consider modern consumer camera shutter speeds of around 0.125 milliseconds max.
Your guess is exactly right, this type of imaging is pretty common with satellite imaging for the reasons you mentioned. Scott Manley talks about it in this video iirc https://youtu.be/Wah1DbFVFiY
I wonder if there are ways to get superior (compared to Bayer) color images from conventional raw-capable monochrome digital cameras (like Leica Q2 Monochrom) in static scenes using similar workflows of taking three separate shots.
Many cameras, especially medium format like Fuji GFX, can do "pixel shift" capture, where they use the sensor stabilization to shift the sensor by one pixel at a time to ensure red, green, and blue are each captured at all pixels, without needing demosaicing. The camera has to be mounted on a tripod for this to work - like any stacking requires, really.
Initially I thought pixel shift might be the way to go, but stumbling across [0] I got an impression it isn’t necessarily as great as it sounds (and, I guess, neither would be any three-shot RGB-filter-swapping composite).
In fact, the first color photos in the world were taken this way in the early 1900s [1]: take three monochrome exposures onto black and white transparencies with R/G/B filters, and then project the three images together.
Interesting. Sadly, I believe the Photoshop approach would not produce a DNG file that you could normally interpret in a raw processor of your choice. Looks like pixel shift cameras (which apparently still use Bayer sensors, just moving them around, to debayer at capture time so to speak) is the most practical option at the moment.
I’m curious if it is technically possible for a digital sensor to capture the entire light spectrum of the scene, without the need for RGB separation at any stage—similar to Lippmann plate[0].
Though they're not as good in most other dimensions as conventional Bayer filter sensors (most Amazon reviews e.g. say that it's impossible to photograph moving human subjects without daylight iirc).
I looked at Foveon before, and got the impression that it still effectively relies on RGB separation/recombination, though in a more elegant way than Bayer sensors. (That said, Sigma’s doing some really cool stuff.)
It's standard when using reflectic/catadioptic objectives to peek back at where the satellites are: https://www.astroshop.eu/filter-wheels-filter-sliders/starli...
This one has USB, the same style with manual operation via a teethed wheel is like 169 EUR at that shop.
Or consider rigging a burnt-out DLP projector's filter wheel up, which comes with a motor and encoder that you could use to capture imagery with the filters they provide at much higher frame rates (a few hundred fps in the monochrome domain seem usual these days, AFAIK, for good consumer DLP projectors).
That sounds very interesting. I feel like firmware/hardware automation combo that swaps RGB filters and takes exposures in accord could provide interesting workflows for owners of those monochrome digital cameras.
I wouldn't be surprised if these military bombing dudes not only took R, G, B, but also near IR, far IR, UV, and various other things. If they did it would make no sense to try to make a Bayer filter for all of that and rather just have multiple filtered cameras pointed at the same direction.
Different wave lengths travel at different speeds through glass and air and focus differently even on the same lens when color photos are taken in the single frame.
“Rather than taking a photograph, satellite sensors record electromagnetic radiation in the red, blue, green, and high-resolution panchromatic (black-and-white) bands (as well as several not visible to the human eye, as this Mapbox[2] post helpfully explains). When these bands are combined to produce a visible image, fast-moving objects – like planes in flight – don’t quite match up, producing the rainbow effect.
> Rather than taking a photograph, satellite sensors record electromagnetic radiation in the red, blue, green, and high-resolution panchromatic (black-and-white) bands (as well as several not visible to the human eye, as this Mapbox[2] post helpfully explains). When these bands are combined to produce a visible image
That's the same as taking a (normal, digital) photograph (sans additional non-rgb layers), just explained.
The difference is typical digital cameras have a bayer filter in front of the sensor to allow taking a single color picture vs. here it sounds like multiple captures with individual filters (e.g. 1 blue, 1 red, 1 green, 1 monochrome) that are later combined into a color picture.
There's some subtlety here. Colors are percepts- phenomena that happen in the brain. Wavelengths and radiation describe natural electromagnetic phenomena, not percepts.
Example: Infrared has no "color". It's possible to induce the illusion of colors in the absence of light altogether. I appreciate that the article took care with this language, as it's a deep and fascinating topic.
Reminds me of when Russia tried to blame Ukraine for downing the MH17 by producing a satellite image of the plane being chased by a jet fighter and even capturing the moment of launching the missile. Of course none of the moving objects had any sign of the color shifting.
They take a separate high-res monochromatic picture. (As image compression theory tells us, if you can afford to sacrifice some colour resolution for monochromatic resolution -- do it.)
It's interesting how some satellites take the alpha channel pic before the RGB channels, and some do it after. The lag also changes. Alpha & RGB have similar lags for some, but alpha is way off on others.
All the B-2s are based at nearby Whiteman Air Force Base. They fly over often -- one even fly over my farm today about an hour ago.
This one was probably on final approach during the summer when the winds are usually out of the south and was probably about 2,000 feet above the ground.
On Google Earth you can use the timeline feature to actually see when the imagery was captured (or at least uploaded?). The plane sitting near the hangers was from a capture in May 2016. Google Earth currently shows several newer captures, including one from September 2021. The current one from September 2021 is actually far more interesting in my opinion. In it, you can see a B2 sitting in a field about a quarter up the runway (38°43'27.63"N 93°32'54.86"W) with stuff scattered all around it and trucks sitting on the runway. https://i.imgur.com/lh1hkWN.png
Ah very interesting! I saw the skid marks on the runway north of the plane and was wondering if it was an emergency landing situation considering that it was just on the side of the runway. Pretty cool to read the article and get the other half of the story.
"According to sources, the B-2 experienced a hydraulic failure in flight and had its port main landing gear collapse during landing, sending it off the runway with its wing dug into the ground. We cannot confirm that this was the case at this time, but the satellite image above does concur with the gear collapse/wing down aspect of the incident. While it is possible the aircraft was rolled off the runway after the fact, this is unlikely, especially considering its wing-down disposition. The damage to the aircraft also remains unknown. "
Yeah I panned over to the munitions bunkers (I think there is a better name I forget) and there seemed to be so few, but I guess you don't need much space for nukes...
There's really no point. A few high sensitivity assets are lower resolution, but any state actor that wants very high resolution shots of military facilities already has it and on a quicker update cadence than google maps allows.
This isn’t an answer to your inflammatory question, but there are only a handful of nation states with nuclear weapons, and France is one of them.
These aren’t hand-me-down US warheads on the back of a missile truck: the flagship of the French navy is a 40,000 tonne nuclear armed, nuclear powered aircraft carrier floating above a strategic submarine fleet that can destroy the world.
Are we talking about the same France? France, one of Europes largest military powers which actually use their military in various conflicts around the globe?
Maybe their argument is that France has nothing to fear from anyone, so why the secrecy? Of course it's nonsense no matter how you spin it. All militaries love secrecy (except maybe the ones that are just for show, but I expect even the Swiss Guard in the Vatican has its secrets).
I'm guessing their comment is from the view of the French being "cheese eating, surrender monkeys", a view not supported by France's long military history. This view was supported, for at least a few years, by a Google "I feel lucky" search for "French military victories" taking you to a Google search mockup site which suggested "Maybe you mean: French military surrenders".
Apart from the fact that the view of France as "surrender monkeys" is nonsense, it also makes no sense as an argument that their military wouldn't be interested in secrecy.
Someone correct me if I'm wrong, but I don't think the biggest threat to domestic military facilities are foreign state actors but rather domestic terrorists, and those usually don't have access to any military-grade satellites.
Interesting that you mention only domestic military facilities... When I was deployed to Afghanistan ten years ago, with rockets and mortar rounds landing daily inside the perimeter of our Forward Operating Base, I was irritated to see recent and high-resolution satellite imagery on Google Maps. I can see my sleeping tent, office building(s), and parked helicopters in perfect clarity. The imagery hasn't been updated since then, so you can still see everything despite this base being abandoned years ago.
https://www.google.com/maps/@33.9428134,69.0629605,492m/data...
Do they need military-grade imagery? The Chinese or Russian militaries do because they’re trying to estimate war-time performance but I find it hard to believe that there are any domestic terrorists building sensor arrays to support SAM batteries. If they’re trying to blow one up, the resolution on a consumer drone or telescope would likely be more than enough.
For my family's semi-rural home I'm seeing that I can buy 0.5m (500cm) imagery from the last year for $250. No historical images with higher resolution I can buy. To task a new a photo at 40cm resolution would cost $1,800.00 from KOMPSAT-3A. (and it'd be a 10km x 10km area)
Not all of them. Some of them just don’t offer photos of that area, even though they offer literally every other inch of the earth in at least SOME scan, a lot of these are missing.
The best air show I've ever been to was at Will Rogers World Airport in OKC. I figure some of it must be the proximity to Tinker, but also other bases in the area. Back in 95 (could have been 96, but not any later, as I left the area late that year), they brought a B-2 to the show, flew it around the airport for a while, landed it, waited a while so people could get take as many pictures as they wanted, then took off and flew it home. At the same show, they brought an F-117, did the flyby, and landed it. Then taxi'd it over to a static display area so people could get close. I was able to walk underneath it and touch it. As a young airman, I was absolutely floored. And this was in 95 or 96, so we were still really impressed with how exotic these planes were.
I remember being part of the staff of NATO's only MIG squadron during military service. People considered a MIG within NATO a sensation, while I was simply bored. I saw them every day in droves. ;)
Apparently prior to getting their hands on a MiG-25 from a defector who took one to Japan, many in the western military and intelligence communities envisioned it to have super avionic powers (that persisted for a long time in the civilian sphere). They thought it was agile and fast. What they didn't know is that it could only sustain those speeds for short spurts (else the engines suffered from metal fatigue due to temperature) and the maneuverability they envisioned from the geometry proved to be false (it had a very wide turning radius)...
The Romanian MiG-21s [0], Polish MiG-21s (when in service, before retirement) and MiG-29s along with the Bulgarian MiG-29s were upgraded to NATO standards. The primary upgrades were the radios and IFF transponder. I'm not sure if they all included
It is widely rumored that certain spare parts for NATO MiG-29s are actually made in US, from sources initially supporting the Adversary training missions of the 57th Wing Nellis AFB and US Navy Adversary training missions, including TOPGUN department. This is primarily for the MiG-29 and Su-27 fighters.
Yeah I remember being thrilled seeing them while canoeing on the Gasconade when I was a kid in the 90s. I believe that we saw one refueling behind a KC-10 in the air as well.
I grew up in the south of the state but I remember them flying overhead when I was a kid now and again. Always stood out like a sore thumb, left a particular contrail too as I recall.
The aircraft seems to be a Northrop Grumman B-2 Spirit.
> The Northrop (later Northrop Grumman) B-2 Spirit, also known as the Stealth Bomber, is an American heavy strategic bomber, featuring low observable stealth technology designed for penetrating dense anti-aircraft defenses. Designed during the Cold War, it is a flying wing design with a crew of two.[1][3] The bomber is subsonic and can deploy both conventional and thermonuclear weapons, such as up to eighty 500-pound class (230 kg) Mk 82 JDAM GPS-guided bombs, or sixteen 2,400-pound (1,100 kg) B83 nuclear bombs. The B-2 is the only acknowledged aircraft that can carry large air-to-surface standoff weapons in a stealth configuration.
My father-in-law helped design the pilot seat for that.
Unfortunately he was unable to make the transition to computer aided design. After the defense cutbacks in the 90s, he could only find minimum wage jobs for the rest of his working life.
In case anyone else is reading and thinking of that predicament, it needn't be a showstopper.
You have a couple of options:
You can do it the old way, and pay someone to convert it into the new way (I know a mechanical engineer who either sketches things or does 2D on a computer, and pays someone else to 3D it) which can then be sent to the client or manufacturer or whatever.
Or, if you're doing the whole product to sell, you can do it however you want, no one cares how their product was designed, just that the end product is good. I don't know how the housing on the monitor I'm looking at was designed, and I don't care, I just care that it's decent.
I feel like I count rant about this for a while, but all the "rules" change if you're taking clients or customers and not looking for FTE. It's got its own challenges, but with a few good / high paying clients, it's even easier than FTE, you can grow or not if you want.
If not, the defense contractor will subcontract to you and mark it up 4000% :)
Also, the government generally has policies that some % of things have to come from small businesses, sometimes they'll hire you just in order to check a box, I'd guess.
Nevertheless, if you can design an airplane seat, there are other customers than the air force.
Lots of niche machines are made by 3 or 4 people and sold for hundreds of thousands of dollars per unit, to farms, to factories, to anyone in business. Consumers will argue with you for a week over a $10 purchase, but people buying things to make money with won't bat an eye at $10,000.
There's a great big world out there and every hurdle doesn't have to be jumped, you can just walk around it half the time.
Of course he's doing well, he has the best son-in-law ever ;)
Jokes aside, he's doing fine. They got by on my mother-in-law's teacher salary and were lucky enough to get in on the California housing game in the early '80s. They mostly don't have to worry about the bills, and all their kids live within 15 minutes of them - we see them around 1-2x per week. And if they ever did run into issues, one of us could take them in.
I just like to keep it in mind because I've been subject to fairly good fortunes when it comes to career. Same thing with all my relatives. But my wife's family has had more bumps along the way.
My dad went through a similar transition. In his case, the world transitioned away gradually from his skill set. He was able to continue moving to new jobs that needed his legacy skill set, until eventually he couldn't find them easily anymore. By the time he was in this predicament, he couldn't find a a path to restart his education. The critical issue was employment, once he found a job, the critical issue was working and keeping that job. He didn't have the luxury of taking the time to build a new skillset.
I don’t think this is true. I think responsibilities pile up and/or the need to learn more is less. The physical ability to learn is probably hardly impacted.
That's a nice idea but definitely not reality. Just try learning a language at different stages of life. I guess you could argue it's still the responsibilities holding you back but the difference is so stark that it's clear it's not just that.
Learning a new language after decades of reinforcing the knowledge of a native language is hard. I’ve attempted multiple times. Personally, I don’t think it has much to do with physical ability as much as baggage from the previous language.
I am right handed. Learning to write neatly with my left hand would be difficult too. I don’t think my brain is less capable of doing it though.
The B-2 Spirit is insane. People usually know the SR-71, but I personally think the B-2 is straight up savage - aesthetically alien, impossibly aerial, and incomprehensibly powerful. Its silhoutte is terrifying: https://i.imgur.com/WqMvxXg.jpg
The measured wingspan (using the Google Maps distance measure tool) is around 200 feet whereas the measured wingspan of a B-2 on the ground at nearby Whiteman AFB is the nominal 172 feet, so its altitude AGL is roughly 14% of the altitude of the camera plane that took the photo.
Good math. I don't know why everyone else is assuming this photo is from a spacecraft. Virtually all of the aerial images of America on Google Earth are taken with aircraft, not spacecraft.
Interesting, but I'm not sure that is dispositive. That company is credited for images even where they are obviously aerial. Google Maps images are an algorithmic mash-up of many sources, I imagine they credit partial or possible sources conservatively, to keep from omitting any potentially relevant rights.
In many tiles they don't credit anybody, so the absences of Google's own image credit doesn't mean anything. The first party does not need to credit themselves.
Would be disappointing for them to remove it for basically no reason, especially considering you can see a B-2 parked on the tarmac (https://www.google.com/maps/place/38%C2%B043'29.9%22N+93%C2%...) in the nearby Whiteman AFB. The B-2 was first flown 32 years ago. Its existence is neither covert or secret, but often used to do flybys at freaking football games where thousands of recording devices are present and active. There's virtually no national security risk with these images because any imagery obtainable by Google Maps would've been obtainable by whatever adversary long ago, plus some more.
Correct, but don't misunderestimate the power of knowledge.
Let's assume that some adversary is out there keeping track of these planes, and knows that they can scan Google maps for the RGB artifact by the engines to locate them reliably and programmatically.
Now Google maps becomes a repository for information that adversary may use to validate other information. Maybe this confirms the schedule for some training excersise. Maybe this particular B2 being in this location validates or invalidates other information about US troop strength abroad.
I was taught to always "assume your adversary is capable of going through your trash" and try to prepare my "trash" accordingly.
The fatal premise is assuming Google Maps updates these areas frequently enough to provide useful info. The military security implications of Google Earth have existed since its launch and Google Maps by relation is no stranger to it. This is not new info and it hardly counts as "knowledge" much less with "power".
Reconnaissance satellite technology isn't US-exclusive (https://www.dw.com/en/modern-spy-satellites-in-an-age-of-spa..., Ctrl+F "Spy satellites in numbers") and those who have a meaningful need to track B-2 movements most likely have their own tools that are up to date, more accurate, and not bound by laws or regulations a US domiciled company is subjected to. What we see on Google Maps is almost guaranted to be 100% "trash" from a military intelligence perspective because actual valuable information has always been obtainable without Google Maps for entities who are capable and in need.
Similar to how the president brings his own toilet when doing foreign visits so enemies can't analyze his poop to see what medicines he's taking or what medical conditions he has.
Spare a thought for the presidential aide responsible for safeguarding POTUS's poop, so that it doesn't fall into the wrong hands. I'd hate to be the person assigned to "doodie duty".
Honestly with the known ties between the US government and Google, if I were a state-level adversary I'd probably be wondering why Google wasn't scrubbing the plane programmatically, rather than assuming that I had caught someone in a 'Gotcha!' moment.
In other words; this google maps link has been circulating around the net for the past week and a half at least -- seemingly originating from either a Discord or the chans; if I were some foreign intelligence analyst I'd definitely be considering the premise that these photos have been so widespread and uncensored simply as an online show-of-force.
( personal anecdote : as an amateur aviation enthusiast/plane-watcher, the B2, as majestic as she is, isn't particularly rare to catch in the skies -- and nowhere near 'secret' )
i doubt they'd remove it for security reasons - as you say, there's nothing secret about what this plane is doing here. more that they'd remove it for the same reasons they remove clouds or any other in-air objects that obscure the thing they're actually trying to take photos of.
the plane isn't part of the map, so it shouldn't be on the map. even if it's not obscuring anything, they use satellite photos to generate building outlines, and this isn't a building. given how many planes there are flying all the time, and how infrequently you see them on google maps, they must make an effort to publish satellite imagery that doensn't have planes in it.
I wouldn't expect it to get deleted for any kind of security reason... but it will get replaced, and possibly faster because there's a defect. Moving object artifacts are undesirable and make the image more difficult to use especially for automation (such as Google's registration). They tend to get knocked out automatically over time as the composition algorithms try to keep a neatly consistent scene. They may even be handled as clouds depending on which methods Google uses to avoid cloud cover. This is the same force that has slowly worked most alignment and registration marks out of Google imagery (for a long time aerial imagery usually contained registration marks etched into the camera optics), although you will still find them especially in areas that are more challenging to register by machine vision (deserts, etc).
Aircraft in Keyhole and Google Earth used to be extremely common before the composition methods improved and more imagery sources became available. You could just about make out the traffic pattern at some airports. You can still find them but they're much rarer today.
Ugh, so when Google does a normal map update, unless there just happens to be a B-2 flying in the same spot, the conspiracy nutjobs are going to think there's some Google / US government conspiracy.
The truth is the fact these planes exist and that there is a base nearby is public knowledge. Any country that cares already has its satellite imagery at much higher resoltion and realtime.
Bit of an aside but I like that the B-2 Spirit has different "Spirit of..." for each aircraft. There are Spirit of New York, Spirit of Ohio, etc. [1]
Also if you're ever in Ohio and you're an aviation or engineering geek, or have kids, or are looking for something to do the USAF Museum is pretty cool! [2]
I visited the USAF Museum earlier this year. It's breathtaking just how large the B-2 Spirit is. You can know the plane's dimensions going in and it'll still blow you away standing next to one.
Interesting how you can see the chromatic aberration on the bomber but not the ground. I guess that this implies that the optical and/or software correction that they're doing only works within a _very_ narrow focal plane, given the relative proximity of the bomber to the ground.
This is kind of surprising, because if the tolerances for avoiding chromatic aberration are that small, whatever is collecting this data would have to be constantly adjusting its optics or software based on the topography.
okay so i'm not sure how accurate any of this is since it's overlaying sun angles over a photo that is very likely not taken from a 90deg overhead angle to the bomber... and the scale of the image may be distorted a little...
I discovered on google earth that the image was taken May 16th.
Then I lowered the height until the sun angle touched the location of the shadow and I came to an altitude of 1100ft / 335.28m (rough height of the "structure")
One of these flew low over my house in Kansas City last year. It was scheduled to do a fly-by of a local hospital so I was out on the porch watching for it. I saw it pass several miles to the south and went back inside thinking the show was over. A few minutes later I am sitting on the toilet when the house literally starts shaking. I run outside (barely getting my pants up...) in time to see it come right over top of the house at low altitude! (Apparently it had just been circling before to line up for the approach.)
I guess this comment would have been really insightful in 1910. It didn't wait for the satellites to be the norm for the underdog in a global war see [1] and [2].
Why would this be blurred? There is nothing in the picture that's not already known in far more detail. I have some B-2 pictures with way more detail I took in Pasadena during the Rose Bowl.
Offtopic, it takes me 6 clicks to access google maps (click to “customize” privacy options, select “off” three times, “continue”, and “use web” when asked about downloading an app). I’m on iOS and use safari in private mode. How are others dealing with this? Do you accept the terms or do you do something different?
My Safari doesn't, in private mode, eventually, open the link at all: after customizing the privacy options (all off), Google still tries to bounce me through googleads.g.doubleclick.net, and I have that blocked.
The way I “deal with this” is that I have a work Google account which I've signed into. With that in place, the link just opens. Obviously I have clicked "I accept" on a variety of forms saying Google is allowed to track my activity as they see fit.
I guess you can run a browser without ad blocking and the link will open after accepting the terms. However, I prefer to still block ads even in private mode for various reasons.
I would love to know if this was found other than randomly poking around and if so how it worked specifically. "out of place shapes in the middle of fields" might find out some other neat stuff too.
I was so confused over how a small GA aircraft would be abandoned in the woods just outside the town where I lived. Until I realized it was probably flying when being photographed!
People use Google maps
for all kinds of things. Example checking out land for real estate purchase. Or to go travelling virtually. Millions of eyes on points around the globe and you’ll find something.
This makes me wonder. Obviously everyone in the world knows that USA operates a few dozen "stealth bombers" and has for the last couple decades, or so. Has any other country developed these kinds of aircrafts? Similarly, submarines. USA operates many stealth submarines as well. Does any other country operate stealth submarines?
I'm aware that USA spends about 1 tril a year on military stuff but these just seems bizarre to me.
This technique of using different filters on top of a monochromatic sensor is also used in astrophotography, it helps with image resolution. You want as much pixels per inch as possible so instead of placing multiple narrow band receptors in your sensor you use a single wide band receptor as your pixel and place a filter in front of the camera, for a RGB camera for example you triple the total number of sensors by doing this.
To many of us these are technological marvels. Defenders of our lifestyle and enabler of our ambitions and dreams.
To many people in the world these are insidious death machines that lurk unimpeded and can rain down death on entire villages. The embodiment of evil.
Without peeling back the veneer of right and wrong, I think it's worth pausing to both appreciate the wonder of the machine and to consider the implications of its existence.
I was googling to see what shape matched that and found this. Looks like not the first time this (probably b2?) type of plane was caught on candid camera ;0
This makes me wonder if military operations also take into account satellite positions when timing operations. Theoretically, having access to a satellite live feed and good processing software, one could potentially detect stealth bombers visually, probably even at night thanks to widespread light pollution (the plane would appear as a dark spot). Or am I missing something?
Awesome! There was a commercial jet caught in the imagery of Downtown San Mateo for about a year, and I always loved looking at it when I was looking up an address. It's cool how much more pronounced the separate RGB is - It was pretty clear even on that commercial jet, but it's very psychedelic here.
Id be surprised if the military is bothered by this. I have nothing to do with military etc but assume China/Russia have high res satellites taking photos/scans across airbases on a regular basis and have much better quality to investigate the technology than a google sat image.
If you look at the photo of the B-2 on the ground I think you can make our the 14 hangars where they would park them overnight (I heard street crime is pretty hot on Whitman AFB) to the right of the staging area.
Given the distance between the different colors in the photograph and the internal refresh rate of the camera one should be able to calculate the speed that the plane was traveling at.
Military adversaries have plenty of sources of their own imagery, and the commercial imagery industry is large and international. It's just not a tractable problem to try to keep people from obtaining good aerial imagery of military installations, and it's become fairly rare for anyone to try. We know that various military organizations have taken active measures to prevent imaging of sensitive activity (e.g. returning vehicles to same parking positions before passes of known IMINT satellites) but it's not a very easy thing to do, especially in the US which is heavily covered by both commercial and foreign satellites in various orbits like sun-synchronous and Molniya.1
Some military installations do receive a degree of protection by means of restricted airspace which mostly prevents commercial aerial imaging, meaning that only lower spatial resolution satellite images are available. But even this isn't really that common, and there's no systematic restriction on commercial imagery operators overflying military installations if airspace permits it.
I put forth the question because I know that in some places (i.e. Northern Ireland), police stations and army barracks are obfuscated on Google Maps to prevent terror groups from using it to easily gather information for mortar attacks.
I suppose the chances of that happening in a remote part of the United States is much smaller but that with the resources the U.S Department of Defence have, I would have thought that they would take every precaution.
Edit: It appears it's no longer done in Northern Ireland.
In Sīnīs, this kind of thing wouldn't see the light. We aren't supposed to see anything like this. Also that bird is beautiful, if only it weren't designed to kill.
Now that we have an exact match of what a plane like this looks like from the point of view of a satellite, one could scan the entire Google maps for this same image.
it looks neat. so much to learn in this thread about satellite imagery and calculations.
it makes sense to use crowdsourcung to watch earth with real-time satellite imagery, and later augment it with machine learning to find potentially interesting objects.
I would be more excited with UFO findings, or a redbull base jumper, or proverbial superman/ironman flying.
I'm sure you wrote this in jest, but in case people aren't aware: The "Stealth" in "Stealth bomber" refer to hiding the plane from anti-aircraft defenses like radar, infrared, acoustics and some optical visibility, but the anti-reflective paint that is used for hiding it optically is only on the underside, not on the top.
I wonder if imaging from above could actually be a threat for these planes. I mean -- stick cameras on all the SpaceX satellites and you've got an awful lot of eyeballs in LEO, right? Might at least be able to tell a ground based radar where to focus their search...
Pretty sure you need powerful lenses to do useful things though, but who knows, maybe this is workable with some image processing of multiple satellite feeds?
Yeah, I'm sure somebody is already doing good work on whatever the photographic equivalent of Synthetic Aperture Radar is. They should call it Synthetic Lens Aperture Photography, so they could use SLAP as the acronym (the most important property of a research topic is of course the ability to come up with good titles).
For example:
SLAP BASS: a Synthetic Lens Aperture Photography, Bandwidth Augmenting Sensor Suite
Does it make technical sense? I have no idea. But it sounds cool!
During the 1999 yugoslavia bombings, after the first one was shot down, there was a joking apology on the TV, something along the lines of "Sorry, we didn't see it" :)
Not just the “F” (“A” would be expected from role), also the “117” was anomalous (under the standard numbering convention, it should have been something like the 19 if it was in the F series, which is the designation that speculation about it was typically associated with before it was officially announced.)
Sort of. IIRC they take quick back to back photos with different filters that share an image sensor, so the smearing you're seeing is the combined motion vectors of the satellite and the aircraft.
you can see the engine exhaust plums, and that would be pretty bright in IR. Very hard to hide burning that much fuel. If you magnify a bit you also can see that the engine exists are pretty bright even in visible wavelength (though that isn't visible from the below and front of the plane).
These are the sort of posts I can do without. It's like the stuff that goes to the top of reddit. Reddit has a way to keep frontpage of your home page for a reason.
I assume this is to maximize resolution, since no Bayer interpolation [0] is needed to demosaic the output of a traditional image sensor that integrates the color filters onto the sensor pixels themselves. As these satellites are not intended to photograph things in motion, the color channel alignment artifacts seen here are a rare, small price to pay for vastly improved resolution and absence of demosaicing artifacts.
[0] https://en.m.wikipedia.org/wiki/Bayer_filter