Satellites Spotting Aircraft

233 pointsposted 8 days ago
by marklit

67 Comments

m2fkxy

7 days ago

A little nitpick about:

  The number of looks correlates with a higher resolution.
Yes and no. When you task an image, you usually (as is the case with Umbra) specify your desired ground resolution, eg. 25, 50, 100 cm, etc. There are two dimensions in a SAR image: range and azimuth. Range resolution is determined by the SAR system bandwidth. Azimuth resolution is determined by the integration angle (the angle formed between your target and your satellite from start to end of the collection).

Let's assume you want a 50 cm image. Your range resolution will be equal to that and, in a 1-look image, your native azimuth resolution will also be 50 cm. What happens when you request a multi-looked image, is that the satellite will collect data over your target for a longer amount of time (and thus over a greater angle diversity). Range resolution will not change; however, in the natural ("native") image, you get asymmetrical pixels: taking the same target resolution of 50 cm, a 2-looks image will have a 25 cm azimuth resolution. For 3-looks, ~16 cm. And so on.

What then happens during the processing of derived products (eg. GEC) is that the pixels are squared: to do that, you have to average out the pixels in the azimuth dimension. This greatly improves what is called the radiometric resolution (ie. how much information a pixel contains), by cancelling out the speckle and averaging the noise. But for all intents and purposes, on a multi-looked image (which is what the GEC products that you use are), spatial resolution remains the same, square pixel.

[SAR nerds here: I am not mentioning the slant-range-to-ground-range process, and I am also ignoring the resolution vs. sampling distinction for simplicity]

OJFord

6 days ago

I know nothing about any of this, but:

> Azimuth resolution is determined by the integration angle (the angle formed between your target and your satellite from start to end of the collection).

For any non-zero value (geostationary), wouldn't it be a quadrilateral rather than an (single) angle? Or is it, measured from Earth, the change in angle from 'straight up' to satellite? But then how would the satellite calculate/observe that?

Or is that what you're saying the resolution derived from that is, the ground distance that that same angle moves over in the time taken?

jofer

7 days ago

Quick geospatial note that's important for accurately geolocating these images:

You need a DEM to use RPCs for geolocation. Running things through gdalwarp as was done here will assume no terrain and 0 elevation. That will lead to significant mislocations anytime it's not flat and at sea level, especially given the off-nadir view angle of the data used here.

In other words:

  gdalwarp \
     -t_srs "EPSG:4326" \
     2024-05-25-15-37-54_UMBRA-06_GEC.tif \
     warped.tif
Should be:

  gdalwarp \
     -to RPC_DEM=some_dem.tif \
     -rpc \
     -t_srs "EPSG:4326" \
     2024-05-25-15-37-54_UMBRA-06_GEC.tif \
     warped.tif
If you don't want to use a DEM for orthographic corrections, then you should at least include a constant elevation in meters of the scene with RPC_HEIGHT. Otherwise things can be shifted kilometers from where the image actually is.

singleshot_

6 days ago

What about the fact these are all pictures of airports? Does that control for "any time it's not flat" or are you talking about really, really flat? Or, given your mention of angles, do you mean flat as in perpendicular to the observer?

jofer

6 days ago

"Orthorectified" (i.e. on a map) means both. It's corrected to a perpendicular to the observer view _and_ also corrected for scale distortion due to elevation (the top of a mountain is closer to the sensor and therefore appears larger than the valley beside it). For off-nadir imagery (i.e. not looking straight down) the view angle plays a larger role than the scale distortion part.

With that said, raw elevation matters too, especially when the imagery is off-nadir.

RPCs describe a view direction + lens distortion / etc. A completely flat surface at elevation will be in a different location than a completely flat surface at sea level. Things will also be a different _size_ as well depending on elevation.

Think of the RPC as a vector looking a specific direction (at a single pixel level, that's kind of what it is, though each pixel gets a different one). You're placing a point on the map where that ray intersects the Earth's surface. It intersects at a different point depending on the elevation of the surface. Projected (aka orthorectified) data needs to correct all of that to be able to place imagery on a map accurately.

If the airport is completely flat, then using a constant elevation is fine, but you still need to use that elevation and not assume it's at sea level. Otherwise it will be in the wrong place on a map and also a slightly incorrect size.

marklit

6 days ago

Thanks for clearing that up.

skinwill

7 days ago

A company called Erim back in the 70's pioneered SAR systems. They had an interesting approach to processing and storage of the data which was all analog back then. They used holography. Erim and their SAR work is documented in wikipedia but their company presidents obsession with holography was detailed in their monthly news letters. I was handed some of these documents from a former (now deceased) worker at Erim. He told me how they were able to fly the massive radar systems and the huge analog computers they used to process the data. It must have been absolutely wild to work there in those days.

tomcam

7 days ago

Man there’s everything to love about this. A novel free dataset, complete example code, clear description of how to get it up and running—the ideal blog post IMHO. Delighted to learn about it.

1f60c

7 days ago

> Umbra has an open data programme where they share SAR imagery from 500+ locations around the world

I think they're referring to this: https://umbra.space/open-data/ (warning: most files are absolutely ginormous)

1024core

7 days ago

Is there a mapping from, say, a geo location to a list of Umbra files which cover that location? I'd hate to download 42TB of data and then have to grep through it :-D

marklit

7 days ago

The JSON files in the S3 bucket can be downloaded in isolation from any other format and they're pretty small. I did some geo-enrichment of them in a previous blog post where I found the country and waters of Umbra's ship imagery. https://tech.marksblogg.com/yolo-umbra-sar-satellites-ship-d...

The above relies on network calls to OSM but with Capella, I found a way to get the smallest GeoFabrik partition for each image using a single GeoJSON file. The code could probably be modified to work on Umbra's feed with a bit of work. https://tech.marksblogg.com/capella-open-data-free-satellite...

Lastly, Umbra have ~24 locations they image frequently but there is an 'ad hoc' folder with a lot of subject names in English that give away the location and subject matter. This might be easier to look through for interesting imagery.

aws s3 --no-sign-request ls 's3://umbra-open-data-catalog/sar-data/tasks/ad hoc/' | grep -i 'tesla\|nvidia\|saudi'

afm-umbra

5 days ago

Hi! Our Archive Catalog STAC API indicates which items are in our Open Data Catalog on AWS. You can use the STAC Search endpoints to filter by this property alongside location. Instructions in our docs:

https://docs.canopy.umbra.space/docs/archive-catalog-searchi...

Unfortunately we don't directly link to the s3 assets for those items at this time :( So you'll still have to pull the list of files and grep by task ID.

gloyoyo

7 days ago

Things like this would instantaneously make the world a better place.

Luc

7 days ago

> Below is Umbra's image of the same location. Though it was taken on a different day and some aircraft might have been moved around, you can see that a lot of the aircraft in the bottom left are barely visible unless you zoom in very closely and pay attention to artefacts that give away a large man-made object is present.

Bad example, because the radar image simply shows a different situation with all but two of the aircraft not present. The two that are present are easy enough to spot.

Here's another image with 5 aircraft present (including the two from the radar image). It's rotated, the aircraft are in the top left: https://x.com/___Harald___/status/1825362047061971309/photo/...

greggsy

7 days ago

I think it successfully got the point across that visual vs SAR have different benefits.

shitlord

7 days ago

Would the F-35 actually show up on SAR or does the radar absorbent material distort the image somehow?

KeplerBoy

7 days ago

It would certainly show up. Maybe not as a recognizable plane, but at the very least you could make out the plane-shaped shadow (or more accurately lack of reflection).

m2fkxy

7 days ago

Yes, radar shadow is how you would find it. But your sensor must be performing well enough for it to be able to distinguish a shadow from the surface backscatter (ie. radar signature) the aircraft is sitting on. This is usually not a problem for rough surfaces (eg. grass or dirt, or some types of pavement), but it can be more problematic for surfaces with more specular scattering.

KeplerBoy

7 days ago

For those less obsessed with SAR images: calm water surfaces are good examples of surfaces with specular scattering, they are basically black/extremely low magnitude areas in SAR images. Example: https://x.com/umbraspace/status/1831111648498810967

I wonder if the tarmac/runways used for stealth planes take this into account and are somehow especially smooth or otherwise special. Also how would such an image look like, I guess one would still see some multi-bounces between aircraft and tarmac making it's way back to the SAR antenna.

m2fkxy

7 days ago

  For those less obsessed with SAR images
Touché!

Also good point regarding multibounces and multipath. I would expect eg. landing gear returns to be standing out in those cases (cockpit too, although probably the canopy is coated to prevent radar penetrating).

KeplerBoy

6 days ago

I meant no offense, I hope there was no misunderstanding.

I'm also knee-deep in SAR stuff these days.

m2fkxy

6 days ago

not at all, I love it too

walrus01

7 days ago

If some guy on the internet can do this in his home office, imagine the SAR and analysis capabilities the NRO and NSA have built in-house.

sangnoir

7 days ago

"Some guy" who happens to be a consultant with a long list of household-name clients, and seems like a GIS pro

walrus01

7 days ago

Indeed so, I also meant the hardware budget/hardware resources to acquire the data before it's processed. I'd bet it's a solid theory that some of the large things launched in those large payload fairings of NRO satellite launches in the past 20 years are SAR satellites, with billion+ dollar budgets.

seoulmetro

7 days ago

In the blog there's a video of what looks like cars moving around, but the lighting is constantly changing. How does that work?

It looks like a timelapse but then there are cars doing normal car things.

https://www.youtube.com/watch?v=cwDjJqtx_og

mikeiz404

7 days ago

My guess is they are steering the radar beam from something hovering overhead.

The video links to a paper [1] titled "Airborne Circular W-Band SAR for Multiple Aspect Urban Site Monitoring" which mentions beam steering in the abstract.

1: https://ieeexplore.ieee.org/document/8701523/

seoulmetro

7 days ago

That's what I thought too, but how can there be such significant shadows? Wouldn't the shadows of any wavelength they are sending be near nothing?

Waterluvian

6 days ago

It’s really cool to see the kinds of stuff I learned in school using closed source turnkey remote sensing and GIS software being done in a more open, programmatic way. There’s just this far nicer feel of control.

ck2

7 days ago

Is bouncing radar from above on an active commercial airport an okay idea?

snakeyjake

7 days ago

The vast majority of SAR satellites are below 500W and have a beam "footprint" of about 5km by 5km at minimum.

Imagine 5 100W lightbulbs reflected and focused to channel as much light as possible onto a 25 square kilometer field from hundreds of kilometers away.

The planes are fine.

m2fkxy

7 days ago

yes, received power is inversely proportional to the distance squared. it's hard to overstate how ridiculously small the amounts of power are at play with spaceborne SAR systems.

Sanzig

7 days ago

Sure, happens all the time. Synthetic aperture radar satellites operate in different bands than air traffic control radar, so there is no risk. Your house is scanned at least once every few days from wide swath radar satellites like Sentinel-1.

Horffupolde

7 days ago

What for?

schoen

7 days ago

https://en.wikipedia.org/wiki/Sentinel-1#Applications

Sounds like some disaster response, some scientific, and some commercial uses.

stavros

7 days ago

And, obviously, lots of military uses.

schoen

7 days ago

I don't know how it's applied in practice, but apparently the treaty establishing ESA specifies that its space activities will be "for exclusively peaceful purposes" and they apparently have no missions that are classified or have any declared military purpose.

I think there are other non-ESA European (and national) agencies that work on military use of space. For example, there's apparently an "EU Satellite Centre" based in Spain that is concerned with military use of space (and is not part of ESA).

https://en.wikipedia.org/wiki/European_Union_Satellite_Centr...

I do imagine that military users can probably buy or access data from ESA sources on the same terms as civilian users and that they probably do so.

stavros

7 days ago

Sorry, yes, I meant non-ESA, just satellites in general.

rurban

5 days ago

They have much much better resolution though. They can detect car plates, and probably also people looking up.

stavros

5 days ago

Do they? From the declassified Trump photo, it seems like the maximum resolution is a few tens of cm, very far from detecting car plates.

rurban

4 days ago

Remember that the CIA has 4 hubble's, the fifth spare one they did donate to point outwards. Of course their capabilities are highly classified, but it leaked.

0cf8612b2e1e

7 days ago

It used to be a joke that satellites could read your license plate from space. Then I see commercial images like this and I am less sure.

ianburrell

7 days ago

This is a SAR satellite which uses radar to produce altitude map. I thought the resolution would be bad but it is really good at up to 16cm. This isn't going to read anything since it isn't a camera.

The other thing is that commercial satellites are lower resolution than military ones. The giant ones are super expensive, and commercial users don't need to read license plates. They would much rather have multiple satellites that visit same spot every day.

KeplerBoy

7 days ago

You don't need a camera to read license plates since the lettering is embossed. You absolutely could read license plates in SAR images, just not in current space borne systems.

Checkout this paper to see what's possible with ground- and airborne SAR systems (scihub is your friend): https://ieeexplore.ieee.org/abstract/document/7461591

jjk166

5 days ago

Your wavelength would have to be small compared to the size of the embossing. At that point you're getting into the infrared range.

KeplerBoy

5 days ago

Sure, some call it Infrared, some call it terahertz radar.

The linked paper talks about 300 GHz SAR, which is generally the lower limit of infrared.

jp42

7 days ago

"Those satellite photographs -- the landsat photographs -- are so darn good that when they re fully enhanced by computer, we can actually tell how high the waves are out in the middle of the Pacific; we can tell what the temperature of the ocean is 20 feet below.." - Grace Hopper in 1982 NSA lecture. YT link: https://www.youtube.com/watch?v=si9iqF5uTFk&t=1612s

I bet current spy satellite will be doing a lot things we can hardly imagine, may be we will know in 40-50 years down the line.

Full_Clark

7 days ago

There are a few open-literature papers already on how a submarine's wake might show up in analysis of surface patterns. Doesn't seem hard to imagine that the classified literature is a few steps ahead.

ceejayoz

7 days ago

We got a hint of how much better spy satellites are during Trump's presidency.

https://www.npr.org/2022/11/18/1137474748/trump-tweeted-an-i... / https://www.npr.org/2019/08/30/755994591/president-trump-twe...

Folks figured out the satellite that took it, too; a 2011 KH-11: https://www.npr.org/2019/09/02/756673481/amateurs-identify-u...

1024core

7 days ago

Just look at the image: https://x.com/realDonaldTrump/status/1167493371973255170/pho...

You can see lamp posts and fenceposts. Fenceposts are literally < 2" thick! So the resolution must be on the order of 2cm.

perihelions

7 days ago

- " Fenceposts are literally < 2" thick! So the resolution must be on the order of 2cm."

That's not quite how it works. If you apply a Gaussian blur on the scale of 20 cm to a 2 cm imaged object, it will persist in some form, if the contrast ratio is very high. That doesn't mean you have 2 cm resolution. Spatial resolution is rather different: it asks something more like, can you distinguish *two* objects at a 2 cm separation distance? Can you distinguish the case there two such separated objects, from that where there is only one?

That's closer to what you need to answer the question "how small text can be read?"

(Some of the visible stars in the sky, by the way, are ridiculously small (angular size) compared to the human eye resolution—there's no contradiction there either! The angular diameter of (for instance) Rigel is smaller than 1/10,000th the resolution of a human eye!)

jofer

7 days ago

That's a point that far too few people in the remote sensing industry understand!!

Put another way, try measuring the width of small objects in the scene. You'll find there's a minimum width things will appear regardless of how small they actually are. Small, high-contrast objects will be visible, but will be wider than they actually are. Measuring the width of small bright objects is one way of estimating the spatial resolution (i.e. FWHM of the PSF) of optical imagery.

And with that said, those fenceposts are not that thin. They're likely on the order of 10cm. It's widely assumed from regulations/etc that "spy" sats can get below 10cm spatial resolution. In the US, commercial sats are not allowed to collect imagery better than 10cm spatial resolution. At the 10cm resolution point, things like atmospheric lensing due to temperature variations become major issues and need to be corrected for (e.g. that "shimmer" you see above the pavement on a hot day). That type of tech gets tightly regulated very quickly even in the US's current "let private industry image how they want" regulatory environment (used to be more restrictive not long ago).

So that imagery is likely somewhere on the order of 5cm to 10cm resolution in its native form. Which is pretty nuts. It's crazy what the NGA can do!

jbuzbee

7 days ago

Just another example of impulsive and very poor judgment from Trump. Even though he had the right to declassify this image, his only reason to do so was to taunt the Iranians - Ha Ha, your rocket blew up. The Iranians likely corrected their issue and moved on whereas now our adversaries have clear evidence of our satellite capabilities and can adjust their behavior accordingly for years to come.

skinwill

7 days ago

My father, back in the 80's, told me about satellites in the 70's that could read your license plate. Using imaging tubes before CCD technology no less. The technology limitation is not only the resolution, which was rather good btw, but rather the time. Early spy sats used film that had to be de-orbited and captured for processing. Then they started beaming down what was essentially slow scan television which only improved with time. Now you can get clusters of CCD's beaming down massive amounts of digital signals from wide swaths of area. None of this is secret, really. Just look up the Landsat program. They even used multispectral cameras to count things like tree growth and duck populations. In the 70's!

actionfromafar

7 days ago

Well the Hubble is 70s tech, right?

mandevil

7 days ago

Because Hubble was designed for in-orbit servicing by the STS, it had it's optics, CCD systems, and even the flight computers (not originally designed for in-orbit servicing!) replaced over the various servicing missions. The differences between WF/PC 1 and WFC 3 is the difference between going into space in 1990 with mid-80's hardware and going into space in 2009.

Since the 2009 Service Mission 4, Hubble tech has held steady, but until then it was regularly being upgraded to state-of-the-art.

skinwill

7 days ago

State-of-the-art for what could survive space. It used 486's for quite a while. You are still very correct, I just think it's funny that the "best" is not always what people think.

dylan604

7 days ago

What is released commercially is not the same thing the NRO uses.

user

7 days ago

[deleted]