Hacker News new | past | comments | ask | show | jobs | submit login

One would wish that for budget-exponentially-overrun taxpayer-funded infrastructure, there would be open-source decoding information available.



This is just telemetry data which doesn’t have much general or scientific interest, i guess they could publish the protocol spec (honestly it probably is aquirable) but most of the fun for the kinds of people who want this data is going to be doing this reverse engineering themselves.

The real imaging data would require a much more significant dish to even receive (i can’t immediately find what it’s going to use, but I’m guessing something like a 40 meter dish) so there are approximately zero amateurs who could use such open source information.


Most of the public specifications are distributed freely by the CCSDS (Consultative Committee for Space Data Systems): https://public.ccsds.org/Publications/BlueBooks.aspx

The mission-specific parameters ("managed parameters") used by any given mission are usually more tightly controlled, as are the payload specifications for each telemetry channel.

> This is just telemetry data which doesn’t have much general or scientific interest

My understanding is that "telemetry" and "telecommand" stand for the downlink and uplink directions of a space link. I mostly worked upstream of telecommand, but I understood "telemetry" to refer to received data of any kind -- e.g. in CCSDS 130.1-G-3, an informational report on the design of the CCSDS telemetry system. https://public.ccsds.org/Pubs/130x1g3.pdf

By the by, I've been continually impressed with the quality of the CCSDS' documents. The "green books" (informational reports, like the one above) are extremely approachable and well-written.


Gonna chime in here to comment that most NASA missions (and ESA too) provide the scientific data for download free of charge, under Public Domain or CC licenses. If it's for scientific purposes, it's not just good manners, but rather a requirement to cite the proper dataset (that also gives you the bonus of citing a respected source, so it's a win-win). Thing is that many people doesn't even know where to look for!

And it doesn't help that some missions manage their own archives differently, and there's a lot of terminology to learn on your own. One of the complete opposites of that, which was a joy, was the New Horizons archive which, at one point, you could download from a torrent! For example, if you wanted to see V3 of the Arrokoth encounter from 2019, you'd go to: https://pdssbn.astro.umd.edu/holdings/nh-a-lorri-3-kem1-v3.0...

Again, New Horizons is a bit of a rare case in which they went for super accessible data for everyone. PDS itself is a great system, but many missions will just upload a bit of data to PDS and then manage the rest some other way (Cassini for example has only a couple of instruments on PDS, and you have to go to some other URL if you want uncalibrated but automatically processed images on JPEG format[0], but yet another place (to which I've lost the link to and I can't find on mobile) for the full, science-grade dataset).

A great resource is OPUS[1] too, however I find it's UI a bit difficult, and in the end I prefer to download full datasets and just explore them on my own rather than going with those online browsers. For example, if you wanted to check the Voyager images of Neptune, you'd go to this massive URL[2]. Quick tip: once you've configured the filter you want to apply, the Search button is on the top left -- this is the kind of usability thing I mentioned, buttons and links aren't quite where you'd expect them. Oh and there's a limit to how many things you can select for download at once. And it's all dynamically loaded, and on and on and on. Which is why, as I said before, I generally prefer to just download the full GB sized dataset and explore it on my own.

[0] https://solarsystem.nasa.gov/raw-images/raw-image-viewer/?or...

[1] https://opus.pds-rings.seti.org/opus/

[2] https://opus.pds-rings.seti.org/opus/#/instrument=Voyager+IS...


> Gonna chime in here to comment that most NASA missions (and ESA too) provide the scientific data for download free of charge

I thought that was the case, but it's been so long since I've been on a mission proper (Cassini, student co-op) that I didn't want to say so without basis. Thanks!


Yes, I just explore these archives as a hobby and sometimes do a bit of amateur processing on the files. There tends to be some embargo period (a few months to a couple of years) where only the mission team has access to the data, and they decide how/to whom share it. But then again, they are all scientists and willing to share knowledge 99.9% of the time -- so a fellow researcher will probably be able to get a copy of the data if they are polite and ask for it accordingly.

As I'm just "playing" with the files, I don't mind waiting a few months/years to get access to full "scientific grade" readings from incredible complex machines and systems. And if the "raw" data is not easily available, they also usually do provide processed images as part of the missions public outreach campaigns (usually the ones that are found on Wikipedia).


From: https://ntrs.nasa.gov/api/citations/20080030196/downloads/20...

>To keep up with the high downlink, the recorder data gets sent directly to the Ka-band transmitter

Currently aws groundstation doesn't support KA band so no luck there. It's apparently going to do a transmission once a day so you would need to time it right with the ground station.


Groundststion as a Service. I had no idea this existed. I am continuously amazed at how many things Amazon churns out "as a service".


It's pretty amazing. I found out about it a couple months ago from a job posting that wanted a programmer for a cube sat. It seems like this is what they're aiming the service for at the moment.

And really, it's the perfect place for aws to enter, add glue, and provide a service to let you do your core business. Feels like the future man.


Nice link, very informative, here's a funny excerpt : "It [JSWT] is currently planned to be launched in 2013 from French Guiana aboard an Ariane 5 launch vehicle".


I couldn't find a publish date in the pdf but going off the url I'm going to guess that it was from 2008.

It's a little bit annoying finding information about jwst as info could potentially be old and out of date. Another document that I read mentioned using XML as the database format because XML was an emerging standard :S


But in a really open project, the design of the whole lot would be on the web, and the data sent back would be sitting on an FTP server somewhere for anyone to download and use.

In many ways, an open project is cheaper to do than a behind-closed-doors project where every new contractor needs to get access to only the bits of the project they need access to, and misunderstandings happen because not everyone has enough of the big picture.

The only bit that needs to be secret is one private key used to sign the commands sent to the satellite, just so one random Mallory can't 'steal' it.


> the data sent back would be sitting on an FTP server somewhere for anyone to download and use

I'm sure they could actually do that without too much fuss. But it would require significant amounts of scientist time to document those datasets to enable others to use them for any arbitrary dataset. I'm sure we'll see fully open data sets from JWST appear, but lots of the stuff it collects isn't going to be interesting enough that it's reasonable to spend scientist time documenting it.


It seems like it should all be automated. Some scientist generates a mission request for the JWST techs. If accepted the mission is added to the timeline with all of the metadata the original scientists had in their proposal. Stuff like the area being imaged, the sensors in use, duration of capture, etc...

Once the data is collected and downloaded it is added to the catalog with all of that metadata attached. Then it's a matter of opening up that catalog to the public, although I'm guessing the downloads will be quite sizeable so the bandwidth could be an issue.

The trick to making this work is to integrate the publishing into the workflow so it doesn't require any additional effort on the part of anyone.


The data from the mission will be made public after proprietary periods [1]. They have an archive [2]. I don't expect that the raw telemetry will be made available, but the raw science data in FITS format appears will be available.

[1] https://jwst-docs.stsci.edu/jwst-opportunities-and-policies/... [2] https://archive.stsci.edu/missions-and-data/jwst


I think there needs to be a distinction between something that a project "publishes", and something that is "made available".

Something published has been checked by a few team members, written with care, and represents the opinion of the authors and project.

Something made available has no guarantees of correctness, might not represent the projects opinion, and might just be random matlab scripts made by a JWST scientist in their lunchtime that they thought was fun.

In the open source world, what is 'published' is probably the projects homepage, and code. What is 'made available' is random chatter on their discord or IRC channel.

I hope that more government projects 'make available' everything done by all the workers - every file saved on every PC, with the understanding that there is no guarantee of correctness.

I guess it's the same idea as being able to see into the kitchen from a restaurant. You might see the chef making mistakes or juggling the saucepans, but you'll also see the work being done as it's done, and being able to view doesn't delay the chefs work.


That's a pretty bad idea IMO. Putting people in a panopticon has a strong chilling effect, no matter what disclaimer you put on the recordings. Creative, deep work needs space to make blunders in private, scientists are no exception. They'll just use their personal laptops or document every experiment and mistake and script to death, getting done a lot less actual research.

Plus, it will be pretty much useless im practice since you'd have to be an expert in that niche yourself to know what's correct (you're not getting any extra docs or context) and probably most of it will be some kind of incorrect, possibly very subtly. The only people who could profit tremendously are the competition who aim to snipe that particular paper. Science is pretty dirty and ruthless often as not, I totally could see this happen.


As I recall, with the Hubble, and presumably the JWST, there were scientific "precedence" rules which limited access to the observational data to the scientist requesting the observation for some period of time.

This was to prevent others from also getting the data and publishing before the requesting scientist.

Assuming this continues to be true, they probably wouldn't want to handle the data by publishing it to a FTP server immediately on receipt.


The data will be at https://archive.stsci.edu/ (which holds the data from a whole host of space telescopes, and uses standard VO interfaces).

STScI is a major contributor to astropy etc. (https://github.com/astropy), and has their own space with more tools/software: https://github.com/spacetelescope.

It's unclear what else you'd actually want, unless you want to build a clone of the actual satellite (it wouldn't surprise me that a majority of the software for the satellite is open source, just not put together somewhere publicly for people to download).


> It's unclear what else you'd actually want, unless you want to build a clone of the actual satellite (it wouldn't surprise me that a majority of the software for the satellite is open source, just not put together somewhere publicly for people to download).

Much satellite technology is classified or at least export controlled (ITAR). The major difference between a space telescope and a spy satellite is which direction you point.


A funny quip but the main imaging cameras on the jwst would be useless pointed at earth.


You're assuming it's secret, but the more likely case is it just isn't anyone's job to make it public.


Aren't they using the Deep Space Network for this?


Yes, all communications are routed through DSN.


I actually don’t know if they want to protect the data. I can imagine they might not want China or other countries listening in and potentially sending commands to the craft.

Does anyone know if there is typically encryption on the downlink? How about uplink commands? I guess we want those to be secured so only authenticated control can send commands


Yes, CCSDS Magenta and Blue reference books mandate AES-256-GCM as a minimum for data encryption and mandate that encryption and authentication should be used, particularly for commands/uplink. Sliding scale of requirements based on application of course - your cubesat's imaging system is less critical than the flight termination system on a manned mission, for instance.


Fwiw, sat c&c is the one thing you're allowed to use encryption on the Amateur radio bands for.


how is that enforceable? you can't determine what the encrypted communication is.


Same as anything in amateur radio. There'll be a lot of hints at what you're doing from the shape of your broadcasts, and there's a lot of amateur radio enthusiasts that'll call the feds on you if they get a hint that you aren't following the rules.


radio enthusiasts are fans of the fed?


They're generally very into enforcing the FCC rules, and calling the feds is one mechanism for doing that.


Here's an ARRL article from 11-2016 (re the shutdown of 11 FCC field stations) which details plans and motivations.

https://www.arrl.org/news/fcc-special-counsel-laura-smith-sa...

In a more recent post (which includes good suggestions for experimenters):

> The ham bands aren’t full of whacked-out haters looking to get rid of everybody else, but hams are very accustomed to people using the bands without a license or not following the band plans, causing interference problems.

https://thesilicongraybeard.blogspot.com/2020/09/a-ham-radio...

Usually the 'self-policing' by hams (it's usually very effective) is done gently. They get that the frequencies are free of $$cost, and don't want to lose (any more of) them.


I highly suspect that China (and almost every other country) are not the issue. Chinese and the whole international scientific community will profit immensely from the data coming from James Webb, so I don't think the Chinese have any interest in sabotaging the project. More dangerous would be some average joe, somewhere who would try to mess with it for the lulz.


Embarrassing the US is high value.


If somebody like China is caught sabotaging JWST the fallout could be immense. They would be accused of trying to hold back human progress, and I could easily imagine high profile Chinese scientists leaving the country and science institutions boycotting China. It's not worth the risk.


Schemes like that would involve not taking credit for it - just another example of US failure, not a success by China.


I also don’t think they would want to do this, but I think you’re overestimating the risk. Genocide, stealing the entire South China Sea, and threatening the sovereignty of other nations haven’t inspired such boycotts.


They actually understated the risk by incorrectly assessing where it would come from.

The primary risk would be that the US would hit back and attempt to sabotage something important to China (and with China's sprawling global interests now, there are a vast number of soft, highly exposed targets). The US Government can be quite vindictive depending on the context, it will hit you back if it can. The responsible agency wouldn't seek publicity for any successful sabotage, but China would know who did it and what it was for.


Or terrorist organizations with tech skills


Or frame the average Joe for the lulz for political points.

(sorry, I'm just now reading into the drama with the fbi and gov. whitmer..)


There would have to be some sort of encryption or passkey or digital signature on commands, for sure. Otherwise some hobbyist in the middle of the Ocean on his yacht could be messing with the craft and taking pictures and there'd be no easy way to shut him down.


It is not necessary really.

There is unlikely any non-state actors[1] that has the ability to transmit signals to L2 . Just receiving signals even now (only 2 out of 30 days to l2) the OP used a 6 meter dish. Most of interplanetary mission signals are handled by the DSN.

Any sort of encryption will add both b/w requirements and compute requirements . The CPU/network budgets on such missions are very very limited. Every bit and cycle counts.

Finally standard encryption libraries, algorithms et al, are not likely suitable . I am no expert, but I have not read of any modern algorithms with very low network overhead + compute requirements designed for these kind of use cases, that is also secure from brute force or other attacks.

Mission risk is also a factor, even handshake failures can jeopardize the mission. It is one thing a website did not load because of TLS negotiation failures and $10 B mission overshot its orbit because handshake failures on the encryption layer.

[1] Threats from state actors for science missions is different category of concern, harder to quantify and with not much history of actual attacks. Collateral risks like from the ASAT Russian test to ISS, or in dual use missions would perhaps not apply here .Usually science teams collaborate well even if there is lot of tension in political sphere.


"It is not necessary really."

Authentication of commands to satellites is very, very necessary


Encryption !=authentication. OP was talking about encryption.

You could do authentication over plain text. For popular example http basic auth.

It is not recommended for regular use cases, but is not out of realm of possibility in satelite given the constraints.


OP was talking about digital signatures and a hobbyist taking command of the spacecraft. In other words authentication.

You said that was unnecessary and too taxing on constrained hardware. That is incorrect. Authentication is both necessary and not excessively taxing, when both considering the risk to the spacecraft's operation or even not considering it, since, as you said yourself, authentication schemes can be reasonably lightweight.

"Not out of the realm of possibility in satellites" What are you talking about? Of course it isn't! Authentication to satellites is recommended and implemented all the time, for obvious reasons. You think they risk a multi million dollar investment to save some clock cycles? How many commandeered satellites do you read about daily? Do you have sources to back up any of this?


I'd imagine the specs are rated and hardened for radiation first, as seen on all previous NASA satellite and probe missions before getting into the weeds of overhead and encryption.


"Torpedo in the water!" would probably be sufficient.


The US government being able to discreetly torpedo everyone everywhere within what would have to be a few minutes at most, that would be pretty pretty scary. Imagine the number of drones they'd need to have deployed and armed at all times, and the potential for abuse.


I think you might just be trying to make this threat a much bigger issue than it really is. How many people do you really think will be trying to attack the JWST?


I don't have any inside information but in my experience lurking in the amateur radio community the answer is 'it depends' and there is a lot of downlink that is not encrypted. This will get you into the graph:

https://twitter.com/usa_satcom

https://twitter.com/uhf_satcom

https://twitter.com/r2x0t


This would certainly speed up the rate of discovery from the images and get more people interested in astronomy.

Inside the project there must be an established data infrastructure running or ready to run. It probably has a raw layer and a transformed layer with well defined tabular schemas for the metadata of each image. It would be fun to see how both layers work and play around with the data.

Would the instrument itself have a fixed set of metadata attributes assigned to all imaging, or is it programmable and changeable during the service life?


There are lots of interesting tools and information on the Space Telescope Institute website to browse through. I'm guessing you can get the decoded data as its received.

https://www.stsci.edu/jwst/science-execution/data-analysis-t...


Science in general still basically operates on the "NASA invented xyz while going to the moon"-model from the 20th century. Things get developed and then trickle into industry via back-channels (or people moving) but the idea of open-source is still both culturally alien and legally suspect.

Even in CS papers directly dealing with a piece of software there is no obligation to publish code.


After seeing plenty of code and projects by people who weren't professional software engineers used to working on teams, part of the problem is likely that code written for this sort of thing often depends on a ton of dependencies and system-specific configuration bits that are documented poorly or not at all. Getting such projects to a state where a random person could git pull it and make sense of it and use it is a whole project unto itself that usually the core contributors are poorly equipped to take on. How many really understand the pain of onboarding into a poorly-documented repo and how to use the right tools to make it a smooth process?


On top of that, missions are heavily incentivized (in an "our success depends on this" way) to solve only the problems they absolutely need to solve, due to constraints on time, budget, and manpower. It's an incredible feat to achieve what they do, but reuse and non-specialist use are non-goals.


(Argumentatively) I don't care. If I can't replicate your work you can't test it.

If you make it clear it must be reproducible from the start and threaten to not publish if it can't be then they'll get in line - the kind of projects I'm thinking of are < 5k lines usually.


The only way to properly replicate a CS paper is to re-implement the code from scratch. Simply running someone's code again is most likely just going to give you the same buggy output they got (or just as likely a bunch of unrelated compile errors). But frequently, that isn't even what you want. Many CS papers are of the form "we were able to build software that does XYZ using this design", which isn't really a falsifiable statement in the first place. It just serves to give future researchers and practitioners data points they can use when building their own software


That's true but it also allows me to look for selection bias in their data.

If I see a paper claiming remarkably good predictive capabilities of (say) the performance of a basic block, and no discussion of it's flaws you bet I'm assuming they didn't test it well enough.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: