No, cameras' resolution is limited by diffraction that is a function of mirror size. JWST achieves that diffraction limit in range between 2 to 28.5 micron where most of it observations will be performed. As for sensitivity, installed instruments are state of the art, finding any improvements would necessitate multi-year research programs.
More powerful and modern CPU would not enable any more science to be gathered and only increase chance of failure. Maybe some power saving could be achieved, but that CPU requires only 5W from 2000W produced by solar panel anyway.
> As for sensitivity, installed instruments are state of the art, finding any improvements would necessitate multi-year research programs.
That was exactly my question. Did they use the 8 years from 2013 to 2021 to improve the instruments? Or did they remain at 2013 state of the art level?
Final optical instruments were integrated in 2014, proposed launch date at the time was 2018. There was not enough time to think about any upgrades and cost was ballooning anyway.
Internal noise of electronics is not significant compared to thermal noise at operating temperature of MIRI (7 Kelvin).
Further small improvements might be possible in the future but no 'orders of magnitude' can be expected without going into even colder temperatures, that would require total redesign.
At the end of the day, efficiency can be offset by simply taking longer exposures so it is nice to have but does not have to be super optimised. Achieving diffraction limit is much more important and JWST's camera do that.
The raw efficiency numbers aren't that helpful without a reference. To put it another way, what would the key differences in the component performances look like if you used the state of the art components from this year vs whatever's is in there?
Quantum efficiency is a ratio of how many electrons are created from typical photon hitting a detector. For optical instruments max value is 100% and is the ultimate reference - every photon gets converted into electrical signal. This is why order of magnitude improvements are not expected to happen - there are physical limits on how much signal can be generated from certain amount of light. Solar cells for generating electricity may go over 100% because high energy photons can be converted into 2 electrons but this is undesirable for optical imaging and light will pass through narrowband filters before hitting detector to filter out such photons as they would skew results.
Is it simplified model, real life detectors have more parameters regarding noise and persistence but all produced in past 20 years are close to theoretical limits.
Did the state of the art change much in 8 years? I am not sure we can use consumer electronics heuristics to determine the rate of progress at the high end. Consumer electronics has only one constraint: cost. Sony could sell you a gigapixel camera and a 200 core CPU to process the data today. Consumers can't afford that, though, so it's not a product. But at the high end -- humanity's only new space telescope -- the cost constraint is not as big of a problem, so the tech is going to be be better than what comes in a $100 phone.
Hubble is 30 years old using 1970s and 1980s tech and it's still a useful scientific instrument. Something with 2013 tech is going to be spectacularly useful in 2022.
That would require an increase in the size of the mirror, which would mean a complete redesign of the satellite. It is already at the limit of what can be observed with a mirror that size so a better sensor won't automatically give you better images.
If you look at professional cameras (Sony Alpha for example), the sensors improved in the last 10 years from taking shitty pictures at night to being able to take pictures in almost total darkness. All this with little improvements in optics.
So is the telescope sensor able to count each and every photon hitting it, was it already at maximum possible performance?
Yes, it can detect single far red shifted photons. It's cooled down to 7(!!) Kelvin to achieve that incredible feat, while the telescope itself is at a relatively balmy 36 Kelvin. If you want to get more spatial resolution you will need a (much) larger mirror and likely an even more stable platform. This is already a machine of such ridiculous precision that new technology had to be invented to make it possible, those sensors are works of art on par with anything ever made.
> If you want to get more spatial resolution you will need a (much) larger mirror
Or multiple mirrors spatially separated. Or a single mirror/camera at different positions at different times. Since JWST's orbit around L2 is fairly large, is that going to be used? You mentioned that stability of the platform would be a limit. How well is the position of the JWST known at any given time (I suppose it can be calibrated by viewing well known sources)?
That trick is hard enough when using radio telescopes connected to the mass of planet earth, with enormous wavelengths in comparison to what JWST is doing. It's position is known fairly accurately but even the telescope will undergo various vibrations from for instance the equipment on board and the course correction burns which will have a pretty significant effect on the satellite body.
The mirrors are dynamically deformed to correct for some errors, I don't think - but I also don't know for sure - if they are going to do long baseline tricks with the orbit, it would make good sense to do this for paralax shift measurements (which give you a good idea about the distance an object is at), but the orbit of the earth around the sun would be far more useful for that because it is so much larger.
Anyway, there are people on HN that are far more knowledgeable about this stuff than I am, I take it the designers and operators of the JWST are top in their fields and that anything interested laypeople can come up with has been debated, accepted or rejected a decade or more before this conversation, they're far from stupid, as evidenced by the incredible performance so far. Let's hope it stays that way and that the insertion burn goes well, that's the major scary thing that will happen next and the delta-v is nothing like the launch so I would assume that it will all go well but at the same time the telescope wasn't as fragile back then as it is now.
CPU it uses is RAD750, pretty standard radiation-hardened CPU for spacecrafts. https://en.wikipedia.org/wiki/RAD750
More powerful and modern CPU would not enable any more science to be gathered and only increase chance of failure. Maybe some power saving could be achieved, but that CPU requires only 5W from 2000W produced by solar panel anyway.