Hacker News new | past | comments | ask | show | jobs | submit | cookingmyserver's comments login

Would refreezing break the cable?


If you build the probe so that it has the spool of cable in it, then the probe has to be as large as the full load of cable. If you make the probe just big enough to do what it needs while pulling the cable from the lander then it can be much smaller. If using the smaller probe, then the cable will need to be fully movable as it melts deeper. The larger probe with the full length of cable will require much more energy as it needs to melt a much larger hole.

Where is all of this energy coming from?


I think our breakdown in understanding here is our concept of cables. When I say cable (and many others here) I mean fiber optic cable. Even with 25km of fiber optic cable it is rather small and light. Drones, missiles, and torpedoes are already doing this with many miles of cable in a tight space. The issue with this which I am not sure about is the dynamic of the ice on the fiber optic cable and how well it would hold up to refreezing of the ice.


Refreezing isn't the big issue; shifting of the ice (causing physical severing of the line) is. We don't have a great handle yet on how much it moves around.


Yes, I think we definitely have a gigantic misunderstanding of cable here. Mine is based in reality, while yours seems to be very unrealistic. How in the world is a fiber optic cable going to do what needs to be done? Where is the power coming from to heat the probe via a fiber optic cable? Even a fiber optic cable at a length of 25km is a very large spool. If you want the probe to hold the spool and unwind as it goes, it must be at least the size of the spool of cable. If you think this would work with an unsheathed piece of bare fiber cable, then your just not even trying to be serious.


I see another misunderstanding then. With this method the actual probe would use nuclear material to melt its way through the ice. In addition, the heat of the nuclear probe on one side and the ice on another (or melting ice) would make for the ideal conditions of a peltier (or just use a traditional RTG) device to power onboard sensors and electronics. The fiber optic cable is only for communication.


> use nuclear material to melt its way through the ice

All 300 watts of it? It's not going to even make an indentation, let alone through 10s of km of ice.


Simple reactors can be designed to be turned up and down according to need. A 300w RTG is more than enough to run all the necessary electronics. The ice-melting 30,000w+ heater can be a second rector that is spooled up only when ice needs melting.


we're attempting to search for life and the thing you want to do is use radioactive heaters? we deliberately crashed a satellite into the planet to avoid having it potentially contaminate the moons we are curious about, and yet you're thinking they'd just irradiate everything like this? it's really just not logical


In the outer solar system, under miles of ice, in total darkness and cold .. it is nuclear or nothing. Short of antimatter batteries, there is no other source of power that would be even theoretically suitable.


The concern is more spreading Earth life. NASA's Planetary Protection team (which is a delightful job title) is largely concerned with sterilizing stuff we send out so any discovery of microbes on Mars doesn't turn out to be hitchhikers.

Even a fully fleged nuclear reactor isn't gonna do much damage to Europa and potential life. Swimming in a nuclear reactor's fuel pool is quite safe; water's some of the best shielding we have. https://what-if.xkcd.com/29/


> Where is all of this energy coming from?

A nuclear reactor, probably.



Really? To heat and melt sufficient ice around 25km of cabling? I don't know what temperature this ice is at, I think on the surface Europa averages around -300F, so it's probably at least that low. I guess a lot is going to depend on whether you're fine with the ice refreezing around the cable - if the ice shifts at all, the cable breaks. Keeping the whole thing heated continuously seems implausible


that would surely mess with any organics you might want to find


No, not really. Water is very effective shielding, and you could melt a base station through the ice and do exploration with subs it sends out if you want fairly pristine samples. In the Jupiter system, it's also hardly the biggest radiation source around.


The spool can be a long, thin "pipe" of wound cable that goes with one end of the "pipe" pointing to the rear (up). You can put an arbitrary amount of cable in a given hole diameter by making the spool taller.

(Google image search suggests that a similar approach has been taken by the TOW, it's not a spool that could be reversed by adding a motor to an axis, more like a tightly packed coil that gets straightened as wire is pulled out)

As for the energy, I assumed GP was thinking of solar panels on the surface. I also assume that we share scepticism based on the low sun intensity out in the orbit of Jupiter... (and that's before you even start wondering how much further away from the melting point that ice will be than all ice of conventional human experience)


>You can put an arbitrary amount of cable in a given hole diameter by making the spool taller.

Wouldn't this be limited to the tensile strength of the material and the weight of the cable? Granted, Europa has much less gravity, but 25km is a lot of cable weight.

Consider something as small as fishing line; one online estimate gives it .245g/m. At 25km, that's over 3 tons of line weight hanging down a hole on Earth or nearly 800 lbs on Europa.


The probe bears on the ice below it and the cable gets held by the ice that's re-frozen above the hole.

What you have to worry about is the ice shifting and severing the cable.


I think there are still mechanics at play that would have to be considered.

>The probe bears on the ice below it

This implies it is bearing the weight of the entire cable above it. So instead of the tensile stress being the limiting factor, it's not the compressive stress. If you're intent is to retract the spool, it would still be in tensile stress as it comes up. (And you'd need enough torque to do so. But maybe you the plan would be to abandon in place).

>What you have to worry about is the ice shifting and severing the cable.

I agree, that's a pretty big concern.


Could you embed a series of metallic needles as you melted your way down, then communicate via radio waves that travel needle to needle? They would not need to be connected. Just close by.


I think indirectly it does. When your launch vehicle costs hundreds of millions of dollars to use once on a scientific mission you try to put as much engineering into the scientific payload to (1) make damn sure it works when you are paying $200 million for a launch and (2) make sure you can do as much science as possible.

With something like Starship I wouldn't be surprised to see SpaceX cheaply provide a starship approaching end of life to a scientific mission. With cheaper and readily available launch opportunities we could see deep space missions that utilize larger amounts of probes manufactured more cheaply that have much less longevity (die after a year of data collecting) but can do a greater amount of science over their shorter lives. Essentially, using a large launch vehicle like starship as a mothership until they get to their destination. Reducing the need for RTGs.


Correct, hence the need for the probe to unroll the cable as it goes down. If you had the roll on the surface, you would need to heat the whole cable to allow it to slip down.


Or design the cable along the same principles as one of these,

https://www.walmart.com/c/kp/water-wiggler-toy

So it can "slip down" by a continuous unrolling process!


Which would require much more power than a single RITEG.


Why - 25km of fibre isn't that large?


Back of the envelope feasibility check: assume the cable is a cylinder with diameter = 1cm, length = 25km. The area of the cylinder face is A = 2*pi*r*h = 785.4 m^2. The thermal conductivity of water ice is approx. 2.3 W / (m K). So to maintain a temperature difference of 10 K with the ice, you need 2.3 W/mK * 785 * 10 = approximately 18kW.


I was assuming something little thicker than optical fibre - the "probe" could be self powered using an RTG with the "waste" heat doing the melting?

Once the ice freezes again behind the probe it would protect the fibre... perhaps?

Fortunately something like that wouldn't be too difficult to test on Earth - probe recovery might be tricky though.


A mini nuclear-reactor-as-a-heat-source might be appropriate for a melt-drill. RTGs are a bit unfortunate as they'll exponentially decay from the time of manufacturing, and you'll need to both 1. deliver high enough power at Europa, and 2. radiate away that much power and a bit more when you're flying there.

A nuclear reactor could produce basically no heat while offline, then be switched on and suddenly provide 100s of kW when it gets to wherever it's going. The hard part in space is radiating away the heat, but if you're on an ice world, that's orders of magnitude easier.

The hardest part I'd see would just be getting into the ice; there's not really any "melting" in vacuum. The constant boiling away of the water would keep insulating your heater from the ice. Meters 1 to 20,000 are probably pretty easy.


> The Ocean Cleanup themselves have estimated at least 75% of ocean trash is from fishing boats

They estimate that 75% of the ocean trash in the Great Pacific Garbage Patch is from fishing boats.


> When properly cooled, the Intel silicon tends to perform a lot better.

Of course, but the average Joe does not want to wear ear protection when running their laptop. Nor do they want the battery to last 40 minutes or have it be huge brick, or have to pour liquid nitrogen on it to not get it to not thermal throttle.

Apple innovated by making chips that fit the form and function most people need in their personal devices. They don't need to be the absolute fastest, but innovation isn't solely tied to the computing power of a processor. It make sense that Intel excels in the market segment where people do need to wear ear protection to go near their products. If they need to crank in an extra 30 watts to achieve their new better compute then so be it.

We don't know the specifics of the conversations between Apple and Intel. Hopefully for Intel it was just the fact that they didn't want to innovate for personal computing processors and not that they couldn't.


It seems like you think I'm trying to dunk on Apple. I am not. Apple Silicon is a great first showing for them. Performance simply isn't better than Ryzen APUs running in the same power envelope. And power usage is what you'd expect of silicon running on the latest node. Further, some of Apple's choices - bringing memory on package, only two display outputs - caused regressions for their users compared to the previous Intel offerings.

I wouldn't call what Apple did innovation - they followed predictable development trajectories - more integration. They licensed ARM's instruction set, Imagination's PowerVR GPU, most of the major system busses (PCIe, Thunderbolt, USB 3, Displayport, etc), they bonded chiplets together with TSMC's packaging and chip-to-chip communication technologies, and they made extensions (like optional x86 memory ordering for all ARM instructions which removes a lot of the work of emulation). Incidentally, Apple kicked off it's chip design efforts by purchasing PA Semi. Those folks had all the power management chip design expertise already.

But again, it's been a good first showing for Apple. I think they were smart to ship on-package DRAM in a consumer device. Now is about the right time for the CPU to be absorbing DRAM, as can be seen by AMD's 3D VCache in another form. And it's cool for Apple folks to have their own cool thing. Yay y'all. But I've run Linux for 20 years, I've run Linux on every computer I can get my hands on in that time, and through that lens, Apple silicon performs like any x86 APU in a midrange laptop or desktop. And as regards noise, I never hear the fans on my 7800x3D / 3090Ti, and it is very very noticeably faster than my M1 Mac. Apple Silicon's great, it's just for laptops and midrange desktops right now.


Somehow you are comparing Apple’s first gen laptop/iPad chip to a a desktop setup requiring 10x the power consumption and 10x the physical size (for the chips and all the cooling required). The power envelope for these chips is very different and they prioritize different things.


That's my point. You got it. Go you.


> I wouldn't call what Apple did innovation - they followed predictable development trajectories - more integration

By this yardstick, nobody in semiconductors has ever innovated.


Well to be fair there is an awful lot of copying and extending in place.


> to be fair there is an awful lot of copying and extending in place

That's how technology proliferates. The point is if the M1 wasn't innovative, that rules out pretty much everything AMD, Intel and potentially even NVIDIA have done in the last three decades.


Did they do anything in those three decades that hadn't been dreamt of and projected out sometime in the 60s? Architecturally, sure doesn't seem like it.

I'd say a lot more innovation happens on the process side. Manufacturing.

All the architecture changes look like things mainframes did decades ago.


>Apple Silicon is a great first showing for them. Performance simply isn't better than Ryzen APUs running in the same power envelope. And power usage is what you'd expect of silicon running on the latest node.

Do you have source for this other than Cinebench R23, which is hand optimized for x87 AVX instructions through Intel Embree Engine?

From all sources, Apple Silicon has 2-3x more perf/watt than AMD's APUs in multithread and a bigger gap in single thread.


It's always curious to me how Apple's superior products are somehow some other company's fault.


Somewhat misleading headline. They plan to release a paid version of Alexa AKA Alexa Plus using generative AI, but testing didn't go well and they are trying to rebuild it using LLM tech from the bottom up.


Thanks to all those scientists from Alexa Entertainment, who have successfully broken all iterations of Alexa. 3 for 3! Great job!


I don't think it should matter if 'systemctl start' was issued by an operator or an external script, it should try to start no matter what. SysD itself should use a different start command or flag that is subject to the limit when trying to restart after it detects a failure to start.


I used to think the same thing until I started riding a motorcycle. I quickly came to the realization that the primary reason I was seeing less insects on my windshield was because of improved aerodynamics.

I am sure that there has also been a major decline in insect population as well though contributing to it.


Which is why you would use waves of consecutive detonations consisting of more and more (smaller) nukes. Think of a cone pointed towards the asteroid. The tip would be the first initial large nuke. Because of the mass and velocity of the asteroid, it is unlikely that the fragments would spread out all that much. You are right that there would likely be fragments that still have an orbit that would lead to collision. After the first explosion you would detonate 5 more nukes spread out evenly to further perturb and break down the asteroid remnants. You would repeat this many times. Each time the nukes could be smaller as the mass of the asteroid remnants would be getting smaller and smaller making the force of the nukes more effective against them. This would probably only be suitable for ruble pile asteroids, but I would imagine those are the hardest to use ablation with, so it may still be appropriate to use more destructive methods.

There are two goals with this: (1) break down or deflect any large chunks to prevent damage related to ground impacts. (2) cause enough change to the orbits of the asteroid remnants such that any subsequent collision with earth would be spread out over time to prevent overheating of the atmosphere via clouds of debris.

The best solution is always to have the asteroid remain as intact as possible, but for certain asteroid types and scenarios, it may just have to be good enough especially as a backup.


Glad to see this researched more. It has become popular in pop culture science to bring up the "myth" of using nukes to stop/deflect asteroids. Apparently, their incorrect use in a few movies discounts them ever being used. Even "science communicators" have participated in evangelizing the ineffectiveness of nukes, never realizing you don't have to land on the asteroid and drill a nuke into its core to use it effectively.

There have already been papers on deflection via the ablation of an asteroid via nuclear detonation, so the idea is not new. However, it looks like with the knowledge gained with the DART mission this research will enable better modeling.


What's more, the technique is in one sense very well explored: ablation drive is how fusion bombs work.

Putting Deuterium and Tritium (well, LiD which becomes D+T) in the middle of a fission bomb doesn't squeeze it hard/long enough to boost the yield of the fission bomb 1000x. Focusing the x-rays from a fission bomb onto a uranium "pusher" and driving it forward with the reaction force of vaporizing uranium does squeeze hard enough to generate fusion yield far in excess of the fission primary.

If this sounds vaguely familiar, you might be thinking of the Hohlraum at the NIF that recently made the press for getting more energy out than was deposited in. That strategy doesn't scale into a power plant but it does a brilliant job of simulating fusion bombs, which is why the DOE / NNSA can pursue it with so much money.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: