Hacker News new | past | comments | ask | show | jobs | submit | soheil's comments login

Well looks like you imposed a qualification on these types of flights and now by definition they cannot deliver payload.

The flights could deliver cheap payload for example.


You can't deliver a payload to orbit if you're not planning on going to orbit. They have a launch licence for a particular mission plan.

The license for Falcon 9 and Falcon Heavy specifically says “Authorization: SpaceX is authorized to conduct flights of launch vehicles: […] (c) Transporting Dragon 2 to low Earth orbit or a payload to orbit;”

The license for Starship/Super Heavy Launch Vehicle does not include such an authorization.

https://www.faa.gov/media/69476

https://www.faa.gov/media/75501


"The payload is data." Elon The data they get from the test and they focus on getting the most and most valuable data. This is an explicit choice. A physical payload at this stage would reduce the overall value that they are able to get from a flight.

How is self-driving a 2D problem when you navigate a 3D world? (please do visit hilly San Francisco sometime) not to mention additional dimensions like depth, velocity vectors among others.

The visual input and sensory input to the self driving function are of the 3D world but the car is still constrained to move along a 2D topological surface, it’s not moving up and down other than by following the curvature of that

So based on your argument they actually operate in 1D since roads go in one direction and lanes and intersections are constrained to a predetermined curly line.

The point is clearly that they don’t have a vertical axis of control, they can’t make the car fly up in the air unless they’re driving crazy taxi style

We have not exhausted what html can do either. LLMs not getting smarter is orthogonal to its currently unexplored search space.

1. Given millions of things that are perfect it takes one of them for HN to lose its mind, power button happened to be it this time, Cook didn't decide that.

2. How often do people exactly have to turn off and on a mac that consumes less than a pi for them to constantly be reaching out to that power button?

3. Standby, hibernate exist.


It's not like Tim Cook personally decided to put the button there, but saying over many years he's aligned the company to be one that would leave the button there rather than bite the cost of putting it somewhere more ergonomic is something I can buy into. Seems like a way to improve margins generation over generation, which is the kind of thing he's obsessed with.

This is also the same Apple that made the G4 Cube: that felt like this in reverse, with Jobs driving them to make a capacitive touch button because of an obsession with a seamless surface.


Yes that's it. Jobs's annoyances were always about achieving a better product, a higher level of refinement or something of the sort. It was mostly about, "it can be better this way" and he was very often right even though sometimes not.

On the other hand, with Cook, it's always about cost cutting and corner cutting and the likes. It feels cheap (especially considering the pricing and brand aspirations) but also primitive and unrefined.

Which is why their price escalation was unjustified, if you want to charge a lot you need to figure out a no compromise product and, in my opinion, they have not been there a lot recently...


Slightly more accurate: they're raising billions making pennies.


Why, just because search has ads therefore anything that is a superset of that must also?


Because ads bring money and companies love money


Over the past two decades, ads have proven to be the only way to make money over the internet…


They're making $300 million per month in revenue right now, with no ads.


That is what Google makes in less than half a day by selling ads. It’s a rounding error on their monthly revenue.


Agreed, it's as if someone completely ignored the meaning of the word and just decided what sounds good for an AI app.


I made one very similar it's basically a wrapper around duckduckgo https://foxacid.ai


I often choose 144p for low bandwidth scenarios. It is very similar to a good quality audio-only stream in terms of size, but you get the added benefit of looking at the speaker even if that's a glance every 45 mins. Same for battery life, you save that too.

Also 144p is somehow so peaceful and relaxing to watch, you don't get distracted with all the shiny colors and intricate details in the video and can just listen and not have your mind focus on some random stuff.


Me too, i don't enjoy observing people's pimples on the nose


Why do you need webgpu? It's unfortunate that people use technology that is "state-of-the-art techniques to run simulations at interactive speeds" without fully understanding what it's for. General compute on GPU is what webgpu is for.. To simulate basic waves like in this demo you absolutely do not need that, in fact it's an indication the author implemented the solution in a non-optimal way. WebGL is fully supported by all browsers fully supported by well-maintained libs like 3js, yet here we are people writing a sin function with basic interference patterns, one of the most elementary 3D primitives, in webgpu and argue that's using the "state-of-the-art" techniques.


Good question! This is actually a numerical solver for a few coupled partial differential equations - the method in this context (electromagnetism) is called FDTD. It's implemented as a WebGPU compute shader.

You absolutely could do this using WebGL2 compute shaders too, but I thought it would be fun to try this newer API.


Annoyingly WebGL2 doesn't have compute shaders even though GLES3.x that it is based on does.


Thank Google for that, as they dropped Intel contribution to WebGL Compute, with the reasoning WebGPU would be good enough.


I don't understand what other type of solution is there to render on a gpu other than a numeric one?

Here is a very basic shader for what you want:

  float freq1 = 2.0;
  float freq2 = 3.0;
  float amp = 0.5;

  pos.z += sin(pos.x * freq1 + uTime) * amp;
  pos.z += cos(pos.y * freq2 + uTime) * amp;

  gl_Position = projectionMatrix * modelViewMatrix * vec4(pos, 1.0);


That's no solver, it just displays a sine wave pattern.


Did ChatGPT write that for you because it has missed the mark by 500 metres.


It's a sin + cos function.. you need an AI for that?


The point was that the implementation of this tool is not a sin+cos function. It's more like

  let newEz = Ez0[me] +  calcEzDiff(vec2u(id.x, id.y), dx, dy, aspect);

  let newEzAbove = Ez0[above] +  calcEzDiff(vec2u(id.x, id.y + 1), dx, dy, aspect);
  let newEzRight = Ez0[right] +  calcEzDiff(vec2u(id.x + 1, id.y), dx, dy, aspect);

  Hx1[me] = Hx0[me] - (uniforms.dt/mu0)*(newEzAbove - newEz) / dy;
  Hy1[me] = Hy0[me] + (uniforms.dt/mu0)*(newEzRight - newEz) / dx;

  Ez1[me] = newEz;

  let localDelta = delta[me];
  let fac = 1 + uniforms.omega * uniforms.dt * (delta[me] / eps[me] / 2);
  Ez1[me] = Ez1[me] / fac;
  Hx1[me] = Hx1[me] / fac;
  Hy1[me] = Hy1[me] / fac;
and then a bunch of other GPU code. You can find this with little effort from the bundle, if you care, by base64-decoding the Pt("xxx") parts.

Though I do imagine it indeed could be implementable with WebGL shaders, but I also wouldn't start a new compute-based on it, unless I had a particular need to support older systems. And this I say as a Firefox user..


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: