Hacker News new | past | comments | ask | show | jobs | submit | proee's comments login

Where did you buy from? Also, what is your end application?


Would it be possible to stack multiple double-edge blades into a shaver head, slightly offset to each other, to perform like a traditional multi-blade disposable razor?

Or is the thinness of a disposable blade what makes it work well, assuming the blades can flex more to follow the skin profile?

I could see a product where you install 3-4 double edge blades into a the shaver head, and then "flip" the head or blades once the first side runs dull.



Great find. It's a shame their blades are not double-sided though so you could get 2x more life out of them.


you could snap a regular blade in half I guess


That is brilliant. Looks like the profile is the same!


There are some nasty parasites that can get onto your vegetarian produce. For example, Rat-Lung Worm disease parasite, can get onto produce from slugs and slug mucus. If you eat this parasite it can go into your brain and mess you up bad (permanently).

I would rather eat a piece of meat (cooked) than a piece of lettuce that is grown in an area that is affected by this parasite.


What a strange name for something in slugs.


I believe it's angiostrongyloides. The OP's description is accurate. It's actually a known problem in Hawaii, though elsewhere too, but with inadequate recognition.


They do live in rat lungs.



Interesting idea. I wonder if anyone has created a 3D type spreadsheet, where you make calculations across different x,y,z planes. I suppose you could do this now with a standard 2d speadsheet, but it would require a lot of finagling.


One of my favorite ideas in this space is Lotus Improv. It lets the user edit and view arbitrary "tensors" and not just 2d or 3d data. The UI/UX is fairly intuitive and I'm sad no one developed the idea further. It's formula system is also very interesting.

Here is a demonstration video I found about it

https://youtu.be/lTko_Pt2ZZg?si=9xtATPq7NU9oqooa


Wow, I've never seen Lotus Improv but so far there's a lot to love.

If you're not aware, excel has something similar to the formula system if you use tables, and in combination with LET and LAMBDA it's also pretty pleasant.

I've got the rest of the video to watch, but thanks for your comment!


Aw, he demos the Windows version - the original NeXTstep version has timeless aesthetics.

https://m.youtube.com/watch?v=rgGmKD87U3M

Another innovative but forgotten piece of “better than spreadsheet” software is Javelin PLUS.

https://archive.org/details/Javelin3_5


Thank you! I am blown away!


I swear I remember there being a 3D spreadsheet on the market back in the early '90s(?), but couldn't find it with a quick web search. So I might be confabulating.

That said, you're right that it can be done with modern spreadsheets. It doesn't even require that much finagling:

https://www.pcworld.com/article/439501/how-to-create-3d-work...


I think I first saw the concept of a 3D spreadsheet in Quattro Pro.

The following link seems to confirm this :

https://www.kstreetstudio.com/science/sams/qpro/concept.html


I have not seen that, but you'd probably like Microsoft's SandDance[0].

Try it in your browser: https://microsoft.github.io/SandDance/app/

[0] https://pldb.io/concepts/sanddance.html


An SQL database can be an n-dimensional spreadsheet.


The fact any modern computer chip works reliability is a pure miracle. The process variations are extreme, and you often end up with a lot of B-level engineers/technicians keeping things going. Having some experience in the semiconductor industry, it oftentimes felt like a lot of bubble gum and bailing wire was used to get the product out the door. Hats off to all the people keeping these systems alive and functioning.


I've worked on the construction of a large dam and I think that's the case for most modern technology. Obviously once a dam is up and running it's really stable compared to what goes on in a fab, but getting it built? That's a whole other story.

The math and engineering behind dam construction are well understood but actually getting them built in practice is a years long story of yak shaving, cat herding, and trying to overcome every little piece of BS nature has to throw at the project. Unique and predictable geological conditions, unknown underground water sources, unexpected soil composition changes, surprise fault lines. Then there's the logistical nightmare of actually moving all the equipment in and earth out, the weather and environmental factors that impede every action, humans ignoring safety altogether, and so on and on. All of this implemented by workers on the ground who just barely know what's going on (for no fault of their own).

I'm continually surprised that anything complicated ever gets built at all.


Similar to software, so the whole IT industry seems like a miracle sometimes. And the car industry gets more and more software installed ...


There must the a name for this phenomenon. Like, the more you know, the less faith you have in it actually working. I'm pretty sure everyone feels this way about their work. I just asked my partner if it's the same with her (non-tech) job and she said yes, she can't believe it works at all. Arthur C. Clarke's Travel by Wire comes to mind too.


I would say it doesn't just work. That's why being a programmer isn't a one-off job. You're still there to glue things back together when they inevitably break unexpectedly.


Is this just Gell-Mann amnesia, but for reliability?


It's more like disregarding all of the advantages and only focusing on the negatives or when incidents happen.

People routinely clown on companies for downtime but do not celebrate sending multiple MB pictures and videos over cell networks in remote locations from a super computer in their pocket.

Even 95% reliability is relatively good for networks working across the globe relative to what we've had through most of history.

The average person easily plays into the trope that no one appreciates IT when it works, but readily has opinions when there are problems.


Every time I see someone FaceTime in an elevator and complain about a little choppiness - I think of this ^^


The funny thing is, anecdotally, I have never had a CPU fail on me. Memory, motherboards, PCIe cards, PSUs, hard drives, monitors, keyboards, mouse have failed but I have yet to loose a CPU or an SSD.


I lost a few AMD Athlon CPUs back in the day. They were not reliable long-term (I think they died after ~6 years, or something like that).


I had a watercooled 5950x up and die on me recently after 3 years. AMD RMA’d it without argument.


I’m at a point where I no longer call these things chips. I don’t know what they are, they have pins but to me the precision and machinery used is at the very least on a scale of atomic design. It should not be possible for automated machines to fabricate these things. Yet here we are.


Yes, but you don’t read much about AMD, Apple, Samsung, Qualcomm. Intel has been shooting themselves in the foot consistently.

Processors are known for their reliability. Everything else will break first in a computer


AMD just delayed Ryzen 9000 by one and two weeks because of production issues. Including recalling samples sent to reviewers and already at stores.

We appreciate the excitement around Ryzen 9000 series processors. During final checks, we found the initial production units that were shipped to our channel partners did not meet our full quality expectations. Out of an abundance of caution and to maintain the highest quality experiences for every Ryzen user, we are working with our channel partners to replace the initial production units with fresh units.

As a result, there will be a short delay in retail availability. The Ryzen 7 9700X and Ryzen 5 9600X processors will now go on sale on August 8th and the Ryzen 9 9950X and Ryzen 9 9900X processors will go on sale on August 15th. We pride ourselves in providing a high-quality experience for every Ryzen user, and we look forward to our fans having a great experience with the new Ryzen 9000 series.

Jack Huynh, AMD SVP and GM of Computing and Graphics


That, or they're calling Intel's benchmark before microcode update bluff, and in either case they're upfront about the delay instead of clamming up for half a year while unrest mounts.


On the upside for them, that also means and reviews will be done comparing against the Intel chips with any mitigations applied.


You can reduce that list of Intel competitors to TSMC and Samsung.

AMD, Apple, and Qualcomm are TSMC customers.


TSMC does not design the chips. Who knows what percentage of the responsibility is due to manufacturing or design?


This news is about a manufacturing issue.


I'd wager it's not that easy to separate these two, especially when voltage is dynamically regulated.

If it's a Zeiss lens imperfection, a broken ASML machine, water or silicon impurity, etc, what matters to the customer is the final product you're buying.


The oxidation is.

The choice of voltage scaling is not.


The industry has always been kind of monolithic. I'm not sure you can count qualcomm as a competitor just yet. competing means being in the same market. Laptop, desktop and server have all been categories that havent seen anything other than x86 for a long time.


Qualcomm makes CPUs. That's what is relevant here.


Arm servers, such as gravitons and ampere altras, blow x86 out of the water right now.


Goes GlobalFoundary (AMDs old fab) still count, or are they too irrelevant?


Is this Intel's marketing department doing damage control?

Intel's been a disgrace, and that's why these chips suck. Imagine having all of the money and hiring everyone in sight and pulling all sorts of self-serving, dirty tricks industry-wide yet still letting an underdog on the verge of bankruptcy (AMD) beat you to 7 nm and eat your lunch.


Perhaps I'm in the minority here, but I've wasted a ton of time in math classes working through way too many academic exercises that have little real world applications. For example, learning a bunch of tricks to solve a differential equation by hand feels like a circus act. Sure it can be done, but only with a limited set of "textbook" equations. When you get into the real world, you'll need to put those equations into a solver like matlab, etc.

It would be nice IMHO to see a more hybrid approach at Universities to teach math and application at the same time. It's strange to send students through YEARS of math classes without strong application. It's like learning music theory without playing an instrument.

Our academic system in general is still modeled after old-school institutions, based on textbook-style learning that all pretty much follow the same recipe. Is it not crazy that we have classrooms in this day and age with 300 students sitting in desks listening to a single professor? It's insane.

We are ripe for an educational system that is truly disruptive - especially with the rise of AI systems.


this was my biggest gripe w/ academic math. Whenever i'd ask my teachers how these concepts are applied in the real world, i'd get a non-answer that showed me a) the teachers themselves have no clue and b) they're hoping you'll just shut up and follow the curriculum.

I agree that we are ripe for an educational system that is truly disruptive. Our current educators are so disconnected from the real world and have no idea how to apply what they teach.


Reminds me of the 3d TVs that were being forced on consumers. Nobody wanted them because you had to wear a set of goofy glasses.

The problem with VR and AR is that you can't sit around the living room and enjoy the same entertainment. You isolate yourself away from others in the room in a strange way.

Wearing headphones is somewhat like this in that others can't easily talk to you, but at least they can waive their hands to get your attention if needed.


> The problem with VR and AR is that you can't sit around the living room and enjoy the same entertainment. You isolate yourself away from others in the room in a strange way.

Speaking on this, I wish more games integrated the role of the God-mode / PC player like some do through things like dev consoles.

It creates instant entertainment for the outside players to be able to see and affect the game world. Some games are explicitly designed for PC vs VR while others, like Blade & Sorcery, give you the ability to drop in enemies for the VR player.

Anything that can keep a party lively and entertained while waiting their turn. I think I have more fun watching my friends scream & flail as I drop in more enemies than I do actually playing.


I have a love of electro-mechanical seven-segment displays. For scoreboards and timers, I feel like these are less intrusive than modern electronic display scoreboards.

There is something magical about seeing the score "flip" when a point is scored.


I also like VFD displays. There is something magical in all that green glow.


There's definitely a certain style with VFDs that I'm a little surprised hasn't been adopted by people looking for a retro/cyberpunk aesthetic. I was looking into them when it seemed my decades old alarm clock might need a replacement, and they seem quite rare which I assume the economies aren't there in making something more complex and delicate. Interestingly there's a few clocks repurposing cooker displays, I'd guess a manufacturer had a large stockpile of displays for models they discontinued early and liquidated them.


What would be a comparable early computer with the same computational power? Amazing to think how a machine the size of a room is now only a penny!


It's an 8-bit processor running at up to 20Mhz. Maybe an Apple-II would be a comparison. Likely the 6502 had a more powerful instruction set. Apple II had a 1Mhz 6502.


This chip: 8 MHz, 8 bit processing  1K x 14 bits EPROM.  48 bytes SRAM.  6 general purpose I/O pins (GPIO), PB[5:0], with independent direction control.

In comparison to the usual suspects like Apollo guidance computer or TI-83, it has a higher clock speed but shorter word length and extremely limited ram and memory. Precisely due to the cost engineering - this is meant to run simple, limited size programs.


The 8051, first released in 1980, the 8051 effectively created the microcontroller market. 4kB ROM and 128B of SRAM for the original 8051. It always was on the small side.

---------

We have to go earlier than 1980 but I'm not quite familiar with all of those computers.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: