Would it be possible to stack multiple double-edge blades into a shaver head, slightly offset to each other, to perform like a traditional multi-blade disposable razor?
Or is the thinness of a disposable blade what makes it work well, assuming the blades can flex more to follow the skin profile?
I could see a product where you install 3-4 double edge blades into a the shaver head, and then "flip" the head or blades once the first side runs dull.
There are some nasty parasites that can get onto your vegetarian produce. For example, Rat-Lung Worm disease parasite, can get onto produce from slugs and slug mucus. If you eat this parasite it can go into your brain and mess you up bad (permanently).
I would rather eat a piece of meat (cooked) than a piece of lettuce that is grown in an area that is affected by this parasite.
I believe it's angiostrongyloides. The OP's description is accurate. It's actually a known problem in Hawaii, though elsewhere too, but with inadequate recognition.
Interesting idea. I wonder if anyone has created a 3D type spreadsheet, where you make calculations across different x,y,z planes. I suppose you could do this now with a standard 2d speadsheet, but it would require a lot of finagling.
One of my favorite ideas in this space is Lotus Improv. It lets the user edit and view arbitrary "tensors" and not just 2d or 3d data. The UI/UX is fairly intuitive and I'm sad no one developed the idea further. It's formula system is also very interesting.
Wow, I've never seen Lotus Improv but so far there's a lot to love.
If you're not aware, excel has something similar to the formula system if you use tables, and in combination with LET and LAMBDA it's also pretty pleasant.
I've got the rest of the video to watch, but thanks for your comment!
I swear I remember there being a 3D spreadsheet on the market back in the early '90s(?), but couldn't find it with a quick web search. So I might be confabulating.
That said, you're right that it can be done with modern spreadsheets. It doesn't even require that much finagling:
The fact any modern computer chip works reliability is a pure miracle. The process variations are extreme, and you often end up with a lot of B-level engineers/technicians keeping things going. Having some experience in the semiconductor industry, it oftentimes felt like a lot of bubble gum and bailing wire was used to get the product out the door. Hats off to all the people keeping these systems alive and functioning.
I've worked on the construction of a large dam and I think that's the case for most modern technology. Obviously once a dam is up and running it's really stable compared to what goes on in a fab, but getting it built? That's a whole other story.
The math and engineering behind dam construction are well understood but actually getting them built in practice is a years long story of yak shaving, cat herding, and trying to overcome every little piece of BS nature has to throw at the project. Unique and predictable geological conditions, unknown underground water sources, unexpected soil composition changes, surprise fault lines. Then there's the logistical nightmare of actually moving all the equipment in and earth out, the weather and environmental factors that impede every action, humans ignoring safety altogether, and so on and on. All of this implemented by workers on the ground who just barely know what's going on (for no fault of their own).
I'm continually surprised that anything complicated ever gets built at all.
There must the a name for this phenomenon. Like, the more you know, the less faith you have in it actually working. I'm pretty sure everyone feels this way about their work. I just asked my partner if it's the same with her (non-tech) job and she said yes, she can't believe it works at all. Arthur C. Clarke's Travel by Wire comes to mind too.
I would say it doesn't just work. That's why being a programmer isn't a one-off job. You're still there to glue things back together when they inevitably break unexpectedly.
It's more like disregarding all of the advantages and only focusing on the negatives or when incidents happen.
People routinely clown on companies for downtime but do not celebrate sending multiple MB pictures and videos over cell networks in remote locations from a super computer in their pocket.
Even 95% reliability is relatively good for networks working across the globe relative to what we've had through most of history.
The average person easily plays into the trope that no one appreciates IT when it works, but readily has opinions when there are problems.
The funny thing is, anecdotally, I have never had a CPU fail on me. Memory, motherboards, PCIe cards, PSUs, hard drives, monitors, keyboards, mouse have failed but I have yet to loose a CPU or an SSD.
I’m at a point where I no longer call these things chips. I don’t know what they are, they have pins but to me the precision and machinery used is at the very least on a scale of atomic design. It should not be possible for automated machines to fabricate these things. Yet here we are.
AMD just delayed Ryzen 9000 by one and two weeks because of production issues. Including recalling samples sent to reviewers and already at stores.
We appreciate the excitement around Ryzen 9000 series processors. During final checks, we found the initial production units that were shipped to our channel partners did not meet our full quality expectations. Out of an abundance of caution and to maintain the highest quality experiences for every Ryzen user, we are working with our channel partners to replace the initial production units with fresh units.
As a result, there will be a short delay in retail availability. The Ryzen 7 9700X and Ryzen 5 9600X processors will now go on sale on August 8th and the Ryzen 9 9950X and Ryzen 9 9900X processors will go on sale on August 15th. We pride ourselves in providing a high-quality experience for every Ryzen user, and we look forward to our fans having a great experience with the new Ryzen 9000 series.
Jack Huynh, AMD SVP and GM of Computing and Graphics
That, or they're calling Intel's benchmark before microcode update bluff, and in either case they're upfront about the delay instead of clamming up for half a year while unrest mounts.
I'd wager it's not that easy to separate these two, especially when voltage is dynamically regulated.
If it's a Zeiss lens imperfection, a broken ASML machine, water or silicon impurity, etc, what matters to the customer is the final product you're buying.
The industry has always been kind of monolithic. I'm not sure you can count qualcomm as a competitor just yet. competing means being in the same market. Laptop, desktop and server have all been categories that havent seen anything other than x86 for a long time.
Is this Intel's marketing department doing damage control?
Intel's been a disgrace, and that's why these chips suck. Imagine having all of the money and hiring everyone in sight and pulling all sorts of self-serving, dirty tricks industry-wide yet still letting an underdog on the verge of bankruptcy (AMD) beat you to 7 nm and eat your lunch.
Perhaps I'm in the minority here, but I've wasted a ton of time in math classes working through way too many academic exercises that have little real world applications. For example, learning a bunch of tricks to solve a differential equation by hand feels like a circus act. Sure it can be done, but only with a limited set of "textbook" equations. When you get into the real world, you'll need to put those equations into a solver like matlab, etc.
It would be nice IMHO to see a more hybrid approach at Universities to teach math and application at the same time. It's strange to send students through YEARS of math classes without strong application. It's like learning music theory without playing an instrument.
Our academic system in general is still modeled after old-school institutions, based on textbook-style learning that all pretty much follow the same recipe. Is it not crazy that we have classrooms in this day and age with 300 students sitting in desks listening to a single professor? It's insane.
We are ripe for an educational system that is truly disruptive - especially with the rise of AI systems.
this was my biggest gripe w/ academic math. Whenever i'd ask my teachers how these concepts are applied in the real world, i'd get a non-answer that showed me a) the teachers themselves have no clue and b) they're hoping you'll just shut up and follow the curriculum.
I agree that we are ripe for an educational system that is truly disruptive. Our current educators are so disconnected from the real world and have no idea how to apply what they teach.
Reminds me of the 3d TVs that were being forced on consumers. Nobody wanted them because you had to wear a set of goofy glasses.
The problem with VR and AR is that you can't sit around the living room and enjoy the same entertainment. You isolate yourself away from others in the room in a strange way.
Wearing headphones is somewhat like this in that others can't easily talk to you, but at least they can waive their hands to get your attention if needed.
> The problem with VR and AR is that you can't sit around the living room and enjoy the same entertainment. You isolate yourself away from others in the room in a strange way.
Speaking on this, I wish more games integrated the role of the God-mode / PC player like some do through things like dev consoles.
It creates instant entertainment for the outside players to be able to see and affect the game world. Some games are explicitly designed for PC vs VR while others, like Blade & Sorcery, give you the ability to drop in enemies for the VR player.
Anything that can keep a party lively and entertained while waiting their turn. I think I have more fun watching my friends scream & flail as I drop in more enemies than I do actually playing.
I have a love of electro-mechanical seven-segment displays. For scoreboards and timers, I feel like these are less intrusive than modern electronic display scoreboards.
There is something magical about seeing the score "flip" when a point is scored.
There's definitely a certain style with VFDs that I'm a little surprised hasn't been adopted by people looking for a retro/cyberpunk aesthetic. I was looking into them when it seemed my decades old alarm clock might need a replacement, and they seem quite rare which I assume the economies aren't there in making something more complex and delicate. Interestingly there's a few clocks repurposing cooker displays, I'd guess a manufacturer had a large stockpile of displays for models they discontinued early and liquidated them.
It's an 8-bit processor running at up to 20Mhz. Maybe an Apple-II would be a comparison. Likely the 6502 had a more powerful instruction set. Apple II had a 1Mhz 6502.
This chip:
8 MHz, 8 bit processing
1K x 14 bits EPROM.
48 bytes SRAM.
6 general purpose I/O pins (GPIO), PB[5:0], with independent direction control.
In comparison to the usual suspects like Apollo guidance computer or TI-83, it has a higher clock speed but shorter word length and extremely limited ram and memory. Precisely due to the cost engineering - this is meant to run simple, limited size programs.
The 8051, first released in 1980, the 8051 effectively created the microcontroller market. 4kB ROM and 128B of SRAM for the original 8051. It always was on the small side.
---------
We have to go earlier than 1980 but I'm not quite familiar with all of those computers.