"Lose/Lose is a video-game with real life consequences. Each alien in the game is created based on a random file on the players computer. If the player kills the alien, the file it is based on is deleted. If the players ship is destroyed, the application itself is deleted."
> "Lose/Lose is a video-game with real life consequences. Each alien in the game is created based on a random file on the players computer. If the player kills the alien, the file it is based on is deleted. If the players ship is destroyed, the application itself is deleted."
That's awesome! Reminds me of a little virtual tamagochi I programmed in college called Virus.
Virus would only represent itself as a file called for example Toby.virus and it would wander around in folders (only 2 levels deep) and you could give him ascii files with a ".food" extension that includes strings and depending on the size of the file the virus would get full.
If too full it would lave ".puke" files with some of the strings from the food file. And otherwise randomly leave ".poop" files around you'd have to clean up to make virus stay healthy.
You could see its stats in the ".virus" file.
I even implemented mating and light genetic traits that can be passed down
I actually have a similar idea, at least the Tamagotchi part of it. Now that I've read your comment I feel more motivated to actually flesh it out and implement it, lol.
" A significant problem with the current implementation of PSDoom is that monsters are much more likely to attack each other than expected. This causes many windows to mysteriously disappear as the program runs"
I love this. In the Future, I imagine complex computer systems will likely be represented in forms humans find engaging rather than having humans learn the intricacies of computers; this would open up the field to a lot more people (analogous to the No code movement… and would suffer similar pitfalls).
I think good interfaces wrap these intricacies in powerful metaphors that build a bridge between the human and the computer, and that it is right and proper that the human meets the computer more than halfway. I think things like desktops and video games as admin tools* are poor user interfaces because they present poor metaphors for how work is done or how concepts are arranged, instead of good metaphors for computation.
This creates brittle interfaces that are difficult to learn and troubleshoot because
I think the terminal is an antiqutated interface that needs to be modernized, but there's a good reason it's still used. Bash is a pretty reasonably metaphor for computation. When things go wrong with your computer, you're better able to reach into it's guts, hear the hum of it's engine, and figure out what's wrong. As I said, it's antiquated and hostile to newcomers, so ultimately it isn't that good an interface. But there is no ceiling for how well you can master it or what you can do with it.
Contrast that with a desktop interface, where if things go wrong, your only option is to muck about in a settings application. If your needs have been correctly anticipated, this works like butter, with very little friction. If your needs have been incorrectly anticipated, it works okay, but is probably frustrating. If your needs have not been anticipated at all, you may be able to find a workaround, but only if you have a high degree of mastery in the interface. And you may be out of luck.
Additionally, achieving this high degree of mastery is difficult and not generally worthwhile, because these interfaces are subject to frequent changes. You don't accrue more and more skill over time throughout your career, unless you happen to work in this industry or are a motivated power user. And as you upgrade your OS, things are moved around and tweaked, and your mastery degrades.
I realized this was the case when my hands were injured for 10 months from typing, and I could only compute on my phone. I couldn't make anything, I could only browse premade apps and pray my needs had been anticipated. I couldn't fix anything, my only path to getting useful debugging information involved a computer. I was frustrated and felt like I didn't understand what my phone was doing or why it wasn't working. And I realized that is how most people feel about their computers.
* I do realize these are fun toys/proofs of concept, and I totally appreciate them on that level.
This seems like pretty hand-wavy thinking. Decision making can be thought of as a directed graph, where you have various situations as nodes and actions as edges.
That does indeed mirror how games work, however for all of our systems we do not perfectly know the actual "true" state of the graph. Even the oldest, simplest, most well understood systems are not understood perfectly. The fog of war is ever present.
To present systems management without the "intricacies of computers" you either have to not care where the actions lead you, or have a system that is perfectly understood. And if the latter is the case, then there is literally 0 value in a human pressing the buttons.
> To present systems management without the "intricacies of computers" you either have to not care where the actions lead you, or have a system that is perfectly understood.
This is a false dichotomy—why would there be no middle ground? It's in that middle ground that all useful abstractions reside, and many of them are effectively lossless compressions of some "computational intricacies," which makes it possible to non-misleadingly interact through some simplified interface.
This isn't to say that forming such a simplified interface is impossible to get wrong, but to say the results will necessarily be poor is overly dismissive.
> this would open up the field to a lot more people
Maybe... but I don't think it can be done in the exact format as presented in the article though.
"Hey we've a new version to deploy, can you punch that 'default:service:hello-world' chicken until it ... turns into smoke? You can keep the egg ;)"
A good UI needs to be efficient to operate while easy enough to form mental model (abstraction) around it. So, basically not a Minecraft server. Nice try, Now get back to work.
This was always my theory about why Microsoft bought Minecraft in the first place, to train the workforce of the future and own the tools that would become standard. Who doesn't want to do data mining with a pickaxe?
Strong agree. The spreadsheet metaphor makes complex computing tasks approachable to a very wide population of people and we've all benefitted from it.
It also is not all roses. From a maintenance, collaboration and correctness standpoint there are many footguns that have caused massive pain. I've yet to see other no code solutions trying to learn from these lessons.
Excel is honestly more like the other MS suite products. Most of the time it is used as such.
Advanced user can make pretty customized stuff happen in excel with VBA and macros - but then it's no longer no-code. It's similar to how python has disseminated to a degree into various white-collar professions but it's still on a small percentage basis.
I'm reminded of the scene in The Matrix where the controllers to the gates of Zion are in a simulation managing the processes. [0] Instead of managing things outside the matrix via a terminal/monitor, they're given a virtual environment with their controls. Your comment makes sense to me!
I always thought Zookeeper was a pretty good analogy for sysadmin - you have a bunch of different complex beasts with different needs, and some of them can co-exist in the same enclosure but some can't, and so on.
I love this on so many levels.. going back 5/10 years (or at least pre-cloud) when you'd have a bunch of long-lived servers, each named, each with it's own quirks etc. and would begin to have their own personalality.
Then the 'cloud' came along and blew all that away with autoscaling and completely non-persistence of machines... this _really_ captures it in my eyes, where we're now managing large farms of desposable infrastructure. It shows the idea of culling large amounts of it and scaling, but still brings back the 'personal' feel that the servers are living things with feelings (。◕‿◕。).
Also, throwJan22's creeper idea for chaos monkey implementation is fantastic (and I guess not a particuarly hard thing to implement at this point :P )
If you could break the fences between the namespaces and just blow open any network policies between them would be great :D
Minecraft is an intrinsically multiplayer game (singleplayer mode just spins a local server and connects to it) so yes it would.
Regarding the plugin, there might be a need to detect the user that kills the chickens for auditing purposes, but the basic idea should work fine yeah.
It is an original game (twin stick shmup) made in Unity that is able to connect to a kubernetes cluster. You 'enter' into your nodes and the pods running on those nodes appear in that 'world' where you are able to shoot and destroy them. Replica Sets will of course re-create them, so they respawn and float around again.
Ha, this is great. I just spent the holiday break launching a kubernetes cluster in my home lab, on which I’ve deployed a Minecraft server to play with my friends. I guess with this I could manage my Minecraft kubernetes deployment WITHIN the world itself. One creeper and the whole server comes crumbling down. It would be a very risky game.
20 years ago I pitched something like this, but based on network traffic (shaping with routers, switches & fws) managed in a 3D world.
My presentation skills suck, so it never got off the ground but I still visualise network traffic in my head that way. Would still love to roam around in networks and enjoy the graphics.
Would actually be fun to work from inside the Minecraft client all day, get bored and you can take a quick break to go build something. Spend the rest of day at something like a ComputerCraft terminal, make notes in-game. Sounds really fun in my head but would probably get bored really quickly.
When they killed a pig and it killed the pod, how was that accomplished in Kubernetes? Did not change the number of replicas for hello-world. I'm guessing the server called into the pod and gave it a exit command/signal.