A modern successor to gopher, with some extra features to make it more useful in the modern day.
Gopher was a competitor to the early web. It had a distinction between contentful pages and link pages, so it was less flexible than the HTML based web. I'm pretty sure this is one of the things Gemini fixes
Gemini is because some people are sick of the modern web.
In the (idealised) olden days, the web was a place people posted content. An amateur could make a geocities that showed people their interests, an academic could have a collection of pages that acted as notes for their lectures, or a company could advertise the products that they sold.
In the (distopianised) modern days, the web is a giant network of interlinked computer programs, none of which can be trusted, but most of which offer at least some attractive distraction, whose primary purpose is to develop a small number of competing databases about you to maximise the amount of money that can be extracted from you while minimising the amount of value that can be returned to you. The providers of the computer programs take particular steps which should cause rational people to distrust them (e.g. hiding the button that says "Save my choices" to discourage you from doing what you want and what they are obliged to permit you to do), but healthy people can only tolerate so much distrust in their day-to-day life that they become exhausted.
Gemini starts with extensions to Gopher to develop something a little bit more like the first one. It acts as something like a safe space. It is based on a similar sort of principle to the black-and-white mode on a lot of modern phones, to discourage you from overusing it by making it less attractive. Although Gemini does support form submissions and CGI, the primary form of interaction as far as I know is to have multiple gemlogs.
(I tried using Gemini last year when it was mentioned in this place. But the content I want - e.g. programming language API documentation - is not on Gemini, and I think the Hacker News proxy is read-only, so I began to forget about it. I think Gemini is perhaps a little bit too far over.)
> the web is a giant network of interlinked computer programs, none of which can be trusted
Agreed, but Gemini does address the problem of surveillance.
All metadata are still leaked: IP addresses, DNS queries, FQDNs in the TLS session opening. Also timing attacks.
Furthermore, there's nothing that Gemini can do to prevent unofficial extensions, e.g. browsers detecting and loading HTML/CSS/javascript found on a Gemini page.
> All metadata are still leaked: IP addresses, DNS queries, FQDNs in the TLS session opening. Also timing attacks.
That's the problem with the current iteration of the internet. If you run a Gemini-based hidden service, they go away.
> Furthermore, there's nothing that Gemini can do to prevent unofficial extensions, e.g. browsers detecting and loading HTML/CSS/javascript found on a Gemini page.
That's the problem for the client to take care of. Those clients that aren't built with web technologies are unlikely to be subject to accidental web technology execution.
I recently got started with photography (A7R M2) and the tamron 28-70 f/2.8 is my first zoom. Have a 28mm 2.8, 40mm 2.8, 85mm 2.8 but the tamron is the one that is almost always mounted (or the sony 28mm). It's truly amazing.
I run a shared hosting business. Mostly wordpress, of course. Most people have their first contact with all this when their site is hacked (plugins not updated for years) and they have to clean up. Then we need to babysit them - but to a degree that is what we are paid for. And for good reason.
The problem is most Wordpress plugins are abandoned. Most sites default to auto updating plugins, but if the plugin author isn’t pushing security patches it’s a big vulnerability.
Wordpress plugins are notoriously bad at input sanitization. Even many large, commercially supported plugins get abandoned or simply aren’t secure.
> Well, Wordpress can know auto-update the plugins...(sic)
And your site (Or some parts of your site) could break at any point in time without your knowledge and remain broken for a long time until you find that it’s broken. There is no platform that doesn’t require “babysitting” with maintenance and testing.
I see this and suddenly it clicks! That is exactly why I couldn't import an SQL dump that I tried importing for days now, that is filtered with grep. Wow.
And I was wondering all the time why mysql reported this strange error "SQL error in Binary file" when the .sql file was clearly a text file...
But... How do you configure the hosts where your containers are running on? How do you configure your storage (NAS/SAN)? How do you configure your routers and switches? ...
The original question didn't have much context, and I guess my answer assumed someone would be using a cloud provider as opposed to anything on premise.
Are Ansible/Puppet/Chef any good for managing the hardware you mentioned?
Funny thing about docker container security, bug that has not been fixed for ages: a custom AppArmor profile is only applied on the first container start, but for no later restart.
Yes, the container runs in the "unconfined" profile after a restart.
That’s disingenuous. In this issue, the maintainers clearly explain that running your container as privileged is supposed to disable all confinement by apparmor. The bug is that the custom apparmor profile is sometimes applied, when it should never be. This is not a security issue in any way since the container is already privileged.
But in a privileged container you could still take away capabilites and/or permissions with an apparmor profile. Sometimes that happens, sometimes it does not. And when it does not, you have no way of knowing.
> But in a privileged container you could still take away capabilites and/or permissions with an apparmor profile.
Right, what you want is “privileged except for XYZ”, which is not supported by Docker. That’s a missing feature which is not the same as a bug. Calling it a security bug is even more misleading.
> Sometimes that happens, sometimes it does not. And when it does not, you have no way of knowing.
Right, it should fail every time. That is a bug. But it’s not security bug, and fixing that bug won’t give you the feature you want, it will just make it clearer that the feature is not supported.
I own a webhosting provider. We offer Let's Encrypt with automatic issuing and renewal, securing 184,961 hostnames (SANs) at this moment.
We issue certificates automatically if none is existing when connecting to a website and renew the certificates in batches 30 days before they expire. When renewing, we merge certificates/hostnames into bigger certificates with 90 hostnames so we don't have so many moving parts.
If renewal would break, however (as it did once or twice before), nothing bad would happen because on page load there would be a new certificate issued.
Keep in mind, he might just be saying what he thinks the public (and/or his wife!) wants to hear, but I tend to agree with you. I don't even leave the house unless I get reeeeally restless (which is rare now that I'm a computer nerd), or run out of supplies, or have to go to that pesky "job" thing I seem to have gotten myself involved with.