Interesting, as it wasn't possible to have https and a custom domain in the past, I'm using Cloudflare with the flexible https & enforce https settings.
Though in github, the "enforce https" is greyed out and says: "Unavailable for your site because your domain is not properly configured to support HTTPS (lab.openbloc.fr)". Does anyone knows what isn't actually "properly configured"? I have a CNAME file in the repo containing "lab.openbloc.fr".
The help says "If you're updating an existing custom domain, first remove and then re-add your custom domain to your GitHub account to trigger the process of enabling HTTPS." but the option is still disabled.
In your position I did these things;
- Removed the CNAME record;
- Changed my A records to point to githubs https ip's
- Deleted the site from cloudflare
- Purged my cache for the site in chrome (chrome appears to remember who your site has its cert with for a bit?)
- Profit
I had to put remove the CNAME file, push, add it back, and push it again to trigger the check again (You can do this with the domain textbox on your settings page as well).
I'm hosting my webpage on GitHub Pages, with a custom domain. Back when I set it up, Let's Encrypt would not work with GitHub Pages so instead I simply put my page on Cloudflare. Should I spend the time and set it up again but with Let's Encrypt? I've heard people say bad stuff about Cloudflare, but dismissed it since my page is really just a blog.
It doesn't really matter for a blog without auth, but when you are using cloudflare's https, the connection is not encryptes between CF and GitHub, so it's not end to end encrypted.
I recommend simply migrating even if you keep the DNS itself on cloudflare. It's a good exercise if nothing else.
It can be encrypted if you're using Full SSL on Cloudflare[1], but it's not authenticated, meaning anyone actively MITMing the connection between CF and GH could easily read and change the traffic. That said, it's not any script-kiddie who can MITM a connection between two DCs, so I think it's hardly a grave threat.
I think the only real gain is not allowing CF itself to see who is accessing your blog.
Ah, ok. But how so? You can get a LE cert as long as you can serve a file in the correct URL, or set a certain DNS record. I don't see why proxying would prevent that.
Oh, of course. I was thinking of Let's Encrypt's DNS-based authentication since that's the only thing I use nowadays (though of course Github isn't using that). Ignore me.
Open IP the cert you're using and see what other sites are listed on it. It probably includes a lot of sites that aren't yours. GitHub + Let's Encrypt gives you a clean cert for just your hostnames.
Unless you personally have something against Cloudflare I'd say there's really no reason to bother changing anything.
But yeah, if you point your domain directly at GitHub Pages instead of at Cloudflare then GitHub should just issue you a cert and start using it automatically.
An interesting alternative is Fly.io - you can setup everything to go through them with a backend that's your Github Page and they'll spin out LE certs on the fly for it.
The only thing that annoys me about github pages or netifly is the lack of traffic analytics (number of visits, location..). There isn't a single metric we can have access to. Cloudflare gives minimal but useful analytics
First, I would prefer not to use a third party analytics that tracks user behavior and downloads / uploads things to their server while users are browsing. Especially to Google servers.
All the info I need is already in the http server logs. That's a shame we can't have access to analytics on them.
Does anyone have this working for both the www subdomain and the root domain? I have the www subdomain working fine, but I get an invalid cert error when attempting to access the root domain with HTTPS. The cert it's attempting to use is for www.github.com .
I'm having problems with it too. I plugged in the new IP addresses for my root A records yesterday, and then deleted and re-added the CNAME file in the repository. GitHub support said they manually started the certificate provisioning process several hours ago, but I'm still getting certificate errors.
You may not have had a certificate issued for your domain yet. The steps I went through for this were changing the CNAME, which made GitHub go try to get a new certificate for me.
I was having trouble with that yesterday, as well. On the settings page for my repository, I switched from www.example.com to example.com and then back again to www.example.com. I noticed it is now working for my site, though I don't know for sure that my changes fixed it.
I just recently decided to move my GitHub pages with custom domains to netlify (https://www.netlify.com/), since it offered free out of the box SSL (let's encrypt).
Kinda wish I waited, but it's good that they finally support it.
Are you asking whether Github is offering their agent software for others to use? Or just whether, in general, there are agents which can make use of Let's Encrypt?
The answer to the latter is yes, of course, a wide variety of suitable software exists, including shell scripts like https://acme.sh/ the "official" Certbot Python code and software like Caddy which implements this feature right inside a web server.
I've tested, it works awesomely. But does it mean that anyone who got temporary access to your DNS can issue certificates and MITM your traffic? It should not be this way.
Though in github, the "enforce https" is greyed out and says: "Unavailable for your site because your domain is not properly configured to support HTTPS (lab.openbloc.fr)". Does anyone knows what isn't actually "properly configured"? I have a CNAME file in the repo containing "lab.openbloc.fr".
The help says "If you're updating an existing custom domain, first remove and then re-add your custom domain to your GitHub account to trigger the process of enabling HTTPS." but the option is still disabled.