My 40 Mbps connection can do all 20 GB of enwiki-latest-pages-articles-multistream.xml.bz2 in about an hour, shouldn't take all night. Unless you want to gently pull all the images from rsync://ftpmirror.your.org/wikimedia-images/, which will take a little longer.
Or you want to simplify by using a pre-built archive and reader app from, say, Kiwix, just grab all 96 GB of wikipedia_en_all_maxi_2023-09.zim in ~5 hours.
Really, what will suffer is your own sleep, as you figure out how to read and organize and serve and auto-update the content...