You should have added this to your pitch, above. I think your pricing is fantastic, but mentioning that you are willing to go even further for people who really need the help would have probably garnered a lot of goodwill around here.
That's it, I am done. No more serious IT work for me.
I am going to become a gamers consultant. If there are people who care enough to make things like this, there must be people who would pay decent money for "professionals" to come in and optimize their factorio game, or their minecraft builds, or to make them feel good about their solution to a Zachtronics puzzle.
You joke, but Factorio is a sufficiently "crunchy" logistics game that it has prompted me to study a number of serious topics, from learning linear programming to determine optimal production ratios, to dipping my toe into learning about the broader subject of systems dynamics, with stocks, flows, feedbacks, and so forth. I maintain that it's an excellent educational tool for these sorts of subjects.
No doubt, gaming support has been a serious business for a while. Far beyond the gold farming, item trade, and "pay me for voice chat" underground.
I'd take a bet that almost everybody reading this who cares about cloud providers has come into contact with companies who started out by providing servers and administration services for Counter Strike or Minecraft.
I’m guessing you’re mostly not serious, though I acknowledge there may be a market for what you describe.
As a lifelong gamer since the Commodore 64 (since parenthood scaling it way back to mostly crewing a tank in Hell Let Loose with two friends, or a ttrpg with same), looking back it feels like I took refuge in a medium (single-player games) ultimately less valuable than other-format stories, like books, film, and radioplays. What I’m driving at is a feeling that I’ve enjoyed elevated layers of luxury that may be an indicator of too much idle time for too many people in US culture. Is this a common mindshift with age?
Escape has a place, but a whole industry built around spending so much time in abstraction layers feels like a problem if it’s not helping us relate to each other better.
Video games are just a younger medium than books and movies. Even TV needed more time to be seen at a similar level to movies in terms of art.
We're getting there though. The past 15 years have given us many masterpieces of gaming that can absolutely be considered art, and they're becoming more and more common. As this happens more and more, the medium is taken more and more seriously.
Many games still suffer from feature-creep by way of trying to please users demanding "content". This incidentally make them feel like the primary goal is to waste time rather than to experience a high quality story. In other words, there are still too many "Big Bang Theory" and not enough "Breaking Bad" so that is why it still feels less valuable.
Because I told you I just binged The Queen's Gambit, you wouldn't look at me like I wasted my time, you'd ask me about the story. If I told you I just binged TBBT, you'd just think I'm nuts. (No offense if you like TBBT, it was a decent sitcom for a couple seasons, but … hey, you do you).
For example, I just finished Control. A wonderful, high quality game. Great storytelling, great mechanics, and … a crapton of completely useless busywork that is just there because it's expected to be there. A talent tree, repeat random side-missions, a crafting system, etc. None of this is useful to the core game. It's a shame because I believe if it had done less in terms of busywork mechanics and maybe focused a tiny bit more on puzzles, that game could have been a masterpiece of the same caliber as Portal and Portal 2. (Can you imagine if Portal 2 had a talent tree and a crafting system?)
Instead, it's a Season 7 of Game of Thrones. Great production value, but so hit and miss in terms of final quality because of avoidable mistakes. Could have been so good, you know?
Thanks for reminding me about Portal, and for the TV analogy. It might have been a Gamasutra article about games also being works of art, and a young medium (tabletop games, too, have both come a long way and have room to grow). We certainly have a plethora of options for how we spent our time alive.
That's why you don't give out your channel name, but your web site. And from there you can link people to whatever you want them to see without any worries about where it is hosted.
>It would be cool with a completely offline SpeechRecognition...
Nuance still makes Dragon and Windscribe branded speech recognition software. Dragon was great +20 years ago. Not sure how it is now, but investigating offline speech recognition is on my agenda because Microsoft nerved theirs to push people to their online service.
That's insane. What's even more insane is that a bit over 20 years later homecomputers reached that frequency. And in the next decade they reached over 100 MHz. - Pure lunacy.
> That's insane. What's even more insane is that a bit over 20 years later homecomputers reached that frequency. And in the next decade they reached over 100 MHz.
The CDCs still had a good run. This line was originally released in 1964. It started at 10MHz, but that was 60 bit words, and special floating-point systems. IIRC floating-point multiplies were only one clock cycle; if that's correct, it took 100 nanoseconds.
The Apple II came out in 1977, 1MHz, 8 bit CPU and no floating-point circuits. You had to use many cycles to do any floating point, and typically you only used 32-bit floating point (because it was painful enough there). A single 32-bit floating point multiply took 3-4 milliseconds according to:
https://books.google.com/books?id=xJnfBwAAQBAJ&pg=PA26&lpg=P...
The original IBM PC came out in 1981. Its clock was 4.77 MHz. But again, that was misleading. Internally the 8086 was a 16-bit CPU but its memory I/O was only 8 bits wide. It didn't normally come with a floating-point processor. There was one, the 8087, and I think the original IBM PC had a socket for it, but it cost big $$$ and the 8087 wasn't actually available for purchase until ~6 months after the PC's release. That one could go 4-10MHz. If you bought a coprocessor, you were finally getting to somewhat similar speeds for numerical calculations... but that was 16+ years later.
Interestingly, the original "sx" designation for Intel 386 chips (80386sx) meant the same sort of thing... the 386sx was a 32 bit chip with a 16 bit bus. The dx was 32/32.
A product generation later, Intel changed what this meant to indicate whether or not the CPU had an on-chip FPU.
I think this means that if a corresponding development in video coprocessors had been taking place (there wasn't really a recognized need for them back then, as far as I can tell), the CDC-6600 could have been running Wolfenstein 3D decently well in 1964, 28 years before the launch in 1992.
And the CDC-7600 could have been running DOOM decently well in 1967, 26 years before the launch in 1993.
I had thought that computers of the time didn't have enough RAM for a framebuffer, but evidently the CDC6600 did, with what we would call 982 kilobytes today.
I get this realization all of the time when reading about and exploring past innovations (it's one of my primary hobbies). The really smart people back then were just as smart as the really smart people today. It's just they had crappier tech and less knowledge to work with.
"On the shoulder of giants" is an old and very true statement.
I suspect they were considered an inefficient use of memory. The TX-0 and PDP-1 had screens of about a million points. For uses like plotting data, playing video games, or drafting mechanical linkages, this resolution is highly desirable — I've done CAD on a 320×200 CGA screen, and the screen resolution was extremely limiting. (For those who haven't used one, this resolution is such that you can fit 25 lines of 40-column text on it before readability starts to suffer.)
You could imagine putting something like a 320×200 CGA on a CDC 6600, taking what we would now call 16000 bytes of memory, at a cost of around US$40 000 in 1965 (https://jcmit.net/memoryprice.htm says). But it seems like it would be hard to justify that as a good alternative to adding those 16000 bytes to the machine's main memory, supplying an extra 1.6% of its address space with storage and allowing it to tackle problems that were, say, 5% larger.
(There may also have been a question of memory speed. The memory cited above that cost US$2.50 a bit was core, and maybe it had a 300 ns cycle time; a 320×200 display at 50 Hz minimally requires a 312.5 ns dot clock, and probably quite a bit faster than that because of the blanking intervals, so you might have needed two to four memory banks just to satisfy the RAMDAC's hunger for pixels. Of course they wouldn't have called it a "RAMDAC" or "dot clock" at the time.)
The Cray's were hard 64 bit machines - smallest addressable unit was 64 bits. This had a lot of implications for software and computer languages that assumed you could easily access bytes.
ECL logic systems reached effective clock frequencies in excess of 500 MHz in the late 60s or so. It was extremely fast compared to contemporary RTL/TTL logic.
Exactly. It was MOS that lagged, not "transistors" or "computers", really. Transistorized microwave circuits (analog, not digital) in the GHz range were operating as early as the 1960's too.
MOS is just hard. It's hard to fabricate, it's hard to design (i.e. you need computers to make computers!), it's hard to scale. It required new chemistries be invented and new circuit design techniques to manage the (really tiny!) components. So for a long time you could have a big circuit in discrete bipolar components which ran fast or a tiny one in MOS which was slow.
But MOS was tiny and MOS was cheap. So by the early 90's it had caught up in speed with all but the fastest discrete technologies (GaAs was "right around the corner" for like two decades as it kept getting lapped by routine process shrinks -- now no one remembers it) and the market abandoned everything else.
I think the important point is that "MOS scales". All the bipolar technologies never had anything like Dennard scaling, which was the backbone of Moores Law.
Case in point, you could get bipolar ECL RAM in the 80s with access times of around 1-2 ns (which is at least four times faster than the fastest DDR-SDRAM in 2020). Except those things would have a few kilobits at most and burn a Watt or two; an entire 16 GB stick of DDR4 doesn't require much more than that. (This is SRAM of course; you can't build good DRAM on a bipolar process, and MOS SRAM is much faster than DRAM as well. However, MOS SRAM in the 80s would have access times of 20-150 ns; it's typically the suffix of the part number, e.g. 62256-70)
These days ECL lives on as CML, which is similar and mostly used for signal transmission. It's very fast so HDMI uses it. Some crypto circuits use CML logic too, because it's less susceptible to side channel attacks.
Almost all modern fast serial interfaces are descended from ECL. Often with a twist that while receiver is ECL/CML-style long-tailed pair (which is the obvious implementation of comparator), the transmit side is normal CMOS totem-pole output stage coupled with some passive network to produce right voltage levels (and right output impedance).
I have one principle, after over 30 years of programming and a decade in "Software Archeology": Consider the poor sod who will have to deal with your work in 30 years.
You can get a volume license by buying a bunch of licenses for stuff that's $5-$10 per user. You'll still end up with a total of ~$300 (Including one LTSC license), but it is entirely legal and supported by Microsoft.