Hacker News new | past | comments | ask | show | jobs | submit login
The Jackintosh: A Real GEM – Remembering the Atari ST (paleotronic.com)
108 points by empressplay on July 8, 2018 | hide | past | favorite | 58 comments



Shawn Hargreaves first created the allegro game library on Atari st that was eventually ported to dos, windows and everything else. It had GUI framework inspired by GEM. That was my gateway drug for software development. Really straight forward and easy to comprehend stuff. We’ve lost a lot imo.


Maturity comes with a lot of costs. Try making a lightweight GUI framework that supports accessibility, bidirectional text, IME, multiple and different kinds of pointing devices, renders all kinds of Unicode script, DPI scaling, hardware acceleration, and so much more...

I absolutely miss those days, but only a little :) I just shrug and use Qt and be happy that at the very least, after I've fought the framework for a couple hours, an incredible amount of engineering is done on such difficult problems without me needing to try to solve them on my own in each project.


The vast majority of the time you don't need all (or most of) that stuff though. None of the applications i actually use take any advantage of accessibility (i do not need it), bidirectional text (most all of the text i see and write is in English and all of it is in left-to-right order), IME (i do not write in any language that needs this), multiple pointing devices (i only have a single mouse), all kinds of unicode script (as i said most of the text is in English, but i'll have a note on that below), DPI scaling (i prefer 1:1 mapping of units to pixels ratio and none of my monitors are hidpi and i plan on avoiding getting one as long as it is technically possible since i want to have the pixels visible), hardware acceleration (i do not think any GUI toolkit that uses traditional buttons/checkboxes/inputfields/sliders/etc style uses hardware acceleration, at least in the modern sense where it means using 3D hardware as opposed to the older 2D acceleration - some toolkits that try to reinvent Flash are probably using 3D acceleration more but i do not use any programs that use those) or any "much more".

The note i mentioned above and a small exception is unicode support, but that is externally imposed: sometimes i download files (like an album i bought on bandcamp) that use unicode characters which display weirdly if the program you use cannot handle them. But beyond that, no real need for it and this is only in a few types of programs.

Of course my needs are not the same as others' needs, but i am "paying" for stuff i do not use nor need, both in terms of complexity and actual resources.


That’s the case for any user though. If you’re Japanese and only ever read documents in Japanese then Shift-JIS is all you need and UTF-8 is a waste of bytes. If you use a screen reader then there’s an immediate extraneous cost to your laptop or mobile phone in the form of a screen and a GPU.

The benefit though is interoperability and economies of scale. And besides, another way of saying ‘not disabled’ is ‘temporarily abled’. Who knows when you’ll need assistive tech (if ever), but you’ll appreciate the effort that’s been put into the ecosystem when you do.


> That’s the case for any user though. If you’re Japanese and only ever read documents in Japanese then Shift-JIS is all you need and UTF-8 is a waste of bytes.

Yes, i do not see the issue with it though - remember that i talked about the case where i explicitly do not care about interoperability, which is pretty much all the time. I do not see the problem to use specialized software in the rare cases where i'd need such interoperability (and again, i'm talking about myself here, not about everyone, although i suspect a lot of people rarely care about more than a single language - and note that i'm not a native English speaker).

> . If you use a screen reader then there’s an immediate extraneous cost to your laptop or mobile phone in the form of a screen and a GPU.

I'm not sure i follow, what sort of cost are you talking about?

> The benefit though is interoperability and economies of scale.

Similarly, i do not follow. I am talking about the complexity and resources needed by a GUI system.

> And besides, another way of saying ‘not disabled’ is ‘temporarily abled’. Who knows when you’ll need assistive tech (if ever)

I may also need better interoperability or any of the other features mentioned above, it isn't like my current needs are written in stone, but that is an issue to be solved when needed, if needed.


I remember Microsoft did a study on Word usage after everyone complained of bloat and the huge number of unwanted features, it turned out that each user only used maybe 5% of the features, but it was a different 5% for each user. Its the same here. Sure, you only use ASCII and don't care about RTL text or screen readers, and that would be great if the vendors were writing software only for your sole use. But they're not, because they need to fund development of the software by selling as many copies as possible to many different users...


I fully understand that which is exactly why i wrote "and again, i'm talking about myself here, not about everyone". That doesn't mean i like it though.


> none of my monitors are hidpi and i plan on avoiding getting one [...] since i want to have the pixels visible

Seriously? I can't work out what possible advantage you might think this has. There is no situation where a lower DPI monitor is better than a higher DPI one, as far as I can work out, so I'm guessing you've probably never used a high resolution screen and just don't know what you're missing? Or are there many people out there who think like this?


> Seriously? I can't work out what possible advantage you might think this has. There is no situation where a lower DPI monitor is better than a higher DPI one

My primary reason is that i just like the way it looks. However it has a few practical advantages, one being that it is very fast when it comes to 3D games (which i play a lot) and while i could lower the resolution, that would end up creating ugly image quality from the upscaling. Also i do graphics myself and being able to clearly see the pixels (note that i do not just have a low DPI monitor, i have a 27" 1080p monitor where you can clearly see the pixels) has helped me many times spot tiny issues i'd miss at a higher dpi. I also use a lot of programs that are not DPI aware (especially older programs, but several new programs do that too) and in any hidpi monitor they look either blurry (from compositor scaling - i should also add that i dislike compositors here and disable them/do not run them when i can, but that is another topic) or everything is tiny.

> so I'm guessing you've probably never used a high resolution screen and just don't know what you're missing

I have used a high resolution screen at my previous work, specifically a 4K monitor which was around 30". It wasn't on my own computer but i got to use it a bit and wasn't a bit fan of it - yes, the image was crisp when it was running in full resolution but that was the only positive you could say. Every 3D program was very slow and most GUI programs had issues and glitches, so we almost always ran it at half resolution (which made everything blurry).

> Or are there many people out there who think like this?

I don't really know, i do not tend to act out of what everyone else is doing - at least not intentionally :-P. A friend of mine also has a 27" 1080p monitor and he has told me that the two main reasons are that he likes "looking at the pixels" and that he gets better performance in games with it.

FWIW i also have a 1440p 27" monitor (and supposedly a good one, Dell U2713HM), but it sits in the corner collecting dust (until the end of the month when i'll send it to a relative who wanted it) for a couple of years now, so it isn't like i do not have the option for higher resolution - i just do not like it.


I don't know what you suggest we do about that. I want my users to be able to use my programs in their languages, that requires translatable strings, UNICODE, IME. We didn't lose anything: I still could've used Allegro. I wouldn't because it no longer meets my needs. If you want to develop applications that don't meet modern expectations, you can use whatever framework makes you happy. Hell, some people really do still use primitive UI toolkits.


I do not suggest anything, my post was to explain that you (that is a "royal you") do not always need all the features you wrote. For example you write that you want your users to use your programs in their language, but most often that does not require Unicode - people did it before Unicode was a thing. It is just that Unicode makes it easier.

My post was an example of someone who doesn't need any of the stuff you mentioned and from my perspective these are just adding bloat since i wont use them. This isn't a recommendation or anything like that.

If you consider people being able to use your programs in their language and consider it a good thing to do, a toolkit that provides that feature will do fine for your case.


I am incredible happy that Unicode is the de facto standard and that encoding problems are (mostly) a thing from the past. But sometimes it doesn't work out and you still hve these problems (Microsoft I am looking at you and your bad Unicode CSV support in Excel).


I recently read a book about Commodore's downfall and it's clear that its disappearance was due to a lack of future vision and future planning in Jack Tramiel's part. He got lucky with the Commodore 64 and was looking for the next big hit. Rather than to develop the company while keeping its customers happy, he focused on short-term profits. He had no idea what to develop for the company's customers. So he hoped a new hit would save the company.

Commodore owned the market for home computers. Commodore's customers would have stayed loyal to the company if they could have gotten a product that moved the Commodore technology forward but they could not advance the technology. They saw sales decline and decided to find another hit rather than to advance the technology.

When he moved on to Atari he developed a great machine but he just did not know how to move forward the same way he did not know with Commodore.


It's easy to Tramiel for not having vision but what you're describing simply wasn't a thing in that era. Computers were silos which where hardware incompatible with one another. And the few instances where a next gen machine was build with support for its predecessors, that generally harmed the successor more than helped as fewer developers wrote bespoke software for the newer platform (why would you when you could target all versions of the product line instead of just the lastest machine?). However that didn't stop Atari from having a go with the many revisions of the ST and, later, the Falcon.

Plus the ST and C64 were largely sold as games machines (yes, I'm aware of the popularity of using the ST as a MIDI sequencer etc) and consoles of that era didn't follow an upgrade path either (baring the Atari 7800 being largely hardware compatible with the 2600 but the 7800 wasn't exactly a success because of that).

Many accounts of history actually look more favourably on Jack Tramiel as they argue that had the American Commedore offices taken his lead (like the European division did) and sold the Amiga in toy stores as well as the usual places - as the Atari ST was sold in Toys R Us across Europe - then the Amiga might not have flopped in the US. And there is good reason to believe Tramiel's strategy worked as the ST was a massive hit and the Amiga had a lot more success in Europe and America after the European division of Commedore started copying Tramiel's business strategy.

Ultimately though, few hardware manufacturers from that era survived. Even Apple nearly went under (several times in fact, Microsoft had to bail them out on one occasion just to avoid a desktop monopoly). And IBM didn't exactly do well out of upgradeable machines as the market just got flooded with cheap clones which IBM got no cut from. So it's very easy to blame Tramiel with hindsight but in reality he was one of the few which had multiple successes.


> Computers were silos which where hardware incompatible with one another.

Current OEMs seem to be quite willing to return to those days, as solution to their thin razor margins now with the rise of mobile computing.


The MSX was manufactured by a lot of companies.


And a market failure outside Japan, as far as I can recall.


It was rather popular in Russia for a while.


And Brazil, IIRC.


But it was too little too late. I had a pretty expensive MSX capable computer in the late 80s, the Yamaha CX5 I used only for music because nearly all software around ran on different architectures like the C64 which was inferior compared to it. By the time it started to be widely known it would soon succumb to the ST or the Amiga.


I worked at a computer/software store in the mall that sold the Atari ST (and other computers of the era) while I was in college.

I can still remember the reaction you would get when you told people whose Ataris had gone haywire that the quick fix for units whose chips were coming unseated from the motherboard was to pick the computer up, hold it level, and then drop it from about a four inch height.


We used Atari STs in a high energy physics lab. The shielding was really poor and we had no end of problems with corrupted memory. However, it was a magnificent piece of hardware for computing at its price point, so it was still worth it. I have a lot of fond memories of that machine.


I used to support a product with a similar fix. We had a bunch of desktop PCs in the field that were only used from January through April of each year, and usually powered-down the other eight months.

Apparently, after several months of no use, the hard drive bearings had a tendency to stick, and the temporary solution that often worked was to have the client lift the front of the box a few inches off the desk and let it drop, unseizing the bearing.

A lot of reactions along the lines of "I can't believe that worked!"


Ah, the good old ST-506 and its sticky bearing of fail. My IT teacher in high school showed me his fix -- rapping the back of the drive's casing with the rear end of a screwdriver.


To be fair the Apple ]I[ (and IIRC, the Lisa) had similar issues. One of those computers had socketed chips mounted upside down, with predictable results . . . and a 3 inch drop wouldn't fix that issue :-/


I worked at an Apple repair facility that was part of a large county agency serving public school districts. We didn't charge a minimum fee, so we typically did the job while you watched. Pushing all of the chips back into the motherboard was called "the laying on of hands." The other quick fix was taking a pencil eraser to the edge connector of the disc controller board, which was known to use inferior gold plating.

I remember people were horrified when the IBM PC came out with soldered-in chips, but IBM published a solid report showing that the chips were more reliable than the sockets.


Oh IBM's first MCA PS/2 PC's had a wonderful issue in some models. They used a daughter card to present the expansion slots. Many had a habit of placing the monitor upon the top of the case (as was norm back then) and a gentle tap would momentarily wiggle that daughter card due to the case bearing the monitors weight and cause the system to lock right up.

Solution - don't put the monitor upon the PC case was not a happy solution for many, but was the solution.


In all fairness, this was a completely normal thing for the socketed chips from that era: dropping the entire thing was easier than opening it up and pushing on all of the chips. And even now, if you are wondering why a particular thing is having problems, and it has any socketed chips, you should push on them.


It's kind of ironic that the Atari ST was developed in part by a number of ex-Commodore people, while the Commodore Amiga was likewise developed in part by a number of ex-Atari people.


Source for the latter claim?


From Wikipedia at https://en.wikipedia.org/wiki/Jay_Miner

>In the early 1980s, Jay, along with other Atari staffers, had become fed up with management and decamped. They set up another chipset project under a new company in Santa Clara, called Hi-Toro (later renamed to Amiga Corporation), where they could have creative freedom.


The original Amiga staffers were not pleased with Commodore's leadership after the buyout. Early versions of AmigaOS had an Easter egg where you could get it to display "We made the Amiga...and Commodore fucked it up!"

Later on it was quietly changed to "Commodore: A Tradition Of Excellence" or something along those lines.


This machine was my gateway drug. Dirt cheap and awesome. GFA Basic and MicroC (?) were really fun. Also programmable MIDI ports.

Also, home to one of the first multiplayer networked first person shooters ever:

https://en.wikipedia.org/wiki/MIDI_Maze


GFA Basic was very powerful, for C I would have recommended Mark Williams C.


Nice, I had not heard of GFA Basic. I'm somewhat of a BASIC afficiando, what was most compelling about it, in your opinion (those who used it?)

https://en.wikipedia.org/wiki/GFA_BASIC


Proper code blocks, record types, typed variables, auto-indenting editor, blazingly fast interpreter, built-in commands for OS and hardware features, functions and subroutines, an actual compiler if you wanted it... It was far beyond what any other BASIC could offer, and made the jump to Turbo Pascal relatively easy.


I miss GFA to this day.


Micro-C from David Dunefield/DDS?


I've earned a decent living doing DTP on Ataris (1040STFM first, TT030 later) all the way through 1995 or so. Back then there were some DTP/graphics software that was miles ahead of anything available on Mac,let alone PC back then. I haven't really followed that field closely since then, but it seems some of the features Calamus or Didot had back then still aren't available in mainstream packages even now...


Sane hwre. CalamCalamus was over the top dtp for those times. I know there was a PC running thru an emulator (https://en.m.wikipedia.org/wiki/Calamus_(DTP))

Did you ever tried Arabesque for vector graphics?


There also was a native Windows NT (including MIPS and Alpha builds) port of Calamus; I helped beta-test it. I mostly used Retouche Pro and Didot Line Art, but Arabesque was pretty good, too.


> but it seems some of the features Calamus or Didot had back then still aren't available in mainstream packages even now

What specific features are you talking about?


They actually worked in the way you'd expect them to work.

For the life of me I still can't get two lines of text to line up in almost every other package, it is most frustrating. Present day DTP/Layout software makes me long for either ATEX or typewriters.


Stochastic rasterization, virtual copies, step kerning, real WYSIWYG down to a printer pixel level, good perfromance on a 32MHz 68030...


There's a pretty good oral history of the founding of the company and it's early years that was posted this weekend:

https://medium.com/s/story/ataris-hard-partying-origin-story...


I have this memory of The University of York (uk) compsci getting these and doing truly awesome stuff. But, we had like whitechapels, and Suns, and Decstations, and Perq, and.. these tiny odd boxes with slidy-sideways function keys.. wut? WhuT???? And then people were coding sprites, and stuff which looked like what became mgr -> plan9 and amazing stuff.

I missed out. I totally did not get it, or the zeitgeist. The road not taken. Damn.


I remember creating a small widget application for Atari ST that drawed rounded corners on the screen to make it look more like Macintosh. Together with the Geneva font, it was very similar to Macintosh. It was a great computer.


For more info on the thinking behind the original Mac roundrects design: http://www.folklore.org/StoryView.py?project=Macintosh&story...


For more interesting reads on the Atari ST, I recommend The Digital Antiquarian:

https://www.filfre.net/tag/atari-st/


Ah, GEM. I always got stuck at why three colours? You're using 2 bits depth, why not give people the fourth colour!


Having that fourth color (in CGA that would be magenta) makes things look terrible. It was an artistic decision to have black, white, and a single color.


That makes no sense - the fourth colour could have been any in the ST palette. Choose appropriately for artistic reasons. Pick either a highlight colour (orange as on the Amiga and whatever early Unix desktop I once used), or a complementary colour (maybe grey?).

CGA was awful for many reasons, one of which was that all colours were fixed.


Since articles nowadays seem to like talking about things instead of showing them, here are some actual screenshots:

http://toastytech.com/guis/tos.html


Question, can the PC/x86 version of GEOS run in QEMU or Virtualbox? Or does it directly address 80286 hardware in such a way that it's not easy to emulate?


Even if it doesn't you may be able to get it working with PCem or 86box (fork of PCem which is a bit more user friendly).


OpenGEM is included in FreeDOS:" http://www.freedos.org/software/?prog=opengem



Hmm, now to find a web browser...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: