Hacker News new | past | comments | ask | show | jobs | submit login
Calc.exe is now open source; there’s surprising depth in its ancient code (2019) (arstechnica.com)
245 points by Tomte on March 25, 2021 | hide | past | favorite | 225 comments



This is the "modern" one with the egregiously bloated UI (including a loading screen!) and telemetry... but funny that it's using the same calculation code. Of course they still managed to screw it up somehow: https://news.ycombinator.com/item?id=17670644

In one of the old Win2k source leaks the code can be found for the far more efficient "classic" one.

Also, title needs (2019). There are some other HN articles from around that time discussing it: https://news.ycombinator.com/item?id=19321217


The loading screen technically isn't part of the calculator's UI - Windows automatically shows a loading screen for all new (UWP) apps unless they explicitly opt out (and even then the OS will still show the loading screen if the app takes more than a second to load). The purpose of this is to keep the OS feeling responsive, since it allows it to bring up the window immediately (for UWP apps the OS shell is in charge of creating and managing apps' window frames, unlike Win32 where it's the apps' responsibility), rather than leaving the user staring at a screen with nothing happening for several seconds (and maybe thinking it's the OS's fault for being unresponsive)


for UWP apps the OS shell is in charge of creating and managing apps' window frames, unlike Win32 where it's the apps' responsibility

Sounds like the linux ecosystem moved to client-side-decorations, and Windows moved to a window manager model.


Regardless of whether it is or isn't part of the calculator's UI, the presence of a loading screen for a calculator is a symptom of a deeply-rooted problem.


It's funny what people tell you about themselves if you just listen.


> The purpose of this is to keep the OS feeling responsive, since it allows it to bring up the window immediately, rather than leaving the user staring at a screen with nothing happening for several seconds

Wait, you only get one of those? As of about a month ago on my computer, launching calc.exe takes 5-6 seconds for anything to happen and I'm then greeted 1-2 seconds of a loading screen after the window appears.

I've taken to just leaving an IDLE shell open at this point.


Open sourcing it seems to be working. Several of the open pull requests fix bad math: https://github.com/microsoft/calculator/pulls


Seems there are 4 pull requests open to fix various math issues, all of them opened in 2019. Most PR merges seems to be about minor changes like translations and version bumps. Not sure we can call it a success just yet.

It's very cool it's open source in the first place though, don't get me wrong. And MS doesn't have to merge fixes if they don't want to of course.


A loading screen and telemetry? For a fucking calculator?

What happened to software...


But what about growth and engagement? I wouldn't be surprised if some idiot's performance metrics are tied to how many people "engage" with the calculator.


Next step: contextual ads. “Looks like you are trying to calculate your taxes. Click here for $10 off of Turbo Tax!”


At least it’s not a subscription model for £6.99/month like almost all of the apps you find on the app stores that used to be either free or a one-time £X.


You get the addition and subtraction for free, but you have to upgrade via in app purchase for multiplication or division. It makes sense because those are advanced features the average user wouldn’t need anyway.


Yes, and if the average user really needs it from time to time, they can use + several times.


Besides you can multiply for free, just add N number of times. Multiplication is a quality of life improvement, not a necessity


I see that you tried to look for a calculator app for iPad (no, there is still no built-in one).


That's because you're supposed to pull out your iPhone to use as a calculator.



Calcbot does a pretty good job of Not Sucking. I think it's a couple bucks, I don't know, I paid for it ages ago.


Did you find PCalc?


I still cannot get over that. A calculator with ads.


I defs paid for the ti-calc emulator, but that gets me my ti-89 experience, and the emulator didnt cost me 200 bucks


Sqrt() is a subscription service if you want more than 2 decimal points


There are now cars where the heated seats are a subscription service. https://www.autoguide.com/auto-news/2020/07/bmw-wants-to-tur...


I would say, "unbelievable!", but sadly, yeah this garbage is only going to grow like a cancer.

So, no. Not for me.


That gives me the feeling there's good business in modifying BMWs to connect the heated seats wires to a physical switch with a light on it


That's piracy. You wouldn't download a car [seat]!


i don't want to live on this planet anymore.gif


They give the seats for free, they must monetize them somehow, understandable


math obsessed kid bankrupts parents by purchasing digits of pi.


Need more than 5 decimal points? Contact us for pricing.


They stopped charging for it. At least for the upgrade when it launched.

The user became the product.

Windows collects user behavior and sells ads.

You can still buy whatever they call Windows LTSB this week if you hate this.


Buying LTSB is extremely difficult and annoying. It's not just a matter of entering card details and getting back a product key + ISO.


folders in Windows have loading bars now. Folders with nothing but text files in them. It's insane.


Explorer UI is capable of showing loading animation for folder contents since it was first publicly released. It makes sense not only in context of potentially slow devices, but also because "Folder" is not an "Directory" but an somewhat arbitrary COM object and thus showing contents of folder can lead to a lot of potentially slow operations.

IIRC somewhere along the line of all the Win32 windows versions the loading animation got somewhat broken and explorer window was more likely to get simply unresponsive instead of displaying loading animation, but that is another story.


Folders in Windows can be network locations or internet locations without any guarantees on their access speed, makes sense to show to the user that it's still doing something.


A big part of that loading bar, outside of network ops and slow drives (and remember, today you don't have floppy drive noise to know if it works or got hung), is for things like sorting and getting file icons and other attributes right. This can take non-trivial amount of time.


Someone did repackage the old Win7/Prior version of calc.exe here: https://win7games.com/#calc


They still have the old calc.exe from Windows 7 in the source tree. You get it if you the LTSC version of Windows. I wish there was a way to get it on regular Windows.

They seem to be going the other direction; notepad.exe is now updated from the App Store in the Dev channel.


You can also get the old calc on other versions of Windows if you use Chocolatey: https://chocolatey.org/packages/oldcalc


This code is MIT licensed, it should be possible for someone to re-create the old UI and still use this calculator's modern calculation engine (with all the bugfixes)?


I did a casual browse of the repo and likely keep overlooking the windows 7 build.

Hoping you could point out how to build it or where it is hiding.


I don't think they meant "this released source tree" but as in "microsoft's source tree" since it is still actively maintained for the LTSC release channel.

This is for the new calculator only.


Thanks for that link. The forum thread is perhaps the most perfect example of how terrible technology forums tend to be that I've ever seen: if you report a problem, some fanboy with over 10,000(!) posts will immediately show up to tell you it's your fault and not really a problem:

> Do you really think it is going to make much difference to the final answer?

> Even the algorithms used are probably only accurate to so many significant figures.

Beautiful. (Keep in mind the issue referred to here only requires five decimal digits to show up.)


A loading screen?

You're probably better off just using a REPL at that point.


Eh, if you launch python in powershell there is still a loading delay, since powershell for some crazy reason takes like 30s to launch on a default win10 installation.


You must have something very wrong; neither Windows Powershell nor Powershell 7 takes 30s to launch.

But anyway, for a quick calc, is there anything faster than Spotlight (for mac)/Gnome shell (for linux)/Powertoys Run (for windows)? Instead of app name you are going to launch, just punch in your expression.


Not the person you responded to but I have the same experience with Powershell as them. I just checked again to be sure and it takes 20s to start and become usable on my laptop. CMD.EXE, in comparison, starts instantly.


I'll be the next to confirm this, and on multiple machines.

Maybe 1 in 5 that I try will be a reasonable startup time.


That's interesting, it seems quite widespread.

I'm getting 1-2s on my machines, both just plain powershell and powershell inside windows terminal. No enterprise crap running though, just Defender.


It got bad enough I wrote a (now unused) script in batch instead of Powershell. Also having to explicitly enable scripting for each machine was a pain, though understandable.


All this time I've been launching calculator and I could have just done it in spotlight...

Why don't they tell you that after you've launched calculator for the 20th time in an hour.


Probably very slow disk (VM?) or Windows Defender ? Or too much modules ?


I shudder to think what you're running Windows 10 on because it only takes 2 seconds to come up on this Celeron 2957U I'm using to browse the internet.


I'm running Windows 10 on a Ryzen 5 with 16GB of RAM and an NVMe drive. Powershell takes something between 0.5 to 1 second to start. The first ~20-30 seconds after the start it feels sluggish when doing simple tasks like listing directory contents. It makes it very unpleasant to use.


Sluggish behavior during and just after Windows startup is sometimes caused by mapped network drives that no longer resolve. Worth checking if you haven't already.


I'm talking about Powershell startup, not Windows startup.


If it's a corporate system like my (presently only) Windows machine then I wouldn't be surprised. They manage to configure these things so they're as fast as a Pentium 133MHz processor from 25 years ago, even though you've got an 8-core i7 with 32GB of RAM.


I wouldn't either, our devs were given the standard issue Lenovo T480 which would have been mostly fine were it not for the fact that we had Raytheon ForcePoint DLP, LANDesk, Trend Micro Apex, WebSense, Cisco Amp, Cisco Umbrella, and who knows what else watching everything we did.

It took upgrading to i9s to get something that could run Teams and Outlook at the same time without audio stuttering during a voice call.


why would you launch powershell to launch python? you can just launch python directly?


powershell is slow, just use a console window.

Or maybe starting anything on modern windows is slow.


Huh? I use powershell every single day at work and it loads instantly for me.


I've started using the browser console window more and more as a calculator. The only downsides are the a longer syntax (eg can't do 3^2) and that it is a tab in the browser. Variables and other such things are very useful.


Why the console window? For basic math things that I sometimes need, I just enter stuff in the search/address bar, and it pops DDG or Google or whatever with the answer; and even your own example of "3^2" works there as-is.


I also often use the browser console for this as well. Just as a quick tip, you can use ** for exponentiation (e.g. 3**2 === 9)


I just have a hotkey that opens a Node CLI. I know some people who do a similar thing with Python, but I think JS is better at composing quick one-liners.


To bad it's the new one. I still recall the Win3.1 (or possibly 3.11 WFW) calc.exe returning 0 when calculating 2.11 - 2.1 . I've always been curious to know what could have triggered that error since it was too big to remain unnoticed for so much time.


I tried it in the "Archive.org Windows 3.11 in a browser" and it returns "0.01".

https://archive.org/details/win3_stock


Ah, now confirmed...exists in Windows 3.1, fixed in 3.11

You can try Windows 3.1 here: https://www.pcjs.org/software/pcx86/sys/windows/3.10/

And, indeed, 2.11 - 2.1 = 0.00

But, if you then add 1.0, you get 1.01. So the right value is there somewhere.


There was a dead comment from another HN person that was actually quite helpful here. Not sure why it was dead, so here's a copy/paste:

"It was fixed in this update for Windows 3.1, which switched away from floating point: https://jeffpar.github.io/kbarchive/kb/124/Q124345/"


It was fixed in this update for Windows 3.1, which switched away from floating point: https://jeffpar.github.io/kbarchive/kb/124/Q124345/


Here's another test I do with calculator apps (in radians): Pi - 4 * ATAN(1)

These apps really should produce 0 for this and avoid floating point issues. Most of us understand why it wouldn't. To me it's an experience/quality issue.

For what it's worth, the default Android calc app produces 0 (as well as nice RPN apps like Droid48).


My go to calculator is spotlight search on my mac. Pleased to announce that it gives 0 for this.


The macOS calculator, on the other hand, has a totally ridiculous bug: it displays "Not a number" for any negative number that you try to take e^x of. 2^x and 10^x work fine, as does using the x^y button with base e. Typing "e^(-whatever)" on the Spotlight search works, as does e^x of -whatever in the iOS calculator app.


I just use a wolframalpha for everything these days. It uses symbolic computation so there's no possibility for problems like this, and it's a lot more powerful than any built-in calculator app. Almost anything can be typed in one line with natural English directives like "300mi/14mpg * $2.7/gal to Euros". Especially convenient with a browser keyword.


Check out Qalculate![1] for a program that does that and much more (though obviously not as much as WA).

1: http://qalculate.github.io/


I usually use Google for quick calculations but get annoyed when it only returns large results in scientific notation. WolframAlpha gives results in just about every conceivable way, e.g. "7 trillion" or "7 000 000 000 000" instead of 7e+12. Much nicer for someone like me who isn't used to scientific notation.


But Spotlight is just cmd-space away at all times. My only complaint is that I can't get results for date-based queries like "days since 17 Jan 2021" or "50 days from today" and I have to go to Google for those although I can at least just hit return after entering the query to have the result pop up in Safari.


My biggest complaint after switching to Windows is how poorly the start menu does math. Half the time it tells me "results are unavailable" for simple arithmetic... I don't know if it makes a web query or something to calculate but it's frustrating.


Alfred also computes it to 0


Another comment suggested they switched from floating point to rational number representations after that version. Since 2.11 and 2.1 can't be represented exactly in floating point, it's possible that the issue was the difference was too small (smaller than just .01) and was being rounded off (or was below the limit for number of digits to display).


Try 2.11 - 2.1 here: http://weitz.de/ieee/ . I don't think they used 32-bit floating point, otherwise there should clearly be a large enough number to display.


in that era, you might or might not have had FPU hardware; where it didn't exist or was off there were a variety of libraries with their own quirks that got used.


How many lines of code is it?


Maybe now someone can finally fix the pixel alignment...

https://www.reddit.com/r/mildlyinfuriating/comments/a5r971/t...


Appears to have been fixed. From my Windows 10 box: https://imgur.com/a/MRW9S8a

Edit: Wider screenshot.


Isn't the issue located near the bottom right corner of sqrt? Your screenshot doesn't include that corner.


Updated with a wider screenshot.


Oh hell, I didn't need to know that. Now I'll be noticing it every time I see the new Windows calc...


It's your punishment for running Windows.


I'm right now running pacman on Arch Linux via WSL while playing 'Call of Duty' in another window. Let's see if your OS can beat that.


It can run Call of Duty, yeah:

https://www.protondb.com/app/393080


> I'm right now running pacman on Arch Linux via WSL while playing 'Call of Duty' in another window. Let's see if your OS can beat that.

Yeah?

I'm running Far Cry $SOME_SEQUEL_NUMBER on Wine, while running Android Builds in a background terminal, using another terminal to hack into the Israeli Defence Force network to deliver my Android exploit that will cause their centrifuges to spin slightly too fast in the incorrect direction, slowing down the spin of the earth ever so slightly ... I'M STOPPING TIME! ...

(Too much? :-)


My OS doesn't let me waste my time in such a way. It's a feature ;-)


Of course it can beat that. Just swap "Call of Duty" with something that only runs on Linux.


I'm playing TuxKart right now. Jealous yet?


My OpenBSD server came with the ancestor of The Greatest Game You Will Ever Play™®ʲᵏ¹ preinstalled. Been using GNU ever since I replaced Windows ME and 3D Pinball for Windows – Space Cadet with Slacko Puppy Linux and Xsoldier, and fully quit Windows when it started shipping with Candy Crush Saga ads and no Minesweeper. Also I typed all but two characters of this message, thanks to the X11 Compose key.

¹https://thegreatestgameyouwilleverplay.com/


Steam's Proton lets most of the popular win games run on linux. I've been playing New Vegas lately.


Well, I do my math with M-x calc on Emacs, but nobody around me seems to want to adopt the superior OS...


I don't see why, it is clearly better.


Probably because everyone you’re trying to convince stopped listening out of fatigue. Nothing revs me up to hear an opinion more than “you’re running the wrong operating system/editor/browser, mine is clearly superior,” when we all have limited lifespans in which to argue about exceedingly pointless trivia such as that — especially discarding any implicit requirements for choosing the tools one has chosen which are almost certainly unknown to the person making such an arrogant observation. People have different needs and goals for using computers and telephones.

I’ve been cutting people out of my life for repeatedly talking like GP (and you, to a lesser extent) lately; that’s how frustrating it is to be on the other end after 20 years of hearing it from elitist peers. I finally snapped when I set my iPhone on the table at a bar and was treated to the entire conversation turning to iOS vs. Android for the next two miserably painful hours, while ignoring that we all went there to drink to forget about shit like that.

You might consider my approach harsh. That’s fine because I know my approach is clearly better. (See?) It amazes me that it’s been decades now and advocates of non-mainstream operating systems and tools (read: FLOSS) haven’t figured out looking down at the people you’re trying to convince is repulsive to anything you’d say - and sometimes you as a person. Persuasion from a negative is almost always a net loss. Ask anyone in sales.

Tell me why it’s better for my specific needs. Not that my choices are inferior. Until then, I’m distantly happy for you that you’ve found a tool that works for you, and I’d prefer that you tell someone else (in general, not you specifically). I’m especially thinking of the randoms who don’t know me and approach at conferences or social events to reflect on my choices. It’s never Windows or macOS or Edge or VS Code advocacy, either, weirdly enough. Emacs in particular seems to catch these attitudes like a magnet.


I think it was a joke, dude.


[flagged]


You don’t need to create a new account for each comment you make, you know.


[flagged]


In comments: don’t be snarky.


Stuff like this gives me a really bad feeling about the quality of software. When it's so close, but not perfect, it seems like someone tried to get it right but just hacked it together and checked by eye instead of solving the underlying problem. I prefer to see software where they haven't even tried.


Isn't this fixed already? I can't recreate it on my machine but I can't also find the exact same layout used in that post.


at least it has a transparent background....


"surprising depth in ... ancient code" seems extraordinarily condescending to the past, upon which computer science has evolved little.


Perhaps what's surprising to non-programmers is that older code tends to be better/faster/more optimized than new code, which is unintuitive because new hardware is orders of magnitude better.


It's an inverse relationship. As the "just release a patch" method of development takes over the idea of a 'gold master' disappears. When software used to ship on floppies and no one had modems, in general software had to ship working 'well enough' because it was not easy to fix.

Also older software tended to be documented better because there was no 'fire off an email' or 'ask Bob on Slack' what this code means. The barriers to quick communication where higher, hence reading the code comments would have been the path of least resistance.


I completely agree. I wonder if there would be a niche for high-performance languages that are maybe a little harder to write than Python but still easy, and closer to the performance of C.


“better” by definition of often applying rather absurd hacks for the sake of performance, yes, hacks that very often also compromise the output of the end result.

In particular, some old video games feature the absolutely most wonderful hacks to achieve what they wanted to with the available hardware, but also often lead to engines having to make particular compromises in possibility of environment design that many players failed to notice.

Many older game engines were actually incapable of stacking walkable planes on top of each other, though the levels were three dimension and one could ascend and descend, there was never any walkable surface directly under another, which was necessary for some optimizations.


Came here to say exactly this. There is so much arrogance in our industry, why would code from the past be worse?


Indeed. For anyone who grew up in the industry after the Internet was widely available, it might be unnatural to visualize how much effort was spent on making sure the code is correct before shipping.

Shipping meant producing that gold master build and sending it off to a floppy replication service and then got distributed to the store shelves. That 1.0 version was frozen until you'd ship a new version a year or more later. There was no second chance for that release, code had to be as close to perfect as possible, first time out.


Medicine is a similar field that advances enormously based on increasing knowledge hand in hand with incredible leaps in technology.

Is it arrogance to say that medicine is far better than it was in the past?


> Is it arrogance to say that medicine is far better than it was in the past?

Irrelevant. Medicine and Software Development are not correlated, just because we want them to be. That's the point. Software Development results seem to have devolved (quality metrics such as binary size for equivalent reliability, functionality, and notably time to release), despite improvements in almost every aspect of development tooling.

Also interesting to note, these quality metrics follow along expected curves (accounting for development tooling) when DUPLICATING (sometimes with small improvements) working software. This tells us a great deal about what is a large problem with modern software.


People expect much more features with modern software as before, and accept worse reliability because of the faster bug fix cycle.

Sometimes even playing music doesn't work on my mobile phone, and I have to restart it. I never had that problem with my casette player 30 years ago. But that doesn't mean that I go back using a casette player instead of my phone for listening music.


> quality metrics such as binary size for equivalent reliability, functionality, and notably time to release

Binary size is not a quality metric.


Interesting that "avoids causing unnecessary costs" is not a quality metric to you. It's fun to see how it can suddenly become a really really important metric on platforms with size limitations, even if they do not care about "cost to users", which is the more common effect of binary size.


That cost is frequently swamped by development costs, runtime costs, support costs. Platforms with super restrictive size limitations are a small minority, these days you can plop a 5-10MB binary on your toaster.


Maybe not to you…


To the vast majority of software developers and software projects out there, except for very constrained embedded environments.

Server back ends don't care about it, front ends don't care about it despite the lip service paid, games sure as hell don't, etc

A small minority of web devs seem to care, together with creators of already gigantic mobile apps and as I was saying, embedded devs working in very constrained environments.

The market, as a whole, maybe has this as their 100th priority. Which means it's not.


This is often the excuse we hear for why modern software is bad: "well, normal people don't care about that", a strawman argument that implies anyone who does care is abnormal and therefore worthless.


> anyone who does care is abnormal and therefore worthless.

No, they're not worthless, they're worth less :-)

Don't tell me, tell the business people financing everything we build. By and large, they don't care, it's reflected in incentives and it's reflected in what we build.

So yes, I stand by that strawman.


> Don't tell me, tell the business people financing everything we build. By and large, they don't care, it's reflected in incentives and it's reflected in what we build.

Another common scapegoat for developers who write bad software. I just wish we had a little more integrity as an industry, especially since so many of us insist on calling themselves "engineers".


To use a more extreme example, regular engineers build tanks and jet fighters and drones and machines that make mustard gas, etc. The software world is just a reflection of the outside world, at this point.


I don't think that's quite the point they're making. They're saying that small binary size doesn't always translate to user value. Technical excellence can be viewed as a goal in its own right, but it's not the same thing as user value.


Yeah, for 99% of software produced, smaller binary sizes do not offer increased user value.

Software designed to spec and user requirements does (where those requirements could be fast development time, high performance for whatever the performance criteria are, longevity, easy and cheap extensibility, etc.).

Binary size, as I was saying before, is like 100 on the list of priorities for most categories of software. Mobile apps sometimes have it as a priority and embedded apps frequently have it, too. But for mobile apps those limits are loosening, so soon they'll stop caring, too, and embedded apps are a minuscule percentage of all apps developed out there (most of them are web apps, especially Line of Business - LOB - apps).


I'm not even sure how to explain why I would expect technology from the past to be worse than technology from the present. But yeah, I do expect that.


Code is a way to define a series of logical steps that can be performed using binary logic. This hasn't changed in 70 years.

Expecting code from the past to be "worse" is like expecting math from the past to be worse.


I also expect math from the past to be worse.


Oh, haha, well at least you're consistent.


Computer science evolved a lot, we have self driving cars running on the road. Of course the computing power improved a lot, but deep learning algorithms are also improving faster than Moore's law.

It's just improving classical computer science has less practical benefits than machine learning nowdays, so the focus of research changed.


Is it really? The much more likely assumption to me is that something as basic as a calculator app wouldn’t have particularly complex internals.


If curious, past threads:

Windows 10 Calculator is now running on WebAssembly, natively on iOS and Android - https://news.ycombinator.com/item?id=20275484 - June 2019 (10 comments)

Fixing a Small Calc.exe Bug - https://news.ycombinator.com/item?id=20210963 - June 2019 (117 comments)

Counting Bugs in Windows Calculator - https://news.ycombinator.com/item?id=19367366 - March 2019 (53 comments)

Open-Sourcing Windows Calculator - https://news.ycombinator.com/item?id=19321217 - March 2019 (174 comments)

Windows Calculator invalid input with 10^x function - https://news.ycombinator.com/item?id=17670644 - Aug 2018 (88 comments)

%*$*#&#* Windows 7 Calculator - https://news.ycombinator.com/item?id=10791667 - Dec 2015 (108 comments)

Why is there not a Square Root button in calc.exe? - https://news.ycombinator.com/item?id=1623381 - Aug 2010 (23 comments)


Raymond Chen about the infinitely precise engine of the Windows calc.

https://devblogs.microsoft.com/oldnewthing/20040525-00/?p=39...


>I wouldn’t be surprised if these are the same people who complain, “Why does Microsoft spend all its effort on making Windows ‘look cool’?

A few years later they released Vista. Stones and glass houses.


PSA: Here's another, much more feature-rich calculator called Expression Calculator (or Global Calculator sold in Germany in 1997), written in Delphi - https://github.com/dblock/excalc, full source code and a working executable that still runs on any version of Windows.


(2019)

Some previous discussion: https://news.ycombinator.com/item?id=20696695


I've always found calculator applications to be quite funny. To use a calculator application while sitting at a computer is at least a bit ironic.


The original intention of computers was to perform calculations. The inclusion of facilities that allow for calculations to be performed on a computer is ironic how?

That seems like thinking it ironic that lawn mowers have a place to put blades for cutting grass.


The purpose of computers is to perform calculations. Then we built a massive pile of abstractions on top of that souped-up calculator to make it into a general purpose machine. Then we built a small calculator app on top of those abstractions. A lot of work to get back to square one.

(This obviously misses the point that we made computers usable by mostly anybody along the way, but it’s still funny)


As someone who sat alongside when my granddad soldered 8080 clone to the board, entered commands via switches, and seen binary results on a LED display (not a modern LED display, but literally a row of 8 red Light Emitting Diodes, you know) I tell you with a great confidence that clacking 1+2x3 on a keyboard and seeing an answer on SVGA+ screen is much quicker and more convenient than:

  - planning a program
  - implementing missing mul and/or shl instructions in your mind
  - managing register pressure for more complex expressions
  - having to start from scratch on a mistake
  - debugging with a multimeter
Even if you think of MBR-style calculator, there is at least dozen kb BIOS and dozen kb VGA BIOS to start with a blank screen that can do cursor and digits. There is nothing cheap or straightforward down there, and the first thing people did was abstracting baremetal away ASAFP.


One way of thinking of it is that a physical calculator has two parts - the circuitry that performs the calculations and the UI - in the form of physical buttons, displays, and the connections between them and the calculation circuitry - that allows people to effectively make use of it for their calculating needs.

A modern CPU has got the calculation part covered by itself, but it still needs the UI. Of course there are other kinds of UI for people to do calculations - for example Excel, or programming language REPLs - but desk calculators had pretty good UI for some use cases, so it makes sense to have an option based on them.


That only seems to explain why it could seem odd to buy an entire desktop PC to only ever use it as a basic calculator. In that case, it might make more sense to just buy a calculator. But there's certainly nothing odd about buying a general purpose computer and then use it for lots of different specific purposes, one of which is numerical calculation.


I agree with the parent because creating code which runs arbitrary calculations is quite weird. You would assume you could just send instructions to the CPU and print the output.

The idea that there’s a huge wrapper around the CPUs core functionality is indeed weird because you would expect that functionality to be available without any program at all.


> That seems like thinking it ironic that lawn mowers have a place to put blades for cutting grass.

A more apt comparison would be having a 40ft tall Mech that transforms into a jet and flies from neighborhood to neighborhood then crouches down and uses a tiny pair of scissors to cut the grass.


Given computers are nothing but a mind-bogging amount of very simple operations (add, mov, whatever on fixed size values), I'd suggest an equally apt comparison is that a computer is billions of nanobots cutting grass one blade at a time, and the calculator app is the Mech.


I suppose what he meant is it's ironic due to the insane overhead. When the end-user double clicks calc.exe, types in 1+1, and sees the result on the screen, the CPU has executed on the order of tens of millions of instructions (filesystem code, OS code, UI code, display driver code, etc) to show a simple result that could have been done in a single x86-64 ADD instruction.


Consider what happens when you type in "= 1 + 1" in the Google search bar then.


shudders ;)


The irony is more so the unnecessary skeuomorphism, that, inside of a machine that is a strict superset in capabilities of another, simulates the limited interface of the latter machine to make it “look and feel” as though it be an actual calculator.

Obviously, a prompt that accepts arbitrarily complicated mathematical expressions and returns a value is a far superior interface than clicking on buttons with a mouse, but skeuomorphisms in design are very common place, in spite of their lesser efficiency.


> That seems like thinking it ironic that lawn mowers have a place to put blades for cutting grass.

A more apt analogy in this case would be a car having pedals.


This might be the Sheri an version of ironic, which includes anything a bit funny, or weird, or they can’t think of a term for. Like rain on your wedding day.


I think your parent comment meant computers are calculators themselves since forever.


Why is that ironic? The obvious use case for a computer is to, you know, compute things, and the most obvious way to do that is to use a program designed for that very purpose.


Because a computer is obviously more powerful than a desk calculator (unless you happen to have a programmable calculator, which would itself be a computer.)

Transposing the metaphor of a desk calculator verbatim is inefficient and unproductive, graphical gimmick exclusively. Unix bc, which predates Windows calculator by a couple decades, is Turing-complete and in fact close to a full-featured modern dynamic language.

Today's equivalent would be a {python|ruby|node|perl} REPL.


The UX of a REPL sucks if you're not an expert. A calculator UI gives you a lot more guidance about what's available.

Accessibility is everything. I would guess the skeuomorphic calculator apps have 5 orders of magnitude more users than bc, and 4 more than the REPLs (not that the REPLs were even built for this purpose).

I think they will evolve over time away from skeuomorphism, but not into anything resembling a plain REPL.


Excel has the same accessibility as a calculator UI. I'll have that, in a smaller window :)

Also, the UX of a REPL sucks because it lives in a white-on-black terminal and prints scary version messages about a thing called "clang". Do the same thing with nice colors and friendly messages and I doubt the usability would be any worse.


This is precisely what I'm trying to illustrate. No, Excel does not have the same accessibility as a calculator UI. Excel is somewhere in between a calculator UI and bc in terms of how discoverable and usable it is.

Engineers fall into this trap a lot - projecting our own preferences onto what we think other people should find accessible. Most people require training before they can use Excel. The training required to use a calculator is so minimal, they teach it in elementary school. And a REPL is gibberish to the vast majority of people, even if you dress it up with nice colors and friendly messages (which to be fair help a lot).


To explain the irony for OP: A computer is or has been basically a fancy calculater, so putting a program called calculator on it, seems redundant. It would make more sense to call it e.g. algebra and have other programs called geometry and so on.


I think the issue is more that calculator apps are one of the last strongholds of skeuomorphism. Why does almost every calculator app have a keypad, even those for non-touch OSes?

For a refreshingly modern take on calculator apps, take a look at something like Soulver/Numi/Calca which use a notebook-style interface, or SpeedCrunch which uses a REPL-style interface (and which does have a keypad, but you can turn it off).

If the idea of a calculator app just seems ugly to your brain, PowerShell does floating-point math and is scriptable to boot. I've used it as a calculator in the past, though it means having to put up with some wonky syntax.


> Why does almost every calculator app have a keypad, even those for non-touch OSes?

How would a non technical person do a square root or exponent? It's not obvious to type sqrt(2) or 3^4 into a text field, and people shouldn't have to read a manual to use a calculator. The default calculator is meant for the average person, and advanced tools are available for the rest.


It's been done with Mathematica, which includes a "basic" toolbar with square roots, integrals and the like. There's plenty of room for middle ground between a REPL and emulating a physical calculator.


The thing that makes me giggle is how it also happens to be the most primal function of a computer.

I'm sure all tools you suggest are great, but as someone who can program, I can just use the REPL of whatever programming language I'm using at the moment. I have no need for a dedicated tool in the first place.


Agreed. Operating systems turn computers into something more like filing systems. It's a shame because computers are actually really useful too!


Floating point implementations vary among programming languages. On the same machine and OS (win10), the equation 0.3 - 0.1 equals varying numbers depending on the calculator;

Powershell says 0.2,

calc.exe app and libreoffice calc agree with 0.2,

BC running in CYGWin also 0.2,

Python 2 and 3 answer 0.19999999999999998,

JS in Vivaldi and firefox also answer 0.19999999999999998,

But Portacle (Common Lisp) returns 0.20000002


Not sure how this is related, but:

- if you get 0.2, the code is using rational numbers or a custom implementation

- 0.199999...8 is the double precision IEEE subtraction.

- 0.200000...2 is the single precision IEEE subtraction.

This looks like a typecasting/coercion issue. You can likely get all of the above in C++ or Java by using different explicit casts.


> if you get 0.2, the code is using rational numbers or a custom implementation

IIRC, there are languages that use IEEE double precision but the way they handle default display presents this as 0.2 (basically, they use a display algorithm that displays the shortest decimal expression that has the same double precision representation.)


Using doubles in Java you get 0.199999..8, using BigDecimal you get the correct 0.2.

    jshell> new BigDecimal("0.3").subtract(new BigDecimal("0.1"))
    $8 ==> 0.2


Yeah, and using float literals you get 0.20000...2:

   jshell> .3f-.1f
   $1 ==> 0.20000002
We got ourselves a hat trick :)


0.2 could just be base 10 / decimal floating point rather than the base 2 single/double i think, It’s all ieee754 it’s just the base10 stuff came later


    In [1]: from decimal import Decimal as D

    In [2]: D('0.3') - D('0.2')
    Out[2]: Decimal('0.1')


And then there are PLs that adopt the incredibly radical technique of treating 0.2 as two tenths[1].

[1] https://medium.com/@raiph_mellor/fixed-point-is-still-an-app...


As far as I know, BC doesn't do floating point arithmetic. It does fixed point arithmetic, built on top of arbitrary-precision integer arithmetic. Which is why it won't give you one of those funny floating point answers. It's what a human would do on a piece of paper.

Floating point implementations can vary not only between languages, but also between different CPUs, if the language relies on the hardware implementation (which is the smart thing to do in most cases).

As a side note: Python's implementation of FP isn't standard-compliant. E.g. when you divide by 5.0 by 0.0, the standard says you should get a +inf. In Python you get an exception.


Actually, according to the standard, you should get either +inf or an exception. Both behaviors are valid. Nevertheless, according to the standard, the user should be able to choose between getting +inf and getting an exception. If in Python there is no way to mask the divide by zero exception, then that is not standard-compliant.


Wait, isn't this the calculator that doesn't know order of operations? When I used to teach math, I would tell my students to type in 1+2*3 in their calculator and if it didn't give 7 as the answer, they should throw it away. The Windows calculator was one of the ones that failed, if I remember correctly.


Windows 10 calculator has a simple mode where operations are evaluated immediately which would give 9 for those keystrokes, but does at least show what it's actually calculating: https://i.imgur.com/au7dvfg.png

In scientific mode you can enter the full expression and get 7: https://i.imgur.com/DT6LNKb.png


Rather than not knowing order of operations, it doesn't parse algebraic expressions, it just takes binary inputs for the common real operations. It doesn't operate on "1+23" as an input, it operates on "1", "+", "2" = 3 "" "3" = 9.


It works like you say in standard mode by design because it's trying to match the behavior of older desk calculators that worked that way. In scientific mode it respects order of operations.

There's a proposal to change the UI to make this clearer: https://github.com/microsoft/calculator-specs/blob/247bbb50d...


It seems absurd to make the default behavior incorrect because older cheap calculators did the wrong thing. This isn't like making an incompatible change to a programming language to fix an incorrect behavior, this is more like if the designer of a new programming language decided to make 1+2*3 give nine because his cheap desktop calculator did the same.


I've found that a surprising number of people (outside of fields like math, engineering, programming) are used to the idea of entering an operation and having it applied to the current value. E.G: pressing the [√] button immediately square roots the current number, pressing [X][2] doubles the current number, etc.

It's a different paradigm to using the buttons to enter a full expression then pressing [=] to evaluate it, but I don't think it's accurate to call it "the wrong thing" - just that the button presses aren't transcribing to the equation you expect (the equation actually displayed by Windows 10 calculator is now accurate).


I'm running calc.exe from Ye Olde Windows days, not this fluent-UI 'refresh', and it gave me 9.

Feeding it 1 + (2*3) produced 7.


it's no wonder windows 10 is ugly

their stack to design native programs is bloated and confusing af, it seems very tedious and incompatible with quick iteration workflows

when you compare to apple with SwiftUI, it's night and day


XAML is relatively horrible to work with in comparison to the old Win32 controls.


Am I the only one that likes XAML?


I prefer Android's XML mechanism for declaring UI over XAML.


How come?

XAML is intuitive and powerful. Android XML seems like an afterthought.


I disagree. XAML is by far the best UI framework I’ve worked with.


Back in stoneage, while using Windows 3.11 (I think it was calles Windows for Workgroups or so...) there was a bug in calc: 0.02-0.01 resulted in a flat zero. Perhaps it's time to open an issue.


They addressed that quite a long time ago, with a switch away from floating point math: https://devblogs.microsoft.com/oldnewthing/20040525-00/?p=39...

Somewhat related, HP calculators used floating point BCD internally since the beginning, to avoid this sort of thing. (Custom mostly-4-bit CPU chips with specific hardware for the purpose, etc.)


It must have been fixed ages ago as I can't reproduce it in Windows 95.


Yes, as far as I remember it was really only in WfW 3.11


> the very earliest iterations of Windows Calculator, starting in 1989, didn't use the rational arithmetic library, instead using floating point arithmetic and the much greater loss of precision this implies.

As I recall, there was quite a bruhaha about Windows calculator solving 2 x 2 as 3.9999999945 or something like that. Pretty sure this rational number library was added in response to bad press.


Notepad next...


Notepad.exe is just a wrapper around the edit control [1] that Windows ships as part of its core system. It doesn't even properly support undo as you can only undo the last action, which becomes the new last action, allowing you to only toggle between two states.

https://docs.microsoft.com/en-us/windows/win32/controls/edit...


It's a window that mostly uses an edit control, but, having code in Notepad, it's not just a wrapper around the edit control. Could you write Notepad in a day or two? Perhaps. But it's deceiving to say that it's just a wrapper around the edit control.


Unlikely. Notepad is just a wrapper around the Windows API's textbox/textarea/whatever it's called, so open sourcing it isn't much.


Apple open sourced TextEdit for exactly that reason, to show how easy it is to write text editing applications in Cocoa: https://developer.apple.com/library/archive/samplecode/TextE...


MSPaint please


Why? There is Paint.Net and it was open source for a long time.


Because Paint.Net isn’t MSPaint. It’s not exclusive. There’s lots of paint programs but only one MSPaint.

Why calc.exe?


I would very much welcome that on my Linux install.


Obviously not quite the same, but if you haven't seen it yet: https://jspaint.app/


I’ve always wondered how far back the modern Windows 10 codebase goes. Could there be any remnants in the code from before the Windows 3.x era?

Also being open source do they accept pull requests? Can my code end up being shoved out in a Windows update?


Some changes to calc.exe (and a few other open source parts) in Windows 10 over the last couple years have come from pull requests.


at least the oldest icons are still exists in modern win 10, yes, even after ms decided to update some icons to fit new "Fluent Design" https://imgur.com/a/6T31ziE (from latest insider build


I use https://thomasokken.com/free42/ as my calculator on all my devices: android, pc, and offline.


I probably wouldn't have time to do this for next few years, but I wish they'd release regular Notepad as open source. There's a part of me that sort of wants to do some kind of masochistic feature-by-feature port of Notepad into a functional language, just to pay homage to the first editor I ever used.

I know that I could obviously just create my own text editor, but I think it would be kind of fun to specifically recreate notepad of all things.


Maybe they can fix this one, in the new calc:

https://superuser.com/questions/26193/why-does-windows-calcu...


3.11-3.10= ???


Text says “depth”, I read “spaghetti”


One mans spaghetti is another mans elegant architecture. Though sometimes it is spaghetti.

Having read the code and lots of other MS code. It is in pretty typical MS style. Though a bit undercommented for the random few modules I looked at. I am sure there are probably a few bits in there that are hairy. But I think I will skip that code review today.

Do not mistake someone else's "style" for "ugly". It is an easy trap to fall into.


You may not like it, but this is what peak performance looks like.


Really? It was actually a lot better than I was expecting.


Standard mode doesn't follow PEMDAS. This software needs to go.

https://answers.microsoft.com/en-us/windows/forum/apps_windo...


This is the standard way calculators have operated since forever. They don't evaluate a whole expression, they simply have an accumulator and you operate on it one step at a time.


Still, it's surprising that this behavior changes when you switch to "scientific" mode. The UI doesn't communicate it well.


Never had such a calculator, but probably you are right.


You’ve never used a cheap four function calculator?


It's probably a concession to skeuomorphism. Most physical calculators I've seen that look like the standard mode ignore PEMDAS as well, whereas pretty much all scientific calculators will respect PEMDAS.

If you took one of the people who still use old fashioned "standard" calculators and put them on a standard mode that followed PEMDAS, they'd probably be confused, at least initially. I've seen that a lot at cash registers in restaurants, dry cleaners, and other small businesses.


Actually my hunch is the microchip doesn't need a stack then, so is cheaper.


While its basic math you would be surprised at how many people have forgotten about PEMDAS. I would argue that the additive way comes more naturally to people. For people wanting to more complex equations scientific one does follow PEMDAS.


Personally I've always found relying on operator precedence uncomfortable and usually add always parenthesis even when they are not strictly needed, this was the case even back in high school when we were doing calculations with then fancy graphing calculators. I just find the extra degree of confidence worth the slight extra effort and noise. Maybe I'm just a belt and suspenders type of guy.

These days for actual physical calculators, and on phone, I've transitioned to RPN which neatly transcends such petty ambiguity problems.


I think it's just a matter of people adapting to the tools that are easily available. Almost 100% of the time I've seen someone use an actual basic tabletop calculator, they've been tallying prices and slapping on a sales tax. At that point it's just a learned ritual and they're not thinking about order of operations. If they needed parentheses, they'd just memorize that the parentheses go around the prices.

Once the computations get complex enough that you have to write down the formulas, then it makes a lot of sense to treat the computations as algebraic expressions, meaning PEMDAS.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: