I have found the same. I was asking it yesterday about calculating rental yields for our rental property. Usually it understands the context from the whole thread, but I found myself repeating myself a lot when asking new questions about possible scenarios. This is at odds with how well it knows the aws-cli, ffmpeg and yt-dlp… and bash. My productivity at the command prompt has skyrocketed… but it really flounders with doing financial calculations.
I would recommend not using it to do any kind of even simple calculations. It's VERY bad at it but what is dangerous is that it makes the answer look subtly plausible.
I've tried to use it to calculate averages between 10 or so numbers and every time I ask it the same exact question I get an "average" number back that looks plausible but is slightly different every time. Then I whip out the calculator and measure it myself and it's an answer that ChatGPT never gave me.
It's *really* dangerous to use this to do any kind of important calculations like financial stuff.
I find it ironic that it can generate function which calculates average of an array of numbers in dozen programming languages, but it can't tell you the average when you ask it to.