I used Sage heavily when working in string theory a long time ago. It's very powerful. But for symbolic math, if possible, I would still use mathematica.
Their is also an excellent open source clone of the rule reducing engine called Expreduce[1], and I've been working on Foxtrot[2] which uses Expreduce to create a GUI and data platform on top of it.
Or Sympy [1], which is very accessible, and embedded in Python, so connects directly to that huge ecosystem. I teach university maths, and use Sympy for every CAS need.
How good is Axiom as compared to Mathematica in general? I am familiar with Mathematica, am asking to get zeroth order idea for Axiom before I would try it out myself. Thanks.
I think Mathematica is HUGE. Axiom/FriCAS does not have every thing that Mathematica have(specially statistics).
Overall it is not that polished UI wise. I think it is pretty good for algebra and symbolic computation.
Let me quote from the book/guide of FriCas:
"
FriCAS provides state-of-the-art algebraic machinery to handle your most advanced symbolic problems.
For example, FriCAS’s integrator gives you the answer when an answer exists. If one does not, it
provides a proof that there is no answer. Integration is just one of a multitude of symbolic operations
that FriCAS provides.
"
You can find out more by reading the book which is available here:
There’s a bit of a learning curve for sure, but in the last year Maxima has become an indispensable tool for me. Symbolically writing out multiple equations, putting them together, and solving for a variable buried deep inside is dramatically more repeatable and less error-prone. Pretty much anything I’m working on now that involves algebra or calculus gets a Maxima file that starts with first principals and derives the final equation.
I'm a physicist and I've used mathematica sparingly off and on for several years, mostly manipulating symbolic formulas.
I still do not understand what I'm doing half the time. Mostly just googling and stackoverflow answers get me by. For example, I don't understand why I can't use subscripts as symbols?
Any recommendations on trying to 'get' mathematica?
As someone who started using Mathematica only very recently, I found The Wolfram Language:
Fast Introduction for Programmers [1] very helpful. It's the quickest way to get started with the Wolfram language if you've seen a bit of functional programming before, e.g., Python's map(F, x) is equivalent to Wolfram's Map[F, x], reduce(F, x) equivalent to Fold[F, x], etc. There's also
An Elementary Introduction to the Wolfram Language by Stephen Wolfram [2,3] which is available freely as Mathematica Notebooks, although I'm yet to read it fully.
My advice is to find a good book that teaches the fundamentals. I'm currently reading "Power Programming with Mathematica: The Kernel" by David B. Wagner. It was published in 1996 and covers then-new version 3, but boy is it good (and still very relevant!).
I think the reason Mathematica hasn't "caught on" is that it doesn't look like fortran/matlab type programming that scientists/engineers are more traditionally exposed to. I find that if I'm trying to write an explicit loop (as opposed to a map or vector operation) in Mathematica I am most likely doing something wrong.
You'll drive yourself crazy trying to understand subscripts as a beginner, I would advise against getting too fancy with them unless you are fairly advanced. They work reasonably transparently in most cases, but there are a handful of situations where they are really frustrating (e.g. in With/Module/Block constructs). You can actually use them as symbols if you import the Symbolize package and Symbolize them, but then you lose the ability to "do math" to the subscript (e.g. you won't be able to use Table[] won't generate the symbolized variables.)
Read through it once to get the gist, then go back and read it again. The second time through, _play_ with each example. Try them out, see how they behave, attempt to apply other things you've learned to each. In no time at all, you'll have the basics down and then will be able to make heads or tails of the other doc pages much more quickly.
That's whay working on CS feels like all the time! /sarcasm
Jokes aside, probably you're experiencing the part in learning a programming language where you can think what yu want to do but don't understand it enough to express it correctly. Usually the solution to this is more practice in said language, which is when pet projects come in handy.
Maybe you can try implementing something that already exists from scratch? That way you can always look for help online implementing it, if you need it, but you get to excercise the language a bit.
This way it takes effort and time, for sure, but it'll help you get a good grip.
Mathematica is worth learning and using since it's so far ahead of the open source alternatives (e.g. Python/Jupyter and associated libraries ) in usability/interactivity/rapid development that it comes across like future-tech.
Unfortunately it's also the single tool most hampered by its licensing and silo-like ecosystem.
I've tried it for modeling tasks but I struggled with the sort of basic data manipulation that can be done within pandas/data.table. I can quite comfortably work with 100-million line CSVs using data.table on a standard laptop but Mathematica wasn't even able to ingest the file. I don’t disagree its technically very impressive but there's no point in having these amazing features if it stumbles with such basic tasks.
I've used Mathematica since about the year 2000, and I think Wolfram "missed the boat" of AI, big data, and machine learning.
They were in the unique position of having one of the best symbolic differentiation engines and one of the best numeric engines and a Lisp-like REPL that allows one to write terse, elegant code.
What they were always missing was efficient bulk data structures.
In recent versions they've added a handful of "special cases" where some types of data are stored as a plain data array like in C-derived languages, but this is hit-and-miss.
Similarly, they've dabbled with GPU acceleration and parallelism, but it's half-baked. It feels like a proof of concept, not something you'd ever actually use.
Julia and the like will slowly but surely eat their lunch.
the neural network stuff is very good and relies on GPU acceleration. it is the "computation graph" paradigm -- have to define the whole architecture up front -- but it's actually quite pleasant to use.
Maxima is also an amazing tool for symbolic maths, and is free software. I especially recommend the wxMaxima interface which is close in spirit to jupyter notebook
I'm taking this opportunity to post a video I made some time ago of my current work-in-progress, a new user interface to Maxima. It's more like a regular commandline compared to wxMaxima, which may or may not be what users want.
I'm an R and sometimes Julia user, although not Python. Can you offer some examples of how Mathematica surpasses open source alternatives in those areas you've mentioned?
I find R to also be a lot better than Python and closer to Mathematica (especially if you combine it with RStudio and Shiny) but still not quite as good overall on the interactivity/environment end.
None of those can beat it in symbolic computation. You can write entire papers, by simply asking yourself the question: I wonder if this has an analytical integral? If you are lucky Mathematica spits out a solution in terms of a special function.
Because the most important part of the paper is formulating the question it is answering, and stating why asking that question is interesting and useful. That is not something anything short of AGI can do.
There are several functions in base R for differentiation, integration, solving system of equations, etc. E.g. `solve`, `stats::D`, `stats::deriv`, `stats::integrate`, `stats::numericDeriv`.
R package Deriv for symbolic differentiation, it allows user to supply custom rules for differentiation.[1]
R package numDeriv[2] for calculating numerical approximations to derivatives.
R package gmp[3] and Rmpfr[4] provide multiple precision arithmetic and floating point operations. They also include some special functions, e.g. Rmpfr::integrateR for numerical integration.
R package mpc[5] available at R forge. It provides multiple precision arithmetic for complex numbers.
R package rSymPy[6] provides an interface to ‘SymPy’ library in python via rJava.
R package Ryacas[7] provides an interface to the ‘Yacas’ computer algebra system. It is easier to install compared to `rSymPy`.
R package symengine[8] is an R interface to the SymEngine C++ library for symbolic computation.
I use R much more often than Mathematica and think is great for many reasons, but there are places where Mathematica is on another level. My mathematical maturity isn't high enough to really get how it's done or describe it well, but Mathematica has a way of being shockingly consistent across concepts and has pretty thorough documentation that can even help you learn the topics. R is very inconsistent even in the internal library, and documentation quality runs from best around to worse than no documentation.
There are also little nifty things like for image processing you can have a hard coded image show up in your code (I like plain text better but it's cool and future-techy). Distributions (as in normal, binomial, Poisson, etc) are a type and PDFs and CDFs can be obtained from them consistently rather than having to remember the different parameters of dnorm, dbinorm, etc.
I would love a real Mathematica expert to give us more. That's the real drawback of the closed ecosystem there is so much less information about it out there, fewer code samples, etc.
When I had the chance to try it, I didn't find it that easy to use due to the interface which felt clunky: the way the command line works: no feature to repeat previous comnand with up arrow, have to edit previous existing one, requiring more mouse usage, weird forms of cursor placement, weird default enter key behavior.
In ipython, matlab and octave it's much easier to repeat and modify last commands, which is something you seem to need all the time when experimenting with math.
What was I missing, usability/interactivity/rapid development wise?
> the way the command line works: no feature to repeat previous comnand with up arrow, have to edit previous existing one, requiring more mouse usage, weird forms of cursor placement, weird default enter key behavior.
It's not a command line. It's an interactive notebook. It's an entirely different experience. Repeating and modifying last commands can still be done with arrow keys, then Shift-Enter. Also, when you are doing math, you spend way more time thinking than typing and manipulating; the time needed to move your hands to the mouse is minuscule by comparison.
What command line are you talking about? If you want a command-line IPython-style REPL, enter MathKernel, which absolutely does support arrow keys to go backward and forward in history (and works over SSH without X). If you want a IPython (later Jupyter) notebook-style interface, enter Mathematica/Wolfram notebooks (guess where IPython notebook got its idea from). Sounds like you just didn’t bother to learn a bit about it before making up your mind.
Mathematica Notebooks have some REPL-like aspects, but they really should be uses like a script editor that shows the script output inline. You can jump between script segments and execute them out of order, which is useful in some cases. But notebooks are at their best when you have some discipline when editing, so that they stay in a state where you can open them, select execute notebook from the menu and everything works on the first try.
For interpreted languages, it doesn't matter semantically whether you use strings or not. It's always a run-time value that needs to get resolved when the interpreter gets there.
Yes, in WL strings are commonly used in a matter similar to enums in other languages.
The thing to keep in mind is that WL is based on symbolic replacement. You normally wouldn't define that kind of function as taking in a parameter (i.e., CreateDataStructure[x_] := ...), rather you'd define it for each case separately (i.e., CreateDataStructure["LinkedList"] := ...).
Does anyone know of a good, strongly-typed computer algebra system similar to Mathematica?
By strongly typed, I mean that Mathematica is weakly typed in the sense that it simply assumes that all expressions are Complex numbers. At most, it can restrict itself to some subset such as the Reals or Integers, but that's it.
It can't, for example, perform general simplifications over non-associative types such as most Matrix algebras, the quaternions, or any geometric algebra. Most built-in functions operate only on Complex numbers, or expressions over the Complex numbers, etc...
I'm looking for something I can use to do symbolic expression manipulation for physics equations in terms of geometric algebra, but as far as I know there's nothing out there with the capability.
Mathematica is an amazing tool, but it has one huge downside: reproducibility.
It is a closed monolith, and - especially when you start to use advanced features - you derive results from it that can't be independently verified because of the closed source nature of the monolith.
Thanks for the link! The succinct article, "Open Source Mathematical Software", illustrates the issue with a quote from a Mathematica tutorial:
> the internals of Mathematica are quite complicated, and even given a basic description of the algorithm used for a particular purpose, it is usually extremely difficult to reach a reliable conclusion about how the detailed implementation of this algorithm will actually behave in particular circumstances.
I agree. How an algorithm "actually behaves" depends on the implementation details of the language interpreter or compiler; open source or not, there are complex transformations leading to running code, that it might as well be a black box.
From the article calling for open-source mathematical software (my emphasis):
> ..we need a symbolic standard to make computer manipulations easier to document and verify.
> ..perhaps we should not be dependent on commercial software here. An open source project could..find better answers to the obvious problems such as availability, bugs, backward compatibility, platform independence, standard libraries, etc.
> Increasingly, proprietary software and the algorithms used are an essential part of mathematical proofs.
> ..with this situation two of the most basic rules of conduct in mathematics are violated: information is passed on free of charge and everything is laid open for checking.
If Mathematica were to be open-sourced one day, I suppose that would cover most of this wish list, with improved availability/reproducibility and verifiability. Tough to imagine without significant funding, collaboration, and communal agreement.
If a software is used to create data that is okay like editing text files in microsoft windows. But computational software such as Wolfram being closed source bothers me a lot. There is no way to verify the science you do is correct.
Quite often it gives you a result that you can then prove directly, or check with other tools. The benefit of MMA is that it has a lot of tools and a good interface, good documentation, and a large community.
In practice, pretty much no one doing science has the expertise or time to completely verify the science they are doing - they are building on centuries of knowledge across many disciplines, and for the most part the community verifies each part as they build knowledge.
And certainly opensource does not allow the vast majority of people "to verify the science you do is correct." They'd have to check the code, the compiler, the hardware, ensure no cosmic rays flipped bits during computation, and so on.
So I'd not worry too much about the closed source vs open source nature of it. It's a solid tool that enables lots of research.
The “cosmic rays” argument, to me, is inane. It simply doesn’t practically apply and is certainly not an argument against the benefits of open source code. You’re castigating the whole practice of code review, computer-aided proofs, automated theorem proving, etc.
Have you ever looked at the rates, or you just dismiss it without looking at it? Note that the current rate is higher than older references since the feature sizes have shrunk, and lower energy events can change bits on newer hardware.
Scientific computation, especially at the level of most researchers, is affected by cosmic ray bitflips, without question.
Since the OP was complaining about not being able to check everything ad absurdium, then this effect is certainly on the table. It's more likely to affect research than the difference between closed and open source if a researcher is ignorant of it.
It's also why good researchers, who know this is a real effect, tries to run a computation in multiple methods over different times, until they feel a consensus on the calculations is robust enough.
If you've never done it, write a program to watch memory for bit flips, and be amazed.
Here's an intro - do a back of the envelope calculation and see if you still think these events are rare enough that they don't affect common scientific work.
>You’re castigating the whole practice of code review, computer-aided proofs, automated theorem proving, etc.
No I'm not. Those are but one avenue of reducing the probability of error during computation. All of those only ensure that the code part is solid - there is an entire other world on the physical part that needs incredible engineering, noise reduction, error correction, defect mitigation, thermal issues, quantum issues, physical data decay, memory leakage, and so on.
I think by focusing only on aspects for code, you miss a large part of ensuring modern computing is accurate.
And how many people doing "science" do code review, computer-aided proofs, or automated theorem proving to verify their code is correct? Very, very, very few.
Two things I miss when working with Mathematica: refactor-rename variables and a usable object-oriented programming support.
Refactor-rename is available via Eclipse-based IDE; however math typesetting is not available there (it makes a difference for large equations/expressions).
There are a lot of community-developed approaches to OOP. I tried a number of these, sticking with a particular approach, but it also leave a lot to be desired. OOP is useful in that it ties together data and functions which act on the data. Inheritance/composition are useful if you compute properties of similar objects.
Mathematica is mostly used for relatively small, notebook-style, applications. Both features, while probably useful for some people, don‘t seem to be necessary for the more popular use cases.
they put the biggest improvement upfront lol: things now scale properly if you have a 4K monitor. people (myself included) have been squinting at very tiny font for years, and now they don't have to anymore!
> Anyone here uses Mathematica besides Physics or pure mathematics?
My adviser (a physicist) uses Mathematica for all non-physics computations as well. It's a clunky language for general purpose computation and doesn't play well with external programs and libraries. The user experience on Linux can also be quite haphazard with frequent crashes and lack of good HiDPI support (I haven't used version 12.1). It's also hard to run headless programs written in Wolfram language. But if you're okay doing everything inside the Mathematica GUI and don't care about the fact that it's a walled ecosystem, don't have a preferred text editor, and don't particularly care for Unix's one-thing-well philosophy, Mathematica might work for you.
Depends how wide your definition of 'physics' is. We have a couple of environmental scientists at work that use it to model water pollution for example. The big win is that they can basically just type in their differential equations exactly like they would write them on paper and have the answers simply pop out.
The language began as something of a gem and each release seems to be muddying it up. Another commenter already mentioned the string argument to make a data structure. It also looks like built-in methods on these objects are also strings.
The metadata and annotation facilities seem unclear. Sometimes they change the appearance and behavior of the object, sometimes they don’t.
Some functions now are curried by default. The choice probably makes symbolic programming and pattern matching trickier since it breeds a variety of representations of the same concept. (Pattern matching and rewriting strongly favor canonicalization.)
All and all, it’s beginning to look like the core Wolfram Language is beginning to tremble under its own complexity. I can’t imagine coping with buggy code at all in this framework.
Is there any reason not to call Mathematica AI at this point? I swear some of the things it does look more like black magic than what we call AI nowadays...
Many people on this and similar threads rightly question whether results received with the help of a closed source software are reproducible. I think it's a fair question, but these discussions often miss a couple of important points.
First, there is a question of theoretical vs. practical reproducibility: true, in _theory_, an open-source codebase with many millions of lines of code of highly complex transformation can be checked for accuracy by anyone. In _practice_, only very few insiders will have both technical understanding and time to verify the correctness of this or that algorithm, and everyone else would need to rely on their expertise. That situation is no different from using a proprietary system if we trust their authors.
Second, while it might be shocking for someone, most of the science, outside of maybe math, it's not _practically_ reproducible. Most of the articles are behind paywalls, don't have enough data to replicate their results (even economists much more often than not won't include their raw data or algorithms or both). Also, no one who is trying to build a scientific career would try to replicate someone else's results, especially if it requires some costly equipment, reagents, etc. - the rewards in the scientific community are for novel results. Replication crisis[1] describes this pretty well.
Most of the software that runs LHC at CERN is probably not inspectable by a regular Joe the physicist, yet somehow the physicist community somehow trusts their results, etc.
In summary, while an open-source software of Mathematica quality would be awesome in theory, I highly doubt it would be practical for a long time.
Reproducible has a much simpler meaning than that you describe: it’s feasible for any scientist to recreate the result perpetually. While inspection of the methods might help one understand the result—and I think itself holds a great deal of value to especially mathematicians—just the simple ability to compute the answers yourself without binding yourself to a contract or paying $1000’s of dollars in license fees is a valuable aspect to science. Especially so when the calculations are at the level of pure algebraic manipulation.
Your (and Wolfram’s!) argument about “not needing to see the insides because really only 10 people in the world understand it anyway” should really be an argument for opening it up. If 10 people could understand it, and we must intersect that group with folks who can access the source, we are left in quite a dismal state of affairs.
While I don't disagree with your definition of reproducibility, I want to point out that very few papers would satisfy that criteria, regardless of their use of a closed- vs open-source software.
Here is a recent example: Imperial College COVID-19 response team published an article[1] where they modeled different effects of non-pharmaceutical interventions, such as suppression and mitigation on the number of infected, deaths, etc. This is a very interesting result, but it's impossible to replicate their results in practice without contacting the authors, as their methodology is not enough to reproduce it.
While someone else[2] posted their own model that is very well documented and fully reproducible by anyone with a $230/year personal license.
While theoretically [1] is high science and [2] is not, in my own opinion [2] is better than [1]. I would love more science to be done and discussed that way. Ideally, using open-source software, but in practice, using Wolfram Language, in that case, is already good enough in my opinion.
PS. I'm not affiliated with Wolfram Research in any way.
Making an exact copy of an experiment is the lwest level of reproducibility.
In science , reproducibility means reproducing the results with different components (people, tools, methodology), showing robust Independence form potential confounders.
Regarding your second point - I don't have a horse in that race, other than tremendously enjoying my personal license that I pay for out of my pocket; but I found their arguments[1] quite reasonable. I think there was an HN discussion about this some time ago.
I find the arguments confused and lacking. Many of those arguments conflate “open source” with “community driven”. Rarely did they make a compelling case as to how availability of the source code would hurt the customers.
And many, many amazing open source teams have demonstrated that even including community driven design in some cases does not hamper the quality of the product. Look at Apple and Swift, or Rust, as examples.
I read their arguments as “we do not know how to engage with a community effort because it’s not in our company DNA” under the facade of supposedly legitimate reasons.
Mozilla funds a few things. Some stuff is funded by other companies. Mozilla pays the most folks to work full time, but the amount is tiny compared to the overall set of teams, let alone contributors.
Thanks Steve for the answer! HN is an amazing place where a random comment asking a question about Rust can get an answer from a core developer. I hope we will always keep it that way.
They have funding; that’s great! Exactly what
I’m saying is that these projects manage to be community run (to some extent) while still remaining successful.
If Wolfram’s reason to not publish the source code is that they believe they would no longer have a business model, they should list that as the One Reason, not 12 gaslighting reasons.
Well, they clearly think their model works for them. If you think otherwise, it's on you to convince them that they should change their model, or start your own effort if you think it will benefit humanity. I don't think they necessarily need to "defend" their business choices.
You're right, they don't need to defend it. But they shouldn't put out reasons that don't make sense. They ought to just put nothing out at all and operate as normal.
Probably too different to merge the submissions.