Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Which general programming language is good for learning/exploring math?
43 points by mathpadawan on Oct 25, 2020 | hide | past | favorite | 63 comments
Which general purpose programming language is good for learning and exploring math?

Here is what I am looking for from the programming language:

* Good standard library that helps in exploring math concepts. Python fits this bill. Python's standard library has functions like math.comb(), math.gcd(), math.factorial(), etc. They make it easy to write down many closed-form expressions without reinventing the wheel.

* Speed. If I am exploring a new concept and I want to test a conjecture for large numbers, such iterate-and-test loops in Python run 30 times slower than equivalent loops in C++. Here C/C++ fits the bill and Python does not.

* Expressibility. While I am exploring mathematics, it should not feel like I am fighting the syntax of the language. Python and Java fit the bill due to their simplicity. C++ is manageable. Rust feels like too much work for quick and dirty hacks to test conjectures.

* Longevity. The language should be stable and not prone to too many breaking changes. Some code I write now should run without modifications ten years later. C, C++, Go fit the bill. Python does not.

* Open source implementations. The language must have popular free and open source implementations. I don't want to be paying large sums of money for something like MATLAB or Mathematica. Most languages popular here on Hacker News like Python, Go, Rust, etc. fit the bill.

Now if there is no language that do not meet all the requirements above, that is fine. Something that comes to close to supporting most of the features above is going to be okay.




Julia[1] It has Python-like syntax, macros, and is general purpose while being designed for maths/science.It runs on LLVM so performance is great [2]. It supports easily writable matrices and Unicode maths symbols.

You may be interested in a free online MIT course about 'computational thinking' that's being run right now using Julia. [3]

In not affiliated with Julia or anything, just think it's a really good language.

[1] https://julialang.org/

[2] https://benchmarksgame-team.pages.debian.net/benchmarksgame/... (graph at bottom of page)

[3] https://computationalthinking.mit.edu/Fall20/


Honestly? None.

You can definitely explore mathematics experimentally, we all do - computer or not, but to really learn the mathematics just read the books and do the exercises.

Being able to work things out symbolically on paper is probably one of the biggest skills you can have beyond your usual repertoire of programming skills - is there any worse feeling that guessing your way through a problem you don't understand?


> is there any worse feeling that guessing your way through a problem you don't understand?

Isn't this called science?


A very small fraction of experimental science is about just doing something and seeing what happens without a strong prior. You are expected to do your homework first. As a scientist, your role is typically to read and understand all the relevant prior work in the area first, then using this theory to derive a hypothesis for the outcome of your experiment. If you are lucky, the way the experiment unfolds might be outside your understanding, given that your understanding of the prior theory was correct, you have made a new scientific discovery which will be used to create new theory.

The fragment of uncertainty in this is typically very small and the process is very far from guessing your way through, hoping to find something new (whatever you found using such a method will most likely be either already known or incorrect).


Yes, I agree with you. My comment looks dull with your explanation. Of course, I also don't advocate doing something without any hypothesis. Though, I think hands-on practice on a subject can help you learn things better. For example, I find it very useful for linear algebra (maybe this is not applicable to other subjects). I can analyze my flawed intuitions.

When I first heard about the Monty Hall problem, I didn't understand it and tried it myself. It was way easier for me to understand the flawed intuition by analyzing each line as opposed to, say, for example, the explanation of Judea Pearl (which is also good).

What I wanted to emphasize is that it is not bad guessing your way through a problem you don't understand. But yeah, of course, you should have some knowledge.


I wish I could say I have been doing science already, not yet (hopefully), but I was referring to (say) programming problems like (for example) one recently where I could see that it boiled down to asking whether there existed a linear combination of a input set of vectors with a single second vector - I was able to spot the connection, I'm not convinced I would've been able to get it without the mathematics under my belt to spot the pattern.

Regardless, I plan to be learning until the day I drop, science or not.


Is mathematics science though?


Mathematics is the exploration of the a priori. Historically some axioms have been deemed more real than others. I think Gauss rejected non-eulidean geometry for instance. But this point of view has changed. With abstract algebra, and I believe also computer science, modern mathematics is about exploring connections between structures such as they emerge from stipulated axioms and rules of inference. It is science in the sense that ideas and hypotheses can be tested experimentally. But a proof requires more than non-falsification. Then there is the complication of potentially irreducible computation problems, where essentially a kind of mining of the computational space is the only way forward. This is the new kind of science Stephen Wolfram speaks of.


I was inclined to write something similar myself as I agree in general. Mathematics is much more about the thinking than the doing, programming is more about the doing.

However, I do think that programming can be an excellent complement and aid when understanding certain concepts. Sometimes, writing a numerical simulation of a hard to grasp concept can make things fall into place. In particular in probability theory it can be quite helpful to generate a lot of random samples to validate and clarify some counterintuitive concepts.


what's the highest level math that one can theoretically get to through "just [reading] the books and [doing] the exercises"?


Field's Medal? Is that high enough?

What do you think mathematicians do?


I meant in terms of mathematical field/topic. Like the progression I did to get an engineering BS was roughly

* basic math > algebra/trig > calculus, linear algebra, differential equations

As a non-mathematician, my impression has been that mathematicians do a lot of proof-based work since (to my impression) the goal is to make new findings, not just learn concepts and formulas that they can apply to their work like an engineer. From my limited experience with proof-based work, it seems like something where you would need guidance and discussion. Whereas when I learned things like derivatives and integrals, it was very procedural and you just needed to learn a process, which is then built on by a more advanced process, etc.

Can one realistically self-learn up to low-dimensional topology or algebraic geometry just by reading books and doing the exercises?


The likelihood that you will succeed in self-learning any field of mathematics without outside help goes down as the material gets more advanced, but there's nothing fundamentally impossible about the endeavor. You can definitely get to graduate level that way if you're willing to put in the work and study the right things in the right order.

Luckily, there are plenty of resources available to someone self-studying online on various math forums.


what would you class as "graduate level" in this context? Would that be stuff like PDEs? As I said I'm not very familiar with what the 'progression' is past undergrad


I think it really depends what kind of mathematical exploration you want to do.

If you are interested in something similar to Mathematica or Matlab, then probably Python, because it is used as a language in SageMath: https://www.sagemath.org/

If you're interested in numerical computing, then there might be other choices such as Julia.

If you're interested in theory programming languages, logic or category theory, then Haskell or even more esoteric language (theorem prover) like Coq or Lean could be interesting for you.



I'm doing the same thing right now to get better at ML, and I chose Python.

I partially disagree with you on longevity since not much math-heavy code would have been hit by the breaking changes in Python 3.

In the ML domain I also disagree with your take on performance since there are plenty good enough GPU accelerated libraries for Python.

Python also has the advantage of knowledge reuse in gainful employment since it's the preferred language for ML.


Take a look at https://www.idris-lang.org/

While I think it doesn’t check most of your boxes, it’s one of a kind in that it lets you write mathematical proofs for your functions.

It pushes the limits of type driven development. As I understand, the big idea is that tests can only show a program is faulty, not that it’s correct. On the other hand mathematical proofs can actually prove your program is correct.

That said I’ve never used it myself but I know a close-knit community of extremely smart mathematicians/programmers using it. It’s certainly a language with huge potential. I plan to play with it / learn it some day.


I highly recommend you look into SymPy, which offers symbolic math computations on top of standard Python programming capabilities. I checks all the boxes except for speed—since it is Python after all...

What makes SymPy great for learning is the API it provides closely matches the math verbs: expand, factor, simplify, etc. so it feel very similar to what doing math on paper would be like.

There is an online shell[1] you can use to try out some commands, and I've written a tutorial[2] that showcases some of the most useful commands.

[1] https://live.sympy.org/ [2] https://minireference.com/static/tutorials/sympy_tutorial.pd...


Sympy is pure Python. You can run it under JITed Pypy3 for a speed boost.


I would say Python in combination with Jupiter notebooks. The calculus class I followed used that and it is great because you can use latex in the notebooks and make plots in python. Look at this repo for examples: https://github.com/LucaAmbrogioni/CalculusTeachingMaterial


Based on my experience either Julia or Wolfram Mathematica (the command line version with fewer libraries should be free, as in beer)

Or if you want to go all in you can try a proof assistant like Lean which is being developed with a focus toward application in mathematics rather than CS/logic.


GNU Octave is a free MATLAB clone:

https://www.gnu.org/software/octave/index

and take a look at Julia as well.


It looks like Julia is gaining momentum?

I don't have experience with it myself, but it looks like it should have the properties you're mentioning.

https://julialang.org/

https://en.wikipedia.org/wiki/Julia_(programming_language)


Julia, the only bad thing is that last I checked a few months ago, time to first plot in the stable release is terrible, though I hear the master branch has made some major improvements.


This is a hard question to answer because mathematics is a broad subject. The fact that you think libraries will exist or be important but that you don’t mention eg bignum support suggests to me that you want to do stuff with floats and matrices and differential equations. Julia would be an excellent choice for this. Language stability is something they care about because they want the language to be suitable for reproducible calculations in scientific research.

If you’re more interested in pure mathematics—things like (non-analytic) number theory, algebra, combinatorics—I would recommend Haskell. I find the type system helps a lot for algebra and the freedom to express computations in more natural and functional ways helps with things like number theory.

Both languages are fast enough. The thing you should mostly care about is the speed of writing correct programs rather than execution on large datasets anyway. Julia is in many ways like python except you don’t need numpy because the built in arrays and for loops (and auto broadcasting) are fast enough.

If you want a proof assistant then I don’t know enough about the available options to make a good suggestion


https://coq.inria.fr/ would be a good choice for proof assistance, I believe


"math" is a massive area, arguably bigger than all of computing (just because it has a longer history).

If you wanted (for example) to learn about group theory, the best tool is gap (https://www.gap-system.org). It satisfies your requirements -- an open source system, with a language which hasn't broken backwards compatbility in over 20 years.


If what you want is to quickly iterate on short experimental math programs, you want an APL. Otherwise, if you need syntactic macros there's Racket.

I suggest J (jsoftware.com)



Define ‘math’. I guess you want to work with the ‘normal’ integers and reals. If so, you’ll need

- fixed-size integers (signed and unsigned)

- big integers

- rationals with big integer numerators and denominators

You also want all of them to be fast, especially if you’re going to explore things.

If you’re going to do exact polynomials, you’ll want the ability to represent radicals (square root of 2, third root of (square root of 2 plus 1), etc). That gets you to a computer algebra system (https://en.wikipedia.org/wiki/Computer_algebra_system) that will also help in doing exact analysis.

If you want your system to be extensible, you’ll want operator overloading (if you implement addition mod n, you will want to write it using ‘+’). You’ll also need extensibility if you’re going to hunt for counterexamples, as you’ll need it for performance.

On the other hand, if you want to do proofs, you may want an automated theorem prover (https://en.wikipedia.org/wiki/Automated_theorem_proving) or a proof assistant (https://en.wikipedia.org/wiki/Proof_assistant)

I don’t think you can get all of the above in one system. That’s why you have to decide what kind of math you want to do before picking the system.


This depends a little on what kind of learning and explorations you want to do. I typically reach for python with some combination of jax, scipy/numpy, sympy, jupyter, matplotlib, seaborn, networkx, cython, and I'm sure a few others that I'm missing. It's nice for a lot of problems, but in no particular order here are a few things that are lacking and might be better elsewhere:

- Complicated plots take some fiddling to get right, especially when you start adding dimensions or want to do anything nonstandard. I still don't have a volume rendering solution I'm totally happy with. This is something that "just works" for a much broader variety of plots in some environments like Mathematica.

- This doesn't usually matter for me since exploratory code rarely does more than a few billion things, so even in Python my experiments are instantaneous, but to get good performance I do have to call into an external library (numpy, networkx, etc) or drop down and cdef everything so that cython can make it fast. That isn't especially arduous (and I don't do a ton of Monte Carlo or anything where that would matter), but I've heard that's supposed to be a point Julia improves on.

- Theorem proving is only kind of okay. You have simple things like hooks into SMT solvers and whatnot, or some weak first order theorem provers, but last I checked that part of the ecosystem is underdeveloped. I wind up writing something bespoke and tailored to the problem at hand every time something like this crops up. That's not really what you want from a system helping guide your exploration of a problem.

Whichever solution you go with, I strongly recommend picking something with notebooks available. Jupyter and Mathematica really excel in this regard, and it makes interactive explorations painless in a way that even REPLs lack.


Swift, especially if you have a Mac. The numeric features are well thought out and generally work the way you'd expect. The same with syntax. Swift Playgrounds are a REPL with ability to visualize what your program is doing to build on intuition.

If you're keen on using higher level language features such as generics, you can make use of them easily. Otherwise they'll stay out of your way. In terms of progressive disclosure of complexity, it is the best language I've used. And I've used many.

There are also many[1][2][3][4][5] open source projects to peruse. Even if you're not on Mac, there is Linux and Windows support, though I don't have direct experience with them.

The language itself is now fairly stable with a fixed ABI, relatively fast, and is open source so you can modify it if that's something you're interested in down the road. Despite the commentary, it's not an Apple-only language, though when you're outside the Apple ecosystem things are more bottom-up and hence the experience can be inconsistent at times.

I credit Swift with revitalizing my own interest in mathematics. It made re-learning the stuff I'd forgot from college fun and interactive.

[1] https://github.com/apple/swift-numerics

[2] https://github.com/phlegmaticprogrammer/LANumerics

[3] https://github.com/apple/swift-algorithms

[4] https://github.com/stsievert/swix

[5] https://github.com/AlexanderTar/LASwift


The natural language for numerical and mathematical sciences has been, is, and will remain Fortran, unless a language is invented with the same nice rich vectorized and parallelized math-oriented syntax of Fortran and its speed, in which case, it will be most likely called Fortran again.


Pen and paper, honestly. I didn't even need a calculator for most of my maths degree, never mind a programming language.

Learning and exploring maths will involve theorems, deriving stuff, solving textbook problems, that kind of thing. Not much of this will involve actual numbers or explicitly calculating anything. In any case, the calculation is very often the "easy" bit, without much to be gained knowledge-wise unless you know why you're doing what you're doing.

And then Julia is probably your best bet. I'll also recommend R, just because nobody else has. It's got all the advantages of Julia with the added benefits of being a much slower and quirkier language. It does do a good job of getting out of the way when you just want to do explore some math concept though.


Julia is a great one to learn for this and will take a long way, both educationally and professionally.


Thoughts on Haskell?


I’m learning Haskell right now and had the same question until I found Hindley’s Lambda-Calculus and Combinators: an Introduction book. Many here are recommending Julia but I’ve used Julia enough now to conclude that it isn’t as mature as Haskell and can be frustrating at times. Haskell, when it’s foundations and history are well understood not only challenges how you think about programming but also of mathematics.


Yep; see The Haskell Road to Logic, Maths and Programming.


I'd recommend scheme. It hits all of your points, and also makes manipulating symbolic expressions very easy, so you can do things like differentiating functions with very little code


No matter what others might think of universities but if there is a place to learn math, for most people it will be there.

If you actually want to learn math, then I would recommend to take courses and do the homework. You will notice how rarely you need the computer for it. I cannot remember that we used a computer even once in the first few semesters of studying math (I am physicist but we shared courses with actual math students. We just had less math courses than they did).


Python hands down. Your concerns about speed are not actually relevant unless you are in the high performance computing world, in which case you can choose C or Fortran. For most any practical application, performance is not that important and library support dominates the calculus. Scikit-learn, scipy, et.al. are the nicest set of math/stat/ML libraries in existence today, you should be able to spend years learning about the things they do.


Depending on what you want to do, Python with numpy could be an option.

In terms of longevity, I don't think Python is going to repeat the 2/3 shenanigans any time soon so with python 3 you should be ok, and in terms of speed, numpy is used in many HPC applications: if applied correctly and depending on use case, writing C/C++ code that performs on par with it will be quite hard.



A bit surprised no one mentioned Clojure yet


Fortran is good for math and physics.


Just go with C++; see books like Numerical Recipes and Matters Computational.


Sage, Julia, or R


http://linear.ups.edu

A very detailed open source interactive linear algebra textbook written in sage, by Rob beezer


There are also lots of types of math that would benefit from learning prolog


pari-gp[1] is fast and powerful once you get the hang of the rather strange syntax (look for examples).

[1] https://pari.math.u-bordeaux.fr/


Right now I would start with Typescript with Deno. You learn to code the syntax of the web (javascript) without much of the headaches of building and tooling of ts/js/node.


Python. Many perf bottlenecks go away if you use the right libraries that vectorize your operations (eg pandas dataframes, or tensorflow for more involved crunching).


Python of course.

Why would you use anything else.

That said math is by absolut3 far better explored with a pen and a book. Computer are not really anything but an annoyance to learn math properly.


comb, gcd, factorial are very trivial. They are like 5 lines each or so. I think it should hardly matter whether or not the standard library has them.


not python. there's a reason all the python math libraries are actually written in C or Fortran.

Julia sounds intriguing though


Fortran


Interesting suggestion. I have not programmed a lot in Fortran but with my limited exposure to it I like the first class support multidimensional arrays. It has a simple and clean syntax too. Definitely wins in speed and longevity.

Can someone with more experience of Fortran explain if this language is still relevant today and if it makes sense to get started with it in this day and age? If so, which Fortran standard to use? Fortran 90? Which compiler implementation to use?


I have experience with modern Fortran. The features that make Fortran relevant today are the array operations, simpler memory management than C/C++ (the allocate statement), and the ability to change floating point and integer precision easily. If you are doing math on very large arrays, Fortran is one of the best ways to go. If you are just starting to learn it, try to follow at least the Fortran 2003 standard and up, because it introduces many more modern conveniences. Many people still use Fortran 77 but honestly I see no reason to look at it at all. Fortran 90 really was only the start of the modernization of Fortran. Most compilers (GNU/gfortran and Intel/ifort) already support the Fortran 2003 standard. For a free software solution, stick to GNU (unless you want to pay for the Intel compiler).


Modern Fortran is a bliss, very high level comparable to MATLAB and Python, yet lightening fast, only a professional careful C programmer could get to its speed. Meanwhile, the only high-level programming language that provides a very simple native builtin syntax for one the most difficult types of distributed parallel computing. If you begin to learn Fortran, start from Fortran 2008 and beyond. Most major compilers support all of Fortran 2008 and and most of Fortran 2018, including Intel Fortran, NAG, GNU, Fujitsu, Cray, PGI, ...


> only a professional careful C programmer could get to its speed.

And a professional careful C programmer would not get to Fortran's library of numerically stable libraries for a very long time.


Java


octave or R




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: