Hacker News new | past | comments | ask | show | jobs | submit login
Wolfram Language (wolfram.com)
217 points by dserban on Feb 25, 2014 | hide | past | favorite | 128 comments



It's an amazing language - basically lisp with the biggest standard library you've ever imagined, and nice (but slow) pattern matching. I used to use it all the time when I was doing one-off projects at school and for clients. But I don't see it taking off unless it goes FOSS - while I have a license and I'd love to keep using it I can't justify it to employers or clients who want to actually be able to run my code after I leave. The alternatives (Python, R, maybe Julia) are good enough now to do most things you'd want to use Mathematica for, and they don't come with per core usage restrictions. To me Mathematica is well worth its price tag, but I only came to that conclusion because I was able to experiment with it for free during college.


I found that it mostly works great, but sometimes it gives really wrong results - without any warnings. My professor complained about various errors too (mostly in dynamical systems, some control theory, but the problems seemed to be of a general sort), so it wasn't just a 40 cm error. To me this suggest that the language has plenty of leaky abstractions. My feeling is that in languages like python or julia, if you do something wrong you are more likely to hear about - and understand - it.


basically lisp with the biggest standard library you've ever imagined,

That's what I was thinking too. And I also think it could mostly be re-implement as open source, with say Python.


http://www.youtube.com/watch?v=_P9HqHVPeik "Stephen™ Wolfram™'s Introduction to the Wolfram™ Language"

That Wolfram™ Language and the Wolfram™ Cloud is going to change the world forever, the language is so great because its Wolfram™ Symbolic and since it all just runs in the Wolfram™ Cloud its really fast. You can put all your financial data in the Wolfram™ and do things. You know they worked 30 years on this, and it must be good because it runs in the Wolfram™ Cloud. Cloud Cloud Cloud Bla Bla Bla Bla Buy Buy Buy Wolfram™ Wolfram™ Wolfram™.


That this is a top rated comment is fucking pathetic.

His name is on the door. Unlike the late beloved Steve Jobs, he does not afford himself the opportunity to blame others for product problems. It's a higher standard - one without "Hold it another way" copouts and buy our products to show your good taste values. He built his startip into a company that makes amazing products that can be used to really change the world because they help solve really hard problems in ways that scale. He's putting knowledge into the world not peddling caveat emptor planned obsolescence and digital whoopie cushions.

Now he's taken his life's work and is shipping it on the Raspberry Pi instead of taken a 30% cut of the creative output of developer and musicians.

That's the difference between what is rightfully called "great" and what is called so only insanely.


No need to shit on Steve Jobs in order to praise Steven Wolfram. Both are good businessmen that built great products and successful companies. And they had great respect for one another.

And it's not like Wolfram is giving stuff away for free. Have you seen how much a Mathematica license costs? I loved using Mathematica at my university, but now I just can't afford a license anymore.


>And it's not like Wolfram is giving stuff away for free.

I think he is...

http://www.wolfram.com/raspberry-pi/


So Wolfram for raspberry Pi is available for free with NOOBS ? eye rolls and Wolfram is chargeable ?


Steve Jobs built a business around charging students full retail and encouraging educational institutions to promote Apple computers to them. Freshman are encourage to take out loans to buy Apple products with a support life cycle that ends before they graduate. Any admiration for Mr. Jobs is based on something other than human community.


If you can't acknowledge Steve Jobs massive contributions to technological progress you're extremely confused about the history.


Wait, what? Jobs didn't create technology, he sold it to people (and did that very well). Before you reply, consider that I was there (http://en.wikipedia.org/wiki/Apple_Writer) and I knew Jobs personally. I watched how he operated. He was very good at what he did, but he was a promoter, not a creator, of technology -- technology created by others.


>>Jobs didn't create technology, he sold it to people

I don't understand what you are trying to say. Selling technology is a great way to facilitate its progress. Or do you believe that technology just sells itself?

I know that this is a forum primarily for developers (err, sorry, where are my manners... I meant creators), but the fact is that most developers would not have jobs if salespeople did not sell the products they created. I know that's a difficult thing to grasp for some, but it's true.


>> Jobs didn't create technology, he sold it to people

> I don't understand what you are trying to say.

Okay, let me spell it out for you -- here's the remark I was replying to:

>>> If you can't acknowledge Steve Jobs massive contributions to technological progress

Really, "massive"? Invention of the transistor massive? Design of the first computer massive? Design of basic computer algorithms and/or languages massive? Or, to be more specific, design anything massive? Well, in point of fact, no -- none of the above.

> ... but the fact is that most developers would not have jobs if salespeople did not sell the products they created.

Yes, and people who actually develop technology wouldn't be comfortable without people to sweep the floors and empty the wastebaskets. The question you need to ask yourself is which role is irreplaceable -- the creator of technology or the person who sells it?


>>Really, "massive"? Invention of the transistor massive? Design of the first computer massive? Design of basic computer algorithms and/or languages massive? Or, to be more specific, design anything massive? Well, in point of fact, no -- none of the above.

Massive in the sense of "co-founded the world's largest and most profitable technology company and later turned it around from the brink of death."

Ask yourself how many technologies we have today because of Apple, and how many innovations those technologies and products drive. Heck, you don't even have to compile a list, because I can prove my case with a single example: if it was not for Steve Jobs, Wozniak would have remained at HP for the rest of his career.


Creating a collaborative organization to design, engineer, and massively produce this stuff is absolutely nothing to scoff at. Process is essential to modern human enterprise--ask any of the hundreds of endlessly delayed Kickstarter projects that routinely fail to turn a prototype into something that can be shipped. If this were simple, there would be many Apples run by many "promoters." Instead there is only one.


> Creating a collaborative organization to design, engineer, and massively produce this stuff is absolutely nothing to scoff at.

First, no one is scoffing. Second, which role is essential and which is replaceable? Steve Wozniak created computers and Steve Jobs sold them. There are ten salesmen for each person able to create something worth selling.

> If this were simple, there would be many Apples run by many "promoters." Instead there is only one.

Yes -- only one Apple, and only one Intel, and only one AMD, and only one Blackberry, and only one Tesla, and only one IBM, and only one Android. Shall I go on listing all the things there is only one of?


What I mean by that is that there is no company in the computer industry that owns so much of the stack as Apple does. And shepherding such an integrated company through one of the biggest tech booms in history is nothing to relegate to mere salesmanship.


> And shepherding such an integrated company through one of the biggest tech booms in history is nothing to relegate to mere salesmanship.

In fact, that's a perfect place for a salesman. Steve Jobs proves it -- he didn't design the products that Apple sold, he sold them to the public.

Henry Ford didn't invent or design cars, he sold them. Thomas J. Watson didn't invent the computer, he sold computers. James McNerney Jr. (Boeing chairman) isn't an aerodynamic engineer, he sells airplanes to eager customers.

The reason for this reverence for salesmen is because of a peculiarity of American history and because of some deplorable defects in our educational system. It's because Americans think Willy Loman was the hero in "Death of a Salesman".


With all respect, you're way, way oversimplifying things in an intellectually dishonest way. I work with folks who worked with Jobs and he was a major, active, creative contributor. His role wasn't, as you say, to be handed products defined and designed by other people and figure out how to sell them. Yes, he was a brilliant marketer. But I don't see why people work backwards from that and say he was only a brilliant marketer. What a strange kind of illogic!


> I work with folks who worked with Jobs and he was a major, active, creative contributor.

He contributed style and color choices to the designs of others. To quote you, "You're way, way oversimplifying things in an intellectually dishonest way."

> His role wasn't, as you say, to be handed products defined and designed by other people and figure out how to sell them.

No, he decided what color and shape the final product would have. But he could not create the technology, he could only change its external appearance.

> But I don't see why people work backwards from that and say he was only a brilliant marketer.

That's easy to answer -- name the product that Jobs designed, that someone else packaged for him.

Don't take my word for this -- read the Jobs biography (http://en.wikipedia.org/wiki/Steve_Jobs_(book)), which ultimately agrees with my view, describes Jobs as a clinical narcissist, perpetually insecure about what he didn't know, and always on the creative sidelines trying to exaggerate his role in the final product.


By that definition nobody creates technology since all they're doing is iterating on work done by other people.

He may not have written code or laid out circuits, but the work he did is extremely important in the process of creating technology. He wasn't just a promoter, he was also a shaper, he would steer the engineering team towards the types of solutions he was looking for and push them to achieve beyond what they would ordinarily do.

Having a vision of some kind of technology and then driving a team towards that goal is, undeniably, "creating technology".


> By that definition nobody creates technology since all they're doing is iterating on work done by other people.

So learn about science. Einstein wasn't "iterating" when he created the special and general theories of relativity -- these theories were very far afield from the physics of the day, and entirely original.

Charles Darwin wasn't "iterating" when be began to think about the fact that Galapagos finches from different islands had different beak sizes and shapes, and what that might mean.

Galileo wasn't "iterating" when he saw those four little stars that seemed to follow Jupiter around in the sky, and what they might teach us about Jupiter, and about earth.

Michelson and Morley tested the ether theory in their eponymous experiment, and their test failed. At the time, no one understood why. Decades later, Einstein explained why and replaced the prior theory without any help.

> Having a vision of some kind of technology and then driving a team towards that goal is, undeniably, "creating technology".

You just tried to equate a bus driver with a scientist.

Steve Wozniak created technology, Steve Jobs sold it to people. Please learn the difference between creators and promoters -- Wozniak didn't need Jobs to create a computer, but Jobs needed Wozniak (and many other similar people over the years, many of whom I knew personally) in order to have something to promote.


Einstein was iterating, that's the thing. Without the massive foundational work done by others, their efforts large and small, he would never have been able to prove anything. Where would he have been without Newton? Without the people who created the math he employed? Without those who made important contributions to his theories, without which he might've floundered endlessly?

His insight was powerful, but like Steve Jobs, having an idea is one thing, proving it mathematically and experimentally takes considerably more work than one individual can possibly do in a lifetime.

While the achievement of individuals of that sort is significant, it's very easy to ignore the less visible people that were hugely significant in the formulation of these ideas.

Your own argument is working against you here. Was Steve Jobs "iterating" when he created NeXT? The iMac? When he defined what the iPod was? Or the iPhone?

I think you seriously under-estimate how difficult it is to have a vision for a type of technology and work relentlessly towards that goal over a span of decades. That's not a bus-driver following a pre-defined route, that's someone charting their own course, one that, in the case of Steve Jobs, broke tons of rules and challenged convention every step of the way.

Just as relativity is "obvious" now, to be taken for granted, so is the iPhone, yet both of those things completely transformed their respective worlds.


>Wait, what? Jobs didn't create technology, he sold it to people

Technology has many levels. A cpu is a technology as is a touch sensitive panel -- but a specific arrangment of cpu and touch sensitive panel etc, is also technology. The one who makes decisions about how it should be made, also creates technology.

Apple also wrote their own software, and even designed their own chips and parts a lot of the time. And of course they also found lots of ingenious solutions to lots of engineering problems in creating smaller/more efficient/more battery power/etc devices.

So, what does "Jobs didn't create technology, he sold it to people" mean? Isn't iWork techology for example? Or you mean, he wasn't a coder or engineer, he was the head of the company making it? Well, that's true, but duh!


> Technology has many levels. A cpu is a technology as is a touch sensitive panel -- but a specific arrangment of cpu and touch sensitive panel etc, is also technology.

We're not discussing the existence of technology, we're discussing the creation of technology. If this distinction were not important, anyone who acquired a computer could claim to be its creator. If this distinction were not important, someone who put a computer in a box would be equal to someone who invented binary arithmetic.

> So, what does "Jobs didn't create technology, he sold it to people" mean?

It means someone else had the education and creativity to invent something for Steve Jobs to sell, to call "fantastic" to crowds of eager consumers, to import into his reality distortion field. The same reality distortion field that makes you think selling equals creating.

> Or you mean, he wasn't a coder or engineer, he was the head of the company making it? Well, that's true, but duh!

So that's your argument? Now I understand why you have the views you do.


>We're not discussing the existence of technology, we're discussing the creation of technology.

Which is exactly what I discussed. Again, you seem to think that only lower component parts are technology (e.g the CPU itself). And you think that building a product that utilises disparate component parts (and needs tons of other engineering decisions besides) is not "creation of technology". I believe this is wrong.

>If this distinction were not important, anyone who acquired a computer could claim to be its creator. If this distinction were not important, someone who put a computer in a box would be equal to someone who invented binary arithmetic.

Apple is not a company who "acquires computers". It designs and builds them, including solving tons of engineering problems in the process. That it doesn't (usually) build the low level component parts its beside the point. Technology is modular.

I also pointed out, which you conventiently sidestepped, that Apple also creates their very own products, from designing logic boards and dedicated processsing units, to software such as Logic, FCPX, iWork etc. They don't just buy some Intel cpus, some Samsung disks and some X brand memory, slap it together and call it a day.

>It means someone else had the education and creativity to invent something for Steve Jobs to sell, to call "fantastic" to crowds of eager consumers, to import into his reality distortion field. The same reality distortion field that makes you think selling equals creating.

You continue to use the word selling, as if the products appeared by magic or trucks, and Jobs just had to sell them. You forgot the part were Apple, the company he run, isn't Best Buy (a seller), but the actual creator of the products it sells.

Jobs didn't just sell stuff. That would be Jeff Bezos, or the Walmart guy. He was the CEO, and quite the micromanaging product manager in the company (actually, companies) that also designed and build those products.

He was the CEO of a tech company. Not the CEO of a reseller chain. And, yes, he wasn't an engineer himself. Technology, as available to the people, is not created by engineers alone, especially in the form of final products.

And engineers don't just work in isolation and deliver their stuff ready for manufacture to some sales guy. They take directions, advice, suggested changes to how stuff works, etc, from a product manager guy -- and Jobs was very much hands on with such things. For the mere selling of stuff, Apple had Cook and top level retail managers on board.

After all, you sure know that Steve Jobs had got the "Vollum Award for Distinguished Accomplishment in Science and Technology, Oregon's most prestigious academic honor"? How come they didn't just give him the "sales" award? Perhaps they see the bigger picture and understand that accomplishing stuff in technology is not just about the guy in the lab?

I know what you're painting the "those non-engineering higher-ups take the praise for our engineering work" angle, but I don't consider it accurate. It belittles the contribution of higher level execs to see them as mere "sales" guys. Of course, it's all to common a point of view in lab guys.


Apple offers educational discounts.


Don't worry about it. Every thread needs a kids' table, and in some threads they're just more prominent.


Nicely put.


If all you know is buzzwords, buzzwords are all you hear.

Personally I found the idea and explanation of knowledge-based computing where my attention went. Does your favorite language have knowledge about the world built into it?


Almost any language can throw a query to google, so yes.


It is bad form to TM every mention of a word, you should only do the first one.

Anyways, invariably someone bashes Wolfram's ego whenever talking anything about Wolfram. This should be an internet law already.


Obviously, it should be called Wolfram's law.


I think he trademarked the word "Symbolic" as well.


I had the same impression.. and whats with the body language of the guy? waving his hands like a populist dictator... signs of "oh no we really need marketing because the product sucks".

The product can be good actually... but this kind of narcisistic marketing is better suited for weight loss products, self help, pick-up techniques e-book, etc (anything that gets money on your insecurity).


How I love irony...


Where's the source code? Is this something like CDF that can be only used from Wolfram's binary blob plugin/exes?

If they don't release the sources for the compiler/interpreter it will be as popular as their Computable Document Format.


Good question. If they don't release the source, then this, as awesome as it is, is just a product and not a language.


Languages don't have source code, they have documentation. I haven't looked through it all, but this appears to be documentation specifying (to some degree) the Wolfram Language. Therefore the language exists, regardless of whether any implementations of interpreters or compilers thereof exist and are open-source.


A formal spec would be good enough too. But since he's specifying concrete output and graphics as part of the language, it's rather hard to specify binary-compatibility without a reference implementation.

This is why there are no interpreted languages without a reference implementation that haven't failed miserably.


SQL, Prolog, Scheme, and VHDL seem to be doing just fine.

And if you leave off the "interpreted" qualifier, which is (a) a property of an implementation, not a language, and (b) is rather meaningless anyway (can you actually define "interpreted"?), well, then C ought to count too.


None of these are established as a single compatible language but rather as a family of languages.

Wolfram obviously isn't aiming for that.

Go to Wolfram's page. That is no spec, it's a rough description of a particular implementation of a (closed source) system.


Uh, all ISO Prolog programs run on all ISO Prolog implementations, regardless of what extensions a particular implementation has. Same for RxRS Scheme.

And VHDL and C99 definitely aren't "families" of languages. GCC, Clang, ICC, and maybe even MSVC all interpret the same code semantically equivalently. Unless you can either show C99 to be unsuccessful, find a reference implementation of C99, or define "interpreted" in a meaningful way, I think your assertion doesn't hold.


It's not as if they started as a "family" of languages, each one of those was just one discrete language at one point.


I'm a big fan of Prolog and Scheme, but seriously the lack of an established solid spec has seriously hurted them.

There's no way Wolfram can succeed with its much more complex feature set without a spec and a reference implementation that's not closed source. Flash pulled that because they caught that opportunity window in the browsed, but neither CDF as a format nor Wolfram (Mathematica language) stand a chance as a generic language. Even Matlab stays niche.


You think ISO Prolog and R6RS aren't "established" and "solid"?


Definitely. I use both and nobody uses ISO Prolog and much less pure R6RS scheme. Here's hoping R7RS will be better although for starters it will be divided in two (small and big). I've been following the process quite closely.

R6RS completely failed to establish itself, to this day R5RS is more popular.

You'd be hard pressed to find a more divided language than Scheme. As for ISO Prolog, it's not even the most popular standard. It might be in America, in Europe it's definitely Edinburgh Prolog and there's where it's used the most.

You guys are giving me great examples of how interpreted languages without a clear spec + reference implementation completely fail to establish a single standard. It can be argued that in the case of Scheme this is a feature, but for something like Wolfram with an existing closed runtime you definitely don't want this.


I'm trying to figure out how VHDL fits in that list. It's essentially a spec in itself, only for hardware rather than software, is it not?


It's a language, it's formally specified (by the IEEE), and it has no preeminent reference implementation. Its domain is irrelevant to the assertion I countered.


Whereas Wolfram isn't. I'm trying to understand the analogy or how does this contradict my assertion, if that was the point at all.


APL, too


> This is why there are no interpreted languages without a reference implementation that haven't failed miserably.

Matlab? I'm not exactly sure what you mean by "reference implementation", though.


Yeah Matlab is Matlab. You are locked in to them or try to use an attempt of compatible runtime.

Far from desirable and this is probably Wolfram's wet dream.

I doubt it will happen though.

This can be considered a commercial success, although it's pretty much a failure for the user, the student, academia, ... everybody else.


VBScript and AppleScript come to mind.

Oh, how I wish they had, some days. But they didn't.


Both qualify as failures to me, but they did have at least a spec.


The language exists because it has distinctive features, like its evaluation model and symbolic nature. The rest is a matter of implementation. http://mathics.net is an attempt to create a free one.


According to Wikipedia: "It also has a large amount of documentation, but it is not standardized. A partial standardization is planned, and an incomplete pre-release already exists."


Someone just pulled that out of thin air. Wikipedia is as good as guessing at this point.


Serious question: How will memory management work for this?

Not-so-serious corollary: How much RAM would a Wolfram wolf if a Wolfram would wolf RAM?


> How will memory management work for this?

There's a garbage collector, and the symbolic structure of the language makes it quite effective. You can also manually control the memory manager, one nice feature of Mathematica is how exposed it is.

http://reference.wolfram.com/mathematica/tutorial/MemoryMana...

> How much RAM would a Wolfram wolf if a Wolfram would wolf RAM?

Easily the funniest thing I've read on HN. You're my hero.


All of it.


So... pretty much just like Java then?


Hands down... Best comment ever!


When I was a student at university in 2003-2008, I loved, loved, loved Mathematica language, Mathematica uncluttered cell-based editor and rich interactive documentation. You could execute and edit examples right in the doc! I did so many things: nicely formatted pages (much easier to type in complex formulas than in LaTeX), computing simulations, crazy charts etc. I never could understand how people can use Matlab or Maple when there is Mathematica.


Yeah I liked it as well; the notebook execution especially. Makes me sad to see how primitive iPython/iHaskell now are compared to the Mathematica editor almost 20 years ago. I hoped it would be the about same experience or better, but it's far from. Especially the web version for iHaskell (I assume the iPython web version is the same?) is horrible, absolutely horrible. But the idea is great and it is possible to make it work well as Wolfram has shown.


Well this is how it is now I'm afraid. There are "data scientists" using IPython convinced that they themselves and the tools they use are on the cutting edge. Like there are database people who think MongoDB is advanced. Or web guys using their favourite "framework" kidding themselves they're doing anything more complicated than Perl CGIs did in 1994.

It's like an Amiga owner being shown a "multimedia" PC in 1995 and thinking, I was doing all this 10 years ago.


Amiga example works; took me definitely over 10 years to stop saying that after I was forced to move to PCs. And then I had it with Unix (SparcStation 5s which never crashed and never had to be rebooted against our uni's 100s of PCs which were mandatory rebooted during the night otherwise they were unusable during the day, and still were actually), luckily everyone listened and we're all using that now. Well no, but at least you CAN now without being hopelessly crippled.

Thing is ; these notebook interfaces COULD be really brilliant, but they suck and I don't know why. Maybe just not enough people working on them?


How much do the people working on them get paid to work on them?

What many users of open source don't seem to understand is that actually, honest-to-goodness open source development is a major undertaking and imposition on a person's time. And quality software takes actual teams of actual people with experience and commitment and a basic peace-of-mind about paying their mortgage and putting food on the table.


No, it's more subtle than that. If you have paying customers and hence money, you can pay for experts in domain X to sit down with your programmers and finesse the product. If you don't, then what you have is a wishlist of features but no real, deep understanding of what users use them for. That's why the GIMP is the GIMP, every feature under the sun, in a horrible mishmash interface and for more of its life unusable for professional work because the one feature it really needed, 16-bit colour, wasn't prioritized because none of the devs realized they needed to.


You are right and I do understand that actually. So instead of whining I'll go help improve it. Not the web version for now though; the everything web thing is not working for me as at least my i7/16gb/ssd system is far too slow for productive work in that kind of context.


For the same reason that GIMP is no Photoshop.


I wish Mathematica had vi editing mode. As it is now editing the command line is stupid slow for someone who has lived in vim for 20 years now.

This is why I love IPython on the command line. Language is nicer too and editing is a breeze. Of course Python and available libraries are not much of a symbolic computing environment, but for the kinds of problems I deal with its faster to get things done than in Mathematica.


As the author of IHaskell, I'd love to know what you find lacking! (I agree that there's still a lot of room for improvement) The front-end is entirely IPython, so there's probably not much I can do other than contribute to ipython, but would love to hear concrete suggestions (feel free to use github issues).


Sorry if I came on a bit strong ; it was serious disappointment on my side as Haskell hacker and user of Mathematica for my Master almost 20 years ago. Of course yourr work is valued by me and many others and I will try to to work on the iPython interface myself as well as writing the above I thought; why not.

I would definitely say that the smoothness of the Mathematica interface, the way it works and interacts currently would be a great thing for programming languages like Haskell. It's just clumsy the way it is now and this is not something you can help but it needs fixing. Let's discuss further somewhere!


Hey there! Sorry I missed this comment earlier, but if you see this, please feel free to ping me via email (my email is in my profile!)


Except that the Mathematica editor still lacks a decent undo function: http://undo-for-mathematica.alaifari.com/undo-how-useless/


The v10 beta I have has it. So you won't have to wait much longer.


I'm on v10 beta as well, and it still remembers only one undo. Is it v10.0.0.0, or is there a newer release?


And no redo either, I suppose? Nothing good comes from the current undo function in Mathematica; if you accidentally press it you're bound to lose work.


I shouldn't have said beta. It's more of an internal nightly. I don't think the version number is different though.


Introductory video by Stephen Wolfram himself:

http://www.youtube.com/watch?v=_P9HqHVPeik


Would be interesting a real introductory video this is much like an infomercial, though a good infomercial.


To C&P myself from the previous thread, this looks awesome but I have a few questions:

1. Does the language work without an internet connection?

2. Is it free to use?

3. What happens if/when Wolfram the company goes out of business, or Wolfram himself retires?

4. Are all the disparate APIs and sources that the language uses determined by fiat, or can they be customized? What happens if/when the APIs/sources stop working?

5. Given that the language can do things like query your Facebook graph, it seems like there's a lot of API upkeep that the language requires. Is this done by Wolfram? How does this work when I'm running locally (if this is even possible)?


I had similar thoughts about humanity losing access to Wolfram's Mathematica after reading of his trip to the Leibniz archive last April. After all corporate officers have fiduciary duty to maximize shareholder value and computing is littered with inaccessible code due to foundering companies.

I have become less worried. Wolfram invites comparisons to people like Leibniz for his own ends. I think it is less to stoke his vanity than to set high standards for himself. Choosing Newton as a role model may be vain but it also places massive responsibility upon him. Newton sought to solve important problems not disrupt pet grooming social networks.

He has extended Mathematica and added a search engine and is shipping it with the most affordable computer hardware he can find. Wolfram Language is an attempt to make maximum human knowledge available and Wolfram has chosen to distribute it inside a project to provide universal access to computing hardware.

Will it all work out? There are no guarantees. Millenia are long and media fragile. But I don't think there are many people more committed to the challenge than Wolfram.

http://blog.stephenwolfram.com/2013/05/dropping-in-on-gottfr...


This looks almost identical to Mathematica. What's the difference?


It is. This is just a description of the language used in Mathematica. Since they are also releasing other products that use the same core language, it looks like they are presenting it in a product-neutral form, but it is just the language of Mathematica.


I'm going to continue programming "in Mathematica" then. Definitely not going to rub his ego by programming "in Wolfram"!

His company offers great products... but his personality irritates me.


I am developing a programming language called Egont. It means "the negation of ego".


I'd be interested to know quite how they're solving the "canonical names for everything" problem [0]. They're presumably making their own ontology, rather than using someone else's.

Also, best attribute name ever [1]: "BoundStateWaveFunctionMomentumRepresentationAbsoluteValueSquared"

[0]: http://reference.wolfram.com/language/ref/CanonicalName.html [1]: http://reference.wolfram.com/language/ref/PhysicalSystemData...


Has the language grown much faster in recent years? I coded some microarray data analysis in Mathematica at one point a few years ago, and was very disappointed in the slow execution, it made it unusable.

It may have been a "naive" implementation that could be much improved, but a similarly naive Java implementation was much faster and not significantly more difficult to write.


Depends on what you mean by recent. In the last 10 years: yes, much faster for number crunching (like SVD, Cholesky, determinant computations, etc).


Thanks, good to know. I'm most interested though in the parts between the canned procedures (which I assume are written in C or similar), for custom algorithms. For example loops which manipulate arrays and pattern matching (probably not the right term, maybe "rule selection")?

I seem to remember the pattern matching was especially slow, though very powerful. Had the latter been fast enough to use, it would have been a big advantage over Java. I'd be happy with single pass linear matching like ML/Haskell if it were just much faster than what I experienced with Mathematica's, I wonder if there's a way of doing that? Or type hinting, which most dynamic languages seem to eventually need to eek out more performance?

Hard to imagine that Mathematica wouldn't need something like that for good performance when you stray from the canned routines, but I'd like to be wrong!


Did you use Compile[] in your functions?


I was able to compile some small expressions, but I couldn't compile some larger functions which had parts that were not compilable (I don't remember why).

However, looking at the compile support now, it looks like it's more fully featured. For example, I see type hinting is supported now (maybe it was before and I didn't know it).

It looks like it's worth trying again.



That was for the Demo video, not for the page to the language documentation itself.


From the demo video, it seems a query language to query a database of knowledge. how's this knowledge database maintained and self updated are more interesting than the query language itself.


This is not intended as a stand-alone language like Python but rather it's DSL dialect for integrating with Wolfram Alpha. You're basically using this dialect to get very specific results from the Wolfram Alpha service.


Not quite. That's only one facet of it, you can still do whatever you were doing with Mathematica without alpha.


Not really. Wolfram Alpha is a pretty recent addition to the Wolfram product lineup, and Mathematica/"Wolfram Language" was very popular in academia before that. There wasn't even a direct way to access Alpha from Mathematica until Mathematica 9 (or maybe 8, I forget) and even now it's not a very commonly used feature. The Alpha API call limits (1000 per day for a professional license) pretty much prevent you from doing any serious work with Alpha.


"Commonly used" depends on who you ask. I use it all the time to help me look up units. It may not be 1000 calls/day, but it's still useful. Also the #/calls can be expanded with enterprise Mathematica.

It was added in Mathematica 8 (see bottom of http://reference.wolfram.com/mathematica/ref/WolframAlpha.ht..., "New in 8").

Disclaimer: I work for Wolfram Research


Any open source version? Also, to people who have implemented their own language interpreters before, is the description enough to attempt an open source version of it? Might be a fun exercise to try to implement it in jison.


There's a start of an open source implementation of Mathematica called Mathics, which is written in Python.

Javascript would be an interesting choice for a toy implementation, especially because there are orders of magnitude more javascript installations than python, and much more effort put towards making javascript fast. And who knows, maybe the toy would turn into something more serious.


Brian Beckman presented a session at YOW!2012 where he briefly discussed the possibility of implementing the Mathematica language in Javascript.

http://yow.eventer.com/yow-2012-1012/term-rewriting-in-javas...


People who were at Caltech at the time have gone on record that Wolfram knew very little about compilers when he wrote Mathematica:

http://groups.google.com/group/comp.lang.lisp/msg/f3b93140c2...

which explains why the Mathematica language contains so many amateurish mistakes, like its bizarre implementation of scope and its wacky pseudo-macro system. Don't get me wrong, Mathematica is a great product, in particular the standard library is fantastic, but the language itself is a mess.


Smug lisp weenie complains commercially successful system not implemented in lisp, news at 11.

The story isn't about Mathematica, its about SMP, which predated Mathematica by about 8 years. Wolfram was 20 at the time.

Back at Caltech, Rob Pike convinced Wolfram that C was the future and that was a good thing.


Who is this product aimed at? If it's academics, students, etc, then it's doomed, the wave today is free open source software.


Wolfram has gobs of money, at this point. He probably just works on whatever he thinks is cool.


He has no beard. The language won't last.



The demo gives me the impression to be more a tool for solving something once, producing a result and discarding it than a tool for creating something maintainable that runs repeatedly. Would love to get my hands on it though to just play around with.


How difficult is it to create an open source version, which is much better than wolfram alpha. Because, this kind of software is too powerful a system to allow one person or one organization to have full control over it ...


It will be interesting to see if this language takes off. I see its use limited to academics mostly. Long-term, I don't expect too much growth since most non-academics would rather use more general-purpose langs.


> I see its use limited to academics mostly

I hope not. If results depends on a specific implementation of a proprietary tool, there is no way to precisely replicate the results from scratch.


In which version of Mathematica will the complete language be available. I can't seem to use the Classify function in my Mathematica 9.0.1 Student Edition.


Classify will be released with Mathematica 10, which is officially coming soon...

http://www.wolfram.com/mathematica-10-coming-soon.html

You can tell from looking at the bottom of the documentation page:

http://reference.wolfram.com/language/ref/Classify.html?q=Cl...

It has links to

    Summary of New Features in Mathematica 10
    New in 10.0: Data Manipulation


I'm glad to see this language has first-class hashtables (Association). Does it have any other data structures that Mathematica was lacking (e.g. trees)?


I have had the hardest time looking up economic data on Wolfram Alpha. I hope this reference will let me figure out the proper syntax.


FRED doesn't have the data you need?


I was looking up things like historical US spending and revenue levels. And not for work, just on my own.


The fact that is so adapted to all those current needs is also its weakest point.


Can anyone recommend a good dead-tree book on Mathematica?


"Power Programming with Mathematica". It's out of print, but can be found free of charge on the Mathematica StackExchange: http://mathematica.stackexchange.com/questions/16485/are-you...

This is a great collection of resources: http://mathematica.stackexchange.com/a/606/241 (The most edited post on all of StackExchange)


Excellent. Thank you.


Wolfram's The Mathematica Book is a great reference and has some good examples for learning Mathematica (plus, used editions are really cheap: http://www.amazon.com/s/?field-keywords=mathematica+book) You can look at the contents online to see if it clicks with you: http://reference.wolfram.com/legacy/v5/

Mathematica Cookbook is also a nice reference. The book is much less systematic, but provides code that solves more realistic problems than the Mathematica documentation.


Thanks. Is the language pretty stable, but the libraries are what mostly changes between Mathematica releases? Or is there much obsolete stuff in version 5 that's not present in version 9 or 10?


There's very little that's obsolete. The newer versions almost exclusively bring additions.


Good to know. It makes sense that for a proprietary language implementation you need to bump major versions on a more regular basis. Perl 1.0 has some significant differences from perl 5.18.0.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: