Hacker News new | past | comments | ask | show | jobs | submit login
Starting to Demo the Wolfram Language [video] (stephenwolfram.com)
190 points by samolang on Feb 24, 2014 | hide | past | favorite | 71 comments



I worked on Mathematica for 2 years, and really the biggest downside is how its an incredibly powerful tool that fails in many practical applications. It failed to power the backend of a search engine (Wolfram Alpha), and based on my latest version of Mathematica, the dynamic computing features, visualization, and JIT compilation features still have a long way to go. It's an incredible and fascinating tool when you're given these toy models to put into it, but I don't really see a Wolfram Language revolution happening anytime soon.

One aspect I think Mathematica will excel in is code generation. Using symbolic constructs in a very high level language to generate constructs of a low-level language is much easier to do in Mathematica than basically any other language.

On a side note - Linus Torvalds thought naming his operating system "Linux" was too arrogant. I think it's pretty bad marketing to name your programming language after yourself, but then again, the same social conventions that apply to us regular human beings certainly don't apply to Stephen Wolfram.


Wolfram Language has much more functionality than what existed in Mathematica, so it remains to be seen if it will have applicability outside of toy problems. So, it isn't fair to judge it solely on the merits of Mathematica.

Also, social conventions are exactly that - conventions implicitly or explicitly chosen by a group of people. That group of people does not encompass the entire population of the planet. Hence, there is no reason to assume that your either your or Linus Torvalds' social conventions apply to anyone outside of you or him. I see no reason why he shouldn't name is work after himself. Is it bad marketing? I don't think so - Wolfram's name is a strong brand.


You make an excellent point about social convention, and I completely agree. However, one important thing to note is that SW is not involved with core Mathematica development, he does research and remotely manages his company. He suggests high level features that are then implemented by the kernel team (which I was on). One incredibly annoying aspect of this bureaucratic development process is that you never know when your kernel project will get killed by someone higher up the food chain. Obviously, this happens at many companies, but when it happens as often as it did when I was there, you see key employees leaving a lot more often than you do elsewhere.

It's worth pointing out that the architecture of Mathematica was designed by Stephen, and in my experience with other large software projects, it is by far the most elegant and well-designed. Reading the original code base gives every developer a great appreciation for the genius of Stephen Wolfram, who is a completely self-taught computer scientist. So he definitely deserves credit for what the language is today, because if he hadn't planned that far ahead and realized the need for a general purpose computing platform, the pace at which the Mathematica team can add new features would not be nearly as fast.


What happens if you run it on itself - eg you do some sort of basic program with a little bit of structure and generate code from it, and then analyze the code, eg to make a graph of it (if that makes sense). Or to put it another way, what are the possibilities of making better tools Mathematica (or Wolfram Language or whatever it is called now) using Mathematica?


It would be slightly easier than coding Mathematica from scratch (you have better computers and an example to follow) but much harder to fund, since you would be re-treading old ground and wouldn't be able to competitive for a long time. The types of problems simplified by Mathematica don't include coding Mathematica. To be sure, it pioneered a few nifty techniques that you would want to take advantage of, especially regarding symbolic manipulation, but you wouldn't be able to implement them faster if you used Mathematica (vs just drawing inspiration from it).

Mathematica's huge selling points (to me) are that it's a fantastic CAS and it shatters the barriers between analytic and numeric tools. Many engineering and statistical codes involve 2/3 math and 1/3 computer science (or the other way around). Mathematica is in a completely different league for those problems because it handles the 2/3 about 50x better than your typical programming language and it handles the interface between the 2/3 and the 1/3 about 100x better than your typical programming language.

Mathematicians, engineers, and scientists regularly use a number of abstractions that your typical programmer doesn't care about: integration, differential equations, linear algebra (including infinite-dimensional and tensors), coordinate systems, statistical distributions, etc, etc. These abstractions are almost never considered during language design so you spend hours writing glue code to achieve the simplest of tasks. Unless you have a Mathematica license.

Example: you're writing a finite element code. The "meat and potatoes" of the problem involve putting a differential equation into weak form and evaluating a bunch of integrals to compute matrix elements. Mathematica can represent and display those elements using proper mathematical notation, it can find analytic AND numeric solutions, it can hold off on substituting variables until the last second (or elegantly substitute different sets of values for different problems), it can visualize all of this in a dozen different ways without worrying about indices and sampling rates, and it can generate code if you're feeling trusting that day. Now you want to evaluate convergence for a bunch of different meshing strategies. Not a problem: Mathematica can interpolate your meshed solutions to whatever order you want in whatever dimension using a single command and it can take differences between those interpolations and integrate with another. You never have to notice that the mesh points don't line up. You never have to spend a second thinking about multi-dimensional quadrature on tetrahedral meshes. Now you want to compare statistics about the mesh graph to an analytic model you found using Bayes Theorem in 3D parameter space and solve for the maximum likelihood values? Again, you don't have to worry about meshing or gradient descent or how to put least squares into matrix form. Just write the high-level math and let Mathematica worry about the rest. Now you want to use a simplified PDE model with an analytic solution and compare that to the numerical solutions? Mathematica can solve the PDE and take care of all the interpolation and quadrature needed to integrate the square difference with the numerical solutions. Then it can plot all these errors and update the plot automatically as you go back and change things. The value-addition vs a general purpose language is off the charts.

None of that translates to the process of writing another Mathematica, since the algorithms Mathematica lets you abstract away are precisely the things you would need to worry about to implement another Mathematica.

Mathematica = A + B = (1000s of algorithms that mathematicians, scientists, and engineers care about, all coded with a common interface) + (syntax that elegantly glues them together)

B is comparatively easy and interesting if you like languages -- Mathematica's language is LISP on steroids. A is the schlep, and it's a doozy.


Thanks for the comprehensive answer! I phrased the question poorly, in that I didn't intend to suggest running Mathmatica against its own codebase but rather against shorter end-user programs with a view to developing IDE-type addons.

I agree with your general comments, as I hate writing glue code and my interest is primarily for DSP and simulation applications, for which I already admire it.


Whoops, sounds like you already know exactly what I spent 3/4 of my post belaboring :)

I'm still not sure I understand your proposal. How do short IDE-type addons solve the problem of integrating siloed codebases with each other? The way I see it, the only way forward is

1. Convince a language person that first-class support for algebraic metaprogramming is important

2. Convince someone from each of the fields {numerical methods, linear programming, finite element methods, graph theory, etc} to export an interface to the language in #1, possibly re-structuring their entire codebase to do so

3. Documentation, convenience schleps

I say "convince" because the problem of making a Mathematica replacement is far enough down my todo list that I have no choice but to be realistic about the fact that I probably won't be that person.


I'm very into flow-based programming because I use a fair bit of it for audio DSP and that's how I think. So these days when I look at code-as-text I often find myself thinking about how I'd like to see the functions, branding etc. abstracted out into a flow chart diagram automatically. I was just struck during the Demo (and from some previous explorations) at how good M. is at developing graphs from structural information, and I think to myself 'why not from code?'.

On a tangential note, this latest demo looks very very close to the sort of thing Bret Victor has been talking about over the last year or two (http://worrydream.com). If you haven't checked it out I think it might be interesting to you.


> why not from code?

We build networks of lots of different kinds against our own codebases and databases. And personally I've done a bunch of experiments around visualizing code in the past.

Unfortunately, I don't think I, or anyone else at the company, has convincingly cracked the problem yet. Generally, the visualizations look pretty, but they aren't "tactile" or practical enough to be useful.

But you've convinced me that it would be worth digging up some of my old experiments and writing a blog post around them, so that at least any progress I've made can be carried further by other people.


> But you've convinced me that it would be worth digging up some of my old experiments and writing a blog post around them...

I would certainly be interested in seeing that. I have a lot of that sort of work looming in my immediate future.


"it shatters the barriers between analytic and numeric tools."

"and it handles the interface between the 2/3 and the 1/3 about 100x better than your typical programming language."

Thank you for that great word picture :D


Yes! The prospect of doing that deeply and seriously is probably the single thing I'm most excited about tackling next.

There's a whole tier of previously impractical metaprogramming that would come with being able to analyze and manipulate programs (and commit histories) as easily as data.

Syntactic macros are just the teeniest tip of the iceberg!


Could you give some more detail on how it failed to power Wolfram Alpha? Did they switch to an alternate language?


Mathematica for handling data aggregation, recursive descent parsing, and tasks well suited for it, while the rest is an enormous codebase of compiled and JVM languages stitched together with Mathematica. Such projects are referred to as the "mud bowl", where when something breaks you just have to throw more mud at it.


> recursive descent parsing

The parser is Java, actually. And it's not recursive descent.

> tasks well suited for it

Fancy that! Using a language for things it is well suited to! :)

> compiled and JVM languages stitched together with Mathematica

A lot of stuff is still WL though. Data curation, evaluation, scanning, typesetting, formatting, and, you know, the actual computation in doing something like http://www.wolframalpha.com/input/?i=regression+of+literacy%...

But it seems like you're complaining the backend isn't written in Wolfram Language? The JVM is great at that stuff! I don't call that ball-of-mud at all, just using the right tool for the job.


Sorry for being inaccurate Tali, you're right about the parsing and the other areas where WL is being used. It's been a while since I poked around WA.


So, I'm curious, is that similar to what Wolfram Cloud will provide? To me, it sounds like a service that will run your Wolfram Language code in the cloud, just like Wolfram Alpha runs it in the cloud as part of its backend. Is that accurate?


The demo looked really cool. I doubt it's better than Lisp for doing symbolic computation though.


It looks friendlier to most people. Like it or not, parenthesis scare people away.


From using both Lisp and Wolfram language (in the good old days when it didn't have a name and was called "the language Mathematica uses"), I have to say I find lowercase and dashes, parenthesis and spaces more readable than CamelCase, square brackets and commas.


Avoiding commas is simple. Just convert Times to Sequence when needed and always use * for multiplication.


This has been a very long time in coming. Here's Wolfram in a 1993 interview:

"One of the things that I consider an exciting direction is to what extent we can expand the use of the language itself, independent of the application side of Mathematica. We've considered making a thing that will probably be called M, that is essentially Mathematica without the mathematics."

http://www.stephenwolfram.com/media/stephen-wolfram-multipar...

What's funny is that it's only the "without the mathematics" part that hasn't necessarily come true, except possibly in terms of marketing. This little Raspberry Pi below my desk has a "Wolfram Language" distribution on it, which as far as I can tell is exactly the same as Mathematica.


You're right, I think. The language that's been at the centre of Mathematica for 25 years hasn't had a name of its own (it's just been "the language used in Mathematica"), but now it has its own name - the Wolfram Language. The Pi has an early pre-release version of version 10, whereas Mathematica is currently on version 9, and there are a number of additional features which will presumably be included in the release of Mathematica version 10.

Leaving aside the re-branding issues, it makes some sense for the language to have a separate name, because Wolfram intends for it to be used in web browsers and other places where there is no surrounding development environment (the Mathematica notebook and 'front end').


I'm working on something akin to this, although my version is somewhat more ambitious[1].

So far I have a "Knowledge Engine" that can answer (some) questions in natural language by using a process similar to how IBM Watson Deep QA works (an extremely simplified version atm, though).

I'm currently working on integrating SymPy for symbolic calculation[2]. SymPy already has a web interface[3] which is broadly similar to what WolframAlpha can do. That relies on Google AppEngine to provide security sandboxing.

My version is designed to be completely private, open source and self-hostable, including all the data[1], so I can't rely on AppEngine. Instead, I have SymPy running inside ZeroVm + LXC (Docker) as a sandbox. I'm currently working out how to get that to communicate with the front end (which is non-trivial, because ZeroVm is pretty much undocumented).

Obviously SymPy isn't as "Natural Language Like" as the WolframAlpha language, but arguably Python is actually more powerful.

After that I'll do OpenStreetMap integration (ie, self hosted OSM), which will give it better geographic understanding that it has.

[1] Every time I mention this I need to include a standard disclaimer that I do realize this is completely and utterly crazy.

[2] http://sympy.org/en/index.html

[3] http://live.sympy.org/


How do you represent queries? That's really, for me, the most interesting question.


I run multiple parsing strategies in parallel. One is based on Quepy[1] (which generates a parse graph, then runs RegEx on the parsed objects), and a simple thing I wrote using a combination of NLTK and a Java processing thing.

For "question answering" (the bit I have working) both end up being converted into SPARQL queries and then run against a local RDF store.

For SymPy the idea is that my frontline parallel pipeline thing will throw the query at SymPy as well as the other parsers, but SymPy will be the only thing that knows what to do with it.

[1] http://quepy.readthedocs.org/en/latest/


This seems like they have essentially codified wolfram alpha's many APIs and created the biggest, baddest standard library any language has ever seen, and slapped on some functions to access everything with. It seems like the wolfram language essentially sits on abstraction level max-1 which is pretty amazing.

I wonder how things break down when you don't want to do something that can be perfectly described in existing functions (ex. you need to perform some novel computation on an intermediate result of some function). Though I'm sure people much smarter than myself have made sure the code is modular enough to allow you to get your hands dirty (or they written the code such that if you have to get your hands dirty, you're doing it wrong)


> This seems like they have essentially codified wolfram alpha's many APIs and created the biggest, baddest standard library any language has ever seen, and slapped on some functions to access everything with.

Yes! I couldn't have said it better!


From my understanding, a big part of its intended usage is for interfacing from applications in other languages. So you can offload all of the difficult calculations from your C/Java/PHP program and get Wolfram to do the work.


Ah, so now people will have a database process, an application process, and maybe a wolfram process to handle tough computation...


First, as random internet commentator guy, I want to thank Stephen for his contribution here. This and NKS are incredible.

Second, I keep seeing videos and articles about Wolfram Alpha and am always left scratching my head kinda going "WTF?"

So it's a knowledge-based purely symbolic programming language? Can it play Flappy Bird? Power the next cool startup? Give me an answer to a question I might have today? I've been to alpha a couple of times. Never seemed to be able to get the questions right to get the answers I wanted.

I don't mean that to be negative. My point is that any audience is going to have a thousand different needs and viewpoints, most of which will probably seem trivial to Stephen Wolfram. Most new technologies get past this hurdle by open-sourcing. That way there's a thousand experiments, and over time a few of them are bound to give all of us little nerds a gleam in our eye one way or the other.

The other way of doing this, where you have a single point of creation describing to the world how cool it is? It is limited to things that have a single terrific property that everybody can agree on.

I'm not sure that is the case here. Still -- awesome stuff. Can't wait to watch this evolve.


Q:"What is the standard deviation in height for humans?"

A: inches and people are not compatible

whereas:

Q: what is the average height of humans? A: 5.3, with a nice graph showing the +- sigma

So it had the info, but just couldn't understand the question.

Whereas Q: what is the average height of male humans in Canada?

gives me an answer and chart "weighted for USA demographics".

If I change 'canada' to USA, I get a graph labelled the same, but with a different numerical value.

I don't think I trust this. At least with Google and I can inspect the source, read links, and so on.

One more: "What is a basis vector"

Wolfram gives me the stock quote for BASI, google starts things off with the linear algebra definition followed by a lot of relevant links.


To be fair, normally you say the standard deviation "of" a distribution, not the standard deviation "in" a distribution. When you use the word "of," the query works fine.

http://www.wolframalpha.com/input/?i=what+is+the+standard+de...


> Can it play Flappy Bird? Power the next cool startup? Give me an answer to a question I might have today?

Although it is possible to make games in Mathematica, it is a painful process.

At the Wolfram Technology Conference two years ago, I met some startup founders who were using Mathematica for some big data projects. It works great for people as a tool to use in the background for visualization and data analysis, but it's definitely not at the stage where a startup thinks "I'm going to deploy Mathematica to the cloud" or "let's build our interface with Mathematica" the same way they think about Ruby on Rails or JavaScript.


Yeah I really like where I think it is, but it still has that "mad genius" feel to it, where one guy explains how awesome it is and you're left kind of empty-handed.

Want to see a lot more eyeballs on this and people playing with it. Hope Wolfram doesn't let "licensing fever" get in the way of the community understanding it fully.


Here are some cool open source projects that use Mathematica.

HadoopLink: https://github.com/shadanan/HadoopLink

ArduinoLink: https://github.com/keshavsaharia/ArduinoLink


I was playing around with Wolfram Alpha yesterday and found this page of examples:

http://www.wolframalpha.com/examples/


I would say it is a matter of available libraries, deployment options and how fast can the generated code execute.

This type of systems is what I dream of, when I get my hands on old demos of the early Smalltalk and Lisp environments and try to imagine how would computing look like if those systems had gone mainstream.

Being super productive at high abstraction level, but exposing enough knobs to go down to the metal if one really needs to.

Oh well, dreaming is always possible.


As someone that wasn't a math major, I regularly get excited about Mathematica but then have trouble figuring out why I would actually want to use it. If someone were to give me an assignment that they knew could be accomplished with Mathematica, then that would be easier to start, but I feel like it takes a while to make that translation between having a vague desire to figure something out, and then realize that Mathematica could do it for me quickly. Like, I wouldn't have thought to plot out last year's trip to Europe using it. So for new users, there's something of a "well sure it's easy AFTER you know how to do it" gap.

It kind of reminds me of when search engines first came out - I regularly witnessed a gap between people having fun typing in some example queries, and realizing that they could type in anything search-engine-ish and get back information they actually didn't know ahead of time; information they had conditioned themselves not to ask out of not wanting to schedule an afternoon trip to the library.


I'm not a Mathematica user, but I've always enjoyed browsing the Mathematica Stack Exchange, which might be what you're looking for. For example, here someone asks how to generate random snowflakes, and many interesting solutions are provided.

http://mathematica.stackexchange.com/questions/39361/how-to-...

Edit: here's a more real-life example, peeling labels from jars http://mathematica.stackexchange.com/questions/5676/how-to-p...


I use Mathematica for most of the programming I do. It makes ordinary programming (file handling, database handling, etc.) much easier (at least for me) than it would otherwise be. I even run nightly backups with a Mathematica script, taking advantage of Mathematica's logical operations.


Wow, nice demo. It is interesting to see Wolfram explain his vision for an "ultimate fusion language" -- a powerful multi-paradigm language, a huge standard library, data sources, and deployment all rolled into one. This monolithic approach seems to be the opposite of the Unix philosophy, and I wonder how that will play out.

There are clearly benefits to be gained from smooth integration, and the demo is impressive considering it only uses stuff built into the language. But there's also something to be said for letting different teams of people focus on building smaller and more focused tools which can be combined in a variety of ways. I worry that this Wolfram language will be limited in its usefulness for real applications because it simply tries to bite off too much.


This was the Lisp and Smalltalk development environments philosophy which offered a better developer experience than UNIX ever did.

The little things together from UNIX was the functions and objects bundled together via the REPL, with added capability to dynamically change the environment.


>This monolithic approach seems to be the opposite of the Unix philosophy, and I wonder how that will play out.

Well, it's not like the Unix philosophy is the only game in town, or has been proven best, or anything.

Emacs, for one, is anything but Unix philosophy, but it's still a very good tool.

And such things are "monolithic" only if you consider an executable as the only possible unit of isolation. You can have other ways to modularize stuff.

E.g a full featured huge program can have modules, plugins, etc that "do a thing and do it well". Just because it can do what 100 unix utilities can do combined, doesn't mean it's monolithic into how it's internally structured.


The best parts are the natural language inputs and inherent connectedness with data. These are going to eventually become expectations that you'll have interacting with computer systems in general.

Having a computer interface that lets you say, find out who in my office is sending the most email, in a one liner is the next logical step. Including and understanding your company / personal data in the way Wolfram Alpha understands its curated data is the missing link.

I'd be really interested in integration with enterprise resource planning software, and customer relational management software for big corps. For web firms an interface to salesforce.com, an interface to your web analytics data, your support ticket dashboard, your web marketing data. I imagine being able to simply ask questions like how does the weather in my customers location impact my conversion rate? Or other poignant questions mixing internal and external data seamlessly.


This is really fantastic of course, but the video doesn't answer the following question: The video says it’s the most powerful language, one can build so much etc. What’s one of those things, that they've built that no one else could build before? What great application now exists that didn’t before? I will still use google for search, twitter for opinion (that’s where the people are at..), photoshop for photo manipulation etc. Besides doing mathematics, which Mathematica is great at, what concrete evidence is there that this language is better for solving any particular problem one would actually care to solve? I don’t care about dominant flag colors.


This seems super exciting. I'm not sure if you could build large-scale applications with it, but I cannot wait to play around with it.

Side note: The use of "I" and "My" really irks me. I imagine many, many people made this happen and contributed to these design goals. I guess this shouldn't take away from how cool this technology is, but it definitely leaves a bad taste in my mouth.


Wolfram started a startup and put his name on the door and it has been incredibly successful. No story about Wolfram appears on HN without someone complaining about his ego. Perhaps he is the Mohammed Ali of programming.



This is the other thing that's almost guaranteed to appear for some reason. I think Taliesin did a good rebuttal to it a while back: http://news.ycombinator.com/item?id=3974503

As for Stephen's ego, I'm surprised it keeps getting mentioned. He seems like a driven person, but then so are many people whose egos aren't discussed. He comes off as quite a decent person in this Reddit thread http://www.reddit.com/r/IAmA/comments/tmutz/stephen_wolfram_... and in this interview with Scoble: http://www.youtube.com/watch?v=mNf8Se_USlE

All of this has only a tenuous connection to the language itself, which is much more interesting to discuss. Sure, it may be the only self-named language (thought it arguably owes its name as much to the brand Wolfram Research as to Stephen, and I'd hoped they'd come up with a more creative name), but if one manages to shepherd a system like Mathematica through 25 years, one earns the right to self-name a language.


one use case could be a business analytics tool. 'business intelligence'.

i.e., import 30 days of log data and instantly plot the geo-located hits on a world map, as a time series of dynamically rendered graphical points, with built-in information i.e. size, color, icon, shape, etc. and also statistical analysis to go with it.

this scenario and many others are certainly possible today but would require quite a bit of code, libraries, and external api calls, as well as a wrapper such as a web app if you wanted a large audience to view it.

i think in the future, managers/executives who will separate themselves from the MBA herd are the ones who will be able to effectively use tools such as this.


This demo is pretty amazing and will take me a while to digest.

A few questions that I'm really interested in hearing the answer to:

1. Will the language work without an internet connection?

2. Will it be free to use?

3. What happens if/when Wolfram the company goes out of business, or Wolfram himself retires?

4. Are all the disparate APIs and sources that the language uses determined by fiat, or can they be customized? What happens if/when the APIs/sources stop working?

5. Given that the language can do things like query your Facebook graph, it seems like there's a lot of API upkeep that the language requires. Is this done by Wolfram? How does this work when I'm running locally (if this is even possible)?


It was once said that Stephen Wolfram builds such powerful tools that even the great man himself doesn't know how to use or explain what these tools are :P

On a more serious note, think of this language as something highly-sophisticated tech and other-industry companies might use.

You won't be building a farting-noise-type app with this language, unless it is a bowel-detecting-smell-sniffing-doctor-replacing-treatment-providing analysis app that uses sensors to compute the above scenario.

If Wolfram wants the language to be used in knowledge and computation fields, then maybe a couple of demos on how to use it for something like stock-trading, or tweet-timing-reach or weather-analysis might get this into the hands of people that use computation in some form with rather 'primitive' tools when compared to the Wolfram language (which basically is the equivalent of a huge-ass library and works like a huge-ass framework of sorts).

I personally would probably like to try or see the language being used in optimizing an industry like logistics or travel or anything to do with maximizing time-efficiency.


So while I can't see myself using this to write applications I can certainly see it replacing / augmenting my current iPython notebooks. I can do a lot with pandas and plotly for data exploration and visualization, but I can't do many of the things he demo'ed nearly as easily.


Ego Force One has taken off again :(


I once had a physics professor (who I admittedly was not too close to) who loved mathematica and is quite the evangelist for the tool at my old school. However, he once met Stephen Wolfram and didn't feel that he had "a high opinion" of my professor.

Regardless of his ego, he has somewhat earned the bragging rights given that he is responsible for mathematica and wolfram-alpha.


Ugh and he just had to say "it's a new kind of thing". No, it's not new. It's a very sophisticated implementation of some very old ideas. Surely that's impressive enough. Why wouldn't you cite the prior art??


This might be a good place to demo the language, assuming it is meant to be general purpose:

http://rosettacode.org/wiki/Category:Programming_Languages


Any ideas of how the Wolfram Language can be re-used in other commercial applications? Seems pretty powerful capability, in theory you could build a ton of services on the back of this, much like google, only with some new interesting hooks.


IIRC the stealth biotech startup "emerald therapeutics" is using mathematica to automate everything.

I actually am personally biased against this startup, but it is a fascinating use case.

EDIT: https://www.wolfram.com/mathematica/customer-stories/researc...


Really interesting use case. I'm super impressed with the concept, of automated data generation and gathering and having a common platform for the company. Perhaps the most impressive idea is that the data that is generated appears to be programatically accessible. I haven't worked in a large enterprise that has advanced past unsearchable and poorly indexed excel files that are manually manipulated/updated floating around on shared drives.

This wolfram implementation seems great, but I wonder about a lot of the annoying 'hard stuff' that would occur in a larger environment, handling of permissions different user roles etc.


For these guys the 'hard stuff' that really impresses me is interfacing with hardware. Scientific instrument controllers are notoriously proprietary and annoying to deal with. Though truth be told, in a small scientific environment not having user permissions is a 'feature not bug', especially if it's not siloed against other departments in, say a bigger facility.

Aside from personally knowing the CEO from grad school, a former co-worker of mine got hired into Emerald and basically transformed from a bioinformatician to a programming lackey... He may have hated every minute of it and quit.


Wow! That is a helluva pitch. My fear is that it _is_ the mind of Stephen Wolfram and unless you have that the whole thing will be one big mystery.


In the hands of an expert, genius, or wizard, Mathematica is very capable. But lesser mortals will struggle to make it do what they want, and it's never as easy as it looks here...


I would sure like to see the code for its repl. If it's all he says, that loop should be uniquely tiny.


The normal working environment (Mathematica) is a repl environment.


This I understand. I would like to see the code that implements the loop.


Well all my programming tools feel obsolete.


How to run it? Is there any interpreter?


This is really, really cool.


Lisp Machines are back! :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: