Hacker News new | past | comments | ask | show | jobs | submit login
Bringing ChakraCore to Linux and OS X (windows.com)
264 points by bevacqua on July 27, 2016 | hide | past | favorite | 120 comments



To everyone thinking we already have V8 - competition is necessary for the ecosystem not to go stale. For it to thrive, we need competition.

Just look at JavaScript. Dozens of libraries that do the same thing, a vibrant ecosystem as a result.


For example, in V8 a try/catch statement triggers the whole function to be deoptimized [1]

Chakra and SpiderMonkey fully optimize these functions even with a try/catch statement (according to 2 devs last time I talked about this [2]).

Without alternate engines, people start to think that implementation details in one engine apply to javascript as a whole.

Just look around at most javascript "performance tips". Most of them talk about how you should avoid try/catch in performant code, even though it's only one engine that doesn't optimize it, and they have talked about how they want to fix it, but just haven't [3].

And that's just one example, there are hundreds of examples of bugs, performance tricks, and even differences in spec interpretation that can cause issues and can inadvertently become a "standard" without multiple competing engines.

[1] More accurately, the function will never be optimized in the first place if it contains a try/catch at all since Crankshaft doesn't support try/catch.

[2] https://news.ycombinator.com/item?id=10896729

[3] IIRC they haven't devoted time to optimizing it because the architecture of V8 doesn't work well with something like try/catch, but also because it's fairly simple to work around, and most people already do. So you kind of get a catch 22 where V8 won't spend time to fix it because it's not worth it, and it won't be worth it because V8 doesn't support it.


Try/catch/finally is handled by TurboFan (one of V8's optimizing compilers) - see https://groups.google.com/forum/#!topic/v8-users/maH60gh_a8s for more detail. There is a certain set of features which go through TurboFan instead of CrankShaft (the other optimizing compiler).

Disclaimer: I work on the V8 team.


Still, using `try` seems much slower (in Chrome 52):

    var times = [[],[]]
    var sums = [0, 0]
    var funcs = [
        function(){     sums[0]+=1            },
        function(){try{ sums[1]+=1 }catch(e){}}
    ]

    function bench(n){
        var t0 = performance.now()
        for(var i = 0; i< 30000; i++) funcs[n]()
        times[n].push(performance.now()-t0)
    }

    for(var j = 0; j< 10000; j++) bench(j%2)

    function avg(ary){return ary.reduce(function(acc, v){return acc+v}, 0)/ary.length}

    console.log(times.map(avg))

    // results: 
    //         [bare             , wrapped in try{}  ]
    // VM89:18 [0.347546000033617, 0.7464809999451041]
Let's do some more work inside the function / try block:

    var times = [[],[]]
    var sums = [0, 0]
    var funcs = [
        function(){     for (var i = 0; i < 1000; i++) sums[0]+=1            },
        function(){try{ for (var i = 0; i < 1000; i++) sums[1]+=1 }catch(e){}}
    ]

    function bench(n){
        var t0 = performance.now()
        for(var i = 0; i< 30; i++) funcs[n]()
        times[n].push(performance.now()-t0)
    }

    for(var j = 0; j< 10000; j++) bench(j%2)

    function avg(ary){return ary.reduce(function(acc, v){return acc+v}, 0)/ary.length}

    console.log(times.map(avg))

    // results: 
    //         [bare               , wrapped in try{}  ]
    // VM90:18 [0.09457900004982948, 0.5069960000008344]
I'm sure I'm doing many things wrong since I'm not a JS perf expert, but still, the difference here is goes from 2 to 50 times slower with the `try` block...

Edit: for the sake of completeness, on the same machine:

Firefox gives [10.2, 10.3] and [5.30, 5.45]

Safari gives [5.56, 5.60] and [4.91, 4.94]


> So you kind of get a catch 22 where V8 won't spend time to fix it because it's not worth it, and it won't be worth it because V8 doesn't support it.

That's an easy fix though: introduce one or more performance tests that stress try/catch (i.e. don't employ the workaround). Obviously Google won't be doing it, but there's nothing to stop Microsoft or Mozilla for lobbying for it.


Is it an easy fix though? It seems necessary, but not sufficient, to build an open source, cross platform JavaScript engine that integrates with node.js, just to be able to advocate for the benchmark change to be made.

That seems like a non-trivial task. Though luckily (for the JS community) they've been working on it.

Happily, they've thought about this stuff already:

"Continuing to invest in understanding and improving real world performance is a priority for the team. Going forward, we also want to work with the benchmarking workgroup[0] and the community to identify real world performance scenarios for Node.js." [1]

[0] - https://github.com/nodejs/benchmarking

[1] - https://blogs.windows.com/msedgedev/2016/01/19/nodejs-chakra...


    > Most of them talk about how you should avoid try/catch in
    > performant code, even though it's only one engine that doesn't
    > optimize it
That still sounds like great advice even though it's "only" slow on one engine. Very few performant JS applications have the luxury of not worrying about how Chrome performs, as opposed to only IE, Firefox & Safari.


The problem is that perf tips have a strong tendency to become "cargo cult knowledge" that keep getting passed around even though no one knows the original reason for their existence, or that they have long since stopped applying. This has been a problem in the C world for ages, and now it looks like it's coming to JavaScript as well. Even if Google were to fix this issue tomorrow, we'd probably still see programmers in 2025 writing convoluted code to avoid try/catch because "it's bad for performance, I heard that somewhere".

If Microsoft had released Chakra around the same time, it never would've been an issue, because people would've known "Oh, this is just Google's bug", gotten on their case to fix it, and it would've been gone a long time ago.


> If Microsoft had released Chakra around the same time, it never would've been an issue, because people would've known "Oh, this is just Google's bug", gotten on their case to fix it, and it would've been gone a long time ago.

But this is the exact situation that the browser world is in (V8 is only one vendor among many) and that hasn't come to pass.


Oh it is great advice right now, since Chrome has such large market share and V8 is the only "real" engine on the back end at this time, you'd be dumb to ignore it.

My point was that you want multiple competing engines to avoid this kind of thing going from a "V8 thing" to a "Javascript thing".

Right now V8 can still fix this "issue" and going forward that will no longer be a problem, but if history was a little different, we could have ended up with no engines that support an optimized try/catch since nobody uses it, and nobody would use it because no engines supported optimizing it.


True, but one way to push them to fix their engine and make it better is for them to feel pressure to not be left behind. People are going to code in the way that makes sense for them, and if one mainstream-enough implementation works way better than another, eventually people are going to stop bothering with it.

Safari has become the new IE in a lot of ways, and larger and larger websites/apps are starting to only target Firefox and Chrome, leaving Apple with the options to either give up or rejoin the mainstream.


Huh, I'm a little surprised that the try/catch de-optimization is still a thing. I watched a v8 optimizing compiler talk maybe ~4 years ago when this was discussed.

Wasn't the workaround to do nothing in the try/catch body aside from calling a named function defined outside the function scope where the try/catch was defined?


> Huh, I'm a little surprised that the try/catch de-optimization is still a thing.

It's fixed in Turbofan, but that's not shipping yet.


TurboFan is shipping for functions with try/catch or try/finally.


Yeah, you can basically turn try/catch into something that looks like the promise constructor and pass in a "try" function and a "catch" function, and then everything is optimized well (for the most part, this can still prevent some fancy optimizations).


>Just look around at most javascript "performance tips". Most of them talk about how you should avoid try/catch in performant code, even though it's only one engine that doesn't optimize it, and they have talked about how they want to fix it, but just haven't

Well, if that engine is dominant or even just a 20%+ player (like V8 is), the advice still holds, whether there are 2 or 5 other engines.


That's true client-side. The push here seems to be Chakra for Node, and if Chakra is more performant for certain Node workloads, then you could see Chakra on Node being adopted on that basis alone.


Additionally, ChakraCore exposes a C API instead of a C++ API, which is interesting because of the relative ease of embedding C APIs in other languages (in my interest, Rust). V8's C++ API is underdocumented and relatively painful.


JavaScript itself also need competition. I really hope the browser vendors can sit down together and start working on a better language for the future browsers.


Otoh, dozens of libraries doing the same thing may be just like dozens of models of fork to eat the soup.


Having alternatives also helps keeping a standard-based platform conformant with the standard and not slowly drift due to various extensions and idiosyncraties being excessively leveraged in an ecosystem.


We also have Spidermonkey and JavascriptCore, nothing has changed.


Having different engine is good, even "non-obvious" ones

If you look at the sources of TypeScript you can see it supports: Node.js, Windows Script Host and ChakraHost (see [1])

I made a little experiment to support another engine: the ActionScript Virtual Machine (AVM2) which not only support ActionScript 3.0 but also JavaScript (see [2])

Personally it's not that I don't like Node.js, but I do like the idea to take the JS sources of some command-line tool and bundle them in one independent executable to then distribute it as a deb package or Mac OS X pkg etc. without having to first install node.js/npm

[1]: https://github.com/Microsoft/TypeScript/blob/master/src/comp...

[2]: https://discuss.as3lang.org/t/the-case-when-you-dont-want-no...


It's all about resource efficiency. If you do that (bundle), you're not resource efficient (memory), but the resulting solution is easy to use (no dealing with dependency conflicts where A depends on X-v1 and B on X-v2, and you need A and B running in the same system). How important is resource efficiency, and what it even means today vs. yesterday varies wildly.


I experienced different results.

One of the main problem with JS engines is that they load external code on demand, even with something like JSDB (see [1] based on SpiderMonkey engine) where you can bundle all of your JS inside a zip that you then bundle/merge with the runtime, the JS files still have to be extracted from the zip and loaded "as file" from memory.

In the case of the AVM2, wether the files are AS3 or JS they are first compiled to bytecode (like Java) and that's what you bundle with the runtime, the bytecode still have to be loaded but it is "one big file", so wether your sources are many 100s of JS files it does not matter.

That matter a lot for the resource efficiency, and it influence a lot the memory footprint (~10MB for AVM2, ~50MB for a Node.js, see [2]), eg. the more files you load the more memory you gonna use.

For the dependencies conflict yep it make things easier too, not only for the versioning of libraries as you describe but also for the versioning of the runtime itself, eg. runtime v2 could be installed system wide but your self-contained exe could use an older runtime v1.

[1]: http://www.jsdb.org

[2]: https://github.com/eclipse/kura/wiki/Footprint


Wait, I can run typescript directly via cscript/wscript?


zwetan is talking about the hosts for the compiler, i.e. the things that run tsc.js


I think you are both right :)

even if I was talking about the host for tsc.js, the way the code is defined it should be able to run from wscript exe under Windows, although I never tried it.


This is good news for a healthy Node.js ecosystem, but I can't see what’s in it for Microsoft. V8-based Node.js works fine on Windows and Azure, so they don’t need this to attract Node.js developers and the npm ecosystem to their platform.

Are there any theories about how this (presumably substantial) investment makes business sense for MS?


It's a hedge against Google.

Google, through V8, Chromium, Chrome, Chrome OS, Android, and Dart, has a large say in the future of the web platform, browsers, apps, the future of Javascript, etc.

Time and time again, Google has used their substantial influence and market share to make unilateral decisions affecting the web platform [1][2][3][4][5][6]; the individual merits of each decision notwithstanding, Google can do just about anything and it will affect enough of the userbase that others must compensate or follow suit.

By offering an alternative to V8 monoculture, they can protect against Google using its influence to drive the Javascript platform in its own direction.

[1] SHA1 sunset: https://security.googleblog.com/2014/09/gradually-sunsetting...

[2] NPAPI discontinuation: https://blog.chromium.org/2014/11/the-final-countdown-for-np...

[3] Block Flash by default: https://groups.google.com/a/chromium.org/forum/#!searchin/ch...

[4] Chrome ignoring autocomplete=off: https://bugs.chromium.org/p/chromium/issues/detail?id=468153...

[5] VP9 available in WebRTC: https://developers.google.com/web/updates/2016/01/vp9-webrtc...

[6] H.264 in WebRTC still gated behind a flag: https://bugs.chromium.org/p/chromium/issues/detail?id=500605


This. As much as I personally don't like JavaScript, it's not up to me whether it exists and is used, and having an advertising company de facto in charge of how people perceive its implementation simply isn't a Good Thing for the web in general.


> Google, through V8, Chromium, Chrome, Chrome OS, Android, and Dart, has a large say in the future of the web platform, browsers, apps, the future of Javascript, etc.

> Time and time again, Google has used their substantial influence and market share to make unilateral decisions affecting the web platform [1][2][3][4][5][6]; the individual merits of each decision notwithstanding, Google can do just about anything and it will affect enough of the userbase that others must compensate or follow suit.

Funny, that 15 years ago you could say the exact same thing about Microsoft. Google's increasingly like the old Microsoft.


And before Microsoft maybe IBM or DEC.

What is strange is that many still don't get these cycles. :)


I think all 6 of those examples are good decisions on Google's part. Google has a heavy stake in making sure the web runs great. Chrome holds a large share of the browser market, but it doesn't control a controlling share.

I'm all for MS releasing a competing open source JS engine. Personally I have been burned so many times by MS in the past when they had "substantial influence" I have no interest in using it.


> Google has a heavy stake in making sure the web runs great.

Google has a heavy stake in making sure people keep using Google.com and it's associated services.


Yes they also have that, however Google.com is a search engine for the web. If users stop using the web, they stop using their search engine.


Not if they use Android or ChromeOS, they will just use the G App instead.


Then this isn't really about V8 anymore and the web and as far as that's concerned MS is already extremely dominant.


Depends on what one understands for web.

For me, the network protocols are what actually matters.

An on the mobile space for the time being, Apple and Google rule.


The whole "do no evil" mantra was only a thing while they were a small startup looking for geek credo across SV Starbucks.

After GMail, Web Search, Chrome and Android, they are just like any other corporation out there.


#5 and #6 are even worse, because it basically means Google can dictate whatever terms onto you, if you develop a competing browser.

As VP9 is heavily patented, and the license only allows you to use it if you don't ever sue Google, you're out of luck if Google ever fucks you over somewhere else.


The alternative is worse(H.264)


H.264 is worse, but at least it's available with pretty straightforward license terms (just pay).

VP.9's license might actually violate antitrust laws.


> H.264 is worse, but at least it's available with pretty straightforward license terms (just pay).

That's pretty much discriminating against a wide range of software - namely Free and Open Source. VP9 on the other hand, is compatible with a such development model.


Sure, yet VP9 is discriminating against about every startup with innovative patented algorithms that also wants to support VP.9

At the example of Pied Piper from SiliconValley: If they used VP.9 for video encoding before compressing and sending it, Google/hooli would now have the right to use their (potentially) patented algorithm for whatever they want, without ever getting sued.

Now that's great, eh?


The VPx cross-license does not work that way.

You are granted patents to use VPx (you can not use these patents outside VPx scope) and you are granting patents necessary to use VPx[1]. If you have some other patents that are not necessary for the VPx, they are not subject of the cross-license.

[1] - "that must necessarily be infringed by the act of Exploiting Licensed Products in the Licensed Field of Use"


VP9 is a open and royalty free video coding format. You are safe as long as you don't sue Google for VP9 patent infringements. It seems to target MPEG so that if they sue Google their license is revoked and Google can claim patents infringements as well. That being said I take VP9 anytime over h.264. It's not only about the payment to MPEG but your license can also be revoked or not granted in the first place. How f is that compared to VP9?


Yes, but imagine you're a startup which has developed a bunch of awesome new algorithms, and patented them.

And now you want to use them in combination with VP.9.

As soon as you do, Google can use your algorithms and patents without ever paying you, and you can't sue them for it, or your product can't use VP.9 anymore.

How fucked up is that? That's a huge cost there for startups that's always treated as "free" while in reality it can be selling out your entire product for a video codec.


Something tells me that you are wrong. Just because Edge is shipped with VP9 support that doesn't mean the MS grants all the windows patents to Google, right? Of course if you want to develop a VP9++ and patent it that could be an issue but I don't think that's necessary a bad thing for the codecs industry. Last thing I want is another proprietary codec to lock the content on specific players.


Surely is only so long as you don't sue google over VP9 codec issues. It can't be a blanket "Never be sue'd" statement.

These patents are all about a corporate cold war esque standoff and a licence term like that to me sounds like a dead hand.


It is a blanket "Never be sued over any patents" statement.

That makes it quite problematic, because if you, as a company, want to implement WebRTC compatible with Chrome, Google can now use all your patents for free, and you can't do anything about it.

It heavily surpresses innovation in the sector - existing big players can easily make a patent deal, but a new startup?


I think Microsoft knows how bad of a rap they got back in the days of IE6, and they know that people don't really change their opinions very easily. So they're hoping to change the minds of devs (which is why they've open sourced so much - they're trying to look better and nicer, and now they're taking it further by putting in a lot of work into languages/ecosystems that aren't their own).

Through their charitable work, they look like the "good guys". The great thing about getting devs to like MS is that those devs might praise MS when talking to their non-developer friends, and their non-developer friends might ascribe a lot of value to that praise simply because they know that devs probably know more about tech companies.

In other words, by getting devs to like/trust MS more, non-devs will also like/trust MS more


These recent moves aren't making me trust Microsoft anymore than before.

It's fantastic that they're open sourcing some stuff these days, but the things they're open sourcing and external projects they're contributing to are all either cost centers (nobody makes money on browser engines these days), or areas where they've lost the battle for cloud hosting, and are hoping to get a leg in on Linux-based hosting.

Now if they were to open source something like MSSQL, Exchange server, truly open up the office format, or pretty much anything where they haven't utterly been defeated by existing open source products I'd be impressed and might think their company culture is truly changing.

As it stands I don't see any reason to think that this isn't just all part of an experimental marketing strategy to bring already lost markets back under the Microsoft umbrella, and once they can achieve lock-in again; goodbye open source.


With .NET Core, all .NET languages, and ASP.NET Core being open source I'd say they've made that jump. It's literally millions of developers who use .NET and pay for Visual Studio (not to mention, this is what an incredible amount of Microsoft's products are built with), which last I checked is a billon-dollar business for MSFT. The open source stuff is where they're taking that business strategically, or at least it certainly seems that way.


All the things you've listed fall squarely in the category of "we've lost the hosting war, and nobody's going to ship proprietary .NET binaries on Linux distros".

The further you go from that sort of thing where they've already lost the market and either have nothing to lose or the writing is on the wall (how long can paid-for languages truly compete with free ones?), the less they have open sourced.

E.g. even Visual Studio which would be the next logical step is still for sale starting at $500 on their website[1], maybe that'll be fully open one day. After all how long can the paid-for editor wars persist?

What I was pointing out is that there's no sign of them opening up anything that would truly make them an open source company. I.e. things they make real money on, or would threaten their established lock-in.

1. https://www.visualstudio.com/products/how-to-buy-vs


Note that, while VS is still proprietary, VSCode is open source.

Keep in mind that there's a considerable overhead in open sourcing large products with very old codebases that evolved over many years, sometimes decades (this describes Windows, Office and VS, among many others). For example, they may incorporate code and components from other sources, and the license that permitted their use and redistribution may not permit opening the source. Worse yet, in many cases, you won't even be sure if they do or do not before you do an expensive code audit.

Open sourcing something mostly brand new, like ASP.NET Core or Roslyn, is much easier in comparison, because you architecture for that from the get go.


> while VS is still proprietary, VSCode is open source

And the two are completely different products


Yes, of course. But they're products in a similar area, with the main difference being that one is a brand-new, from-scratch codebase, while another one is ~20 years old, and in places, might be based on code going even further back. Which reinforces my point that it's much easier to open source new stuff.


Both Visual Studio and the Jetbrains IDEs are commercial and they're considered some of the best tools available on the market. Quality tools can still command a decent price.

To put things into perspective, IntelliJ has been battling both Eclipse and Netbeans for ages and it is (slowly) winning.


Yes it makes perfect sense that they're selling it. I'm not saying it doesn't.

What I'm saying is that the comments here to the effect of "Microsoft has changed their stance" don't seem to be in touch with reality.

Microsoft has just realized relatively late that there's certain things that they can't make money on yet are still strategically important, such as core .NET technology, or JS engines.

That doesn't mean that they'll keep open sourcing stuff, or that they're not likely to pull back the second they think it services their commercial aims.


> "Microsoft has changed their stance" don't seem to be in touch with reality

I would say that porting MSSQL to Linux, open-sourcing .NET and ChakraCore (amongst other things) is definitely a change from the "Linux / OpenSource is a cancer" days, no?

Microsoft is in no way going to become the next Red Hat in running a service-only model onto of Open Source products. It doesn't make sense for them to do that. But the fact that they have accepted Open Source as a possibility for some of their products, and put that into action is a step in the right direction.


The fact that they are happy to ship proprietary products that are known to have backdoors doesn't make me believe they've changed their views on user freedom. It's a relevance ploy, nothing more.


You're probably right. The thing is, Microsoft is right now in existential danger, at least if it wants to grow, and I think it has to grow, because it's a public company.

Their core market (desktop/enterprise desktop) is basically saturated. Almost their secondary efforts have failed from a financial standpoint or are only second in their niche (Bing, Edge, Azure, Xbox, etc.).

Unless they pull off some sort of a miraculous switch:

* they'll continue to be perennial underdogs

* they'll opening things up in order to gain allies, through genuine community building

From my point of view, in a sort of twisted way, that basically means that they're safe from a open source point of view.

They'll have to develop communities in almost every area they want to compete in (which means that there will be stewards available if they back down) and they'll invest quite heavily in their open source contribution.

It might not be done for the purest reasons, but I think that their new contributions will be here with us for the long term.


The open sourcing of .NET core is incredibly significant. As much as people here like to pretend no one uses C#, it's a massive and profitable ecosystem.

The license you're buying for VS is a support seat with some special integrations. It's not like IntelliJ doesn't do that too. The community edition is entirely serviceable and free.


> how long can paid-for languages truly compete with free ones?

This is why we cannot have nice things, and FOSS community keeps rediscovering features that were already common in the 90's in commercial tools.

Common Lisp IDEs is just one example out of many.


> they're trying to look better and nicer,

I think that's a misleading picture of where things are going. Seeing these releases as charitable work misses the point. They haven't open sourced any of their core many makers. Nothing of the Office suite got open sourced, and you still can't code for your game console without special permission.

They do as many others, they open source what's already lost in an effort to stay relevant. It has been .net stuff (which was supposed to out-compete Java, but captured exactly zero mindshare outside the Windows ecosystem which is necessary to grow it at this stage), other developer stuff (not Visual Studio, mind you, but things like git and ssh where they tried to compete but lost), cloud (they need Linux on Azure to stay relevant) and web development. Much like Apple when they started out with OS X and released networking, printing and X11 stuff to get back developers. That worked out well for them but we haven't seen any more releases since then.

Don't be tempted to view this in terms of good and evil, but in terms of where these products are in their life cycle. Time will tell if it's successful. I would guess they lost web development long ago and there won't be any mass exodus from PHP or Rails to .net, but they still have a fighting chance against Java, especially if Oracle continues to bodge it.


It's less about an explicit positive for MS than a necessary rearguard against a trend negative in the form of the encroachment of Linux. If they don't do (things like) this, they'll be shut out of the server-side ecosystem altogether. MS needs to hedge its bets and ensure that if Linux achieves total domination, they still have a very good technical offering on that platform that does not rely on their arch enemy. So put as much stuff onto Linux as possible, then sell services/support etc IBM-style. And of course they also hope to bring in dev resources for free!


MS is using Electron for Visual Studio Code, so I can see them using NodeChakraCore instead down the road. They might push it as the way to develop cross platform app, with WebAssembly coming, making it the preferred platform is probably what MS wants. They are planning for Web 4.0.


Oracle is also making JavaScript having first class support on the JVM since Java 8.

JavaScript has just reach the same status as many other languages in having the freedom of multiple implementations.


Browsers send queries to search engines. Search engines have ads. Ads shit gold.

It makes sense for Microsoft to not loose too much market share to v8 or chrome. What if node running on chakra on azure vms performs amazingly well?


Easy: embrace, extend, and extinguish.

If Microsoft behaves as they have done in the past, this is what will happen:

1. Make Node run on top of Chakra [x]

2. Give Chakra non-standard features and sell communities on them while pointing out to dissenters that the extensions are open source []

3. Gradually allow open source work on the product to die and commercialize it []

Microsoft has lost a lot of people's trust. My grandfather used to tell me that trust is hard to earn and easy to lose. Once trust is lost, it's even tougher to earn back. It's going to take some convincing on their end if they want developers to trust them. It doesn't __matter__ if they appear to act more ethically than their competitors, that's not how trust works, and it won't convince a lot of people that have watched Microsoft for a while that they've suddenly changed their ways.

EDIT: funny this is being downvoted considering eee was an actual strategy Microsoft has historically employed according to the US Justice Department. This isn't even conspiracy theory stuff -- this is real life. Those who don't study history are doomed to repeat it.


This makes no sense at all when Chakra core is open source.

If MS creates a closed source branch, then the most likely result will be for someone to fork the ChakraCore code and continue NodeJS dev based on the open source fork.


Being "open source" is meaningless if there isn't a community actively building the software and no one understands (or very few) understand the code outside the organization.

Open source is luring developers into a false sense of security -- it's possible to implement features that could not be ported easily into competitor's products and have no formal specification provided anywhere. Even if they have a specification, there is no guarantee that the specification will remain up-to-date as time progresses.

Also, I even said:

    > while pointing out to dissenters that the extensions 
    > are open source
Open source doesn't mean anything.


I simply cannot foresee a scenario where the Node community is active enough to put in the effort to make Node work with ChakraCore, anode continues to remain popular and widely used, but after adopting ChakraCore the community does down to the point where they are incapable of replacing ChakraCore if MS turns hostile.

In addition, if ChakraCore is adopted, it's almost certain there will be a major fork that continues running on V8. The most likely scenario will be that Node will continue to run both in parallel, and soon enough the JS engine bindings will be abstracted enough that it may even become a bring your own JS Engine type software.


I'm not concerned about Node on Chakra necessarily -- its developers are probably smart enough to avoid this issue; I'm concerned about applications running on Node on Chakra. Unless you avoid using __any__ non-standard behavior, Microsoft owns you.

Using Chakra is risky. There is a very real possibility that Microsoft will make developers regret trusting them. The only possible payoff is that maybe you need one less app instance. Maybe.

It's foolish to put you, your codebase, and your organization at risk for this.


Why can't you say the same for v8?


Because Google is a company of unceasing virtue and "Don't be evil?"

Or perhaps because many people have been conditioned to dislike anything Microsoft does without actually analyzing it's impact or relevance. Certainly it seems like there aren't many rational facts to thia person's argument.

I very much agree that "non-standard behavior" in an open source product exists that could be directly replicated. One of the major benefits of open source licensing is to hedge against that behavior.


So your counter-argument is this:

    > Or perhaps because many people have been conditioned to 
    > dislike anything Microsoft does without actually 
    > analyzing it's impact or relevance.


    > Certainly it seems like there aren't many rational facts 
    > to thia person's argument.
I could say the same thing about this comment.


So tell me, why is it bad for the industry? Why wasn't it bad when V8 came out and enabled node?

Why is there a different standard you apply to Microsoft's open source efforts? Please, tell me.


    > why is it bad for the industry?
I already explained how Microsoft has a history of violating people's trusts, and how an investment in Microsoft's technologies is therefore risky.

    >  Why wasn't it bad when V8 came out and enabled node?
This is off-topic: you don't know my opinion of V8 or Google. My opinion on V8, good or bad, also does not change the validity of my criticisms.

    > Why is there a different standard you apply to 
    > Microsoft's open source efforts?
You don't know me personally, so it's presumptive of you to assume I'm applying a standard unevenly.

You have done nothing to address the criticisms of Microsoft, instead you have resorted to cheap red herrings -- a tell-tale sign of a lost argument. Until you decide to actually provide an argument of substance against what I've said, rather than arrogant name-calling and presumptive assumptions regarding my standards for open source, this conversation cannot serve any constructive purpose.


Corporations are not people. Holding grudges against them as if they are people and not a social construct that changes over time makes 0 sense.

They have new management that is doing a lot of great things. The number of bad things they do is going down. Many of their products are genuinely good now. They're oing out of their way to improve their support for the broader developer ecosystem.

At this point, I "like" them a lot more than I like Apple or any of the hapless desktop Linux Distribution corporate governance bodies.


That's not related to the conversation at hand. Who said I like using v8?


Who asked what you like? That's irrelevant.

You talk about "organizational risk"as if we don't live in a post-Sun world where Oracle genuinely believes APIs are patentable and the number of successful major open source projects without corporate sponsorships is vanishingly low. It's a world where a single assinine node developer with a trivial package on npm shut down the entire damn industry for days.

A relationship with a fairly stable and increasingly developer friendly company like Microsoft seems very reasonable by comparison.


    > Who asked what you like? That's irrelevant.
Parent did, when they asked:

    > Why can't you say the same for v8?
My point was they did not know my opinion on the subject and to make an argument about my personal consistency in analysis (although this is off-topic) is impossible.

Oracle's actions do change the validity of the criticisms against Microsoft. That conversation isn't directly related to this thread and is a weak distraction from Microsoft's history, a history that many reasonable people could claim is shameful. If that's the best that can be done to defend Microsoft, then it seems that it isn't trustworthy.


That's an odd reading. It looked to me like they were asking you why Chakra doesn't represent a risk similar to V8.

And it does. But it's not a very high risk.


If you alone hold the copyright to a piece of software you can change the license at any time.


But that doesn't apply to versions that already had an open license. You can't retroactivally revoke a license, you can only add it to new versions.


That's not true, and it's happened before:

http://www.cnet.com/news/nessus-security-tool-closes-its-sou...

A project like Linux couldn't do this, since it has thousands of authors, and each author would have to agree to the change. But if the author is a single person, or a corporation, then only one person has to make the decision.


You're misreading the article, I think. They didn't release a new version under the GPL. They didn't retroactively undo licensing for older versions.

It is not at all clear that you can snap your fingers and renegotiate an existing license arbitrarily like that. You need to do it in advance of the agreement.


Probably Microsoft want to take over NodeJS ecosystem. They are lobbying to replace v8 engine with their chakra engine for Win32 builds. Read up about the history of the company, it can be very bad if the official NodeJS project switches to chakra for one or all platforms. So it's in probably bad news that Microsoft targets NodeJS even more. And their TypeScript adventure and how they steer/lobby the ES6/7 development in their interest isn't great for JS and WebAssembly. They have a vast interest to port their class based old C++/C# codebase of Office tools and Monaco editor to the web. They could stick with their new delayed lacklustre rather unfinished DotNetCore 1.0 ecosystem. Let's hope Google and others cone to the rescue to help NodeJS. On the other side I really wonder why Google switched to TypeScript for one of their project. Maybe the choose between Facebook and Microsoft and favoured the later because he former is an even bigger competitor in WebFrameworks (React), because they were late to the game?


I've not tried recent versions of node but have had to port things like Brendan Gregg's flamegraph tools from node to other languages because node would die with even pedestrian memory use (well under 4GB).

Does anyone know if ChakraCore has better memory management? Could be a big win for longer running applications or those that simply need a larger working set.


Does this indicate a desire to bring Edge to macOS/Linux? Finally some more browser competition!


[disclosure - I work for MSFT] It's only ChakraCore that is going cross-platform at the moment, and existing platform support matrix for Microsoft Edge remains unchanged now. That said, we'd welcome input from developers and users who want to see Edge on the platform of their choice.

Limin

edit - typo


One reason for Edge on other platforms: not having to spin up a Windows virtual machine or boot up a PC for cross-browser testing.


It's possible for there to be platform-specific bugs / quirks even in the same engine.


True, but if you could do day to day testing without a VM, even running automated tests against it, that would be a huge timesaver. Then you fire up the VM once a week (or some reasonable interval) to confirm it's 100%. There are platform specific bugs, but most issues I see now with Chrome or Firefox are specific to the browser, not the platform.


If the hypothetical Linux/OSX Edge lives up to Windows' one's battery life promises, it would be great as an actual primary browser.


> Windows' one's battery life promises

As in promises that Microsoft has made or as in what Windows Edge actually delivers?

Because so far, from what I've heard, there seems to be a pretty big discrepancy between those two, but I haven't seen any actual benchmarks, so could also just be nonsense...


Aside from comments in a HN thread, is there a formal mechanism for requesting - and then tracking the progress of - truly cross platform support?


Thanks for asking! The best way to request Edge features is through the issue tracker https://developer.microsoft.com/en-us/microsoft-edge/platfor....


I would love to have a Mac version of Edge, both for compatibility testing and for evaluating it as a main browser due to its ahem spartan UI/featureset.


OS X user here. I would at least have it installed for testing


What does ChakraCore bring on to the table V8 currently doesn't? Can someone elaborate?


All JS engines make different tradeoffs, so they are better at some things compared to other engines.

For example, Chakra has had full asm.js optimization support for a long time now, while v8 still doesn't (but it's coming).

We now have 4 high-quality JS engines that are cross-platform and open source (v8, SpiderMonkey, JavaScriptCore, and now Chakra). This is great!


The Node.JS project has a problem in that it has never had a specification - only a defacto implementation based on an ever-changing V8 engine with whatever ECMAScript compliance that V8 had at the time. Recently Google made big changes to the V8 API that broke Node Buffer in a rather large way and the Node devs had to do handsprings to put together a less than optimal workaround to get it working with V8 ArrayBuffer. Node.JS is at the mercy of Google's V8 engine which is designed first and foremost for Chrome.

ChakraCore did the impossible in that they've mimicked the V8 C++ API to make it appear to Node as V8. But this trick will only work for so long as Google continues to change its engine.

It would be better for Node long term to have a JS engine independent C ABI. But the core developers of the Node project are resisting this idea as they believe Node is V8 only. I don't think that Node will ever officially merge Chakra. They just want it to go away.


When MS announced a node.js version backed by Chakra instead of V8, some of the comments (including one from me) addressed some of the reasons that having multiple JS engines is a good thing.

https://news.ycombinator.com/item?id=10932182


Competetion.


It's an engine -- it shouldn't bring much that is different.

Having hundreds of frameworks, platforms, etc isn't great, but having at least one alternate is awesome. Minimally it allows for a larger collection of minds to deal with hard performance problems, to each take different approaches, and to potentially standardize on a solution after sufficient time to prove out solutions.


Off the top of my head: better performance on some workloads, better standards support in some cases, not being controlled by Google.

If you happen to care about those workloads or those cases, or have a company policy to not use or minimize use of Google software (which some places do, especially ones that think they might end up competing with Google), these are pretty useful features.


Sometimes faster, better ES6 support (to hopefully spur V8 to complete theirs), etc. Nothing big, but a little competition is nice.


This is interesting enough, I suppose, but I think they should have included some kind of interesting benchmark where Chakra outperformed V8 or something for Node.


Thanks for the suggestion, great idea! However, it's really early days for us still- as the blog post mentions, we are still working on implementing the JIT and concurrent GC which are quite important to awesome performance. When we complete that work, we'd definitely keep a close eye on current industry benchmarks


Heh, I didn't actually think an actual MS developer would respond to my comment.

As it stands, I can appreciate what MS is doing; you guys aren't probably going to compete with or replace V8 or SpiderMonkey on Linux overnight, hence why getting a working Node port is a good first step of presumably many.

Regardless, this is definitely an interesting step; it would be really trippy if I end up using an entire MS stack on my Linux desktop, but it appears that might be a legit possibility.


To clarify, this is why no graph has been published for Node.js on Linux and MacOS, or chakraCore on Linux and MacOS. But as a sibling post to the parent mentions, they have published performance statistics for Node.js on ChakraCore on Windows, where the JIT is implemented.


They already did that back in January when they announced a node.js integration with ChakraCore [0].

https://blogs.windows.com/msedgedev/2016/01/19/nodejs-chakra...


Direct link to the graph:

https://winblogs.azureedge.net/win/2016/01/nodejs-benchmark-...

It looks like they perform better on 50/675 tests, and worse on 500/675 tests. That is not particularly compelling. Since that was a while ago now, I really would prefer to see where improvements have been made.


Why was this down voted?

Edit: Seriously, with all the down voting of normal comments like this one. If you have an issue with something someone wrote enough to down vote it at least be enough of an adult to own up to it and explain why instead of being sneaky.


Super


What's the point of this exactly? Why would someone chose this over V8?


Why would someone buy Benz when there is BMW. Every engine is better in some aspect and suffer in another aspect. There is already benchmark link provided, which already prooves one reason to choose over V8.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: