Hacker News new | past | comments | ask | show | jobs | submit login
RyuJIT: The next-generation JIT compiler for .NET (msdn.com)
172 points by darrenkopp on Sept 30, 2013 | hide | past | favorite | 80 comments



> It’s not supported for production code right now but we definitely want to hear about any issues, bugs or behavioral differences that you encounter. Send feedback and questions to ryujit@microsoft.com. Even if you’ve figured out the issue or a workaround yourself we want to hear from you.

What a nice call to action. They even give you a specific email address to contact instead of something generic like support@microsoft.com


> They even give you a specific email address to contact instead of something generic like support@microsoft.com

This isn't that uncommon on MSDN Blogs.


I really hope the StackOverlow team writes up a detailed blog post on the performance benefits of RyuJIT.

The StackExchange network must be one of the largest .NET web applications.

EDIT: typo


I'd be surprised if StackExchange was in the top 100 largest .NET web applications (I guess it depends how you define size, but I've seen apps with much larger code bases, bigger infrastructure, and bigger data sets, and I live in a relatively small country).


I think it's likely it's one of the top 100 by traffic, but (not to belittle the achievements of the Stack Exchange guys) I'm sure there's at least 100 bigger sites out there by code/infrastructure.

Flight booking systems come to mind. I don't know how many web applications Microsoft would have individually (I imagine they use .NET...) and whether you want to count those (microsoft.com, bing.com, accounts.live.com, Azure...). Not to mention in-house web applications.


And Dell, Xbox, Bing, Nasdaq, Chase, and on and on. SO is a great site but it's neither the largest nor the coast complex.


They do. Windows Update is an ASP.Net application, for example.


According to the fairly suspect Alexa they are in the top 100, world wide (regardless of technology). So by traffic I guess yes, they are in the top 100.


Quantcast puts us at #54 in the US (haven’t managed to find the global rankings on their site). https://www.quantcast.com/p-c1rF4kxgLUzNc


One can't forget the London Stock Exchange clusterfuck, but then again that was three years ago.


Clearly you can't forget it. Some teams can succeed in a wide variety of languages, some organisations can fail in any language.

Given the number of teams that have succeeded with .Net, I'd say that organisational failings are to blame for this one. Have you already forgotten the reputation of the consultants hired to implement it?


Actually I used to think the same, until I discovered who did the job.

Currently, I think it failed thanks to the usual quality of outsourced projects with off-shoring developers than tooling.

Of course it is always easier to blame the tools.


I think you can put TradElect down to hiring monkeys rather than professionals. It would have failed in C++ and on the JVM with those guys.


Meh. Web applications are very rarely CPU bound.

Hopefully I'll get around to benchmarking the results on Roslyn, although the results are a little bit biased as we have some incredible perf people working on the product.


With all due respect, given that you work at Microsoft ... But this article specifically talks about web applications being a primary use case for the new JIT (RyujiT)

>>Quote: "But “server code” today includes web apps that have to start fast. The 64-bit JIT currently in .NET isn’t always fast to compile your code, meaning you have to rely on other technologies such as NGen or background JIT to achieve fast program startup.


Don't read too much into the specific example of "web apps". The point here is that the classic server app of 10 years ago was often a long-running service that processed large batches of work.

Think, for example, about gene sequencing: JIT compile the app once, chew on data for hours. No one cares too much about the startup time because the app will be running far long enough to amortize the cost of a thoroughly optimizing compilation.

The classic server app of today is often a web app that doesn't do a ton of work relative to its startup time. Web apps often start up, do a bit of work, then shut down. Waiting a long time for the JIT compiler degrades each launch significantly.

As others point out, there are solutions to making web apps faster. But rest assured: RyuJIT helps all kinds of apps: server, client, web, computational.

--Andrew Pardoe [MSFT]


Isn't there an option in IIS to have your web apps pre-compiled ?


I'm sure it will help with web apps for which JIT time is a significant performance drag. But even then, my knee-jerk response was to think that faster JIT compile times shouldn't really impact the life of anyone who knows about ngen.exe.

Reading the article, it sounded to me like Microsoft's primary motivation was cutting back on their development costs by shrinking the codebase. Faster compilation was just a nice side effect that also happened to make for a sexier selling point.


NGen doesn't help too many folks with asp.net, since it just doesn't work. And for a class of x64 applications, NGen is a terrible solution because it takes so long to precompile the entire application. I'm going to put together a more detailed article for the Codegen blog regarding motivation, history, all that fun stuff...


Quite true. I was involved into a project (ASP.NET/64bits) where the JIT compile times on startup seemed like I was compiling C++ code instead. :(


I am all for fast web applications.

But I think there is confusion—or at least I am confused—about what precisely RyuJIT provides. I know it provides quicker compilation times, which would affect the start-up time for web-apps. But does it also provider quicker-executing code? That is, does it do a better job with its optimization of web-apps' code once compiled?

Yes, a quick-starting web-app is awesome for when you need to add or replace nodes to your cluster. But I can usually suffer some warm-up time, even in that scenario.

My opinion is that web-apps are disturbingly often CPU bound and what I want most of all is faster web applications. Not in start-up time (that's icing on the cake, really) but in bottom-line request-processing throughput and latency.


The post's author commented: "Currently, we generate code comparable to JIT64. Sometimes we're a little better, sometimes we're a little worse. (I'll post some samples on the codegen blog as I get time) But we're just getting started, honestly. I expect that we'll be generating better quality code in almost every situation before we release a full, supported JIT (and I don't think we're that far away today)." (That said, CPU perf of the JITted code isn't everything.)


Interesting -- I had no idea ASP was that dependent on fast startup time that wasn't NGen compatible. My mistake!


My next big perf task is to get the Roslyn self-build as a test for RyuJIT. It's a pretty interesting combination of JIT compiler throughput, and generated code quality.


While you may be right when talking about the entire life time of the website, the JIT cost for startup on large asp.net sites can be non-trivial


True, however don't most large web-based applications have some sort of "warm-up" process/procedure to compile and populate the mostly used caches and pages?

A shorter ramp-up time is always welcomed but JIT is not the only cost. Depending on your application, JIT might not even be the highest/longest portion of your startup time.


> don't most large web-based applications have some sort of "warm-up" process/procedure

Yep.

The simplest way to get this is add a line to the end of the deploy script: "curl http://www.mysite.com/"


StackOverflow is tiny. We have about 6 million LoC of C# across our entire product portfolio...


Tiny? Let's agree with that. But StackExchange is one of the most open "platforms" of .Net, and that's why its popular (at least for me, and i'm a .net developer)


Offtopic:

I'm actually glad that a project of MSDN (aka. Microsoft) gets on the top pages on HackerNews.

I've been a fan of Microsoft on some things they do (but not all). .Net is one of them (Visual Studio). And i've almost never seen something like this rise up on the popular topics section.

At least not with constructive comments like here in the topic.

So thanks, HackerNews community, you guys have made my day :)


So many on HN treat those that write C# code as a sort of lower class of programmers and I have never understood why. I have been writing HMIs for machinery for over two years now in C# and the .NET framework with intellisense is a great tool for getting the job done. I guess there are some programmers that blunder through their career using only the most mainstream languages but I make sure to work towards learning new languages all the time so maybe I'm just a special case.


The C# ecosystem doesn't really interoperate with the HN world. There's a bunch of friction around using it - I don't want to run windows (it's not configurable enough and I'd miss lots of X features), so I'd have to use the relatively weak MonoDevelop, and my dev OS/VM would be different from the production one which would be a recipe for awkward-to-diagnose bugs. There's probably a way to get the software for free but I'd have to start at least thinking about licensing (and that means I can't just fire up a local VM in 30 seconds to test something). Maybe my cloud provider supports windows (though it's unlikely to be as well-tested as their linux infrastructure), maybe not; certainly windows is a second-class citizen for puppet. And what's the library ecosystem like? I get the impression that open-source libraries are a lot less common for .net; is there even an equivalent of cpan/pypi/maven central/etc?

I've got no objection to microsoft/MSDN; I'm a very happy typescript user, because it slots straight into my existing workflow and there's a decent eclipse plugin for it. But for a lot of these things you live in one world or the other, and never the twain shall meet - and rightly or wrongly, my impression is that more interesting software gets written in the "HN stack" than in the "MS stack", which seems a lot more enterprise-oriented.


An equivalent of maven central, ... is Nuget. Windows is well tested as an infrastructure, Azure is great and i think you underestimate it's potential. I've seen PHP developers using Azure because it's more advanced then anything else on the market (their words, not mine). Windows Server has an optional GUI, powershell is Microsoft answer to get an advanced terminal, ...

They support open-source libraries, but it's not as popular as eg. gems. But some are definatly worth mentioning: glimpse, elmah, stackexchange opensource projects for detecting queries, ... Some of them are on codeplex, but i see more and more change to the Github community (ps. git is integrated in Visual Studio 2012 next to TFS).

Monodevelop is not weak, it's just a version later (if c# 5 is out, monodevelop is at c#4, not "that" important for developping. Want the latest gimmicks, well yeah, then it is).

Never used Puppet, so is that important? To test something, you can just publish your project to your server (or Azure if you like), also other party hosting is possible. You can also publish it on Amazon if you want.

Your comment on "enterprise-oriented" is correct, but mostly because there are practicly no bugs on the stack... It's fast (compiled to the CLR) and stable and it's a proven concept.

SQLLite => Local Database Gems => Nuget ActiveRecord => EF Functional Programming => F#, lambda's, LINQ

But this is a good comment though. .Net (latest versions) shouldn't be used on Linux at the moment. It could be different if it had more support of the community though. I think Microsoft tried it first, they see there is some kind of barrier and now they are (perhaps) letting it go, piece by piece (don't know for sure).


>Azure is great and i think you underestimate it's potential. I've seen PHP developers using Azure because it's more advanced then anything else on the market (their words, not mine). Windows Server has an optional GUI, powershell is Microsoft answer to get an advanced terminal, ...

I'm not saying these things don't exist, I'm saying they don't interoperate. To get from where I am now to running on Azure/Windows would involve a lot of changes that would put me in a worse position if C# didn't work out. It's not something you can just dip in and out of.


>But for a lot of these things you live in one world or the other, and never the twain shall meet - and rightly or wrongly, my impression is that more interesting software gets written in the "HN stack" than in the "MS stack", which seems a lot more enterprise-oriented.

Oh I agree! Writing HMIs for machinery can be very boring work, but someone has to do it, and since we pick the PCs for the machine we can simply choose windows and then we have no issues being bound to the Microsoft stack. I figure once I get more experience I can go find one of those dream jobs where I can use lots of different languages on a regular basis.


I don't think it's the treatment one way or the other of those that write C# code. I think it's just the Microsoft Bubble at work. I think there are just fewer .NET developers here than the other ecosystems.

I personally shy away from .NET shops because they seem to be such a mono-culture.

--- This is totally off-topic so feel free to downvote into oblivion...


I don't think it's because of C# as programming language, but mostly because that C# is backed by Microsoft.

C# is great and everything that is great should get a chance.. At least, that's how i see it.


It's good to have competition. Modern open source keeps Microsoft awake. At the same time Microsoft has a lot of structure to offer to enterprise and millions of mainstream developers.


Excellent. Genuine innovation on a platform many had feared was abandoned.

My only regret is the confusing x86/x64 message. Many will interpret perf improvements being due to the 64 bitness of new compiler


Can you expand on what you mean by "confusing x86/x64 message"? I'd love to help clear it up. Are you saying that people will believe that the reason the JIT is so much faster is because it's 64 bits? That's the exact opposite of reality: 64 bit programs tend to be slower, because they have to manipulate more data (all pointers take twice as much space, the Win64 ABI requires a minimum of 48 bytes of stack per non-leaf function, etc...)


Do all 64-bit programs tend to be slower all the time though? I'd think it would be faster because you're processing more data per clock cycle. Or is that the x64 JIT hasn't been optimized to take advantage of the latest gen of 64-bit processors (I heard stuff about not making use of the latest Math.Pow a while ago)?


Programs that are pointer-heavy tend to be a little slower. There's really not that much 64-bit arithmetic going on in the average application, so the more data per clock cycle only helps in a particular class of apps. Cryptography tends to do significantly better, for that exact reason. You do seem to be conflating code quality with compiler throughput, though. I'll try to clarify in much more detail in a CLR Codegen blog post soon.


> I'd think it would be faster because you're processing more data per clock cycle.

This is only true if you happen to be dealing with integers larger than 32 bits (rarely in most of today's code). The real performance benefit of x64 has more to do with a larger number of general-purpose registers. More registers allow (but don't guarantee) programs to spend less time accessing main memory, thus gaining some speed.


Yes, I knew about the registers. I guess I overestimated the number of applications that involve large number computations.


Or perhaps I underestimated them...


Why can't the CLR come up with their own "X32" target? IIRC, they made the default for projects in VS to be 32-bit explicitly because of better codegen and lower overhead. If an app is OK with 4GB of RAM, why not let the process run in 64-bit mode, but use 32-bit pointers?

Also, why should the Win64 ABI constrain everything? Certainly it'd only be needed around the edges, but for .NET code calling other .NET code, you're free to do interesting things. (Like pass a GUID or tuple in a single register if it'd help.)


Keep in mind that Windows emphasizes a unified ABI on AMD64 largely because of what happened on x86. The x86 ABI sort of evolved into this "wild west" kind of situation, with various calling conventions for different languages/runtimes and usage scenarios (i.e. stdcall, cdecl, fastcall, pascal, etc.), which ends up making things unnecessarily complicated for the OS. (Kevin has an old blog post where he discusses this in more depth: http://blogs.msdn.com/b/freik/archive/2006/03/06/x64-calling...)

Being a good OS citizen and sticking to the Win64 ABI also makes some things significantly simpler for the rest of the runtime. An obvious example of where this pays off is native interop (e.g. P/Invoke, COM interop, C++/CLI), but one less obvious example of where this comes into play is actually managed exception handling (which is built on top of the underlying Structured Exception Handling mechanism that Windows provides). Not only does adhering to the unified ABI allow managed exceptions to interop with native exceptions, but it also makes things simpler for debuggers, anything that needs to walk the stack, etc.

Remember, the CLR is really more of an execution engine, and not so much a "virtual machine". We try not to disrupt the architectural conventions of the underlying platform, since we're not trying replace the OS environment itself.

--Henry Baba-Weiss [MSFT]


> Genuine innovation on a platform many had feared was abandoned.

Are you equating .NET in general with Silverlight? Honest question.


No, but the way MS came out with WinRT and excluded existing .NET code from executing on it certainly didn't reassure anyone.


There is a lot to criticize about WinRT, but basing it on COM was the right call. In fact I would say absolutely essential for high-performance apps. Make all the costs of .NET (GC, memory usage, JIT) optional. CLR can still call into COM objects without problems, so all the C# fans can still go nuts.

It's hard not to see this as informed by how badly Longhorn failed. Microsoft tried to make .NET the basis of their OS platform during Longhorn and failed miserably, in part because it was simply not built for that. CLR belongs as a layer on top.

[Disclaimer: I used to work at MS. Did not work on the Windows Runtime. These are my personal opinions.]


That's not what I was questioning - see my other answer for why WinRT was not "reassuring". Now it makes sense, and I can safely not care about WinRT.

As far as Longhorn failing, certainly that was both a disaster in management and not just technology. After all, MS Corp felt generics were and impractical academic and theoretical idea that couldn't be properly implemented in a language like C# or the CLR.


If that's not what you're questioning then IMO don't phrase it as a .NET problem; it is broader than that. There is a pocket of the .NET community that talks as if they shafted .NET in favor of native code, but WinRT is probably more disruptive to the existing Win32/C workflow than it is for people writing C#.


> After all, MS Corp felt generics were and impractical academic and theoretical idea that couldn't be properly implemented in a language like C# or the CLR.

What?!

Generics were only added to .NET 2.0, because it wasn't going to be done on time for the .NET 1.0/1.1 releases.

There are papers from the .NET Beta days already describing the way generics could be implemented, but additional work was still needed at the time.


Generics were only added to .NET 2.0 because MSR got it implemented. Redmond would never have done it alone, and called it academic, theoretic feature. They may have ended up with a lameass implementation of generics, ala Java.

Here's a history by Don Syme, who was one of the main people on this project:

http://blogs.msdn.com/b/dsyme/archive/2011/03/15/net-c-gener...

Quotes:

  "But I do want to say one thing straight: Generics for .NET and C# in their current form almost didn't happen: it was a very close call, and the feature almost didn't make the cut for Whidbey (Visual Studio 2005)"
  
  "being told by product team members that "generics is for academics only""
  
  "It was only through the total dedication of Microsoft Research, Cambridge during 1998-2004, to doing a complete, high quality implementation in both the CLR [...] and the C# compiler, that the project proceeded."
Microsoft's C# track record shows MC Corp's commitment to higher-level programming pretty well, IMO. LINQ got added just to hit the "LINQ" target (and Erik Meijer mighta been a big force there). But even then: the C# 3.0 were only implemented to hit LINQ, not added as general language features. One big example: declaration type inference is half-assed and only exists to facilitate anonymous types. They've had years since to clean up the design and haven't shown any indication of doing so.

(I still think C#'s the best out of the "mainstream" languages, but they could be doing a whole ton better.)


I know those papers, like any one that cares about compiler design should know.

As for MSR vs MS Corp, as you put it.

It is called Microsoft Systems Research, it is still a Microsoft unit, with researchers on Microsoft's payroll.

So it is plain and simple, Microsoft.


C# was first to introduce most mainstream functional concepts into C-like languages: generics, lambdas, LINQ etc.


I never understood where people got this idea from.

You can target WinRT with the usual set of .NET languages, the only difference being the classes one uses.

Does C stop being C, if you don't use libc?


There's multiple things at play. Until the F# stdlib was ported, F# couldn't target WinRT. Then there's the ton of fanfare JS/HTML were given, and the fact that C#'s development has been moving at a snails pace (and the "face" of C# is now working on making JS suck less) -- eh, it makes people nervous. Plus, MS's messaging wasn't all that clear. Now I see WinRT as being for cutesy tablet apps and I'm not so concerned. But before Win8, there was a lot of confusion going on.


> But before Win8, there was a lot of confusion going on.

Which was cleared for anyone that cared to access the information available after the BUILD conference.

But hey, it is easier to form opinions based in twitter spread rumours, or something like that.


WinRT was definitely a screw-up, but at this point I don't see it really effecting developers that much. WinRT has pretty much failed as a platform. At this point, is it even worth developer time to deliver an app to that platform?

If you're talking about apps in the windows app store (so-called metro-style apps or whatever the current name is), then you're right that it meant throwing out a lot of code, but it's not true that all existing .NET code was excluded. It took some rejiggering. But once again, that's only if you want to deliver via the app store.


I think you might be confusing WinRT with the Surface RT tablets. They have relatively little to do with eachother. For example, the "metro-style apps" you mention are WinRT just as well.


Uh... WinRT excluded all existing third-party code from executing on it.

EDIT: asveikau has rightly pointed out I have confused two matters.


Surely you would not confuse WinRT, the API, with Windows RT, the ARM tablet product. Why, these names are totally unambiguous!

(Microsoft is not great at naming products.)


I agree :(


The other one is that it's available only on 8.1 - I wish 7 was supported too.


The compiler is not 64 bit. :)


Which compiler are you talking about? The JIT compiler is 64 bit, the C++ x64 targeting compiler is available as both 32 & 64 bit. Roslyn is 32 bit because, well, the x64 JIT is dog slow :-)


I was referring to C#, but I may have misunderstood the parent. The C# compiler isn't 64, but unfortunately it's not because of the JIT.


> It’s literally off the chart! -- I hope they are better at writing JITs than making visualisations.


Mea culpa :) --Andrew Pardoe [MSFT]


How does this relate to the normal .NET compiler? I thought .NET programs were compiled at installation time, not JIT?

The reported performance improvements here are significant, but in absolute terms still seem pretty bad. 200 MB to compile a big regex, or 1 second of JIT time during launch, is a substantial burden.


The compiler you're thinking of compiles to MSIL - bytecode. The run time still needs to compile the bytecode into machine code, and this is what the JIT is responsible for.


I guess I was thinking of the NGen compiler. My (very vague) memory is that NGen is run at installation time. Is this run after NGen, or instead of it?


In .NET bytecodes are always compiled to native code before execution, there aren't any interpretation steps like in other management environments.

By default, MSIL gets compiled to native code on load via a JIT compiler. The developers that care about performance on startup, can choose to use ngen at installion time, thus taking the JIT out of the equation.

The only issue with ngen is that it isn't able to perform all optimizations that the JIT is capable of.

So this new JIT improves the current JIT and most likely, the optimizer will also be used by ngen.


My (very vague) memory is that NGen is run at installation time

As far as I know, ngen is only used if you decide to use it. There's nothing automatic about installed code being ngen'ed.


Native image generation differs based on the platform. There is automatic native image generation in Windows: see http://msdn.microsoft.com/en-us/library/hh691758.aspx for details. There's also Triton on Windows Phone: http://channel9.msdn.com/Events/Build/2012/3-005.

In the classic Windows Desktop case, however, you're right: you need to NGen your code yourself or call NGen as a custom action from your installer. --Andrew Pardoe [MSFT]


NGen invokes the JIT compiler, so your install time (and .NET servicing time) should be quite a bit lower, once we release RyuJIT as part of the full .NET Runtime. Probably less important, but still important enough for a lot of customers to complain :-)


While that's nice, what about actual innovation in the CLR, like a more expressive type system? Or access to performance basics, like SSE?


I don't understand this comment. .NET team isn't one man shop, there's various teams working on various issues. Should JIT team stop working until CLR gets "more expressive type system"?

Not to mention that .NET CLR features are mostly driven by CLR languages. While there's couple of things that CLR can do what C# can't do, you can bet that CLR will get new features when they're introduced in new C#.

And yes, SSE access would be nice.


You're correct, and my comment isn't constructive. It's just that they added features for CLR2, and then it's stayed there. C#'s been even more stagnant; they haven't even gotten around to polishing the edges from the LINQ push.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: