It seems the comments area here is full of lots of questions and few answers. Let me try to clear some things up.
Lets start with how it currently works...
When you create an application or website in C#,F#,VB.NET or any other .NET language you are not really 'compiling' it per se to native code, you are compiling it to an intermediate language called IL - very much like Java byte code. When your application or website is run for the first time the .NET runtime converts this code to native code on the local computer and then caches this binary and officially compiled native version of your code. The result is, the first time you run your app or a section of the code there is a small performance hit while the .NET runtime converts your code to native. Let me be perfectly clear, this is a one time thing - a first time thing. Once you have gotten past that every subsequent run is going to be as fast as other native code. (Why is it not as fast as something written in c or c++ you may ask - because there are other things the .net runtime is providing for you like garbage collection, but that is a whole other talk)
So how is this different?
This preview allows you to skip this JIT process and image caching process altogether. This is first targeted for Windows store apps, when you would first run it on your machine it was slow on first launch because all this extra work was being done. Microsoft decided they could take the server side resources they had and run this JIT and imaging process ahead of time and when someone downloads and installs your app - this work was already done and first launch would be faster - as much as 60% faster. There are also lots of changes under the hood to make this all work better - Andrew Pardoe is on this comment thread further down and mentioned things like a refactored runtime, static-optimized libraries, static marshalling all combining for performance wins.
Does this mean I can build apps for Unix with C#
No, Mono is still the best way to do that
Can I build simple console apps with this and deploy to machines without the .NET runtime?
No, right now this is only available for Windows Store apps. Besides, unless you have some Windows 98 machines sitting around nearly every PC has the .NET runtime.
Will this make my app run faster?
Starting up, yes. Normal operation, probably not. .NET runtime and jitter have been around a long time and they are impressively efficient and fast. This includes some tweaks to parts of the runtime but don't expect to see your text file processing app go from 10 seconds to 2 or something like that.
When I read the OP, I was tempted to
scream about bad writing due to lack
of context, undefined terminology, and
general lack of explanation, not nearly
the first time I was tempted to scream
at Microsoft's technical writing abilities.
Your post provided essentially everything
that was missing.
Windows Store Apps are what was once called 'Metro' apps and now termed 'Modern' apps. They are essentially Microsoft's version of apps from iOS app store or Google Play store but focused on Windows 8 tablet and desktops (and as of today, the same apps you develop for these environments can also work on Windows Phone and XBOX One)
In the past if you were writing a client side or 'desktop' app for Windows you used C++ and MFC or a .NET language with Windows Forms or WPF (or lots of other options) to build that experience. Going forward Microsoft would prefer you build using this new model they are calling Windows Store apps. Along with that comes a set of rules and explicit declarations about what your app will do (much like we have on Android or iOS). This should in theory limit the number of crapware that exists, though that remains to be seen.
Jitter is a term we often use for a program that does the JIT work, JIT is an acronym for Just In Time which in context means Just In Time Compiler; it runs just before the code is needed. You have a program with 10K lines of code but the command you just sent the program only invokes 500 lines - a jitter or Just in time compiler will only convert the code that is actually needs to perform the operation you requested (thus performing faster).
Technically JIT may run along with the execution, profile and later replace the code, i.e. start on interpreter, run a bit, compile. JITting is far more complex than "running before the code is needed", technically it can even deoptimize some code and then optimize it better.
How good/reliable is Mono on Linux now? If I knew for sure that people are actually using Mono on Linux robustly and with good performance I would ditch the JVM - as a language/runtime .Net is so much better...
In the OpenSimulator open-source cross-platform project we've been using C# on Mono for over 6 years for a server-side application with extremely high concurrency. From this perspective, recent releases of Mono have definitely increased in reliability. Mono 3.2 onwards is particularly good and it's very rare now, if it at all, that problems we have on the project can be traced back to issues with the Mono VM or the associated SDK.
We haven't done many systematic performance comparisons. However, it seems to be the case that whilst Mono still performs worse than .NET in a few areas (e.g. loading new AppDomains), in general there isn't a significant performance difference between running OpenSimulator on Mono and on Windows.
.NET is not a language, presumably you're referring to C# vs. Java? C# may have the upper hand over Java currently, but Java 8 is a very good sign from Oracle that they're finally willing to push the language forward, which down the road may not necessarily mean C# > Java.
As for .NET being so much better than the JVM, that's not a factual statement ;-) Certainly the JVM, with Clojure, Scala, Kotlin, Ceylon, Groovy, JRuby, etc. running on it, has proven the JVM to be a pretty awesome environment for emerging languages, while the CLR has proven to, not surprisngly, be an awesome environment for Microsoft backed projects.
If you're tied to Linux and Mono isn't a viable solution, why not try one of the above Java alternates?? JVM is a kickass dev environment these days if you're able to break away from Java...
FWIW, last time I tried Mono (a couple of years ago) it was quite impressive, particularly the IDE, incredibly responsive. The biggest drawback was the feature lag between latest and greatest from .NET side of the fence.
How good is Mono - Unity3D is based on it and I have not heard anyone complain that the thousands of games built with it (including some of the most popular ever on iOS and Android) are having performance issues because of the underlying architecture.
> Can I build simple console apps with this and deploy to machines without the .NET runtime?
> No
Darn. I was hopeful that this would help us run .NET applications in Wine by eliminating the need for the full VM, but it appears not.
> Besides, unless you have some Windows 98 machines sitting around nearly every PC has the .NET runtime.
Well, or a Unix machine trying to run Windows apps in Wine. We are (well, one single dev is, really) still trying to get Mono to work for us, but it's a very long road to get the missing APIs and VM features in place.
So, uh, can I compile my C# app to an bare x64 .exe file, statically linked with no runtime dependency on the .NET Framework, that will run on Windows 7 and 8? Or not?
"However, apps will get deployed on end-user devices as fully self-contained natively compiled code (when .NET Native enters production), and will not have a dependency on the .NET Framework on the target device/machine."
From the original link they say "Today's preview supports Windows Store applications. We will continue to evolve and improve native compilation for the range of .NET applications." which suggests to me that it might be on the radar. However the scope of Windows desktop or server applications is obviously much broader, so it might be a harder problem to solve there.
I'm kind of scratching my head here, can someone explain what problem this solves? There's already a CLR and already a way to build native apps for Win 8, etc. This is a new CLR that can run new 'native' apps?
It gives faster startup time (due to AOT compilation rather than JIT), and doesn't require that the .NET framework be installed. Possibly better runtime performance, but I wouldn't expect huge gains there. P/Invoke should have lower overhead, which may be highly relevant for some things.
I suspect that there's things that require this which they haven't announced yet, since on its own it seems like something kinda nice but not beneficial enough to justify creating.
Yeah I don't understand the need to get rid of a .NET framework dependency though. This only supports building Windows store apps, which only run on devices that have the .NET framework right now...
One admittedly crazy theory is that this is laying the groundwork for building iOS apps and therefore would compete with Xamarin. Any kind of JIT would not be allowed on the iOS app store.
Start up times could be a big deal to compete with iOS? Especially in similar app lifecycle models where apps could be stopped and restarted implicitly by the OS?
"It [...] doesn't require that the .NET framework be installed."
Are you sure? You still need the standard libraries, a garbage collector, its security checks when loading other code, and may want to use its compiler from your code. Together, that's a lot, maybe all, of what's in the framework.
The standard library base is huge... they already split things into a desktop and full profile for the .Net installer. This will only need to include those portions of the library your app actually uses, which is generally a pretty small part (including GC). Not to mention future versions of the .Net runtime not needing to be installed on the target machine.
If I remember ngen is very architecture/platform specific - that is why it has to be done on the target machine. This sounds like it is done ahead of time with support for a variety of architectures.
It seems like this is basically just the same thing that Mono has done for a long time with AOT. I'm not really sure how it differs from the Ngen tool, though.
.NET Native addresses many of the same issues as Mono AOT but it has a radically different basis. We had the advantage of hindsight :)
The biggest difference with NGen is the fact that .NET Native doesn't ever "fall back" to JIT compiled code. There are other differences--refactored runtime, static-optimized libraries, static marshalling, etc.--which can be seen by the perf wins over NGen. We're seeing 40% startup performance gain over NGen apps for top Windows Store apps.
@apardoe, one question: where will the compilation to machine instruction happen? From some comments in this thread, it seems that it will happen at app store server and not at developer's box. If that is the case then why will I need anything to do any settings in VS for building projects?
@rajeevk, we could compile from MSIL and not tell the developer. But you probably appreciate the chance to test out the app on your own the way your user will run it. After all, you'll want to profile it or maybe find that last on-device, optimized-only bug...
Reflection.Emit isn't allowed in the Windows Store profile so we haven't handled it in this go-around. Expression compile is supported and handled properly.
NGEN caches native code on the machine. You're still running from the CIL executable and you still need to have the framework installed, it just loads the machine code from the cache instead of JITing it. This appears to generate a native executable.
No MDIL. NGEN can still JIT for things like cross domain generics and because it loads non-native code, still needs to optimize at run time.
Project N compiles to fully native code, so there can be no JIT, and significantly smarter optimizations can happen since they don't impact performance at runtime.
@Pantaloons: Right on all counts. One clarification: .NET Native and MDIL are orthogonal. MDIL was used in Triton to split NGen into conceptually two pieces: the optimized part (in the Windows Phone Store) and the part bound on the customer's device to the installed Framework.
.NET Native produces completely native binaries. MDIL/Triton produces NGen binaries that still need a runtime and, occasionally, a JIT.
Sorry. Triton is the Windows Phone 8 "Compiler in the Cloud". It basically split NGen into two pieces, allowing the optimization to happen in the Store while the .exe generation happened on the device. The interface between these two halves is MDIL.
Was someone from marketing writing this: "the performance of C++ with the productivity of C#" ... on startup. And not even on every startup, but only the first. And not even the first, since, as you know, there already are a lot of (unarguably, productive) features (such as GC etc.) making apps slower than C++ apps.
Also, is it just me, but this sounds like a glorified NGen in the usual build process (compiling for production goes straight to native), instead of NGen in the background optimisation service?
I am all about getting the new tools, switching to new, more productive ways of development (in fact I was programming in C# for about 3 years), I simply hate when false information is being spread, just because it sounds better to the sales person.
Probably -- I don't care too much about the windows store, I'm more excited about being able to write C# code that doesn't require a VM. Not that there's anything /wrong/ with the CLR, but one thing I miss about C++ when I'm not writing it is self contained programs where it's just a small exe without a bunch of huge dependencies.
The redistributable is huge but not required. You can statically link the C runtime. That option has never been available for C#, and IMHO it's the single biggest reason C# failed to replace C++ for Windows client development.
Exactly. The redist is potentially annoying, but for my own projects I tend to statically link almost everything. It's not really a size thing, it's more of a "I don't want to bug my potential users to have to install a bunch of system altering things" viewpoint.
It's interesting. It's more like a "static JIT" (BIT?) that does not account for the different data characteristics the instructions will process (which is something a JIT would/should do because it alters the actual program flow), but I find the "fast from the start" idea very compelling. I wonder how long will it take to Oracle (or the OpenJDK folks) to react to it.
"Our compiler in the cloud compiles the app using .NET Native in the Store, creating a self-contained app package that’s customized to the device where the app will be installed."
This is great. I hope they make this available for web apps because we've got 45 second warmup time and CPU spamming due to the massive size of the work the JIT has to do when we deploy. Total pain.
I imagine this would help them keep costs down on Azure as well.
From my experience, much of an ASP.NET web app's start time is not so much the JIT-ing but before that, when the ASP.NET runtime compiles pages/views (aspx/cshtml etc) into *.cs then DLLs. Pre-compiling can mitigate this.
Larger EF models can have quite an impact.
Also if you use New Relic, it can seriously affect startup time too.
Yes and no. We ship our views as content as we have a custom UI front end that isn't razor or asp.net. The startup time for us is compiling NH proxies and the fact that our bin dir is huge:
Its truly a great thing. A single file deployment would be a tremendous improvement. Also removing the ease with which distributed applications can be decompiled helps as well.
Hopefully the marketing people let them spread it to all of their application types.
@BuckRogers, these two are related somehow. We've definitely benefitted from conversations with the M# people and they've likely benefitted from conversations with us. In fact, Joe Duffy once worked for our humble little .NET team :)
Lets start with how it currently works...
When you create an application or website in C#,F#,VB.NET or any other .NET language you are not really 'compiling' it per se to native code, you are compiling it to an intermediate language called IL - very much like Java byte code. When your application or website is run for the first time the .NET runtime converts this code to native code on the local computer and then caches this binary and officially compiled native version of your code. The result is, the first time you run your app or a section of the code there is a small performance hit while the .NET runtime converts your code to native. Let me be perfectly clear, this is a one time thing - a first time thing. Once you have gotten past that every subsequent run is going to be as fast as other native code. (Why is it not as fast as something written in c or c++ you may ask - because there are other things the .net runtime is providing for you like garbage collection, but that is a whole other talk)
So how is this different?
This preview allows you to skip this JIT process and image caching process altogether. This is first targeted for Windows store apps, when you would first run it on your machine it was slow on first launch because all this extra work was being done. Microsoft decided they could take the server side resources they had and run this JIT and imaging process ahead of time and when someone downloads and installs your app - this work was already done and first launch would be faster - as much as 60% faster. There are also lots of changes under the hood to make this all work better - Andrew Pardoe is on this comment thread further down and mentioned things like a refactored runtime, static-optimized libraries, static marshalling all combining for performance wins.
Does this mean I can build apps for Unix with C#
No, Mono is still the best way to do that
Can I build simple console apps with this and deploy to machines without the .NET runtime?
No, right now this is only available for Windows Store apps. Besides, unless you have some Windows 98 machines sitting around nearly every PC has the .NET runtime.
Will this make my app run faster?
Starting up, yes. Normal operation, probably not. .NET runtime and jitter have been around a long time and they are impressively efficient and fast. This includes some tweaks to parts of the runtime but don't expect to see your text file processing app go from 10 seconds to 2 or something like that.
Will this help me pick up chicks?
Depends.