C++ CLR Compilation

The C++ compiler has the ability to produce both native and managed instructions. The /clr compiler switch is the basic option for turning managed instruction generation on and off, but beginning with Visual C++ 2005, finer grained control over managed instruction generation is possible. Managed code can be written using C++/CLI in a manner that is verifiably type-safe, which means that it can be statically analyzed to ensure that it only accesses the member variables and methods of a type correctly. Without verifiable type safety, pointer arithmetic can be used to access methods and member variables indirectly, potentially bypassing the inbuilt security checks that are present in the .NET runtime. A binary that is verifiably type-safe can be hosted in more environments, like SQL Server and hosted environments, and can be deployed in more scenarios, such as browser-hosted Internet applications.

At times, verifiable type safety is not an important design goal for an application—applications that will be deployed to the local machine using traditional deployment mechanisms such as MSI files don’t gain an immediate benefit from being verifiably type-safe, and type safety may not be possible if the code needs to access native types that cannot be converted to managed types. Unlike other managed languages that mandate a certain compilation model, C++/CLI and the C++ compiler follows the spirit of C++, and gives the developer the choice of the compilation mode that makes sense for the application.

The default choice for a C++/CLI application is /clr, which can be set on the General tab for a project, as shown in Figure 1. With /clr set, both managed and native classes can be added to an executable image. For managed classes, a normal managed type will be defined in Microsoft Intermediate Language (MSIL), and objects of the type can fully interoperate with languages like C# and Visual Basic.NET. For native classes, managed methods on native types are generated by the compiler.

Figure 1: CLR compilation modes.

The great advantage of including native types within an image compiled with the /clr switch is the ability to use managed types from within functions on the native type, as shown in the following code sample. When a native type like this is compiled and run within a /clr image, an object of the type is still allocated on the call stack or native heap, but the functions are compiled as MSIL, and are executed on the .NET runtime.

class C{
public:
   static void f(){
      Console::WriteLine(L"Console::WriteLine");
      ::printf("printf\n");
   }
};

In C++/CLI applications that do not make use of native types and that want to be hosted or deployed in more constrained environments, using a more restrictive compiler option will be required. There are three sub-flavours of the /clr compiler switch: /clr:pure, /clr:safe and /clr:oldSyntax. /clr:oldSyntax exists for the sole purpose of compiling legacy code written using Managed Extensions for C++, which shipped with Visual C++ 2002 and 2003.

The /clr:pure switch tells the compiler to generate only MSIL code. When the /clr switch is used, the compiler will attempt to compile all code as MSIL, but there are a number of situations where this will not be possible, such as functions that contain inline machine assembly instructions, functions that use varargs and functions that contain #pragma unmanaged blocks, which explicitly tells the compiler to generate native code. The full list of MSIL exclusions is contained in the Managed Extensions for C++ specification. With /clr:pure on, no native code will be generated, and the only way to call native functionally is through the .NET P/Invoke technology, and C++ Interop (also called It Just Works(IJW) interop) will be disabled.

In addition to the distribution and hosting benefits of pure managed C++ binaries, a performance improvement also can be achieved. When the call stack transitions between native and managed execution, the .NET Runtime must execute a number of bookkeeping instructions to control features like garbage collection, and if the transition happens frequently, it can result in a performance problem. C++ Interop is designed to achieve fast transitions, and doesn’t suffer from the security stack walk and other features that make a P/Invoke transition more expensive, but if used too frequently, the performance cost can still be noticeable. Balancing the performance penalty of transitions, there are some computational tasks that can be significantly faster with native execution, and the various compilation modes allows the C++ developer to choose the mode that makes sense for their application.

The most restrictive /clr switch is /clr:safe, which will only generate verifiably type-safe MSIL code. A C++/CLI assembly compiled with /clr:safe will have the same distribution and hosting options as a C# or Visual Basic.NET assembly. The main steps involved in converting a project from /clr:pure to /clr:safe is the conversion of all native types to managed types, and the removal of calls the C Runtime (CRT) Libraries. A managed version of the CRT can be used in /clr:pure applications, but it is not type-safe, and hence cannot be called from /clr:safe assemblies. With /clr:safe or /clr:pure selected, standard C++ features like COM interop, MFC, ATL, and DLL exports and imports are not available. This is partly compensated for by an increase in the availability of .NET features—/clr:safe and /clr:pure assemblies can be reflected over using .NET reflection, and there is better support for loading and hosting the assemblies in plug-in style applications.

In addition to the compiler options, there are linker options that affect the CLR properties of a binary image, as shown in Figure 2. The default setting for the linker is to choose the lowest level of verifiability of all the modules that are being linked to a particular binary, but this can be overridden to choose a custom level. If a higher level is set for the linker rather than the compiler, a linker error will be produced.

Figure 2: CLR Linker Options

Determining whether a DLL or EXE binary contains native or unverifiable managed code can be accomplished by using the PEVerify Tool that ships with the .NET Framework SDK. For native images and those compiled with the /clr compiler option, PEVerify will fail and report the existence of native code in the image. For /clr:pure images, PEVerify will report all the managed code that it detects that are not verifiably type-safe, and for /clr:safe images, PEVerify will report that all methods and types within the image are verifiably type-safe.

Conclusion

Choosing the correct CLR compilation mode depends on a number of factors, including the deployment and hosting strategies that the executable image will use in a production environment, and the extent to which native functionality like MFC, ATL, the CRT, and custom libraries will be used. Although any C++ project can have the /clr switch turned on, resulting in a managed executable being produced, there are progressively stricter requirements for achieving /clr:pure and /clr:safe compilation. Visual C++ is the only compiler that gives the developer the option of how the .NET Framework will be utilised by their code, and C++/CLI gives C++ developers that ability to produce managed types that are fully interoperable with other managed languages.

About the Author

Nick Wienholt is an independent Windows and .NET consultant based in Sydney. He is the author of Maximizing .NET Performance and co-author of A Programmer’s Introduction to C# 2.0 from Apress, and specialises in system-level software architecture and development, with a particular focus of performance, security, interoperability, and debugging.

Nick is a keen and active participant in the .NET community. He is the co-founder of the Sydney Deep .NET User group and writes technical articles for Australian Developer Journal, ZDNet, Pinnacle Publishing, Developer.COM, MSDN Magazine (Australia and New Zealand Edition), and the Microsoft Developer Network. An archive of Nick’s SDNUG presentations, articles, and .NET blog is available at www.dotnetperformance.com. In recognition of his work in the .NET area, he was awarded the Microsoft Most Valued Professional Award from 2002 through 2008.

More by Author

Get the Free Newsletter!

Subscribe to Developer Insider for top news, trends & analysis

Must Read