What’s in Visual C++ Whidbey

At the PDC this year, attendees learned about technologies that are headed our way over the next few years. The code names and rough timelines are publicly available at http://msdn.microsoft.com/vstudio/productinfo/roadmap.aspx and have been for some time.

So you already knew (or could have known already) that the next release of Visual C++ is code-named Whidbey, and will release together with SQL Server “Yukon.” Further down the road, we will see Windows “Longhorn” and Visual Studio “Orcas.” But what is in those products? I can’t give you all the gory details (that’s what the PDC is for, after all) but I can give you a quick tour of one corner of Whidbey—the C++ corner, of course—and whet your appetite for what’s to come.

Most of what’s exciting about Whidbey is the next version of the framework, improvements to the Base Class Libraries, support for Yukon, and so on. In this column I’m just going to focus on changes to Visual Studio, the compiler, and related developer tools. Improvements to the C++ language will be in a later column.

Profile-Guided Optimization

If you asked ten C++ programmers what kind of code C++ is best for, I hope at least nine of them would include words like “performance,” “speed,” and “tight” in their answers. If you write the same program in a variety of programming languages, you will see differences from language to language in the execution time of the applications you have created. (If you compare C#, VB.NET, and Managed C++ versions you won’t see a difference, because they all compile to IL.) Interestingly, if you choose a language for which multiple compilers are available, and compile and run the same code under several tools, you will also see a speed difference: some compilers optimize better than others. The better they optimize, the faster the code they produce.

The optimizer in Visual C++ is among the best, and it gets better with every release. This time around it has a really interesting twist: profile-guided optimization. You see, some applications are easy to optimize: Pull a line of code out of a loop instead of doing it over and over, that sort of thing. But others are much harder to optimize: A decision the optimizer needs to make might actually depend on the input given to the program. In days of yore, some C or C++ gurus could tweak their code using knowledge of the pattern of input they expected. Knowing that there were usually about ten times as many “employee” objects as “manager” objects, or that items were added to an array far more often than they were searched for, the guru would write hand-optimized code. But not all of us can operate at that level, and even for those who can it’s slow work that produces hard-to-read code.

In Visual Studio Whidbey, when you’re making an unmanaged C++ application, you have the option of using profile-guided optimization. This is a multi-step process:

  • Compile the application with instrumentation
  • Run the instrumented version of the application
  • Relink the application using the information gathered by running it as a guide to the optimizer

I’ve seen profile-guided optimization in action, and it really works. You don’t have to do much to take advantage of it, and your unmanaged code gets even faster than it was before.

MSIL Linking

Whenever you build an application of any reasonable size, you probably use a library of some sort. Often, you use a library of your own; for example, reusable code from earlier projects. Sometimes the reason code is in a separate library is because you’ve developed it as layers, and each layer is in a separate project or even solution. When you work in unmanaged C++, you have a choice of how to use these libraries: static linking or dynamic linking. With static linking, you change your library settings to create a .lib file, and you link that library into your application. This increases the size of the application, but simplifies deployment, because you only need to deploy a single file, the EXE. With dynamic linking, the library settings ensure the creation of a .dll file, and you add some code to the application to find and use the .dll at runtime. Now you need to deploy both the EXE and the companion DLL file. If the DLL is shared by several applications, this is a good thing; but if it’s a DLL just for this application, you might prefer the static linking approach, especially if the files are small.

Now, shift gears to the managed world. Until Whidbey, there has only been dynamic linking. Your assemblies are either .exe or .dll files, and your libraries are all in .dll files that are brought in at runtime. When you deploy the application, you deploy the .exe and the appropriate .dll files. Starting with Whidbey, you will have static linking available to you. To switch from a traditional reference to MSIL linking, simply:

  • Change the library project so it produces a .netmodule file instead of a .dll assembly
  • Change the application project so it links in the .netmodule file
  • Remove any reference to the .dll file and replace it with a reference to the .netmodule file

Now, the code from the .netmodule file can be linked into the executable assembly and you only deploy one file. What’s more, the .netmodule can be produced in any managed language: I’ve linked MSIL produced by the C# compiler into an assembly produced by the C++ compiler to create a single executable assembly. Now that’s a new level of cross-language interoperability!

IDE Changes

Some of the things I like the most about this release aren’t language specific. Take data tips, for instance. Whenever you’re paused in the debugger, if you hover the mouse cursor over a variable, a tip will appear with the value of the variable. That’s great if it’s an integer or some other simple type, but not so great if it’s an instance of an object, because you get some not-very-helpful information, such as the address of the variable. I end up right-clicking on the variable and adding a watch or bringing up a quick watch. That lets me see everything in the object, but it’s not very convenient. The new data tips in the Whidbey IDE are like an instant watch: If the variable you’re hovering over is an object, the data tip is many lines long and shows the values of all the member variables in the object. If any of those are objects themselves (or pointers), they have a dot next to them that you can use to drill your way into the data. You can keep popping up data tips and exploring values without having to use watch windows at all.

Solution explorer, at least for C++ programmers, lets you have your files the way you like them: grouped into Source/Header/Resource files if you like, or just blobbed together in the same folder structure as on the hard drive. It’s your choice, and it lets you work more quickly when you need to move around from file to file.

Another little thing that makes a difference: colored highlights in the margins to indicate recent changes in a file—and whether those changes have been saved or not. The IDE is full of these little touches, and luckily I have plenty of time to experiment with them. Traditionally, the bits that come home from the PDC are not followed by betas for quite a long time, and the released product takes even longer. The public roadmap just says it will be next year. Well, that gives us all something to look forward to.

About the Author

Kate Gregory is a founding partner of Gregory Consulting Limited (www.gregcons.com). In January 2002, she was appointed MSDN Regional Director for Toronto, Canada. Her experience with C++ stretches back to before Visual C++ existed. She is a well-known speaker and lecturer at colleges and Microsoft events on subjects such as .NET, Visual Studio, XML, UML, C++, Java, and the Internet. Kate and her colleagues at Gregory Consulting specialize in combining software develoment with Web site development to create active sites. They build quality custom and off-the-shelf software components for Web pages and other applications. Kate is the author of numerous books for Que, including Special Edition Using Visual C++ .NET.


More by Author

Get the Free Newsletter!

Subscribe to Developer Insider for top news, trends & analysis

Must Read