Controlling Project and File Properties with C++ Macros

In previous columns, I introduced you to the basics of writing Visual Studio macros in C++—well, to be accurate, writing a library in C++ that provides all the functionality for macros. I showed you how to insert text into a file being edited, and how to work with the code model that represents classes, interfaces, functions, and the like within your project.

In this installment, I'll tackle the project model. The particular task my macro will perform is changing a file (within a managed project) from managed (/clr) to unmanaged. This is something you might do for performance reasons, creating a mixed executable. When you make this change in Solution Explorer, you have to make a companion changes, turning off precompiled headers. The macro does both steps. I'll leave it as an exercise for you to write the opposite macro that puts the file back to unmanaged.

Sample Project

I created an unmanaged console application and added a class to it, kept in a separate file. The class is called Person: the header is in Person.h and the implementation is in Person.cpp. I plan to flip Person.cpp back and forth between managed and unmanaged using the macro. Here's Person.h:

class Person
{
private:
   int number;
   char code;
public:
   Person(int n, char c) ;
   int getnumber(); 
};

You can guess what the two functions look like, and you might be tempted to write them inline in the .h file. But then, think what will happen when you #include that .h file into a .cpp file that is being compiled /clr: you will get MSIL versions of the functions. That's why I put the implementations into a separate file. That file will then be compiled to MSIL or native code according to the properties you've set for it. And after all, in real life, if you're flipping a file back to native code for performance reasons, it's going to have a great deal of code in it and not these little "demo code" examples. The macros don't care how much code they work on, so I wrote small examples. Here is Person.cpp:

#include "StdAfx.h"
#include ".\person.h"

Person::Person(int n, char c)
{
   number = n;
   code = c;
}

int Person::getnumber()
{
   return number;
}

I then wrote a really simple main():

#include "stdafx.h"

#include <iostream>
using namespace std;

#include "Person.h"

int _tmain(int argc, _TCHAR* argv[])
{
   Person p1(1,'a');
   Person p2(2,'q');
   cout << "total of the numbers: " <<
      p1.getnumber() + p2.getnumber() << '\n';
   return 0;
}

So far, this is all unmanaged code and has no .NET part to it. I built and ran it to make sure nothing weird was going on, and then used Solution Explorer to make the entire project managed (/clr) and built and ran it again to make sure it still worked. This should be familiar (if you've read my head-spinning columns) as the "xcopy port" to the CLR.

Getting to the Project Model

My macro, like the earlier ones, has one line of VB that calls into the C++ DLL:

Sub MakeNative()
   CppMacroClasses.Utilities.MakeNative(DTE)
End Sub

I added a MakeNative method to the Utilities class I showed in earlier columns, and after each build I copied it to C:\Program Files\Microsoft Visual Studio .NET 2003\Common7\IDE\PublicAssemblies to make it available to the macro project system as a reference.

That just leaves the MakeNative function to write. It needs access to the project system, and that means a new namespace to work with: Microsoft::VisualStudio::VCProjectEngine. By getting a ProjectItem, I can get a VCProjectItem, and from there a VCFile. Unlike the file code model I used in the macro that stubs in interfaces, the Microsoft::VisualStudio::VCProjectEngine::VCFile class represents the file of code from the point of view of the project system: the properties and options as set for this particular file. Here's how I got it:

// get the active window
EnvDTE::ProjectItem *pi = DTE->ActiveWindow->ProjectItem;
VCProjectItem* vcpi = 
   dynamic_cast<VCProjectItem *>(pi->Object);
VCProject* proj = 
   dynamic_cast<VCProject *>(vcpi->project);
IVCCollection* files = 
   static_cast<IVCCollection*>(proj->Files);
VCFile* file = 
   static_cast<VCFile*>(files->Item(pi->Document->FullName));
if (file)
{
   // do the work of the macro
}

(Just as with my other macros, you need to be sure that the right kind of file, in this case a code file that's part of a project, has focus when you run the macro. This piece of code blows up with null pointer problems if you are looking at online help or some other non-project information when you run the macro. For readability, I'm not testing each of these return values but just assuming they all work.)

If you think about changing properties for a file or for a whole project, I hope you think quickly of configurations. While it makes very little sense for your Debug build to be managed and your Release build to be unmanaged, there are of course some property differences between the builds. The project model calls builds "configurations" and gives you access to them once you have the file. Here's what I put inside that if block above:

IVCCollection* fconfigs =
   static_cast<IVCCollection*>(file->FileConfigurations);
for (int i=1; i<fconfigs->Count+1; i++)
{
   VCFileConfiguration* fconfig =
      static_cast<VCFileConfiguration*>(fconfigs->Item(__box(i)));
   // work with the configuration
}

If you want to make your changes only to a single configuration, take a look at the Name property of the fconfig object. In this macro, I'm going to make the same change to each configuration I find, so I don't care about their names.

While this is certainly not production-level code, I have included a bit of error checking. If you invoke this macro while you're editing a header file, you won't be able to change compiler options, because header files don't have compiler options. When I'm working in managed code, I often find myself putting more and more into header files, leaving implementation files that are just shells to include the header files. That's not a problem, as long as I remember the other purpose of the implementation file is to carry the compiler options.

This code establishes whether the configuration obtained from the file has compile-link options or not:

VCCLCompilerTool* tool =
   dynamic_cast<Microsoft::VisualStudio::VCProjectEngine::
                VCCLCompilerTool*>
    (fconfig->Tool);
if (tool)    // .h's won't have the cl tool associated, nor other
             // files
{
   // set file level compile options
}

For once, all that casting is making things simpler. If the tool associated with this configuration isn't the CL compile-link, the dynamic cast will fail, and tool will be a null pointer.

All that remains is to see whether the file is compiling as managed or not, and then adjust properties accordingly:

if (tool->CompileAsManaged == compileAsManagedOptions::managedNotSet)
{
   // file is already unmanaged in this config, nothing to do
}
else
{
   //file is managed in this config, get to work
   tool->CompileAsManaged = compileAsManagedOptions::managedNotSet;
   // no pre compiled headers
   tool->UsePrecompiledHeader = pchOption::pchNone;
}

There you go! You can flip the /clr flag on and off simple C++ classes to experiment with the effects of moving the boundary between managed and unmanaged code. Of course, don't assume you can write a managed class (with __gc keywords) and turn off /clr with success. The freedom to compile to native code or IL is for classic C++ with no managed classes. Besides, I'm just using this macro as an example to show you another namespace in the Microsoft::VisualStudio part of the tree. Perhaps it will inspire you to create some macros of your own. As long as you don't mind casting (you'll do a lot of casting), there's nothing you can't do.

About the Author

Kate Gregory is a founding partner of Gregory Consulting Limited (www.gregcons.com). In January 2002, she was appointed MSDN Regional Director for Toronto, Canada. Her experience with C++ stretches back to before Visual C++ existed. She is a well-known speaker and lecturer at colleges and Microsoft events on subjects such as .NET, Visual Studio, XML, UML, C++, Java, and the Internet. Kate and her colleagues at Gregory Consulting specialize in combining software develoment with Web site development to create active sites. They build quality custom and off-the-shelf software components for Web pages and other applications. Kate is the author of numerous books for Que, including Special Edition Using Visual C++ .NET.




Comments

  • There are no comments yet. Be the first to comment!

Leave a Comment
  • Your email address will not be published. All fields are required.

Top White Papers and Webcasts

  • You may already know about some of the benefits of Bluemix, IBM's open platform for developing and deploying mobile and web applications. Check out this webcast that focuses on building an Android application using the MobileData service, with a walk-through of the real process and workflow used to build and link the MobileData service within your application. Join IBM's subject matter experts as they show you the way to build a base application that will jumpstart you into building your own more complex app …

  • Packaged application development teams frequently operate with limited testing environments due to time and labor constraints. By virtualizing the entire application stack, packaged application development teams can deliver business results faster, at higher quality, and with lower risk.

Most Popular Programming Stories

More for Developers

Latest Developer Headlines

RSS Feeds