Microsoft Visual Studio 2010 for Database Professionals

As a developer, you use Visual Studio (VS) to build many flavors of applications for the .NET Framework. Typically, a new release of VS comes with a brand-new version of the .NET Framework, and VS 2010 is no exception—it ships with the .NET Framework 4.0. However, you can use VS 2010 to build applications for any .NET platform, including .NET 3.5 and .NET 2.0.

Some of the features to help database developers include the following:

Microsoft Visual Studio owes a large share of its popularity to its integrated development environment (IDE), which is made up of language- and feature-specific code editors, visual designers, IntelliSense, auto-completion, snippets, wizards, controls, and more. This IDE is extended in Microsoft Visual Studio 2010 to host the much-requested multimonitor support. Writing code with the .NET Framework requires you to mix designer windows with code windows while keeping an eye on things such as a database profiler, an HTTP watcher, a specification document, or an entity-relationship model. However, doing so requires monitor real estate, and reducing the size of the fonts employed is no longer an option. Having multiple monitors is a viable option because monitors aren’t very expensive and are easy to install. Many IT organizations are using dual monitors as a way to increase productivity and save time and resources.

In Microsoft Visual Studio 2010, IntelliSense includes auto-filtering, which gives you the ability to display a context-sensitive list of suggestions. In this version, the list isn’t limited to an alphabetical sequence or to all names starting with the typed sequence. IntelliSense attempts to guess the context in which you’re operating and shows related suggestions, and it even understands abbreviations. For example, if you type WL in an IntelliSense window, it will match member names, such as WriteLine.

Refactoring is an aspect of development that has gained of lot of attention in the past few years. Refactoring is the process of rewriting the source code in a better way (i.e., adding testability, separation of concerns, extensibility) without altering the actual behavior. Originally associated with agile practices, refactoring is now a common, everyday practice for almost every developer. Because refactoring doesn’t add any new behavior to the code, it’s often perceived to be a waste of time and is neglected. In the long run, however, a lack of systematic refactoring leads to low-quality software or even project failures. Microsoft Visual Studio 2010’s refactoring tools are an excellent way to speed up the refactoring process, making it affordable for nearly any development team.

There are many ways to take advantage of Microsoft Visual Studio 2010’s data-comparison capabilities. The most obvious usage is to update a development server with a copy of the tables you have in your production environment. Another common scenario is to use the tool to copy data across two or more tables in the same or different databases. Finally, you might consider using the Data Comparison wizard to compare data as it appears in a table before and after you run tests as a way to assert the behavior of a piece of code. In fact, it should be noted that both refactoring and testing are project-wide features in VS that aren’t limited to C# or Visual Basic .NET projects. You can have a few refactoring features (i.e., Rename) and testing assertions available for database-related projects, too.

Some changes to LINQ-to-SQL were made in Microsoft Visual Studio 2010. First, when a foreign key undergoes changes in the database schema, simply re-dragging the table-based entity into the designer will refresh the model. Second, the new version of LINQ-to-SQL will produce T-SQL queries that are easier to cache. SQL Server makes extensive use of query plans to optimize the execution of queries. A query execution plan is reused only if the next query exactly matches the previous one that was prepared earlier and cached. Before Microsoft Visual Studio 2010 and the .NET Framework 4.0, the LINQ-to-SQL engine produced queries in which the length of variable type parameters, such as varchar, nvarchar, and text, wasn’t set. Subsequently, the SQL Server client set the length of those fields to the length of the actual content. As a result, very few query plans were actually reused, creating some performance concerns for DBAs. In the LINQ-to-SQL that comes with .NET 4 and Microsoft Visual Studio 2010, text parameters are bound to queries as nvarchar(4000), or nvarchar(MAX) if the actual length is greater than 4000.

The Entity Framework 4.0 introduces a significant improvement in the process that generates C# (or Visual Basic) code for the abstract model. The default generation engine produces a class hierarchy rooted in the Entity Framework library classes. In addition, the Entity Framework includes two more generation engines: the Plain Old CLR Object (POCO) generator and the self-tracking entity generator. The first engine generates classes that are standalone with no dependencies outside the library itself. The self-tracking entity engine generates POCO classes that contain extra code so that each instance of the entity classes can track their own programmatic changes.

Microsoft Visual Studio 2010 also includes an improved set of design-time facilities such as IntelliSense, refactoring, code navigation, new designers for workflows, Entity Framework–based applications, and WPF applications.

View Article



Comments

  • There are no comments yet. Be the first to comment!

Leave a Comment
  • Your email address will not be published. All fields are required.

Top White Papers and Webcasts

  • Today's agile organizations pose operations teams with a tremendous challenge: to deploy new releases to production immediately after development and testing is completed. To ensure that applications are deployed successfully, an automatic and transparent process is required. We refer to this process as Zero Touch Deployment™. This white paper reviews two approaches to Zero Touch Deployment--a script-based solution and a release automation platform. The article discusses how each can solve the key …

  • On-demand Event Event Date: December 18, 2014 The Internet of Things (IoT) incorporates physical devices into business processes using predictive analytics. While it relies heavily on existing Internet technologies, it differs by including physical devices, specialized protocols, physical analytics, and a unique partner network. To capture the real business value of IoT, the industry must move beyond customized projects to general patterns and platforms. Check out this webcast and join industry experts as …

Most Popular Programming Stories

More for Developers

RSS Feeds