Our website uses cookies to enhance your browsing experience.
Accept
to the top
close form

Fill out the form in 2 simple steps below:

Your contact information:

Step 1
Congratulations! This is your promo code!

Desired license type:

Step 2
Team license
Enterprise license
** By clicking this button you agree to our Privacy Policy statement
close form
Request our prices
New License
License Renewal
--Select currency--
USD
EUR
* By clicking this button you agree to our Privacy Policy statement

close form
Free PVS‑Studio license for Microsoft MVP specialists
* By clicking this button you agree to our Privacy Policy statement

close form
To get the licence for your open-source project, please fill out this form
* By clicking this button you agree to our Privacy Policy statement

close form
I am interested to try it on the platforms:
* By clicking this button you agree to our Privacy Policy statement

close form
check circle
Message submitted.

Your message has been sent. We will email you at


If you do not see the email in your inbox, please check if it is filtered to one of the following folders:

  • Promotion
  • Updates
  • Spam

Webinar: Evaluation - 05.12

>
>
>
Support of Visual Studio 2019 in PVS-St…

Support of Visual Studio 2019 in PVS-Studio

Jun 04 2019

Support of Visual Studio 2019 in PVS-Studio affected a number of components: the plugin itself, the command-line analyzer, the cores of the C++ and C# analyzers, and a few utilities. In this article, I will briefly explain what problems we encountered when implementing support of the IDE and how we addressed them.

0630_VS2019_Support/image1.png

Before we start, I'd like to take a look back at the history of supporting the previous versions of Visual Studio in PVS-Studio so you better understand our vision of the task and solutions that we came up with in every single situation.

Since the first version of PVS-Studio that shipped with a plugin for Visual Studio (it was Visual Studio 2005 back then), supporting new versions of this IDE has been quite a trivial task for us, which basically came down to updating the plugin's project file and dependencies of Visual Studio's various API extensions. Every now and then we would have to add support for new features of C++, which the Visual C++ compiler was gradually learning to work with, but it generally wasn't a difficult task either and could be easily done right before a new Visual Studio release. Besides, PVS-Studio had only one analyzer back then - for C/C++.

Things changed when Visual Studio 2017 released. In addition to huge changes to many of the IDE's API extensions, we also encountered a problem with maintaining backward compatibility of the new C# analyzer added shortly before that (as well as of the new analyzer layer for C++ to work with MSBuild projects) with the new versions of MSBuild \ Visual Studio.

Considering all of this, I strongly recommend that you see a related article about support of Visual Studio 2017, "Support of Visual Studio 2017 and Roslyn 2.0 in PVS-Studio: sometimes it's not that easy to use ready-made solutions as it may seem", before reading on. That article discusses the issues that we faced last time and the model of interaction between different components (such as PVS-Studio, MSBuild, and Roslyn). Knowing these details may help you to better understand the current article.

Tackling those problems ultimately led up to significant changes to the analyzer, and we were hoping that the new approaches applied then would help us support future versions of Visual Studio \ MSBuild much easier and faster. This hope already started to prove realistic as the numerous updates of Visual Studio 2017 were released. Did the new approach help us in support of Visual Studio 2019? Read on to find out.

PVS-Studio plugin for Visual Studio 2019

The start seemed to be promising. It didn't take us much effort to port the plugin to Visual Studio 2019 and have it launch and run well. But we already encountered two problems at once that could bring more trouble later.

The first had to do with the IVsSolutionWorkspaceService interface used to support of the Lightweight Solution Load mode (which, by the way, had been disabled in one of the earlier updates, back in Visual Studio 2017). It was decorated with the Deprecated attribute, which currently only triggered a warning at build time but was going to become a big problem in the future. This mode didn't last long indeed... That was easy to fix - we simply stopped using this interface.

The second problem was the following message that we kept getting when loading Visual Studio with the plugin enabled: Visual Studio has detected one or more extensions that are at risk or not functioning in a feature VS update.

The logs of Visual Studio launches (the ActivityLog file) helped to clear it up:

Warning: Extension 'PVS-Studio' uses the 'synchronous auto-load' feature of Visual Studio. This feature will no longer be supported in a future Visual Studio 2019 update, at which point this extension will not work. Please contact the extension vendor to get an update.

What it meant for us was that we would have to switch from synchronous to asynchronous load mode. I hope you won't mind if I spare you the details of how we interact with Visual Studio's COM interfaces, and only briefly outline the changes.

There's an article by Microsoft about loading plugins asynchronously: "How to: Use AsyncPackage to load VSPackages in the background". It was, however, already clear that there were more changes to come.

One of the biggest changes was in the load mode, or rather initialization mode. In earlier versions, all the necessary initialization was done using two methods: Initialize of our class inheriting from Package, and OnShellPropertyChange. The latter had to be added because when loading synchronously, Visual Studio itself might still be in the process of loading and initialization, and, therefore, some of the necessary actions were impossible to perform during the plugin's initialization. One way to fix this was to delay the execution of those actions until Visual Studio quits the 'zombie' state. It was this part of the logic that we singled out into the OnShellPropertyChange method with a check for the 'zombie' status.

The Initialize method of the abstract class AsyncPackage, which asynchronously loading plugins inherit from, is sealed, so initialization has to be done in the overridden method InitializeAsync, which is exactly what we did. The 'zombie' check logic had to be changed too because the status information was no longer available to our plugin. Besides, we still had to perform those actions that had to be done after plugin initialization. We solved that by utilizing the OnPackageLoaded method of the IVsPackageLoadEvents interface, which is where those delayed actions were performed.

Another problem resulting from asynchronous load was that the plugin's commands couldn't be used until after Visual Studio had loaded. Opening the analyzer log by double-clicking in the file manager (if you needed to open it from Visual Studio) resulted in launching the corresponding version of devenv.exe with a command for opening the log. The launch command looked something like this:

"C:\Program Files (x86)\Microsoft Visual Studio\
2017\Community\Common7\IDE\devenv.exe"
/command "PVSStudio.OpenAnalysisReport 
C:\Users\vasiliev\source\repos\ConsoleApp\ConsoleApp.plog"

The "/command" flag is used here to run the command registered in Visual Studio. This approach didn't work anymore since commands were no longer available until after the plugin had loaded. The workaround that we came up with was to have the devenv.exe launch command parsed after the plugin has loaded and run the log open command if it's found in the launch command. Thus, discarding the idea of using the "appropriate" interface to work with commands allowed us to keep the necessary functionality, with delayed opening of the log after the plugin has completely loaded.

Phew, looks like we made it at last; the plugin loads and opens as expected, without any warnings.

And here's when things go wrong. Paul (Hi Paul!) installs the plugin on his computer and asks why we still haven't switched to asynchronous load.

To say that we were shocked would be an understatement. That couldn't be! But it's real: here's the new version of the plugin, and here's a message saying that the package is loading synchronously. Alexander (Hi Alexander!) and I try the same version on our respective computers - it works fine. How's that possible? Then it occurs to us to check the versions of the PVS-Studio libraries loaded in Visual Studio - and we find that these are the libraries for Visual Studio 2017, whereas the VSIX package contains the new versions, i.e. for Visual Studio 2019.

After tinkering with VSIXInstaller for a while, we managed to find out that the problem had to do with the packages cache. This theory was also supported by the fact that restricting access to the cached package (C:\ProgramData\Microsoft\VisualStudio\Packages) caused VSIXInstaller to output an error message in the log. Curiously enough, when the error didn't occur, the information about installing cached packages didn't appear.

Side note. While studying the behavior of VSIXInstaller and accompanying libraries, I thought how cool it is that Roslyn and MSBuild are open-source, which allows you to conveniently read and debug their code and trace its work logic.

So, this is what happened: when installing the plugin, VSIXInstaller saw that the corresponding package was already cached (it was actually the .vsix package for Visual Studio 2017) and installed that package instead of the new one. Why it ignored the restrictions/requirements defined in the .vsixmanifest file (which, among other things, restricted installation of extensions to a specific version of Visual Studio) is a question yet to be answered. As a result, the plugin designed for Visual Studio 2017 got installed on Visual Studio 2019 - despite the restrictions specified in the .vsixmanifest file.

Worst of all, that installation broke the dependencies graph of Visual Studio, and although the IDE seemed to be running well, things were actually terrible. You couldn't install or delete extensions, update, etc. The "restore" process was painful too as we had to delete the extension (i.e. the files comprising it) manually and - also manually - edit the configuration files storing the information about the installed package. In other words, it wasn't fun at all.

To fix that and to make sure we didn't run into any situations like that in the future, we decided to make our own GUID for the new package to have the packages for Visual Studio 2017 and Visual Studio 2019 securely isolated from each other (the older packages were fine; they had always used a shared GUID).

Since we started talking about unpleasant surprises, here's another: after updating to Preview 2, the PVS-Studio menu "moved" to the "Extensions" tab. Not a big deal, but it made accessing the plugin's functionality less convenient. This behavior persisted through the next Visual Studio 2019 versions, including the release. I have found mentions of this "feature" neither in the documentation nor in the blog.

Okay, now things looked fine and we seemed to have finished with the Visual Studio 2019 support at last. This proved wrong the next day after releasing PVS-Studio 7.02. It was the asynchronous load mode again. When opening the analysis results window (or starting the analysis), the analyzer window would appear "empty" to the user - no buttons, no grid, nothing at all.

This problem in fact occurred every now and then during the analysis. But it affected only one computer and didn't show up until Visual Studio updated to one of the first iterations of 'Preview'. We suspected that something had got broken during installation or update. The problem, however, disappeared some time later and wouldn't occur even on that particular computer, so we thought it "got fixed on its own". But no - we just were lucky. Or unlucky, for that matter.

As we discovered, it was the order in which the IDE window itself (the class derived from ToolWindowPane) and its contents (our control with the grid and buttons) were initialized. Under certain conditions, the control would be initialized before the pane and even though things ran well and the FindToolWindowAsync method (creating the window when it's accessed for the first time) did its job well, the control remained invisible. We fixed that by adding lazy initialization for our control to the pane-filling code.

Support of C# 8.0

There's one great advantage about using Roslyn as a basis for the analyzer: you don't have to add support for new language constructs manually - it's done automatically through the Microsoft.CodeAnalysis libraries, and we just make use of the ready-made solutions. It means new syntax is supported by simply updating the libraries.

As for the analysis itself, we had to tweak things on our own, of course - in particular, handle new language constructions. Sure, we had the new syntax tree generated automatically by simply updating Roslyn, but we still had to teach the analyzer how exactly to interpret and process new or modified syntax tree nodes.

The nullable reference types are perhaps the most widely discussed new feature of C# 8. I won't be talking about them now because a topic that big is worth a separate article (which is currently being written). For now, we have settled on ignoring nullable annotations in our dataflow mechanism (that is, we understand, parse, and skip them). The idea is that a variable, even of a non-nullable reference type, can still be pretty easily (or accidentally) assigned the value null, ending up with an NRE when attempting to dereference it. Our analyzer can spot such errors and report a potential null dereference (if it finds such an assignment in the code, of course) even if the variable is of type non-nullable reference.

Using nullable reference types and associated syntax enables you to write pretty interesting code. We nicknamed it "emotional syntax". This snippet is perfectly compilable:

obj.Calculate();
obj?.Calculate();
obj.Calculate();
obj!?.Calculate();
obj!!!.Calculate();

By the way, my experiments led me to discovering a couple of tricks that you can use to "crash" Visual Studio using the new syntax. They are based on the fact that you are allowed to write as many '!' characters as you like. It means you could write not only code like this:

object temp = null!

but also like this:

object temp = null!!!;

And, pushing it even further, you could write crazy things like this:

object temp = null!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                  !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                  !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                  !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                  !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                  !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                  !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                  !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                  !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                  !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                  !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                  !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                  !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                  !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                  !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
                  !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!;

This code is compilable, but if you try to view the syntax tree in Syntax Visualizer from .NET Compiler Platform SDK, Visual Studio will crash.

The failure report can be pulled out from Event Viewer:

Faulting application name: devenv.exe,
version: 16.0.28803.352, time stamp: 0x5cc37012
Faulting module name: WindowsBase.ni.dll,
version: 4.8.3745.0, time stamp: 0x5c5bab63
Exception code: 0xc00000fd
Fault offset: 0x000c9af4
Faulting process id: 0x3274
Faulting application start time: 0x01d5095e7259362e
Faulting application path: C:\Program Files (x86)\
Microsoft Visual Studio\2019\Community\Common7\IDE\devenv.exe
Faulting module path: C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\
WindowsBase\4480dfedf0d7b4329838f4bbf953027d\WindowsBase.ni.dll
Report Id: 66d41eb2-c658-486d-b417-02961d9c3e4f
Faulting package full name: 
Faulting package-relative application ID:

If you go even crazier and add several times more exclamation points, Visual Studio will start crashing all by itself, without any help from Syntax Visualizer. The Microsoft.CodeAnalysis libraries and the csc.exe compiler can't cope with such code either.

These examples are contrived, of course, but I found that trick funny.

Toolset

It was obvious that updating the toolset would be the most difficult part. At least that's what it looked like in the beginning, but now I tend to think that support of the plugin was the most difficult part. For one thing, we already had a toolset and a mechanism for evaluating MSBuild projects, which was good as it was even though it had yet to be extended. The fact that we didn't have to write the algorithms from scratch made it way easier. The strategy of relying on "our" toolset, which we preferred to stick to when supporting Visual Studio 2017, once again proved right.

Traditionally, the process starts with updating NuGet packages. The tab for managing NuGet packages for the current solution contains the "Update" button... but it doesn't help. Updating all the packages at once caused multiple version conflicts, and trying to solve them all didn't seem a good idea. A more painful yet presumably safer way was to selectively update target packages of Microsoft.Build / Microsoft.CodeAnalysis.

One difference was spotted right away when testing the diagnostics: the syntax tree's structure changed on an existing node. Not a big deal; we fixed that quickly.

Let me remind you, we test our analyzers (for C#, C++, Java) on open-source projects. This allows us to thoroughly test the diagnostics - for example, check them for false positives or see if we missed any cases (to reduce the number of false negatives). These tests also help us trace possible regression at the initial step of updating the libraries / toolset. This time they caught a number of issues as well.

One was that the behavior inside CodeAnalysis libraries got worse. Specifically, when checking certain projects, we started to get exceptions from the libraries' code on various operations such as obtaining semantic information, opening projects, and so on.

Those of you who have carefully read the article about support of Visual Studio 2017 remember that our distribution comes with a dummy - the file MSBuild.exe of 0 bytes.

Now we had to push this practice even further and include empty dummies for the compilers csc.exe, vbc.exe, and VBCSCompiler.exe. Why? We came up with this solution after analyzing one of the projects from our test base and getting diff reports: the new version of the analyzer wouldn't output some of the expected warnings.

We found that it had to do with conditional compilation symbols, some of which weren't extracted properly when using the new version of the analyzer. In order to get to the root of the problem, we had to dig deeper into the code of Roslyn's libraries.

Conditional compilation symbols are parsed using the GetDefineConstantsSwitch method of the class Csc from the library Microsoft.Build.Tasks.CodeAnalysis. The parsing is done using the String.Split method on a number of separators:

string[] allIdentifiers 
  = originalDefineConstants.Split(new char[] { ',', ';', ' ' });

This parsing mechanism works perfectly; all the conditional compilation symbols are correctly extracted. Okay, let's keep digging.

The next key point was the call of the ComputePathToTool method of the class ToolTask. This method computes the path to the executable file (csc.exe) and checks if it's there. If so, the method returns the path to it or null otherwise.

The calling code:

....
string pathToTool = ComputePathToTool();
if (pathToTool == null)
{
    // An appropriate error should have been logged already.
    return false;
}
....

Since there is no csc.exe file (why'd we need it?), pathToTool is assigned the value null at this point, and the current method (ToolTask.Execute) returns false. The results of executing the task, including the extracted conditional compilation symbols, are ignored.

Okay, let's see what happens if we put the csc.exe file where it's expected to be.

Now pathToTool stores the actual path to the now-present file, and ToolTask.Execute keeps executing. The next key point is the call of the ManagedCompiler.ExecuteTool method:

protected override int ExecuteTool(string pathToTool, 
                                   string responseFileCommands, 
                                   string commandLineCommands)
{
  if (ProvideCommandLineArgs)
  {
    CommandLineArgs = GetArguments(commandLineCommands, responseFileCommands)
      .Select(arg => new TaskItem(arg)).ToArray();
  }

  if (SkipCompilerExecution)
  {
    return 0;
  }
  ....
}

The SkipCompilerExecution property is true (logically enough since we are not compiling for real). The calling method (the already mentioned ToolTask.Execute) checks if the return value for ExecuteTool is 0 and, if so, returns true. Whether your csc.exe was an actual compiler or "War and Peace" by Leo Tolstoy doesn't matter at all.

So, the problem has to do with the order in which the steps were defined:

  • check for compiler;
  • check if compiler should be launched;

And we'd expect a reverse order. It's to fix this that the dummies for the compilers were added.

Okay, but how did we manage to get compilation symbols at all, with the csc.exe file absent (and the task results ignored)?

Well, there is a method for this case too: CSharpCommandLineParser.ParseConditionalCompilationSymbols from the library Microsoft.CodeAnalysis.CSharp. It, too, does parsing by calling the String.Split method on a number of separators:

string[] values 
  = value.Split(new char[] { ';', ',' } /*, 
                StringSplitOptions.RemoveEmptyEntries*/);

See how this set of separators is different from that handled by the Csc.GetDefineConstantsSwitch method? Here, a space isn't a separator. It means that conditional compilation symbols separated by spaces won't be parsed properly by this method.

That's what happened when we were checking the problem projects: they used space-separated conditional compilation symbols and, therefore, were successfully parsed by the GetDefineConstantsSwitch method but not the ParseConditionalCompilationSymbols method.

Another problem that showed up after updating the libraries was broken behavior in certain cases - specifically, on projects that didn't build. It affected the Microsoft.CodeAnalysis libraries and manifested itself as exceptions of all kinds: ArgumentNullException (failed initialization of some internal logger), NullReferenceException, and so on.

I'd like to tell you about one particular error that I found pretty interesting.

We encountered it when checking the new version of the Roslyn project: one of the libraries was throwing a NullReferenceException. Thanks to detailed information about its source, we quickly found the problem source code and - just for curiosity - decided to check if the error would persist when working in Visual Studio.

We did manage to reproduce it in Visual Studio (version 16.0.3). To do that, you need a class definition like this:

class C1<T1, T2>
{
  void foo()
  {
    T1 val = default;
    if (val is null)
    { }
  }
}

You'll also need Syntax Visualizer (it comes with the .NET Compiler Platform SDK). Look up the TypeSymbol (by clicking the "View TypeSymbol (if any)" menu item) of the syntax tree node of type ConstantPatternSyntax (null). Visual Studio will restart, and the exception info - specifically, the stack trace - will become available in Event Viewer:

Application: devenv.exe
Framework Version: v4.0.30319
Description: The process was terminated due to an unhandled exception.
Exception Info: System.NullReferenceException
   at Microsoft.CodeAnalysis.CSharp.ConversionsBase.
        ClassifyImplicitBuiltInConversionSlow(
          Microsoft.CodeAnalysis.CSharp.Symbols.TypeSymbol, 
          Microsoft.CodeAnalysis.CSharp.Symbols.TypeSymbol, 
          System.Collections.Generic.HashSet'1
            <Microsoft.CodeAnalysis.DiagnosticInfo> ByRef)
   at Microsoft.CodeAnalysis.CSharp.ConversionsBase.ClassifyBuiltInConversion(
        Microsoft.CodeAnalysis.CSharp.Symbols.TypeSymbol, 
        Microsoft.CodeAnalysis.CSharp.Symbols.TypeSymbol, 
        System.Collections.Generic.HashSet'1
          <Microsoft.CodeAnalysis.DiagnosticInfo> ByRef)
   at Microsoft.CodeAnalysis.CSharp.CSharpSemanticModel.GetTypeInfoForNode(
        Microsoft.CodeAnalysis.CSharp.BoundNode,
        Microsoft.CodeAnalysis.CSharp.BoundNode,
        Microsoft.CodeAnalysis.CSharp.BoundNode)
   at Microsoft.CodeAnalysis.CSharp.MemberSemanticModel.GetTypeInfoWorker(
        Microsoft.CodeAnalysis.CSharp.CSharpSyntaxNode,
        System.Threading.CancellationToken)
   at Microsoft.CodeAnalysis.CSharp.SyntaxTreeSemanticModel.GetTypeInfoWorker(
        Microsoft.CodeAnalysis.CSharp.CSharpSyntaxNode,
        System.Threading.CancellationToken)
   at Microsoft.CodeAnalysis.CSharp.CSharpSemanticModel.GetTypeInfo(
        Microsoft.CodeAnalysis.CSharp.Syntax.PatternSyntax, 
        System.Threading.CancellationToken)
   at Microsoft.CodeAnalysis.CSharp.CSharpSemanticModel.GetTypeInfoFromNode(
        Microsoft.CodeAnalysis.SyntaxNode, System.Threading.CancellationToken)
   at Microsoft.CodeAnalysis.CSharp.CSharpSemanticModel.GetTypeInfoCore(
        Microsoft.CodeAnalysis.SyntaxNode, System.Threading.CancellationToken)
....

As you can see, the problem is caused by a null reference dereference.

As I already mentioned, we encountered a similar problem when testing the analyzer. If you build it using debug libraries from Microsoft.CodeAnalysis, you can get right to the problem spot by looking up the TypeSymbol of the corresponding syntax tree node.

It will eventually take us to the ClassifyImplicitBuiltInConversionSlow method mentioned in the stack trace above:

private Conversion ClassifyImplicitBuiltInConversionSlow(
  TypeSymbol source,
  TypeSymbol destination,
  ref HashSet<DiagnosticInfo> useSiteDiagnostics)
{
  Debug.Assert((object)source != null);
  Debug.Assert((object)destination != null);

  if (source.SpecialType == SpecialType.System_Void ||
      destination.SpecialType == SpecialType.System_Void)
  {
    return Conversion.NoConversion;
  }

  Conversion conversion 
    = ClassifyStandardImplicitConversion(source, destination,
                                         ref useSiteDiagnostics);
  if (conversion.Exists)
  {
    return conversion;
  }

  return Conversion.NoConversion;
}

Here, the destination parameter is null, so calling destination.SpecialType results in throwing a NullReferenceException. Yes, the dereference operation is preceded by Debug.Assert, but it doesn't help because in fact it doesn't protect from anything - it simply allows you to spot the problem in the debug versions of the libraries. Or it doesn't.

Changes to the mechanism of evaluating C++ projects

There wasn't much interesting in this part: the existing algorithms didn't require any big modifications worth mentioning, but you may want to know about two minor issues.

The first was that we had to modify the algorithms that relied on the numerical value of ToolsVersion. Without going into details, there are certain cases when you need to compare toolsets and choose, say, the most recent version. The new version, naturally, has a larger value. We expected that ToolsVersion for the new MSBuild / Visual Studio would have the value 16.0. Yeah, sure! The table below shows how the values of different properties changed throughout Visual Studio's development history:

Visual Studio product name

Visual Studio version number

Tools Version

PlatformToolset version

Visual Studio 2010

10.0

4.0

100

Visual Studio 2012

11.0

4.0

110

Visual Studio 2013

12.0

12.0

120

Visual Studio 2015

14.0

14.0

140

Visual Studio 2017

15.0

15.0

141

Visual Studio 2019

16.0

Current

142

I know that the joke about the messed version numbers of Windows and Xbox is an old one, but it proves that you can't make any reliable predictions about the values (whether in the name or the version) of future Microsoft products. :)

We solved that easily by adding prioritizing for toolsets (i.e. singling out priority as a separate entity).

The second issue involved problems with working in Visual Studio 2017 or related environment (for instance, when the VisualStudioVersion environment variable is set). It occurs because computing parameters needed to evaluate a C++ project is a much more difficult task than evaluating a .NET project. For .NET, we use our own toolset and the corresponding value of ToolsVersion. For C++, we can utilize both our own toolset and the ones provided by the system. Starting with Build Tools for Visual Studio 2017, toolsets are defined in the file MSBuild.exe.config instead of the registry. That's why we couldn't get them from the global list of toolsets anymore (using Microsoft.Build.Evaluation.ProjectCollection.GlobalProjectCollection.Toolsets, for example) unlike those defined in the registry (i.e. for Visual Studio 2015 and earlier).

All this prevents us from evaluating a project using ToolsVersion 15.0 because the system won't see the required toolset. The most recent toolset, Current, will still be available since it's our own toolset, and, therefore, there's no such problem in Visual Studio 2019. The solution was quite simple and allowed us to fix that without changing the existing evaluation algorithms: we just had to include another toolset, 15.0, into the list of our own toolsets in addition to Current.

Changes to the mechanism of evaluating C# .NET Core projects

This task involved two interrelated issues:

  • adding the 'Current' toolset broke analysis of .NET Core projects in Visual Studio 2017;
  • analysis wouldn't work for .NET Core projects on systems without at least one copy of Visual Studio installed.

Both problems were coming from the same source: some of the base .targets / .props files were looked up at wrong paths. This prevented us from evaluating a project using our toolset.

If you had no Visual Studio instance installed, you'd get the following error (with the previous toolset version, 15.0):

The imported project
"C:\Windows\Microsoft.NET\Framework64\
15.0\Microsoft.Common.props" was not found.

When evaluating a C# .NET Core project in Visual Studio 2017, you'd get the following error (with the current toolset version, Current):

The imported project 
"C:\Program Files (x86)\Microsoft Visual Studio\
2017\Community\MSBuild\Current\Microsoft.Common.props" was not found. 
....

Since these problems are similar (which they do seem to be), we could try killing two birds with one stone.

In the next paragraphs, I'll explain how we accomplished that, without going into details. These details (about how C# .NET Core projects are evaluated as well as changes to the evaluation mechanism in our toolset) will be the topic of one of our future articles. By the way, if you were reading this article carefully, you probably noticed that this is the second reference to our future articles. :)

Now, how did we solve that problem? We extended our own toolset with the base .targets / .props files from .NET Core SDK (Sdk.props, Sdk.targets). That gave us more control over the situation and more flexibility in import management as well as evaluation of .NET Core projects in general. Yes, our toolset got a bit larger again, and we also had to add logic for setting up the environment required for evaluation of .NET Core projects, but it seems worth it.

Until then, we had evaluated .NET Core projects by simply requesting the evaluation and relying on MSBuild to do the job.

Now that we had more control over the situation, the mechanism changed a bit:

  • set up the environment required for evaluating .NET Core projects;
  • evaluation:
    • start evaluation using .targets / .props files from our toolset;
    • continue evaluation using external files.

This sequence suggests that setting up the environment pursues two main goals:

  • initiate evaluation using .targets / .props files from our toolset;
  • redirect all subsequent operations to external .targets / .props files.

A special library Microsoft.DotNet.MSBuildSdkResolver is used to look up the necessary .targets / .props files. In order to initiate the setting up of the environment using files from our toolset, we utilized a special environment variable used by that library so that we could point at the source where to import the necessary files from (i.e. our toolset). Since the library is included into our distribution, there's no risk of a sudden logic failure.

Now we have the Sdk files from our toolset imported first, and since we can easily change them now, we fully control the rest of the evaluation logic. It means we can now decide which files and from what location to import. The same applies to Microsoft.Common.props mentioned above. We import this and other base files from our toolset so we don't have to worry about their existence or contents.

Once all the necessary imports are done and properties set, we pass control over the evaluation process to the actual .NET Core SDK, where all the rest required operations are performed.

Conclusion

Supporting Visual Studio 2019 was generally easier than supporting Visual Studio 2017 for a number of reasons. First, Microsoft didn't change as many things as they had when updating from Visual Studio 2015 to Visual Studio 2017. Yes, they did change the base toolset and forced Visual Studio plugins to switch to asynchronous load mode, but this change wasn't that drastic. Second, we already had a ready-made solution involving our own toolset and project evaluation mechanism and we simply didn't have to work it all up from scratch - only build on what we already had. The relatively painless process of supporting analysis of .NET Core projects under new conditions (and on computers with no Visual Studio copies installed) by extending our project evaluation system also gives us hope that we have made the right choice by taking some of the control in our hands.

But I'd like to repeat the idea communicated in the previous article: sometimes using ready-made solutions isn't that easy as it may seem.

Popular related articles


Comments (0)

Next comments next comments
close comment form