The article describes some observations concerning changes in the infrastructure of tools used by programmers in everyday work. First of all, these changes are related to the release of Visual Studio 2010.
A set of tools used by developers is regularly updated. Some absolutely new tools appear, others become obsolete and some others stop being developed and are displaced with more perfect analogues. It is rather interesting to watch this process and I decided to share some of my latest observations in this sphere with you.
I would like to notice right away that I stick to the viewpoint that the fewer diverse tools one uses, the better. I am ready to be criticized for minimalism in functionality. My viewpoint is disputable but still has the right to exist.
Some functionality, that has been earlier available only in third-party tools, is gradually moving into development environments, in particular Visual Studio. But in the mind of developers, this functionality is still associated only with third-party tools. For example, consider the system of automated testing of user application interface that appeared in Visual Studio 2010 Premium/Ultimate which lets you avoid such tools as AutomatedQA TestComplete or Borland SilkTest in many cases.
Please do not get me wrong. I in no way persuade you to refuse using an existing test base and hurry to move to the testing system integrated into Visual Studio 2010. And I by no means want to persuade you to use it. TestComplete is one of the most powerful commercial products intended for automation of software testing. But if you use Visual Studio 2010 and have to decide what automated testing system to choose for the new project, I think you should not go very far. If you do not need specific features during testing, you will not have to buy and employ additional systems besides Visual Studio 2010.
We use the user interface testing system of Visual Studio to test the PVS-Studio interface. We were concentrated more on testing internal units before, but as the interface component was developing, we faced the task of moving from manual testing to automated testing. However, our demands are rather modest and we are content that we have chosen the testing system of Visual Studio. Figures 1 and 2 show some windows of the Visual Studio testing system while working.
Figure 1 - Writing of user's actions in Visual Studio
Figure 2 - The tree of controls in Visual Studio
The conclusion is it is useful to study the innovations in the toolkit you are already using. If your demands to testing lie within the standards, perhaps you will find the functionality of Visual Studio quite enough when developing new projects. Therefore, the number of entities (tools) you have to deal with will not rise - and this is always good.
Like with testing, it is the same with the assistant - Visual Assist. I remember how good it was to work with it when we used Visual Studio 6.0. But many people say that this tool is useful without knowing or noticing the contemporary capabilities of the latest Visual Studio versions. Most of the features I appreciated in Visual Assist have been gradually implemented in Visual Studio. Beginning with Visual Studio 2008, I understood that I could well do without Visual Assist and stopped using it. As Visual Studio 2010 was released, Visual Assist became absolutely irrelevant for me.
I agree that Visual Assist has some functions that will never be included into Visual Studio. And I am sure that somebody might find these functions very important or simply convenient and useful. There are a lot of people for whom Visual Assist does not become less important at all but grows even more indispensable. But personally I used very few features and did not need more. Now these needs are quite satisfied with Visual Studio environment. Let me show you some examples arming myself with Visual Studio 2010.
There is the function of syntax paint. Though it is not so colorful as in Visual Assist, it is still rather pleasant and sufficient for me. If take into account the underlining of syntactical errors, it is just quite good (see Figure 3).
Figure 3 - Highlighting of code in Visual Studio 2010 and underlining of incorrect constructs
The help system of function parameter prompting and name prompting by the first characters of name works quite well (see Figures 4 and 5):
Figure 4 - Function parameter prompting in Visual Studio 2010
Figure 5 - Function name prompting in Visual Studio 2010
There is also a feature I actually lacked for without Visual Assist. It is file name prompting. Visual Studio 2010 now has this feature (see Figure 6).
Figure 6 - Integrated file name prompting in Visual Studio 2010
Visual Assist helped me find my way even in a very badly edited code when it was necessary to understand where brackets open and close. Visual Studio 2010 now also provides this function highlighting matching parentheses as shown in Figure 7.
Figure 7 - Highlighting of matching parentheses in Visual Studio 2010
I find the Visual Studio 2010 code editor quite satisfying. Perhaps you will also look at this editor of Visual Studio in a new way.
When speaking of static analysis of C++ code, programmers often have an association like this: "These are some tools like lint that have command-line interface and are obsolete nowadays". Let us make it out how this idea has appeared. I will not speak about companies and mature development processes where static analysis was used before, is being used now and will be used in future. But most developers use immature processes. One should not be ashamed of it - it is the drawback of organizations not programmers. For them, a static analyzer is rather an exotic thing than an everyday tool integrated into the development process.
C is a language that requires much accuracy and attention from the programmer because its own control over actions performed in the code is very weak. A more dangerous language is rather only assembler. Because of this, static code analysis tools appeared of which lint is the most famous representative. This tool and all the other similar ones have been used rather widely due to the absence of alternative means to detect errors at the stage of coding. They were relevant for the cycles of development of programs with any maturity level.
The new language C++ has become much safer due to a stricter level of control over types and other innovations. Compilers for C and C++ started to generate warnings on many potentially dangerous constructs. They have actually undertaken the functions of existing static analyzers, and the latter became less popular. Many refused to use the additional analysis level provided by third-party tools at that moment.
However, static analyzers have not become obsolete at all. They have learned to detect many kinds of errors referring to object-oriented programming, warn the programmer about incorrect use of libraries (for example, Qt) and even find errors in parallel programs. The conclusion is static analyzers, as before, let the programmer greatly reduce costs at the stage of testing and maintenance. What is good, nowadays these are usually not separate tools but units integrating into the development environment.
I would like to point out that it is the opinion itself that static analyzers are command-line solutions which is obsolete. Static analyzers are contemporary tools that greatly supplement standard capabilities of the compiler and other tools of improving software quality.
For an example, let us again look at Visual Studio. Beginning with Visual Studio 2005, the Team System version includes a generally-purpose subsystem of static analysis Code Analysis. Although this system is an extension, it is tightly integrated into the environment and working with its diagnostic warnings is the same as with messages generated by the compiler (see Figure 8).
Figure 8 - Settings tab in Code Analysis in Visual Studio 2010
There is another type of static analyzers - specialized analyzers. For example, it is PVS-Studio analyzer we are developing which also tightly integrates into Visual Studio (see Figure 9) and allows you to detect many errors in 64-bit and OpenMP programs.
Figure 9 - Integration of PVS-Studio into Visual Studio 2010
Contemporary static analyzers are user friendly programs that may be operated both by professionals and beginners.
When speaking of a dynamic analyzer to search for errors of memory operation, first of all, everybody remembers about the tool DevPartner BoundsChecker Suite. But I would like to restrain the ardor of its supporters and those who recommend it in forums. Surely, it has been a great and indispensable tool during a long time. Unfortunately, this project is not being developed now and is quickly becoming obsolete. For example, BoundsChecker does not support Win64 applications. It can be launched in the 64-bit environment and check 32-bit applications but it cannot work with 64-bit applications. Here is a quotation from the booklet: "DevPartner Studio supports 32-bit application development on 64-bit Windows (WOW 64)".
A lag like this is not permitted for testing tools. Fortunately, BoundsChecker and other dynamic analysis tools have been replaced with a new titan it is best to focus on. We speak about the tool Intel Parallel Inspector included into Intel Parallel Studio.
Intel Parallel Studio integrates into Visual Studio (see Figure 10) and adds the function of checking memory and thread operations. Checking memory in Intel Parallel Inspector includes check of memory leak, detection of pointers referring to a remote object, detection of operations with uninitialized variables. Intel Parallel Inspector allows you to detect use of incorrect references to memory chunks, controls the stack and so on. Thread testing includes check of race conditions, mutexes, call stack analysis with adjustable depth.
Figure 10 - Setting the diagnosis level in Intel Parallel Inspector
What is the most pleasant, you can analyze programs built both with Intel C++ and Visual C++. There is also support for Win32 and Win64 software analysis. Intel Parallel Studio is being stably developed, it is not very expensive and you may safely use it for long terms.
The infrastructure of programmer tools is constantly changing. You may both find new, more convenient, solutions and stop using some obsolete ones (if they are not developed anymore). By the way, there are special workers (and even departments) in large companies whose only work is to watch the development of the tools being used in the development process.