PVS-Studio is a tool for detecting bugs and potential vulnerabilities in the source code of programs written in C, C++, C#, or Java, and is also a Static Application Security Testing (SAST) tool. It is meant to be used as part of the CI practice and allows the user to detect bugs at the earliest development stages, where they cost almost nothing to fix.
As software projects develop, they grow in size. Compare:
As the project grows, its complexity grows faster than linearly. This explains why error density grows along with the codebase. One of the ways to make up for the growing complexity is to use static code analysis tools.
A static analyzer is a software tool that performs preliminary code review and points out code fragments that are very likely to contain errors. This allows developers to fix most bugs at the earliest development stage, where they are cheapest to fix.
Static analysis does not replace – but rather complements – other bug-detecting practices such as code review, unit testing, dynamic analysis, regression testing, manual testing, and so on.
Take code review, for example. A much better scenario is to have a software analyzer find the most trivial bugs for you so that you could focus on more useful high-level checks of the algorithm rather than figuring out comparison functions – all the more so since, as our experience proves, the human eye is bad at noticing many of the bugs and they are very likely to be overlooked during code review.
We recommend choosing PVS-Studio, a static code analyzer developed by our team. It runs on 64-bit Windows, Linux, and macOS systems and can check the source code of programs for 32-bit, 64-bit, and embedded ARM platforms.
As of this writing, the analyzer supports the following languages and compilers:
The analyzer comes with detailed documentation in English and Russian. The descriptions of diagnostic rules include examples of correct and incorrect code. They also comprise links to code snippets from real open-source programs.
For those specialists who are going to use PVS-Studio as a SAST tool, the diagnostics are mapped to the Common Weakness Enumeration, SEI CERT Coding Standards, and MISRA standard. Here are the mapping tables of PVS-Studio diagnostics to different standards:
The analyzer can be used both as a standalone tool and as a plugin for Visual Studio and IntelliJ IDEA. Some of our customers have also been using PVS-Studio as part of SonarQube lately. When used as a plugin for SonarQube, the analyzer provides additional diagnostic messages.
We have developed a number of scenarios of using PVS-Studio with CI systems. As observing all the scenarios is outside the scope of this article, please refer to the documentation. Here are just a few links to give you the general idea:
PVS-Studio effectively detects a wide range of flaws from typos to memory leaks. This is possible thanks to the dataflow analysis, symbolic execution, pattern matching, and method annotation (including automated annotation). To learn more about the working principles behind the analyzer, see the article "Technologies used in the PVS-Studio code analyzer for finding bugs and potential vulnerabilities".
Integrating PVS-Studio into your development process will make many of the bugs cheaper to fix, thus helping to save time, which you will be able to invest into implementing a new feature or carrying out more thorough high-level testing.
If used regularly, the analyzer will eventually help you enhance your code's quality, thus making it easier to maintain. Regular bug fixing and the practice of writing high-quality code will make it less susceptible to Zero-day vulnerabilities. This subject is discussed in more detail in the article "How can PVS-Studio help in the detection of vulnerabilities?".
PVS-Studio is most cost-effective when used by teams of five members and more. The ROI estimate is given in the article "PVS-Studio ROI".
Integrating PVS-Studio into projects developed by a couple of enthusiasts would probably be impractical, but even small projects can benefit from it – all the more so since we provide free licensing options for students, open-source developers, and so on.
Our new customers will typically purchase a one-year license. When it expires, they're already happy about our analyzer's capabilities and user support service and will renew the license for two or three years, which is much cheaper than the one-year license. You can request the prices and ask for advice on licensing here.
Become our clients and let PVS-Studio make your development process more mature, bug-fixing cheaper, and your code better.
In our turn, we'll provide you with fast and competent support. Your questions will be addressed directly by the programmers developing the particular modules in question. This guarantees getting an answer even in the most complicated situations. Here's one such example: "False positives in PVS-Studio: how deep the rabbit hole goes".
Programmers are sometimes negative about the idea of including static code analysis into their development process and criticize the static analysis method in general or PVS-Studio in particular. As you start digging deeper, it turns out their criticism is unfounded and is simply the product of their reluctance to change anything in the established development process. Let's see what typical arguments for not changing the situation they resort to and what's wrong with them.
Out of context, the statement "static analysis will be taking up some amount of your working time" is true. It does take time to regularly examine the warnings output by the analyzer for newly written or modified code. But that idea needs to be continued: "but it will be taking up much less time than other bug-detecting methods do."
Why do people believe that examining a static analyzer's report is time-consuming?
Those programmers who are not yet familiar with the code analysis method confuse one-time test runs and regular use. When run for the first few times, any analyzer will output a huge list of warnings, with a high rate of false positives. This happens because the tool hasn't been customized yet. With the settings tweaked to meet your exact needs, you won't be seeing many false positives if you run the analyzer regularly. In other words, with regular use, most of the analyzer's diagnostics will be detecting genuine flaws or smelling code. You only have to do that tweaking.
The article "Handling objections: static analysis will take up part of working time" elaborates on the subject.
Again, this statement is true when you haven't properly customized the tool. Once you've tweaked PVS-Studio's settings as needed, you can expect the false positives rate to drop down to 10-20%. That is, out of every five warnings, four will be pointing to genuine bugs or code that is very likely to become the source of bugs in the future. The article "Characteristics of PVS-Studio analyzer by the example of EFL Core Libraries, 10-15% of false positives" shows an example of analyzer customization.
Another source of misconceptions is the temptation to turn on as many diagnostics as possible, without knowing their exact purpose. For instance, if you turn on the MISRA rule set, which was designed for embedded systems, when checking a classic Windows application, the analyzer will generate hundreds of thousands of warnings, none of which will be of any use to you. Irrelevant diagnostics are especially harmful when you are only getting started with the tool since you may get the wrong impression about its diagnostic capabilities. The article "How to quickly check out interesting warnings given by the PVS-Studio analyzer for C and C++ code?" will help you avoid the disappointment.
This concern is vividly illustrated by the following comment:
Unfortunately, static analyzers themselves are nothing more than toys. It's a hell of a job trying to make them part of your routine work process, and it requires assigning some of the staff to examine and filter the analysis results. Any attempt to place this burden on ordinary developers is usually to no avail.
It's not that horrible. There are at least three practices to smoothly integrate static analysis even into large old projects.
Practice 1. "Ratcheting", which is nicely explained by Ivan Ponomarev in his article "Introduce static analysis in the process, don't just search for bugs with it".
Practice 2. To help our users get started quickly, we recommend using the "suppression base". In a nutshell, the idea is that you run the analyzer and get multiple warnings. Since the project has been in development for many years and is still alive, evolving, and profitable, you are not likely to get many warnings pointing at critical defects. In other words, most of the critical bugs have been already fixed using other – more expensive – means or in response to users' feedback. In that case, whatever bugs were found during the first check, they can be viewed as technical debt, which would be unreasonable to rush to fix immediately.
You can tell PVS-Studio to treat these warnings as irrelevant (thus putting off the resolving of the technical debt until later) and not show them again. The analyzer will create a special file storing the information about currently irrelevant bugs and will be outputting warnings only for freshly written or modified code. The mechanism is pretty smart. For instance, if you add an empty line at the beginning of some .cpp file, the analyzer will understand that this line doesn't make any difference, and keep silent. The suppression file can be version-controlled. It's large, but it doesn't matter because you don't need to version-control it often.
After that, every programmer on your team will be getting only the warnings triggered by freshly written or modified code. From the next day on, you will be able to use the analyzer as part of your routine work. As for the technical debt, you'll be able to get to it later and gradually fix the bugs and adjust the analyzer's settings as needed.
Practice 3. You can delegate the task of setting up and integrating PVS-Studio to our team by drawing up a contract with us. One example of this practice is described in the article "How the PVS-Studio team improved Unreal Engine's code".
This scenario is quite possible but it still doesn't mean that the analyzer isn't going to be of any use. The problem is that the bugs have already been found and fixed using other, more expensive, means. It's like feeding a text already checked by a bunch of proofreaders to Microsoft Word to see if its built-in spell check could find anything. It would find just a few mistakes, if any at all, but that doesn't mean Word's spell check is useless when writing new texts.
This subject is discussed in more detail in the article "Philosophy of static code analysis: we have 100 developers, the analyzer found few bugs, is analyzer useless?".
What this argument really says is that the person doesn't want to change anything. After all, their team has been growing and hiring new programmers and testers for some time already, but that didn't help to achieve a more mature development process. That said, we should still elaborate on this argument.
First, hiring another person for bug searching is much more expensive than buying a static analyzer. Just calculate the new employee's annual payroll and add the taxes and expenses on setting up a new workspace. Considering the resulting figures, the argument about a software analyzer being too expensive doesn't seem an argument at all. Besides, a static analyzer, unlike humans, won't take a vacation or go on sick leave or leave the company altogether. For a large team, say, 100 people, you would have to hire not one but several new employees to achieve any noticeable result. In that case, buying a static analyzer becomes an even more favorable solution.
Second, the best result is achieved through the synergy between various bug-detecting techniques used in combination. Some bugs are better diagnosed through unit testing, others through manual testing, and so on. Imagine having 10 programmers working on a project, with lots of unit tests but not a single tester. The users aren't satisfied with the project's quality, so it occurs to you to hire a tester, but you don't do so because "we'd better hire an additional programmer, let there be even more unit tests!" That can't be called a wise decision, can it? In this scenario, the QA process is obviously one-legged and would only gain by adding manual testing. The same is true for static analysis.
Some bugs are better diagnosed by static analyzers, others by dynamic analyzers. These types of tools complement each other, so you don't have to choose only one.
For example, dynamic analyzers can't detect unreachable code and many of the bugs caused by typos. Some of the types of bugs dynamic analysis has a difficult time finding are described in the article "Checking the code of Valgrind dynamic analyzer by a static analyzer".
If you were to choose between writing unit tests and using static analysis, I'd say the tests are more important and valuable. But you don't have to choose; you should use both unit testing and static analysis. These techniques work very well together.
Here are the arguments for using static analysis along with unit testing:
Sure, compilers are evolving and acquiring new warnings, which can detect bugs. But you can't expect much of compilers in comparison with professional proprietary solutions such as PVS-Studio.
Reasons to go for PVS-Studio:
The first two reasons are already enough to tip the scale toward choosing PVS-Studio, but let's talk about the diagnostics too. We are constantly improving our product to stay ahead of other vendors. For instance, our tool can detect an interesting bug described in the article "February 31".
Being aware that all said above is not enough to make skeptics change their mind, we check compilers every now and then to show that they, too, have bugs, which PVS-Studio can detect:
If you still doubt whether you should use PVS-Studio, just look at this list of the bugs it has found in various projects.
Date: Apr 27 2023
Author: Andrey Karpov
Date: Apr 06 2023
Author: Andrey Karpov
Date: Mar 16 2023
Author: Andrey Karpov
Date: Jan 26 2023
Author: Sergey Vasiliev
Date: Aug 08 2022
Author: Artem Rovenskii