PVS‑Studio analyzer diagnostic logging
- Terms (used further in the text)
- When to use
- Quick start
- Enabling logging and its options
- How to read a structured log
- Privacy and security requirements
- What to send to technical support
- FAQ
Currently available only for the pvs-studio-analyzer utility.
Diagnostic logging is designed for quick collection of reproducible information about the analyzer run: actual parameters, environment, execution result, and details about which files were analyzed or skipped.
The result is saved as a structured log (JSON), which is convenient for reading, automated processing, and submitting to technical support.
If you encounter problems with the analyzer (crashes, abnormal termination, and other scenarios that prevent normal use of PVS-Studio), attach the obtained logs when contacting technical support. You can contact technical support via the form on the website.
Terms (used further in the text)
Structured log is a text file containing diagnostic information in JSON format. Suitable for machine processing.
Raw log is a text file with intermediate information from which the structured log is formed. Has the suffix *-raw.PVS-Studio.log. The file is not intended for self-troubleshooting.
Analyzer (frontend) — pvs-studio-analyzer (CompileCommandsAnalyzer);
Executor (of the analysis; is started by the analyzer) — pvs-studio.
When to use
It is recommended to enable diagnostic logging if:
- the analysis finished with an error (non-zero return code) or crashed;
- analysis settings were "not applied" or applied differently than expected;
- expected files were not included in the analysis;
- a user needs to confirm configuration details (for audit/certification/incident investigation);
- detailed information is required to contact support.
Quick start
1) Run analysis with generation of a structured log:
pvs-studio-analyzer analyze <your_usual_analysis_flags> \
--enable-logging ./pvs-diag.json
After execution, the following will appear:
- structured log:
./pvs-diag.json; - analyzer raw log (in case of a crash):
./*-raw.PVS-Studio.log.
If the structured log was not generated (due to any error), start the conversion manually:
pvs-diag-collector convert --input ./pvs-diag-raw.PVS-Studio.log \
--output ./pvs-diag.json
How it works
- Run the analyzer with logging enabled.
- During operation, diagnostic data is written to raw logs.
- When the analysis is completed, a structured JSON log is generated.
- If something goes wrong during JSON generation, raw logs remain on disk. They can be converted manually using the
pvs-diag-collectorutility or sent to support.
Enabling logging and its options
--enable-logging <file_path> is a mandatory flag that enables logging and saves the structured JSON log to the specified file.
Logging options
Options are set using the --logging-options <option1,option2,...> flag. Values are passed separated by commas, without spaces.
Correct version:
--logging-options skip-sensitive,dump-intermediate-files
Incorrect version:
--logging-options skip-sensitive, dump-intermediate-files
Available options:
|
No. |
Option |
Description |
|---|---|---|
|
1 |
|
Minimizes and partially masks sensitive data in logs. For example, environment variable values and some artifacts—depending on the analysis scenario. |
|
2 |
|
Saves intermediate files after analysis—for example, preprocessed |
|
3 |
|
Adds extended information about executed runs to the structured log, if available in the collected data. |
Examples (bash):
pvs-studio-analyzer analyze <your_usual_flags> \
--enable-logging ./pvs-diag.json \
--logging-options skip-sensitive,dump-intermediate-files,embed-runs
Manual raw log conversion (pvs-diag-collector convert)
pvs-diag-collector is a utility for collecting/processing logs and related artifacts of PVS-Studio analyzer operation.
The convert command of the pvs-diag-collector utility converts the analyzer raw log (*-raw.PVS-Studio.log) into a structured one.
In a normal workflow, these actions are performed by the analyzer automatically.
Usage
pvs-diag-collector.exe convert [FILE] [-i <FILE>] [-o <FILE>]
[--embed-runs] [-j <NUM>]
Utility options
|
No. |
Option |
Description |
|---|---|---|
|
1 |
|
Path to the input raw log. |
|
2 |
|
Path to the output structured log. If the option is not specified, the result is output to |
|
3 |
|
Adds extended information about executed runs to the structured log, if available in the collected data. |
|
4 |
|
Number of threads (by default the number is 1). When used without an argument or with the |
Examples
# Output to a file with detailed information about executor operations
pvs-diag-collector convert --input ./pvs-diag-raw.PVS-Studio.log \
--output ./pvs-diag.json \
-–embed-runs
# Output to a file
pvs-diag-collector convert --input ./pvs-diag-raw.PVS-Studio.log \
--output ./pvs-diag.json
# Output to console
pvs-diag-collector convert --input ./pvs-diag-raw.PVS-Studio.log
Return codes
0is success;1is an error (error text also outputs).
How to read a structured log
The main fields and hints for typical problems are described below. A complete description of all fields is in the JSON schema (see the section about viewing in the editor).
Main sections
It is usually useful to start with the following fields:
utilityindicates, which utility generated the log, its version, and path;environmentindicates OS/platform/environment variables;inputindicates actual launch arguments, used settings and configurations;outputis a return code,stdout/stderr, indicators of problems;basicPerformanceindicates time and memory;runsindicates, which files were analyzed/skipped.
Anomalies
During the conversion of raw analyzer logs, "anomalies" may be detected—these are hints about potential problems—output in stderr, unsuccessful statuses, version mismatches, etc.
They can be found in output.anomalies.
An anomaly is not always an error, but almost always a useful signal for checking and contacting technical support.
Typical use cases
This section indicates typical logging use cases and log fields to pay attention to.
Analysis finished with an error / crashed:
output.returnCode;output.stderr and output.stdout;output.anomalies(if present).
File was not included in the analysis (skipped):
runs.skipped[].runs.skipped[].sourceFilePath.runs.skipped[].skipReason.- path/exclusion settings in
input.
Need to confirm the actual launch configuration:
input.arguments;- configuration files/settings in
input; - (if available)
runs.executed[].stages.analysis.detailssections of the final configuration in executor logs.
Performance issue;
basicPerformance.elapsedTime;basicPerformance.peakMemoryConsumption.
Viewing the log in an editor with field hints
The structured log contains a $schema field. If the editor can get the JSON schema, it will:
- show field descriptions on hover;
- suggest valid values;
- highlight type and structure errors.
Recommended scenario
- Open the JSON log in a code editor (for example, Microsoft Visual Studio Code).
- Ensure the value of
$schemais accessible in the corporate network/internet—according to your policy. If the file at the specified link is unavailable, change the link to the same file in the analyzer distribution. - Use the editor hints to navigate fields and check the structure correctness.
- If access to
$schemais restricted by corporate policy, it usually helps to publish the schemas on an internal artifact server (intranet) or install the schemas locally with mapping in VS Code settings (json.schemasparameter). You can also change the link to the same file in the analyzer distribution.
JSON schemas for structured logs of utilities from the PVS-Studio distribution
Links to these files are written into the $schema field of each structured log. If for some reason you cannot access them locally, use the links below:
Privacy and security requirements
Diagnostic logs may contain confidential information:
- paths to source code and build artifacts;
- compilation/preprocessing parameters;
- environment variables;
- parts of output from third-party tools.
If passing such information is critical, use --logging-options skip-sensitive, which minimizes and partially masks sensitive data in logs. Also follow these steps:
- before sending logs, perform an internal check—search for tokens/secrets/URLs/project names;
- if necessary, coordinate the transfer channel with the organization security policy.
What to send to technical support
Recommended set:
- structured log (
*.json), specified in--enable-logging; - additionally, for C/C++ analyzer:
- preprocessed files (
*.PVS-Studio.i) related to the problematic file (will appear after specifying the--dump-intermediate-filesoption); - run configuration files (
*.PVS-Studio.cfg) related to the problematic file (will appear after specifying the--dump-intermediate-filesoption); - stacktrace files (
*.PVS-Studio.stacktrace.txt); - any additional files to clarify the problem (header files, source code files).
- preprocessed files (
FAQ
Where are the logs located?
Analyzer structured log
Saved exactly at the path a user specified in --enable-logging.
Executor structured log
Not created separately; embedded into the analyzer structured log when the embed-runs option is specified.
Analyzer raw log
Saved in the same directory as the file from --enable-logging, but with the suffix *-raw.PVS-Studio.log.
Executor raw log
Executor raw logs are saved next to the analyzed source files. Their common feature is that the filename ends with *-raw.PVS-Studio.log.
How to find all raw logs in a project
Linux/macOS (bash):
find <project_folder> -name '*-raw.PVS-Studio.log'
Windows PowerShell:
Get-ChildItem -Path <project_folder> -Recurse -Filter "*-raw.PVS-Studio.log"
Can I view the contents of log files?
Yes, all files are text files. You can open any of them and view the contents.
I don't see the structured JSON log. What should I do?
Find the analyzer raw log next to the --enable-logging path.
Run the conversion manually:
pvs-diag-collector convert --input <path_to_raw_log> --output ./pvs-diag.json
If the conversion fails, send the raw log to support.
Why do raw logs remain after analysis?
By default, temporary files and raw logs are deleted when the analysis is completed. Raw logs are saved (not deleted) if at least one of the following conditions is met:
- the
--logging-options dump-intermediate-filesoption is enabled; - the analyzer terminated abnormally.
Practical outcome: even in case of a crash, a user still has raw logs that can be converted or sent to support in their original form.
Can the log be used in automation (CI/scripts)?
Yes. The structured log is a valid JSON, suitable for parsing.
If you need to integrate conversion into a pipeline, use pvs-diag-collector convert and check the return code (0/1).
How to study the log most conveniently?
Open the JSON in VS Code: $schemaallows you to see hints for fields and valid values. This speeds up navigation and reduces the risk of errors in interpretation.