To get a trial key
fill out the form below
Team License (standard version)
Enterprise License (extended version)
* By clicking this button you agree to our Privacy Policy statement

** This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Request our prices
New License
License Renewal
--Select currency--
USD
EUR
GBP
RUB
* By clicking this button you agree to our Privacy Policy statement

** This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
To get the licence for your open-source project, please fill out this form
* By clicking this button you agree to our Privacy Policy statement

** This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
To get the licence for your open-source project, please fill out this form
* By clicking this button you agree to our Privacy Policy statement

** This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
I am interested to try it on the platforms:
* By clicking this button you agree to our Privacy Policy statement

** This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Message submitted.

Your message has been sent. We will email you at


If you haven't received our response, please do the following:
check your Spam/Junk folder and click the "Not Spam" button for our message.
This way, you won't miss messages from our team in the future.

>
>
>
Yo, Ho, Ho, And a Bottle of Rum - Or Ho…

Yo, Ho, Ho, And a Bottle of Rum - Or How We Analyzed Storm Engine's Bugs

Jun 25 2021

PVS-Studio is a static analysis tool that helps find errors in software source code. This time PVS-Studio looked for bugs in Storm Engine's source code.

0838_Storm_Engine/image1.png

Storm Engine

Storm Engine is a gaming engine that Akella has been developing since January 2000, for the Sea Dogs game series. The game engine became open-source on March 26th, 2021. The source code is available on GitHub under the GPLv3 license. Storm Engine is written in C++.

In total, PVS-Studio issued 235 high-level warnings and 794 medium-level warnings. Many of these warnings point to bugs that may cause undefined behavior. Other warnings reveal logical errors - the program runs well, but the execution's result may be not what's expected.

Examining each of the 1029 errors PVS-Studio discovered - especially those that involve the project's architecture - would take up an entire book that is difficult to write and read. In this article, I'll review more obvious and on-the-surface-type errors that do not require delving deep into the project's source code.

Detected Errors

Redundant Checks

PVS-Studio warns: V547 Expression 'nStringCode >= 0xffffff' is always false. dstring_codec. h 84

#define DHASH_SINGLESYM 255
....
uint32_t Convert(const char *pString, ....)
{
  uint32_t nStringCode;
  ....
  nStringCode = ((((unsigned char)pString[0]) << 8) & 0xffffff00) |
                  (DHASH_SINGLESYM)
  ....
  if (nStringCode >= 0xffffff)
  {
    __debugbreak();
  }
  return nStringCode;
}

Let's evaluate the expression that the nStringCode variable contains. The unsigned char type takes values in the range of [0,255]. Consequently, (unsigned char)pString[0] is always less than 2^8. After left shifting the result by 8, we get a number that does not exceed 2^16. The '&' operator does not augment this value. Then we increase the expression's value by no more than 255. As a result, the nStringCode variable's value never exceeds 2^16+256, and therefore, is always less than 0xffffff = 2^24-1. Thus, the check is always false and is of no use. At first glance, it would seem that we can safely remove it:

#define DHASH_SINGLESYM 255
....
uint32_t Convert(const char *pString, ....)
{
  uint32_t nStringCode;
  ....
  nStringCode = ((((unsigned char)pString[0]) << 8) & 0xffffff00) |
                (DHASH_SINGLESYM)
....
  return nStringCode;
}

But let's not rush into anything. Obviously, the check is here for a reason. The developers may have expected the expression or the DHASH_SINGLESYM constant to change in the future. This example demonstrates a case when the analyzer is technically correct, but the code fragment that triggered the warning might not require fixing.

PVS-Studio warns: V560 A part of conditional expression is always true: 0x00 <= c. utf8.h 187

inline bool IsValidUtf8(....)
{
  int c, i, ix, n, j;
  for (i = 0, ix = str.length(); i < ix; i++)s
  {
    c = (unsigned char)str[i];
    if (0x00 <= c && c <= 0x7f)
      n = 0;
    ...
  }
  ...
}

The c variable holds an unsigned type value and the 0x00 <= c check can be removed as unnecessary. The fixed code:

inline bool IsValidUtf8(....)
{
  int c, i, ix, n, j;
  for (i = 0, ix = str.length(); i < ix; i++)s
  {
    c = (unsigned char)str[i];
    if (c <= 0x7f)
      n = 0;
    ...
  }
  ...
}

Reaching Outside Array Bounds

PVS-Studio warns: V557 Array overrun is possible. The value of 'TempLong2 - TempLong1 + 1' index could reach 520. internal_functions.cpp 1131

DATA *COMPILER::BC_CallIntFunction(....)
{
  if (TempLong2 - TempLong1 >= sizeof(Message_string))
  {
    SetError("internal: buffer too small");
    pV = SStack.Push();
    pV->Set("");
    pVResult = pV;
    return pV;
  }
  memcpy(Message_string, pChar + TempLong1, 
         TempLong2 - TempLong1 + 1);
  Message_string[TempLong2 - TempLong1 + 1] = 0;
  pV = SStack.Push();
}

Here the analyzer helped find the off-by-one error.

The function above first makes sure that the TempLong2 - TempLong1 value is less than the Message_string length. Then the Message_string[TempLong2 - TempLong1 + 1] element takes the 0 value. Note that if TempLong2 - TempLong1 + 1 == sizeof(Message_string), the check is successful and the internal error is not generated. However, the Message_string[TempLong2 - TempLong1 + 1] element is of bounds. When this element is assigned a value, the function accesses unreserved memory. This causes undefined behavior. You can fix the check as follows:

DATA *COMPILER::BC_CallIntFunction(....)
{
  if (TempLong2 - TempLong1 + 1 >= sizeof(Message_string))
  {
    SetError("internal: buffer too small");
    pV = SStack.Push();
    pV->Set("");
    pVResult = pV;
    return pV;
  }
  memcpy(Message_string, pChar + TempLong1, 
         TempLong2 - TempLong1 + 1);
  Message_string[TempLong2 - TempLong1 + 1] = 0;
  pV = SStack.Push();
}

Assigning a Variable to Itself

PVS-Studio warns: V570 The 'Data_num' variable is assigned to itself. s_stack.cpp 36

uint32_t Data_num;
....
DATA *S_STACK::Push(....)
{
  if (Data_num > 1000)
  {
    Data_num = Data_num;
  }
  ...
}

Someone may have written this code for debugging purposes and then forgot to remove it. Instead of a new value, the Data_num variable receives its own value. It is difficult to say what the developer wanted to assign here. I suppose Data_num should have received a value from a different variable with a similar name, but the names got mixed up. Alternatively, the developer may have intended to limit the Data_num value to the 1000 constant but made a typo. In any case there's a mistake here that needs to be fixed.

Dereferencing a Null Pointer

PVS-Studio warns: V595 The 'rs' pointer was utilized before it was verified against nullptr. Check lines: 163, 164. Fader.cpp 163

uint64_t Fader::ProcessMessage(....)
{
  ....
  textureID = rs->TextureCreate(_name);
  if (rs)
  {
    rs->SetProgressImage(_name);
    ....
}

In the code above, the rs pointer is first dereferenced, and then evaluated against nullptr. If the pointer equals nullptr, the null pointer's dereference causes undefined behavior. If this scenario is possible, it is necessary to place the check before the first dereference:

uint64_t Fader::ProcessMessage(....)
{
  ....
  if (rs)
  {
    textureID = rs->TextureCreate(_name);
    rs->SetProgressImage(_name);
    ....
}

If the scenario guarantees that rs != nullptr is always true, then you can remove the unnecessary if (rs) check:

uint64_t Fader::ProcessMessage(....)
{
  ....
  textureID = rs->TextureCreate(_name);
  rs->SetProgressImage(_name);
  ....
}

There's also a third possible scenario. Someone could have intended to check the textureID variable.

Overall, I encountered 14 of the V595 warnings in the project.

If you are curious, download and start PVS-Studio, analyze the project and review these warnings. Here I'll limit myself to one more example:

PVS-Studio warns: V595 The 'pACh' pointer was utilized before it was verified against nullptr. Check lines: 1214, 1215. sail.cpp 1214

void SAIL::SetAllSails(int groupNum)
{
  ....
  SetSailTextures(groupNum, core.Event("GetSailTextureData", 
                 "l", pACh->GetAttributeAsDword("index",  -1)));
  if (pACh != nullptr){
  ....
}

When calculating the Event method's arguments, the author dereferences the pACh pointer. Then, in the next line, the pACh pointer is checked against nullptr. If the pointer can take the null value, the if-statement that checks pACh for nullptr must come before the SetSailTextures function call that prompts pointer dereferencing.

void SAIL::SetAllSails(int groupNum)
{
  ....
  if (pACh != nullptr){
    SetSailTextures(groupNum, core.Event("GetSailTextureData", 
                    "l", pACh->GetAttributeAsDword("index",  -1)));
  ....
}

If pACh can never be null, you can remove the check:

void SAIL::SetAllSails(int groupNum)
{
  ....
  SetSailTextures(groupNum, core.Event("GetSailTextureData", 
                  "l", pACh->GetAttributeAsDword("index",  -1)));
  ....
}

new[] – delete Error

PVS-Studio warns: V611 The memory was allocated using 'new T[]' operator but was released using the 'delete' operator. Consider inspecting this code. It's probably better to use 'delete [] pVSea;'. Check lines: 169, 191. SEA.cpp 169

struct CVECTOR
{
  public:
    union {
      struct
      {
        float x, y, z;
      };
      float v[3];
  };
};
....
struct SeaVertex
{
  CVECTOR vPos;
  CVECTOR vNormal;
  float tu, tv;
};
....
#define STORM_DELETE (x)
{ delete x; x = 0; }

void SEA::SFLB_CreateBuffers()
{
    ...
    pVSea = new SeaVertex[NUM_VERTEXS];
}
SEA::~SEA() {
...
STORM_DELETE(pVSea);
...
}

Using macros requires special care and experience. In this case a macro causes an error: the incorrect delete operator - instead of the correct delete[] operator - releases the memory that the new[] operator allocated. As a result, the code won't call destructors for the pVSea array elements. In some cases, this won't matter - for example, when all destructors of both array elements and their fields are trivial.

However, if the error does not show up at runtime - it does not mean there isn't one. The key here is how the new[] operator is defined. In some cases calling the new[] operator will allocate memory for the array, and will also write the memory section's size and the number of elements at the beginning of the memory slot. If the developer then uses the delete operator that is incompatible with new[], the delete operator is likely to misinterpret the information at the beginning of the memory block, and the result of such operation will be undefined. There is another possible scenario: memory for arrays and single elements is allocated from different memory pools. In that case, attempting to return memory allocated for arrays back to the pool that was intended for scalars will result in a crash.

This error is dangerous, because it may not manifest itself for a long time, and then shoot you in the foot when you least expect it. The analyzer found a total of 15 errors of this type. Here are some of them:

  • V611 The memory was allocated using 'new T[]' operator but was released using the 'delete' operator. Consider inspecting this code. It's probably better to use 'delete [] m_pShowPlaces;'. Check lines: 421, 196. ActivePerkShower.cpp 421
  • V611 The memory was allocated using 'new T[]' operator but was released using the 'delete' operator. Consider inspecting this code. It's probably better to use 'delete [] pTable;'. Check lines: 371, 372. AIFlowGraph.h 371
  • V611 The memory was allocated using 'new T[]' operator but was released using the 'delete' operator. Consider inspecting this code. It's probably better to use 'delete [] vrt;'. Check lines: 33, 27. OctTree.cpp 33
  • V611 The memory was allocated using 'new T[]' operator but was released using the 'delete' operator. Consider inspecting this code. It's probably better to use 'delete [] flist;'. Flag.cpp 738
  • V611 The memory was allocated using 'new T[]' operator but was released using the 'delete' operator. Consider inspecting this code. It's probably better to use 'delete [] rlist;'. Rope.cpp 660

Analysis showed that many of the cases above involve the STORM_DELETE macro. However a simple change from delete to delete[] will lead to new errors, because the macro is also intended free the memory that the new operator allocated. To fix this code, add a new macro - STORM_DELETE_ARRAY - that uses the correct operator, delete[].

struct CVECTOR
....
struct SeaVertex
{
  CVECTOR vPos;
  CVECTOR vNormal;
  float tu, tv;
};
....
#define STORM_DELETE (x)
{ delete x; x = 0; }

#define STORM_DELETE_ARRAY (x)
{ delete[] x; x = 0; }

void SEA::SFLB_CreateBuffers()
{
    ...
    pVSea = new SeaVertex[NUM_VERTEXS];
}
SEA::~SEA() {
...
STORM_DELETE_ARRAY(pVSea);
...
}

A Double Assignment

PVS-Studio warns: V519 The 'h' variable is assigned values twice successively. Perhaps this is a mistake. Check lines: 385, 389. Sharks.cpp 389

inline void Sharks::Shark::IslandCollision(....)
{
  if (h < 1.0f)
  {
    h -= 100.0f / 150.0f;
    if (h > 0.0f)
    {
      h *= 150.0f / 50.0f;
    }
    else
      h = 0.0f;
    h = 0.0f;
    vx -= x * (1.0f - h);
    vz -= z * (1.0f - h);
}

Take a look at the h < 1.0f expression in the code above. First, the developer calculates the h variable, and then sets it to 0. As a result, the h variable is always 0, which is an error. To fix the code, remove the h variable's second assignment:

inline void Sharks::Shark::IslandCollision(....)
{
  if (h < 1.0f)
  {
    h -= 100.0f / 150.0f;
    if (h > 0.0f)
    {
      h *= 150.0f / 50.0f;
    }
    else
      h = 0.0f;
    vx -= x * (1.0f - h);
    vz -= z * (1.0f - h);
}

Dereferencing a Pointer from realloc or malloc Function

PVS-Studio warns: V522 There might be dereferencing of a potential null pointer 'pTable'. Check lines: 36, 35. s_postevents.h 36

void Add(....)
{
  ....
  pTable = (S_EVENTMSG **)realloc(
                         pTable, nClassesNum * sizeof(S_EVENTMSG *));
  pTable[n] = pClass;
  ....
};

When there's a lack of memory, the realloc function fails to extend a memory block to the required size and returns NULL. Then the pTable[n] expression attempts to dereference this null pointer and causes undefined behavior. Moreover, the pTable pointer is rewritten, which is why the address of the original memory block may be lost. To fix this error, add a check and use an additional pointer:

void Add(....)
{
  ....
  S_EVENTMSG ** newpTable 
    = (S_EVENTMSG **)realloc(pTable, 
                             nClassesNum * sizeof(S_EVENTMSG *));
  if(newpTable) 
  {
    pTable = newpTable;
    pTable[n] = pClass;
    ....
  }
  else
  {
  // Handle the scenario of realloc failing to reallocate memory
  }

};

PVS-Studio found similar errors in scenarios that involve the malloc function:

PVS-Studio warns: V522 There might be dereferencing of a potential null pointer 'label'. Check lines: 116, 113. geom_static.cpp 116

GEOM::GEOM(....) : srv(_srv)
{
  ....
  label = static_cast<LABEL *>(srv.malloc(sizeof(LABEL) *
                               rhead.nlabels));
  for (long lb = 0; lb < rhead.nlabels; lb++)
  {
    label[lb].flags = lab[lb].flags;
    label[lb].name = &globname[lab[lb].name];
    label[lb].group_name = &globname[lab[lb].group_name];
    memcpy(&label[lb].m[0][0], &lab[lb].m[0][0], 
           sizeof(lab[lb].m));
    memcpy(&label[lb].bones[0], &lab[lb].bones[0],
           sizeof(lab[lb].bones));
    memcpy(&label[lb].weight[0], &lab[lb].weight[0], 
           sizeof(lab[lb].weight));
  }
}

This code needs an additional check:

GEOM::GEOM(....) : srv(_srv)
{
  ....
  label = static_cast<LABEL *>(srv.malloc(sizeof(LABEL) *
                               rhead.nlabels));
  for (long lb = 0; lb < rhead.nlabels; lb++)
  {
    if(label)
    {
      label[lb].flags = lab[lb].flags;
      label[lb].name = &globname[lab[lb].name];
      label[lb].group_name = &globname[lab[lb].group_name];
      memcpy(&label[lb].m[0][0], &lab[lb].m[0][0],
               sizeof(lab[lb].m));
      memcpy(&label[lb].bones[0], &lab[lb].bones[0],
             sizeof(lab[lb].bones));
      memcpy(&label[lb].weight[0], &lab[lb].weight[0], 
             sizeof(lab[lb].weight));
    }
  ....
  }
}

Overall, the analyzer found 18 errors of this type.

Wondering what these errors can lead to and why you should avoid them? See this article for answers.

Modulo 1 Remainder

PVS-Studio warns: V1063 The modulo by 1 operation is meaningless. The result will always be zero. WdmSea.cpp 205

void WdmSea::Update(float dltTime)
{
  long whiteHorses[1];
  ....
  wh[i].textureIndex = rand() % (sizeof(whiteHorses) / sizeof(long));
}

In the code above, the developer calculated the whiteHorses array's size and applied the modulo operation to the size value. Since the array size equals 1, the result of this modulo operation is always 0. Therefore, the operation does not make sense. The author may have made a mistake when declaring the whiteHorses variable - the array's size needed to be different. There is also a chance that there's no mistake here and the rand() % (sizeof(whiteHorses) / sizeof(long)) expression accommodates some future scenario. This code also makes sense if the whiteHorses array size is expected to change in the future and there will be a need to generate a random element's index. Whether the developer wrote this code on purpose or by accident, it's a good idea to take a look and recheck - and that's exactly what the analyzer calls for.

std::vector vs std::deque

Aside from detecting obvious errors and inaccuracies in code, the PVS-Studio analyzer helps optimize code.

PVS-Studio warns: V826 Consider replacing the 'aLightsSort' std::vector with std::deque. Overall efficiency of operations will increase. Lights.cpp 471

void Lights::SetCharacterLights(....)
{
  std::vector<long> aLightsSort;
  for (i = 0; i < numLights; i++)
    aLightsSort.push_back(i);
  for (i = 0; i < aMovingLight.size(); i++)
  {
    const auto it = std::find(aLightsSort.begin(),aLightsSort.end(), 
                              aMovingLight[i].light);
    aLightsSort.insert(aLightsSort.begin(), aMovingLight[i].light);
  }
}

The code above initializes std::vector aLightsSort, and then inserts elements at its beginning.

Why is it a bad idea to insert many elements at the beginning of std::vector? Because each insertion causes the vector's buffer reallocation. Each time a new buffer is allocated, the program fills in the inserted value and copies the values from the old buffer. Why don't we just simply write a new value before the old buffer's zeroth element? Because std::vector does not know how to do this.

However, std::deque does. This container's buffer is implemented as a circular buffer. This allows you to add and remove elements at the beginning or at the end without the need to copy the elements. We can insert elements into std::deque exactly how we want - just add a new value before the zero element.

This is why this code requires replacing std::vector with std::deque:

void Lights::SetCharacterLights(....)
{
  std::deque<long> aLightsSort;
  for (i = 0; i < numLights; i++)
    aLightsSort.push_back(i);
  for (i = 0; i < aMovingLight.size(); i++)
  {
    const auto it = std::find(aLightsSort.begin(),aLightsSort.end(), 
                              aMovingLight[i].light);
    aLightsSort.push_front(aMovingLight[i].light);
  }
}

Conclusion

PVS-Studio found that the Storm Engine source code contains many errors and code fragments that need revision. Many warnings pointed to code the developers had already tagged as needing revision. These errors may have been detected by static analysis tools or during code review. Other warnings pointed to errors not marked with comments. This means, the developers hadn't suspected anything wrong there. All errors I've examined earlier in the article were from this list. If Storm Engine and its errors intrigued you, you can undertake my journey by yourself. I also invite you to take a look at these select articles about projects whose source code we checked - there my colleagues discuss the analysis results and errors.

Popular related articles
The Evil within the Comparison Functions

Date: May 19 2017

Author: Andrey Karpov

Perhaps, readers remember my article titled "Last line effect". It describes a pattern I've once noticed: in most cases programmers make an error in the last line of similar text blocks. Now I want t…
Characteristics of PVS-Studio Analyzer by the Example of EFL Core Libraries, 10-15% of False Positives

Date: Jul 31 2017

Author: Andrey Karpov

After I wrote quite a big article about the analysis of the Tizen OS code, I received a large number of questions concerning the percentage of false positives and the density of errors (how many erro…
Free PVS-Studio for those who develops open source projects

Date: Dec 22 2018

Author: Andrey Karpov

On the New 2019 year's eve, a PVS-Studio team decided to make a nice gift for all contributors of open-source projects hosted on GitHub, GitLab or Bitbucket. They are given free usage of PVS-Studio s…
How PVS-Studio Proved to Be More Attentive Than Three and a Half Programmers

Date: Oct 22 2018

Author: Andrey Karpov

Just like other static analyzers, PVS-Studio often produces false positives. What you are about to read is a short story where I'll tell you how PVS-Studio proved, just one more time, to be more atte…
The Ultimate Question of Programming, Refactoring, and Everything

Date: Apr 14 2016

Author: Andrey Karpov

Yes, you've guessed correctly - the answer is "42". In this article you will find 42 recommendations about coding in C++ that can help a programmer avoid a lot of errors, save time and effort. The au…
Technologies used in the PVS-Studio code analyzer for finding bugs and potential vulnerabilities

Date: Nov 21 2018

Author: Andrey Karpov

A brief description of technologies used in the PVS-Studio tool, which let us effectively detect a large number of error patterns and potential vulnerabilities. The article describes the implementati…
The way static analyzers fight against false positives, and why they do it

Date: Mar 20 2017

Author: Andrey Karpov

In my previous article I wrote that I don't like the approach of evaluating the efficiency of static analyzers with the help of synthetic tests. In that article, I give the example of a code fragment…
Appreciate Static Code Analysis!

Date: Oct 16 2017

Author: Andrey Karpov

I am really astonished by the capabilities of static code analysis even though I am one of the developers of PVS-Studio analyzer myself. The tool surprised me the other day as it turned out to be sma…
PVS-Studio ROI

Date: Jan 30 2019

Author: Andrey Karpov

Occasionally, we're asked a question, what monetary value the company will receive from using PVS-Studio. We decided to draw up a response in the form of an article and provide tables, which will sho…
The Last Line Effect

Date: May 31 2014

Author: Andrey Karpov

I have studied many errors caused by the use of the Copy-Paste method, and can assure you that programmers most often tend to make mistakes in the last fragment of a homogeneous code block. I have ne…

Comments (0)

Next comments

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
This website uses cookies and other technology to provide you a more personalized experience. By continuing the view of our web-pages you accept the terms of using these files. If you don't want your personal data to be processed, please, leave this site.
Learn More →
Accept