[Networkit] SegmentationFault in CoarseningBenchmark

Johannes Ernst joh-ernst at t-online.de
Mon Dec 5 19:12:23 CET 2016


Hello everbody,

when i compile the C++ part of NetworKit with "scons --optimize=Dbg 
--target=Tests" and call the tests with "./NetworKit-Tests-Dbg 
--gtest_filter=*CoarseningBenchmark* --loglevel=DEBUG" i am asked to 
enter the number of nodes. When i enter a number smaller than 100 the 
program crashes with a segmentation fault. The reason for that is: if 
the number of nodes is smaller than 100, 
ClusteringGenerator::makeRandomClustering is called with 0 as second 
parameter in coarsening/test/CoarseningBenchmark.cpp:34. This leads to 
Aux::Random::integer being called with -1 as parameter which results in 
a very large random number. This number is used as a node index and 
added to a cluster (so the cluster contains a nonexisting node). Then 
Partition::numberOfSubsets is called and the huge random number is used 
as an array index, which leads to the segmentation fault.

In my opinion the easiest way to fix that, would be to use a fixed 
number larger than 100 instead of asking the user for a number. This 
would be better for automated testing anyway.

A similar question concerns the usage of the assert function. In many 
places statements like "assert(n<maxIndex)" are used to ensure that the 
array-index is not too large. But the asserts are only enabled when 
compiling with --level=Dbg. For example if i create a Graph G with 5 
nodes and then call G.addEdge(4,6) i get a segmentation fault when 
NetworKit is compiled with --level=Opt. So is the range-check for arrays 
intentionally disabled (e.g. for performance reasons)?

Best regards,

Johannes






More information about the NetworKit mailing list