Data Validation history table and Java heap space.
By Henk Vandenbergh-Oracle on Jul 07, 2009
Data Validation stores information about what data pattern is written where in a table that requires one byte per data block. With very large luns that can become a problem. I just saw a run where xfersize=4096 was used against a total amount of storage of 5.7 terabytes. That requires about 1.5GB worth of table space. The default Java heap given to Vdbench is 1024m so that's not good.
Vdbench 5.00 also does not abort nicely when it runs out of memory for Data Validation, and it gets into a "Waiting for slave synchronization:" hang. Vdbench 5.01 will fix this.
To avoid this problem:
- Change the swat or swat.bat script, replacing -Xmx1024m with a higher value.
- Use a larger xfersize=
- use less devices
- use either the size= or range= parameter