Data Validation history table and Java heap space.

Data Validation stores information about what data pattern is written where in a table that requires one byte per data block. With very large luns that can become a problem. I just saw a run where xfersize=4096 was used against a total amount of storage of 5.7 terabytes. That requires about 1.5GB worth of table space. The default Java heap given to Vdbench is 1024m so that's not good.

Vdbench 5.00 also does not abort nicely when it runs out of memory for Data Validation, and it gets into a "Waiting for slave synchronization:" hang. Vdbench 5.01 will fix this.

To avoid this problem:

  • Change the swat or swat.bat script, replacing -Xmx1024m with a higher value.
  • Use a larger xfersize=
  • use less devices
  • use either the size= or range= parameter
Henk.
Comments:

dd

Posted by ddd on November 25, 2010 at 07:49 PM MST #

Post a Comment:
  • HTML Syntax: NOT allowed
About

Blog for Henk Vandenbergh, author of Vdbench, and Sun StorageTek Workload Analysis Tool (Swat). This blog is used to keep you up to date about anything revolving around Swat and Vdbench.

Search

Categories
Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today