Subject
Symptoms
Platform/Tools
Solution
Monitoring individual jobs
Zabbix items
Add new items to the wrapper script
...
my $MYSQL_SERVER = '127.0.0.1';
my $MYSQL_PORT = '3306';
my $MYSQL_USER = 'bacula';
my $MYSQL_PASSWORD = 'yRacJaj4Eujw';
my $DATABASE = 'bacula';
...
my $JOB_FILES_KEY = 'bacula.job.sumFiles[%s]';
my $JOB_BYTES_KEY = 'bacula.job.sumBytes[%s]';
my $JOB_READ_BYTES_KEY = 'bacula.job.sumReadBytes[%s]';
...
my $dbh = DBI->connect("DBI:mysql:database=$DATABASE;host=$MYSQL_SERVER;port=$MYSQL_PORT",$MYSQL_USER, $MYSQL_PASSWORD,{'RaiseError' => 1});
### Send statistics for individual job
my $sth = $dbh->prepare("SELECT JobFiles,JobBytes,ReadBytes, P.Name AS Pool, if(isnull(sum(`M`.`VolBytes`)),0,sum(`M`.`VolBytes`)) AS `PoolBytes`,P.NumVols FROM Job J, (`Pool` `P` left join `Media` `M` on((`P`.`PoolId` = `M`.`PoolId`))) WHERE J.PoolId = P.PoolId AND J.Job = ?;");
$sth->execute($options{'j'}); #Job name in the form BackupCatalog.2015-02-01_23.10.00_03
my ($jobFiles,$jobBytes,$readBytes,$poolName,$poolBytes,$poolVols) = $sth->fetchrow_array(); #exactly one row expected
$sth->finish();
#Send for the job
system(sprintf($zabbix_sender_cmd_line,sprintf($JOB_FILES_KEY,$options{'n'}),$jobFiles) . " >/dev/null");
system(sprintf($zabbix_sender_cmd_line,sprintf($JOB_BYTES_KEY,$options{'n'}),$jobBytes) . " >/dev/null");
system(sprintf($zabbix_sender_cmd_line,sprintf($JOB_READ_BYTES_KEY,$options{'n'}),$readBytes) . " >/dev/null");
...
$dbh->disconnect();
...
Adding a trigger
Adding a graph
What about 24-hour statistics?
There is a very nice feature in the webacula tool, I installed very long ago. It shows some aggregated statistics for the last 24 hours. Although we can do some aggregation inside zabbix, I decided to create additional monitoring items and make my wrapper script to also send aggregated statistics.
More zabbix items
Here I made my first mistake, I added aggregated item prototypes to the zabbix Discovery rule. This didn’t work because aggregated items are not bound to individual jobs, while zabbix discovery will try to create new items for every job discovered. So these items shall be created as template items:
Extending script
Same approach as before, we make sure that we use proper keys for our new items and a different SQL command:
...
my $SUM_FILES_KEY = 'bacula.24hr.sumFiles';
my $SUM_BYTES_KEY = 'bacula.24hr.sumBytes';
my $SUM_READ_BYTES_KEY = 'bacula.24hr.sumReadBytes';
...
### Send statstics for the entire installation for the last 24 hr
$sth = $dbh->prepare("SELECT sum(`JobFiles`),sum(`JobBytes`),sum(`ReadBytes`) from `Job` where (`StartTime` > (now() - interval 1 day));");
my ($jobFiles24,$jobBytes24,$readBytes24) = $sth->fetchrow_array(); #exactly one row expected
$sth->finish();
system(sprintf($zabbix_sender_cmd_line,sprintf($SUM_FILES_KEY,$options{'n'}),$jobFiles24) . " >/dev/null");
system(sprintf($zabbix_sender_cmd_line,sprintf($SUM_BYTES_KEY,$options{'n'}),$jobBytes24) . " >/dev/null");
system(sprintf($zabbix_sender_cmd_line,sprintf($SUM_READ_BYTES_KEY,$options{'n'}),$readBytes24) . " >/dev/null");
### Done with the 24-hr
...
Adding a trigger for 24 hr
24 hr graph
Well, graphs could have been sexier, but I leave it to the future articles.






