[AGENT++] Notify Filter Mask Clarification

Dave White dave.white at efi.com
Thu Jun 17 20:31:59 CEST 2004


I have been experimenting with traps/notifications lately and it has raised a
question about how masks are supposed to work.  While reading RFC 3413 (page
51), it describes the operation of the snmpNotifyFilterMask.  (The operation
of the notify filter mask is exactly the same as the vacmViewTreeFamilyMask
from RFC 3415 (page 24).  Agent++ implements the masking operation with the
Oidx::mask method but the way the code is implemented seems wrong to me.
Please help me understand where my thinking goes wrong on this.

 

The object ID masking in the code works starting from the least significant
bit to the most significant bit, but my reading of the RFCs indicate that
they are supposed to work starting from the most significant bit.  Here is
the text from the RFCs:

 

"Each bit of this bit mask corresponds to a sub-identifier of
snmpNotifyFilterSubtree, with the most significant bit of the i-th octet of
this octet string value (extended if necessary, see below) corresponding to
the (8*i - 7)-th sub-identifier, and the least significant bit of the i-th
octet of this octet string corresponding to the (8*i)-th sub-identifier,
where i is in the range 1 through 16."

 

For this discussion, I'll refer to the most significant bit (msb) of an octet
as bit-7 and the least significant bit (lsb) of an octet as bit-0.

 

So for example, when i = 1 (or first) octet of the mask octet string, bit-7
(msb) corresponds to the (8*1 - 7)-th sub-identifier, or the 1-th (or first)
sub-identifier.  Likewise, when i = 1, bit-0 corresponds to the (8*1)-th
sub-identifier, or the 8-th sub-identifier.

 

The relevant code from Oidx::mask(const OctetStr& mask) follows:

 

    for (unsigned int i=0; (i<len()) && (i<cmask.len()*8); i++) {

        char m = 0x01 << (i%8);

        if (!(cmask[i/8] & m)) {

            (*this)[i] = 0ul;

        }

    }

 

In my thinking, the value of 'm' should be set as follows:

            char m = 0x80 >> (i%8);

 

If you have a sub-tree of 1.3.6.1.6.3.1.1.5.0 and wanted to include all but
the last sub-identifier, the octet string for Agent++ would have to be FF.01
(hex).  The octet string for the way I read the RFC would be FF.80 (hex).
Can someone please explain how I'm misreading the RFC?  Has anyone ever used
a mask other than the zero-length string resulting in a mask of all ones?

 

BTW: I have also been using the book "A Practical Guide to SNMPv3 and Network
Management" by David Zeltserman (1999), but its use of snmpNotifyFilterMask
seems different from both the RFC and from the way the code is implemented in
Agent++.  It suggests that the octet string for the sub-tree above would be
03.FE (hex) causing even more confusion.

 

 

Thanks,

Dave

 




More information about the AGENTPP mailing list