Revision 547 – OUISignatureList Works with our Parser

Para 3.1 of the XML specification says that the order of attributes to an XML element is not significant; for example, the following two statements are equal:

<A left=”one” right=”two”/>

<A right=”two” left=”one”/>

Unfortunately, it seems we wrote our own parser, and missed that part. For what it’s worth, we’re probably using some sort of SAX which leverages the exactness of XML’s syntax markup, so there’s a certain portability and disambiguation there, but the VW parser gets confused when attributes to OUISignature are not in the order it understands.

Similarly unfortunately, I had to break the OUISignatureList writer to accommodate that. The risk is not that other XML parsers will not understand — they will, because they don’t care about ordering. The risk is that I’m building a basic “just print the text” writer, I might make an error such as URLEscaping output. In jumping on a leaky boat, I may also sink.

In the meantime, OUISignatures written by these tools are directly replaceable into VW’s config.

Revision 546 – Summarize-Idle Added

A quick Awk script was added to show how to post-process a CSV exported from a trend report. In many cases, reuse of a data export would be nice, but it’s somewhat opaque: there’s a report, or a CSV. CSVs tend to show just the one summary such as a 1-day rollup.

In order to get more meaningful data, a post-process of an export might be useful. This is the sort of thing that can be coupled behind an email sent to a which later asynchronously executes a BAT file to contain this logic.

I may need to create a writeup of how to leverage this, but in the meantime, this example is added.

Revision 544 – Improved DOS BAT Convenience Scripts

Previously, all the BAT files provided for convenience assumed that the JAR files you want to use are in the current directory as opposed to the directory holding the BAT file.

Sorry about that.

In this revision I fixed that, noting “All .bat convenience scripts updated to be able to run using an absolute path to the BAT file rather than assume contents are in local directory”

Revision 542 – VR Can Help You Sync

Added a hack to VR to allow it sync your repository.

VR, The “VirtualRegent” was a start of a helpful bot before there was such a thing as RemoteWisdom. As a bot that would contact outward, it could maintain a bidirectional channel for status and payloads — basically for information and uploads or downloads. It even has a Jabber bot so the the tools we already have can talk to it.

I created a way to self-test commands — such as “checksum XX file” and a “fetch” topic to test download such as software updates.

This test function became the more common usage: for example “VR –demofetch” will fetch you everything you need to set up a demo, except for the license file.

I’ve added the command “syncfae” so that “VR –cmd syncfae” does what’s needed to sync the FAE Toolkit on windows. This is abbreviated with the option “–sync” to VR.jar, expanding to “–cmd syncfae”, so that fewer keypresses are needed ot get the sync started. It’s not that I think people are not intelligent, it’s that the fewer buttons on the remote control, the better the chance that the user will choose the right one.

“VR –sync” will download rsync.exe, cygwin1.dll, and use them to synchronize the contents to a local directory called FAE.

Revision 541 – Added Parsing of DCNM Data for Nicknames

In revision 541, the basic capability to ask a DCNM service for the Nickname/WWPN mapping has been started. This has the potential to reuse the information entered into DCNM rather than re-entering manually, and avoid pulling zonefiles from each zone for parsing using a –nickname=file:// or –nickname=http:// method. As well, in situations where aliases are not used, we can collect the port labels or private aliases that DCNM may not share down to the switch.

This method is only initially defined, but requires a bit more work.

The same parser logic is used as was added for OnCommand in OnCommand Query and for BNA in BNA Query. Like the BNA and OnCommand work, the DCNM query simply reformats a query and sends it through the array of parsers to vote upon:

java -jar vict.jar --nickname=dcnmsql://user:pass@server:port/ --nicknameout=\VirtualWisdomData\DeviceNickname\nicknames.csv

Default user/pass should be accurate but need additional testing to confirm. Like the BNA parser, this method hits the underlying database directly, so it needs (firewalls/filters) direct access to the server, and is vulnerable to schema changes. Schema is based on 5.2 documentation.

In the meantime, cisco-shows2wwncsv.awk is also provided to build portlabel nicknames as import CSVs; this awk file needs two “show” commands on every switch. NOTE: every switch, not just every fabric, and it needs to see the output of “show flogi database” before it sees “show interface description”.

Revision 537 – Confirm DB Backup is OK

I had a few customers who had no idea their backups were not running. Typically this is because of space-exhaustion, but sometimes this was because the backup schedule was not set, or it had become deactivated.

PHC on/after version 0.2-537 will now confirm that a backup was completed within 14 days; this value was chosen based on the recommendation that backups be done weekly, and the longest possible “staleness” of a backup is when the next backup is running: 7 days. Missing two backups is a critical concern. If a backup wasn’t scheduled, then there will be no backup logged; if a backup fails due to space, it will still be cause by not showing a completed backup. Either way, we can now catch when a portal service has no protective backup and is a risk to upgrading.

Revision 536 – FixNickNameHistory for Database Back-Edits

In Revision 536, I implemented “–fixnicknamehistory” which applies any loaded nicknames (for example, those gained using OnCommand Extraction, those Parsed from Cisco Zones, or those extracted from BNA) to the underlying Portal Server database retroactively.

So long as the Portal Service is shut down, this can be done on a customer’s live server, but it’s more useful in analysis where nicknames arrived late yet the analyst cannot wait another week or so for nickname-ful summaries to be collected.

Revision 533 – Semi-Statistical Summary Insertion Delay

In this revision, I incorporate Welford-via-Knuth into the running statistical calculation, providing a running mean/deviation calculation to show where the greater portion of insertion delay are.

Why?

The “mean” of a value is nearly useless for planning: it’s just an average. An understanding of how that value varies can help get a better idea of it just as any ability to match the behaviour to a predictable curve. A mean+/-1deviation can give a basic idea; a mean+/-3deviation can show where generally “all the non-unusual ones” land. Put another way, mean+/-3deviation can tell where all the value land once the aberrant outlying points are removed.

Currently, this means that the summary calculation now shows the mean delay plus the mean+1deviation “ceiling”: a mean +/- deviation provides two values of course, but we only worry about the worst case when a risk exists of taking too long to insert a summary row.

PHC-0.2-536 showing mean+1deviation

PHC-0.2-536 showing mean+1deviation

Currently, I’m not sure whether the mean+1deviation is the more critical value than the reported mean, nor whether we should shift to the mean+3deviation as a more appropriate metric.