VirtualWisdom4 enables a deeper insight to the metrics behind performance and available of large-scale data networks, but has a few challenges in the initial setup. One we see fairly often in the deployment side is the fact that few customers know what JSON is. Sure, a developer will say “JSON, yeah I got that”, but VirtualWisdom4 users are not all developers. Heck, our field staff don’t interpret JSON, don’t recognize when a “{” is where a “[” should be.
VirtualWisdom4 collects data as soon as it gets access, but that data is not aggregated into upper-level entities such as Hosts and Storage Arrays until those entities exist. In essence, VW4 collects immediately, but the data is of limited benefit until those entities are created, so creating those entities is critical to return-on-investment. The sooner we get to visualizing data, the sooner we can begin trouble-killing.
Often, it’s easier just to convert from a common format that our customers understand, or can produce from other tools, into JSON.
One example we use is the basic 2-column Comma-Separated-Variable format that our previous VirtualWisdom3 product line understood. Of course, CSV has no schema as well, and suffers other fragility such that disagreeing assumption on whether spaces are used, or quotation-enclosure, or even end-of-line markers. Even a basic CSV parser needs to be very liberal about what it accepts — which means, to a coder, that assumptions and stream-cleanup are required. We can parse CSV into JSON, but the variances in format now hit us further out for the most part. We still lack schema validation, so we still can only validate JSON — both punctuation and schema — right at the VW4 import. Be prepared for some last-minute quick-fixes to import data.
Let’s consider a basic 2-column:
50:0A:09:85:98:12:34:56, NetApp-123456
50:0A:09:86:98:12:34:56, NetApp-123456
10:00:00:00:c9:12:34:56, Oracle01
10:00:00:00:c9:12:34:57, Oracle01
We want to convert this to basic entities that can be aggregated to create upper-level entities. In past, we’ve created the HBA Card as a way to offer WWPNs to an upper-level host, but within the version 1 JSON format, permission to create HBA Ports, HBA Cards changes. With Version 4.0.2, we gained the ability to define a FibreChannel Port, still within version 1 of the JSON format. FCPorts are either HBA Ports or Storage Ports, and are available to the Entity-Creation Utility where either HBA Port or Storage Port are presented, so it’s the most efficient way to offer a Name/WWPN mapping to VirtualWisdom4.
We can convert this 2-column CSV into a JSON using a number of means. Since VI is a Java company, and Java does indeed “run anywhere”, I tend to use Java for portability in addition to its behavior as an early-bound language (symbols are verified during development, not last-minute on the user’s desktop such as scripting languages — but that’s a personal preference)
Let’s look at this flow:
wpg_div_wp_graphviz_1
On our implementation, we’re going to use “CSV” as our common format, and replace our conversion look with vw4tools (an open source pure-java parser library), so our flow looks like:
wpg_div_wp_graphviz_2
Running Java
The hardest thing about this process is finding Java. If you don’t know how to run “java -version” for your platform, check that link.
Running vw4tools
VW4tools is run simply using the “-jar” option:
java -jar vw4tools.jar ...
If you have to use a full pathname to java, or you need to use a .exe, or both, this might look something like this:
"C:Program FilesCMCNE 12.0.2jre64binjava.exe" -jar vw4tools.jar
Please pay attention to the double-quotes: they tend to enclose things with spaces.
Running vw4tools with an Input File
VW4Tools works like an inhale/exhale action: it inhales files to process, may do things with that information, then exhales JSON files (or other debug files).
To give vw4tools an input file to process, give each file with a “–nickname=” or “-N ” and a URL of the file. For example:
java -jar vw4tools.jar -N http://example.com/some/query?fabric=Neo&pill=red ...
If the file is a local file, then you don’t need to use the “file://” that would otherwise be required: the fibrechannel-parsers used by vw4tools automatically tries a local file if there’s no protocol (i.e. no “http://” nor “file://”). The formats and protocols supported by FibreChannel-Parsers is listed on Compatible File Formats. Without a protocol, a local file is checked, which we can leverage to be lazy:
java -jar vw4tools.jar -N switch45.csv ...
You’ll see the underlying parser library tell you how successfully it parses the data using a number of different methods:
(vw4tools) parsed 0 zones, 3 aliases via NicknameParser
(vw4tools) parsed 0 zones, 3 aliases via VW4InvalidAddedParser
(vw4tools) parsed 2 zones, 7 aliases via ShowZoneParser
(vw4tools) parsed 4 zones, 6 aliases via AliShowZoneParser
(vw4tools) parsed 4 zones, 6 aliases via ShowZone2Parser
This is normal, and can signal if the parser you expected isn’t the most efficient to do the job. Of note: you don’t need to tell vw4tools what format the -N file is in. That’s intentional to make it easier for jet lagged engineers to get the job done.
To make it faster, we can do this with many files at once:
java -jar vw4tools.jar -N FabricA.csv -N FabricB.zoneshow -N FabricC.alishow -N FabricD.fcalias ...
This example shows files with “extensions” to hint at their content, but vw4tools is “the honey-badger of parsers”: it just don’t care. The file you feed it may be called whatever you like, it’s the stream inside it that matters.
Running vw4tools to an Output File
So that’s all well-and-good, but we want a usable JSON file for import. The “–nicknameout=” or “-o” option gets us there:
java -jar vw4tools.jar -N switch45.csv -o xyzcheese.json
Running this at the XYZCheeseFactory, that customer can immediately try this file on VW4’s validation before import. I’ll remind you that vw4tools doesn’t know what’s in your VW4 Appliance or OVA and may give it things it doesn’t want to talk about yet. Things like WWPNs it hasn’t yet discovered. This typically means that additional steps are required such as removing some hosts from the “inhaled” content, but we can cover that in a later article.
Conclusion
This should give a basic idea of how to convert a CSV into a usable JSON, most of the war towards having fcport entities which can be aggregated into hosts and storage arrays.
By using a number of parsers at once, you don’t need to guess what format a file is in, just toss it into vw4tools to get a result.
You do need to make sure you have a “-o” to get the result “exhaled” for import.
As well, around December 2013, versions of vw4tools needed the “-o” right up next to the output file: “-oacme.json”, not “-o acme.json”. This has long since been fixed, but I have a habit of doing to the more cooperative way just to get things done.
Additionally, VW4 seems quite interested in the name of the file, so it must be a .json, not a .JSON. I don’t know how well it works if the file has no extension. This may signify that other import formats will some day be possible — for example, acme.xml. That, with a schema, would make the pre-deployment preparation much more predictable.