Rev 7 | Go to most recent revision | Blame | Compare with Previous | Last modification | View Log | Download | RSS feed
This script polls network switch ports to see which MAC addresses are assigned to them, then polls the arp table
on one or more routers to determine the IP address. It then uses reverse DNS to look up the name of the device
The results are saved in a YAML file and used to initialize the internal structures if it exists, so when a device
goes off the network, the information that it was there remains available (and the date/time it was last seen).
Since the data is stored in a YAML file, a second script, mapSwitchesShow.pl was created to convert to a tab delimited text
file for human viewing and/or script processing.
Files:
mapSwitches.pl - the main file which connects to each device in config, gathers data, and updates persistent storage
mapSwitchesShow.pl - converts stored data into a CSV (tab delimited)
mapswitches.cron - sample cron file to gather information hourly and generate a CSV once a day
The following files are not required for the system to run
mapSwitches.config.yaml.sample - sample configuration file
README - This file
makeConfig.pl.sample - generate configuration file from hash. Easier for perl programmers
mapSwitches.config.yaml.sample - sample configuration file
pingall.pl - utility to ping all available ports on a /24 network. Used to artificially refresh arp table
The configuration file MUST be created before running the script. It can be manually created
(rename mapSwitches.config.yaml.sample to mapSwitches.config.yaml and edit) or generated makeConfig.pl.sample
(edit hash, then run)
The following file is created on first run.
mapSwitches.yaml
This is the persistent storage. It can be safely deleted at any time to reinitialize the system.
The script requires snmp to be installed (no MIB files required), and uses the perl module YAML::Tiny. On a Debian based
system, this can be installed with the command:
apt-get libyaml-tiny-perl snmp
We generally set this to be run hourly, and we then generate a tab delimited text file once a day. See the sample cron file
for how we do it.