You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 11 Next »

 We use the "tulip" database to generate our sites.xml which is further used in probing the landmarks. We added pingER nodes from the Nodedetails database to the tulip database but with some defined rules. We only added those nodes which contain a traceroute server. To implement this and make it working we developed three packages.

  •  TULIP/ANALYSIS/NODEDETAILNODES.pm
  •  insert_sites-xml.pl
  •  create_sites-xml.pl

TULIP/ANALYSIS/NODEDETAILNODES.pm

In order to build this module we used some predefined  perl modules and scripts. To get data from node details database we used require '/afs/slac/package/netmon/pinger/nodes.cf' and we used standard package Text::CSV_XS to convert our data in comma separated values. In order to define our node to be a candidate for a tulip landmark we tested it for a few conditions which include that the node must have traceroute server, it should not be set to NOT-SET and it's project type should not be set to "D" which means deleted as per nodedetails database semantics. The nodes which qualified these conditions were put into a separate array. This array is used by insert_sites-xml.pl to insert these sites to Tulip database

 insert_sites-xml.pl

This perl script is used to create the insert query from the data of above nodes. Again using the perl package Text::CSV_XS we divide our data into separate chunks. The data is then fed to the structure which contains parameters for the query. This script resolves each hosts with host names taken from nodedetailnodes.pm this helps in eliminating bad hosts if they exist. Before inserting new nodes into database it checks weather the node is in nodedetail or not. We use ipv4Addr as our unique key. We traverse through the tulip database and check if there exists some node with the same ipv4addr if it exists we ignore the entry and if not we go ahead with inserting it in the database. 

create_sites-xml.pl 

 This perl script is used to create the sites.xml file which is further used in our TULIP project as a source for node information and landmarks. This perl module uses the template library in order to generate required xml. It traverses through tulip database and gets each node, checks the service type and generates the file with all the available parameters in the database.

 Tulip Transition to Sites.XML

Our next step in the process is to transform TULIP so that it can get data from created sites.xml. TULIP version 1 was having two different data sources one to get data from nodedetail data base and the other was to fetch data from a list containing Planet lab sites.

Inorder to perfom this step we used JAVA Xerces  API , which is described as the best option in terms of resources used and efficiency in parsing XML data. We used SAX model because of following reasons.

1) The list of landmarks would increase as more pinger monitoring sites are added

2) We donot require to rewrite or add xml to the document from TULIP

Tutorial posted by SUN is helpful in understanding parsing of XML. The tutorial can be found over here

 TULIP Client

  TULIP has a Java based client which have the important functions and classes for sending the query over web to reflector.cgi. In addition to sending query and getting/parsing responses TULIP client uses the algorithm to find the location of queried host. This section explains technical details about the tulip client and the classes used and important function. The re-implementation of TULIP has led to proper packages and use of ANT package management tool by Apache for code compilation.

Depandancies

Current implementation of TULIP Java client depend on two files for its execution

  • Sites.xml
  • /afs/slac/package/netmon/tulip/Initial.txt

The former is dicussed above in this article. The latter is used to Automate test. Its a list of nodes/sites which TULIP tries to locate when it is run from command lin. 

Code Deployment at SLAC

TULIP is deployed at SLAC in /afs/slac/package/netmon/tulip/src

The class hierarchy is defined in three folders which are defined in library and tulip client hierarchy.

  • Library Folder
    • The lib directory contains third party packages used for implementation of TULIP. For eg we are using  Java Matrix Class which provides the fundamental operations of numerical
         linear algebra.
  • TULIP Folder
    • This folder contains actual implementation of TULIP client. It has two branches core and utill. The core package contains core classes or classes which are used in directly invloved in  implementation of Tulip. The utill package contains all the utility classes for the core classes i.e they help in implementation. For instance to get the data we need to parse sites.xml. So, getData is core class where as xml parsing class is helping it to achieve its goal so its in utill package. 
TULIP Core Classes

TULIP core classes are divided into three sections.

  • Automate.java
    • This class contains the control of the program. This class have the "main" function which initiates the test and decides the flow of the program.
  • GetPingDataPL.java
    • This class handles the all the issues relating to parsing sites.xm quering reflector and parsing the results. It uses Sites.java as its child class which is in utill function. Sites.java contains all the parameters like location information, node IP address and rtt values. To populate the geo-location information it parses sites.xml and for rest of the parameters it parses response from the reflector.
  • Locate.java and PhysicalDistance.java
    • These classes contains the algorithm it self. Once GetPingDataPL has fiished its functionality Automate.java handles the control to these two classes and they use Triangulation Algorithm to identify the location of the host.

Compiling and Running the Code:

The code has been reformated in a way that it should work with a well know package management tool (ANT). The main benifit of ant is that you need not to compile a class and set the path once you have edited the class. The package management tool handes it all. The build.xml file has been written to set the paths of classes once the code gets compiled. It would require following steps to compile and run.

  ant -f build.xml

This command would compile the code and create a directory build in the same folder with all the .class files. To run the code you need to run following command from build directory is needed.

 java tulip.core.Automate

Selection of Planet Lab Sites

To select the Planet Lab sites we need to go through a process. In this section we would highlight the scripts used in the process and their functionalities.

 generatePL.pl

The process started with a script  /afs/slac/package/netmon/tulip/generatePL.pl

 This script is deployed in crontab which generate a all the available PlanetLab sites. The list is gather through the following link

 http://www.scriptroute.org:3967/

This script generate Sites-yyy-mm-ddd files in /afs/slac/package/netmon/tulip/sitesxml with the nodes which are marked up by the scriptroute adminstrators.analyzePL.pl

analyzePL.pl

This script opens up each file generated by generatePL.pl and counts the number of times each node appears in the files. This way we can have the idea for the uptime of every node. And based on the results we can choose the node to be added our repository(tulip database) as the primilanary passed node.

After primarily evaluation we run tulip from command line and test the added nodes for their ping servers. The complete test contains 107 sites to be traced by TULIP client.  This generates the log file with all the successful and failing landmarks.

tulip-log-analyze.pl

 The purpose of this script is to parse the log file and provide us a hawk eye view of the landmarks which are failing or are successful . We then disable all those nodes which do not reply to requests by reflector.cgi. This script also generates the total delay/time taken by each landmark to respond. We will use this parameter for the final selection of our tier 0 landmarks.

  • No labels