You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 24 Next »

There are two significant type of maintenance issues when we talk about TULIP. These are: 

  1. Whether the landmark is working.
  2. obtaining accurate landmark locations and detecting location changes of landmarks

The problem with the landmarks working or not is discussed below in the Landmarks Laundering section below.

The other significant problem is the obtaining and maintaining accurate location information for the landmarks. Initially we obtained the location of Planet Lab landmarks using  geoiptool or geotool, however, we have improved the methods as described below.  We intend to run a nightly trscron job to check if there is any host which has changed its position according to geoiptool. The maintainPL.pl script is currently deployed at

 /afs/slac/package/pinger/tulip/maintainPL.pl

We have also discovered problems with geoiptool in some of the hosts. For example one challenging problem was two hosts(planetlab1.pop-mg.rnp.br,planetlab2.pop-rs.rnp.br) in Brazil . We did a ping between the host and got a response of 25~30 ms. Geoiptool showed that the hosts are at the same location which is about the center of brazil. We then inquired further and found the case interesting as the traceroute to both of them was going through different routes i.e. traversing different routers to reach the destination. After confirming the actual latititde and longitude from thier websites we updated the database manually. To cater for the problems like these the hash named errltln(Error in lat/long in geoiptool) contains the host which are not updated in the database with this script. The sample code is given below

#Error in lat/long in geoiptool
 my %errltln = (
 'planet01.hhi.fraunhofer.de' => '1',
 'planet02.hhi.fraunhofer.de' => '1',
 'planet-lab1.ufabc.edu.br' => '1',
 'cs-planetlab3.cs.surrey.sfu.ca' => '1',
 'planetlab1.pop-mg.rnp.br' => '1',
 'planetlab2.pop-rs.rnp.br' => '1',
 'csplanet02.cs-ncl.net' => '1'
 );

Laundering Landmarks

Unable to render {include} The included page could not be found.

Finding the Latitude and Longitude of a Landmark Manually

Unable to render {include} The included page could not be found.

TULIP Creating the Landmark XML files

There are two landmark xml files:
  1. sites.xml gives a list of landmarks which are enabled (ability=1)
  2. sites-disabled.xml gives a list of landmarks which are disabled (ability=0)

They are created from the TULIP database by two trscrontab jobs running create_sites-xml.pl:

create_sites-xml.pl > /afs/slac/www/comp/net/wan-mon/tulip/sites.xml
create_sites-xml.pl --ability 0 > /afs/slac/www/comp/net/wan-mon/tulip/sites-disabled.xml

The landmarks are enabled or disabled in the TULIP database by the laundering process.

These xml files are used by reflector.cgi with the ability parameter to select the landmarks to be used.

Using Google Fusion Tables to record landmarks

We analyzed the possibility of using Google Fusion Tables/API to store landmarks from the TULIP database. Initial results indicate that keeping a Google Fusion Table in sync with a live url or having it to update itself as a change happens, in the database is not currently possible. These tables are really good, in case of static data, but as of this moment, the tulip database, gets updated daily. Atleast twice to enable or disable nodes, and atleast once to update planet lab nodes.
As of this moment, we have concluded, that we should skip this methodology and wait, for improvements, and newer versions of the API.

  • No labels