You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Current »

Anjum believes the TULIP Geolocation application  can be improved significantly. At least there are few ideas that we can try. For this, either a group of undergraduate students or an active masters student is required. The resultant work can easily be the thesis of masters level.

  • See http://www.slac.stanford.edu/comp/net/tulip/. Basically TULIP uses pings to a target from landmarks at known locations and converts the minimum RTTs to estimate the distances. Then uses the distances with mulitlateration to estimate the location of the target
  • To improve TULIP one needs the right selection of landmarks, i.e. good (working landmarks) at the right locations (not too far from the target), straddling the target, and with a a reasonable estimate of the indirectness (directivity or alpha) of the path from the landmark to the target (so we can reasonably accurately estimate the distance). One also needs a reasonable density of landmarks (e.g. number of targets/100,000sq km)
  • The landmarks come from PingER and perfSONAR sites.  We have a reasonable density in the US, Pakistan and Europe. Currently Anjum is getting better than 20km accuracy for Pakistani targets
  •  As the number of landmarks goes up so does the accuracy, but so does the time to make the measurements (pings).
  • One needs to find the optimal density
  • Anjum proposes to speed up the measurements using a cluster for parallelization and also proposes to improve the adaptation of alpha based region. He regards the adaptive geolocation and parallelization as  MS projects.
  • No labels