SE discover websites by one of two means. Firstly crawlable bots collect meaningful information from, for example, title tags, URL and hyperlink data, to help with recall and search precision. Secondly humans may submit URLs to a search engines’ database. Every website should optimise for search engines because they will be more popular. But where is search moving to and how does local search know where websites’ are?
Geographic information technology
Geographical search was created by the evolution of the web, for example moving from a one-way, web 1.0, medium to a two-way medium, web 2.0, meant that locational data was evolutionary created. Goodchild (2007) also identifies the 4 main geographic technologies:
- Numbers coordinate with specific locations that are on the Universal Transverse Mercator (UTM) system. This is a grid-like system that pinpoints locations.
- Inserting a standard piece of code to identify a location is known as geotagging.
- Global Positioning System (GPS)
- Allows direct locations to be taken from portable devices with approximately 10 metre accuracy.
- Broadband / Internet connection
- All geographic information needs to be uploaded to the web so an internet connection is required for geo-specific information.
Hyperlinks and localised search
Within 1 year 24% of hyperlinks do not change whereas 25% of hyperlinks are newly created each week in 2007. This does not only mean search engine algorthims need regular updates but the amount of data search engines need to take into account greatly increases as web technology evolves. A URL, for example, will have server and hosting information examined too. Hyperlink data goes hand-in-hand with geographic searching because URLs are also pinpointed to specific hosts; thus locations. Geographic, or location based, searching has not reached its full potential and these signals, if you like, need to be accurately developed to be precisely recalled.
Currently local links boost local rankings in terms of SEO. Hyperlink data is not only important to analyse but link building campaigns must focus on the broader picture: Aims and objectives of link building campaigns are the most effective way to launch focused campaigns.
Search engine indexes are now 3D allowing locational data to be stored. Such data, however, is difficult to precisely pinpoint because webmasters may choose servers in foreign counties due to, for example, cheaper price plans. Location based searching has not reached its full potential yet.
You may find how large search engines deliver localised results interesting.
Google Venice update
In 2012 Google released an update called “Venice”. The Venice update means broad search queries rank to specific locations. Google does this by IP address. For most searchers this is effective, however, it will always be ineffective for those who use virtual private networks or proxy servers. The future of localised search means a range of data will be collected and analysed to allow search engines to judge where a website is best placed, for example, combining servers, hosts and IP address data alongside URL information will allow localised search to be more effective.
Do you use local search engines? Tweet Gerald and get involved.
Posted by Gerald Murphy
- Bar-Ilan, J. (1998) On the overlap, the precision and estimated recall of search engines: A case study of the query ERDOS. Scientometrics. 42(2) pp. 207–228.
- Goodchild, M.F. (2007) Citizens as sensors: The world of volunteered geography. Geojournal. 69(4) pp. 211-221
- Ntoulas, A. Cho, J. and Olston, C. (2004) What’s new on the web? The evolution of the web from a search engine perspective. ACM.
- Ramsey, M. (2012) Understand and rock the Google Venice update. [Online] [Accessed on 22nd October 2013]