Internet Guide Logo


Last Edit: 10/01/17

ALIWEB is a search engine that was launched in 1993, and is amongst the earliest web search engines. It has been claimed that ALIWEB was the first search engine, but this claim is challenged by a 'handful' of search engines launched during the same period of time. ALIWEB was developed by Martijn Koster, a Dutch software engineer who had previously designed a search engine for FTP. ALIWEB was a fairly basic search engine, compared to modern variants, that was capable of searching the title, meta description, and meta keywords, of a webpage. The Webcrawler robot was launched in 1994, and this crawler was capable of indexing and searching the full text of a webpage. Therefore, ALIWEB's search results were soon surpassed, but, it's place as the first, or, one of the earliest web search engines, is assured. Shown below, is the search form that ALIWEB used in 1993, and still appears to use, as of 2015:

The ALIWEB search form, the first search engine for the web, that was established in 1993.

ALIWEB is not a defunct website, and is still operational, as of 2015: The search form is still in the same configuration as the original - so the assumption is that it is still functioning in a similar manner to the original; but this is probably a mistaken assumption, as it appears the website is no longer owned by Martijn Koster. However, as of 2015, Martijn Koster appears to have created a mirror of the ALIWEB database, which can be accessed at the FUNET.FI server.

Martijn Koster's creation of ALIWEB has had an ongoing impact upon search engines. Due to his experience with serach engines and web crawlers, Koster proposed the 'robots exclusion standard' in 1994: this proposal outlined a standard that would allow websites to ban web crawlers (an Internet robot) from accessing their content and indexing it into a search engine database. Robots.txt files are used by most websites, and the majority of search engines - including Google, Bing, Yahoo! etc - adhere to commands in robots.txt files that ban or allow their web crawlers from accessing content.