Our corporate intranet is a non-framed environment with both Lotus Domino and IIS (.Net and classic ASP) applications and content. We have between 300,000-500,000 pages of web content and documents across more than 1200 “sites” on approximately 30 unique domains. We used to have Inktomi’s UltraSeek Server 3.0 as our intranet search engine which was beginning to look like its age (purchased in 1998). The Inktomi product did not handle attachments well (DOC, PPT, PDF, etc.), would not crawl our secured sites, and was no longer supported by the vendor. We did a cursory review of the search vendors and were immediately attracted to Google’s 30 day trial offer for their Google Search Appliance (GSA). After signing a standard agreement, they shipped us a brand new shiny yellow unit which we could test for 30 days before returning or purchasing.
The GSA is a “black box” 1U standard rack-mountable server. By “black box” I mean, Google gives you a web interface to administer the device but do not want you to access the Operating System (a heavily Google-customized version of Linux). In fact, the license agreement stipulates that you will not tamper with the hardware or OS of the appliance in any way. The device has no need for a keyboard, mouse or video all you need for normal operation is a network cable and standard power input.
The GSA comes in different flavors to fit different needs varying by size of the hardware and correspondingly size of the license. (Licensing is based on the number of URLs crawled by the appliance.) There are 3 different hardware configurations; the GB-1001, GB-5005, and GB-800. These are broken down as follows;
GB-1001 150K documents for $28K, 300K documents for $50K
GB-5005 1.5M documents for $230K
GB-8008 4M documents for $450K
As advertised, the GSA met all of our needs being able to index the large variety of filetypes we have in our environment, access secured content, having a documented API, etc. The Google brand power was another big selling factor. When we told our users that they were going to get a Google-based search engine they knew their days of troubled searching were over. Lastly, the 30-day trial run experience we had with the GSA sealed the deal. The appliance is the easiest enterprise solution I’ve ever had to install, configure and maintain. We were literally up and running within an hour of opening the shipping box.
The appliance has two network ports on the back panel; one for normal operation and the other used exclusively for network configuration. To configure the network settings we connected a laptop to the appliance via a special (some pin-outs are non-standard) orange Ethernet cable which is included. The installation process was about as easy as one can imagine for a “black box.”
First we plugged in the normal operation network cable and then the power. The power plug on the appliance IS the power switch; plug it in to turn on and unplug it to turn it off. After plugging it in, we waited about 5 minutes for the appliance to play a tune which is the signal to continue. Next, we hooked up our laptop (already set to DHCP mode) to the appliance and powered it up. After logging in to our laptop and making sure we had the correct IP assigned by the appliance’s built-in DHCP server we are ready to configure the network settings. Total elapsed time (excluding rack mounting): 10 minutes.
Network configuration, like normal administration, is done entirely through a browser and is a simple 5 step process. The first screens ask you for basic network information; IP address, subnet mask, default gateway, and DNS. Subsequent screens collect SMTP server, “From” address for GSA notification messages, time zone, NTP (time) servers and the admin account name/password. The last step is to test a few URLs which you will be crawling to make sure you’ve done the setup correctly. After a final settings review screen configuration is complete and you can then unplug your laptop and get to the good part; start crawling. Total elapsed time: 10 minutes.
Crawling the site(s)
Using the URL provided, all administration of the GSA is done remotely. After logging in with the ID/password we provided in the previous step, we were presented with the Administration console. We created a new collection to hold our index, put in the “Start crawling from” URL, copied that same URL into the “Follow and Crawl only URLs with the Following patterns” box and we were done. We saved our settings and then clicked the “Start crawling” button. We then went over to the “Crawl status” screen and watched the “Crawled URLs” counter increase. Google advertises that it can crawl about 4,000 URLs in about 15 minutes or so. We found the crawl time would increase significantly if there are documents (Word, PDF, Excel, etc.) linked to from those URLS.
After the crawl is done the collection is automatically indexed and then checked against the Serving Prerequisites (any criteria you wish to use to determine whether to move an indexed collection to production) and the collection will either be moved to Production (and consequently searchable) or be moved to Staging. The Staging area lets you validate new crawls before letting users search against them.
After your first crawl you may find the need to go back and tweak the crawling parameters. Google gives you a good amount of control over how sites are crawled, the frequency, how many threads are used, etc. For sites with security, the GSA supports Basic Authentication and an additional security module is available which supports Forms Authentication. The most challenging configuration aspects for us were determining the right combination of URL patterns to exclude from the search. If you are a Domino shop and looking to use the GSA you may need to spend some time getting the crawler configuration just right to support the sometimes convoluted Domino query string parameters.
After we got the crawl parameters tuned and the first complete crawl done we did some testing to see if the crawler grabbed all the content. Browsing our site and testing with some strings buried deep inside the taxonomy we always found the GSA had crawled them accurately. We also did some testing with strings inside PDF documents, PowerPoint presentations and the like. When we did come across something that hadn’t been crawled a careful analysis led us to discover that we needed to do some more tweaking of the crawl settings.
Other notable features
Google also gives you a KeyMatch tool that allows you to specify which indexed documents should appear at the top of the results page for a given query. These manifest themselves almost identically to the Sponsored Links at the top of the results page of the Google we all use. A Synonym tool allows you to specify alternate words or phrases for search queries. For example, if someone searches for WCM, you can suggest “Web Content Management” at the top of the results page.
An output format feature lets you control (via an XSLT) the presentation of the search results. You can use this for changing the fonts, colors, logo, header, etc. of the results page. We were able to easily remove the “Cached” feature on the results page with some XSLT modifications.
The Reporting tool lets you run reports on search queries over various time ranges. It will show you the number of searches per day, per hour, the top 100 keywords and top 100 queries for the time period specified.
The GSA is not for organizations looking to index their shared network drives as the appliance has no facility for crawling file systems. This is really too bad as many companies struggle with the huge quantities of unstructured content on stored on their networks. Of course, there are a plethora of other products out there for exactly this issue.
Access directly to databases (e.g. SQL, Oracle, etc.) is another area which is off-limits for the GSA as well as any kind of integration with content or document management systems.
Author: Bryan A. Mjaanes