Set zoom level of a google map so that all markers are visible

Hi out there,
A couple of months ago I needed to set the zoom level of a Google Map so that all markers in the map are visible. After hours of reading the API documentation, I did not find any solution on that issue, so I wrote a little Javascript function that does exactly that.

This function has to be called after all markers have been inserted into the map.
Here I set up a quick example: http://www.adick.at/wp-content/uploads/map/

EDIT:
The above example is deprecated. It is based on a function I wrote some months ago –
now there is a better and more common way for doing this: http://econym.org.uk/gmap/example_map14.htm

Cheers
Alex

Blocking web crawlers on lighttpd

Note: The information contained in this post may be outdated!

Nutch did ignore my robots.txt (for whatever reason, I was unable to figure out why), so I had to find another way to forbid those directories for the crawler.

I finally came up with this neat piece of config for lighty:

– throws an HTTP 403 when matching our defined User Agent and URL.

Nutch – meta description in search results

Note: The information contained in this post may be outdated!

Hello out there!

Today I’m gonna show you how to tell nutch to display your page’s meta description in the search results.

In order to do so, we need to write a plugin that extends 2 different extension points. Additionally the OpenSearchServlet needs to be extended in a way that your description info gets shown. (I perform searches via the OpenSearchServlet, extending the default search.jsp should be similarly to that I guess).

At first the HTMLParser needs to be extended to get the description content out of the meta tags. Then we need to extend the IndexingFilter to add a description field to the index. Nutch — meta description in search results weiterlesen

Nutch – prevent sections of a website from being indexed

Note: The information contained in this post may be outdated!

Nutch by default indexes the entire HTML document, this means that basically every single word of a page is taken to the index of it. When you have common boxes on your web site, e.g. a sidebar or a footer (applies to almost all web sites nowadays), nutch takes the terms of those common boxes into the index of all crawled pages.

Searching for a term contained in one of those common boxes leads to loads of results, since this term is associated with all pages nutch has crawled so far.

Now I thought an ideal solution would be telling nutch to ignore specific sections. A good and common practice doing this kind of stuff is creating HTML comment tags, let’s say <!–nutch_noindex–> … content not to be indexed … <!–/nutch_noindex–> – these comments could then be wrapped around our sidebar or footer, preventing our nutchie from indexing it.

If this is what you are looking for, read on. Nutch — prevent sections of a website from being indexed weiterlesen