When people think about web scraping in Python, they usually think BeautifulSoup.
When I think about web scraping in Python, I think Mechanize. BeautifulSoup is great for parsing already downloaded HTML files, but it doesn't have the same web-navigating features Mechanize does such as stateful web browsing and easy form filling (unless I'm missing something).
I've been thinking about checking out Mechanize, but I've been using BeautifulSoup and a small library of helper functions for a while and haven't found time to familiarize myself with Mechanize.
BeautifulSoup is lacking easy form-filling functions, but they're pretty easy to write, assuming that the site doesn't do really funky stuff with their parameters. Basically, just grab all the input and select elements from a given form and put them in a dict, update values, and urlencode.
When I think about web scraping in Python, I think Mechanize. BeautifulSoup is great for parsing already downloaded HTML files, but it doesn't have the same web-navigating features Mechanize does such as stateful web browsing and easy form filling (unless I'm missing something).