Bennett Haselton wrote:
> When you visit a site in Firefox, it sends a query to
> safebrowsing.clients.google.com asking if the site is safe. Then if
> it gets back a response saying that Google has the site on its list
> of malware-infected sites, Firefox displays a warning saying the site
> has been infected with malware, before giving you the option to
> proceed to the page.
> Internet Explorer could protect a lot of users from malware infection
> by using the Google feed as well. Does anyone know of any public
> statements from either Google or Microsoft, as to why they don't do
> this? Does Google let anybody use the
> safebrowsing.clients.google.com feed who wants to, and it's
> Microsoft's choice not to have their browser query it as well? Or
> would Google for some reason not want IE to query their database the
> way Firefox does, unless Microsoft paid them a fee or something?
> In terms of protecting people from malware, this seems like really
> low-hanging fruit to pick, given that the system has already been
> implemented for Firefox.
You actually trust Google's classification of what is malware? I tried
visiting a site that was *discussing* some malware and gave some links
to exploit examples (which you had to follow instructions to make them
actually usable) which Google then claimed was a malware/malicious site.
Turned out it wasn't the obfuscated code but some wording on the site
that triggered Google to mark it "bad". Stupid.
You've never heard of false positives in anti-virus, anti-malware, and
other security-related software? So just how are you going to "fix" a
mis-classification by Google? I double Google handles pages that employ
dynamic code obfuscation (http://www.finjan.com/Content.aspx?id=1456
so Google not saying it is a bad site does NOT make it a clean site.
So you have some false positives. And you have false negatives (which,
I suspect, is so prevalent that Google not alerting on a site saying
nothing about the safety of that site).
Because Google uses in interstitial page to display its alert (see
), it may not be seen
by a user of a popup blocker. Some popup blockers eliminate intersitial
because they are predominately used by spammy sites (although I have
seen some login pages go bouncing through interstitial pages to complete
Unlike lookups done in parallel for a search engine, like when you do a
Google search, where the search results are shown immediately while the
malware lookup is done in parallel and updates the search list when the
lookup data becomes available, sticking in an interstitial page before
you get to the web site's page will ALWAYS slow you access to that web
page, plus you're are adding even more statistics to Google regarding
where users are navigating (yes, they help protect you against
*possibly* infected sites but they're also tracking all this surfing,
too). Say you were to install Google's Toolbar (don't argue about
whether you would or not since the point is not about installing the
Toolbar but how you might configure it). Would you actually enable
their Page Ranking feature that reports to them your navigation habits?
Well, if you disabled it there, why are you willing to give all that
same information via another route?
By the way, Google gets their "bad" site list from StopBadware
). According to their web site's home page,
they had 333,276 sites reported as "bad". Does that really sound like a
lot of bad sites to you? As of February 2007, the Netcraft Web Server
Survey found 108,810,358 distinct websites and in June 2009 they
reported 238,027,855 sites. So about one-tenth of one percent (0.14%)
of those sites are listed as "bad" by StopBadware. So why would such a
service even exist if it lists so damn few sites as bad? Also remember
that any such ranking by Google is based on whenever it last crawled
across a web site so the ranking is old. It doesn't reflect the state
of the site NOW. A clean site might be infected now but not when Google
checked it. An infected site might now be clean (from webmaster
reports, it appears to take 2 weeks before the submission to reclassify
their site is even seen at Google and it takes over a month before they
can get reclassified from bad to good or from falsely accused to good).
Web sites are increasing at a rate far faster than these services can
rate them regarding their safety. Plus any such ranking is old. Users
of McAfee's SiteAdvisor already are aware of how old are those rankings.
Google doesn't seem to be any faster at keeping updated on just the site
they have ranked. Rather than rely on some antiquated list full of
errors (false positives/negatives) and which cover only a very tiny
number of web sites out there, seems you want to use something that
interrogates the web page NOW.
Avast and other anti-virus+anti-malware programs have their web
"shields" or "guards" to interrogate the content of web pages as they
are NOW when downloaded into your web browser. Finjan has their toolbar
to scan pages for malicious content but their free version only works
with search engine results which means they are useless when you
actually visit the web page unless you pay for their commercial products
that sit upstream of your host. Online Armor (paid version only)
includes their DNS spoof checking to make sure you end up where you
think you went to prevent phishing you to a fake site.
The lists can't keep up with the rate of sites showing up. You need to
know about the site as it exists now, not what it was back a week, a
month, or longer.