Webmaster tools access denied


















It only takes a minute to sign up. Connect and share knowledge within a single location that is structured and easy to search. It's strange that it shows for URL's with www but not for those without it i.

The access denied errors in Google Webmaster Tools that you're seeing for only URL's containing www is likely because the Googlebot can't access them. Some common reasons for that are listed here: Google Webmaster Tools: Access denied errors.

Specific to your situation, since this affects only URL's containing www and not those without, it's likely there's an issue either with your web server's configuration or DNS settings pertaining to how requests for URL's with the www subdomain are being handled.

A bot is crawling your site, to know this, check your access log and see if is always the same IP or almost the same IP. Someone wants to deface your site or troll you. He may have a site with good traffic and have put an iframe inside his site pointing at your site so all the users that enters his site loads your site in the background. Also he may bought cheap adf.

To know this, just check your referrals and check from where the traffic is coming. You site has been linked somewhere in a high traffic famous site. To know this, again check your referrals. I don't know what server have you installed so I can't tell you how to check IPS and referrals. I have an idea that it is related to not being able to crawl urls with cdn-cgi just a hunch. If so, please see this. If not, please see the following information about crawl errors and CloudFlare.

I'm only blocking as of today Baidu as I find it is every day bothering me Bing should learn from them, what a lazy spider Bing is This my robots. Wrote: you can use below for code for robots. I had actually seen this robots. I'm glad it looks like it is then.

Can I ask, is the use of the code word "Allow" correct? Shouldn't it be "Disallow:"? I'm asking as I have read that Allow should never be used as not all bots understand that, and that Disallow is the word every robot understands. Thanks again Find. I was just reading more on it and apparently using "Allow:" is useful for when you disallow a lot of folders and then want to ensure Google does crawl the rest of the site.

Only problem is that I'm reading some bots sometimes take Allow as Disallow, especially the older ones. And I'm reading rumors that Bing bot is as stupid as lazy it is, so it may interpret Allow as Disallow. My question is: if I remove the Allow line, am I OK with Google to crawl everything except what is mentioned in the above Disallow lines? I'm thinking that yes, that'll be fine, but want to double check to finish now Also, I'm getting more and more errors and even freaking errors.

I have just seen that in the plugin folder it has a robots. Even still, I would have asked this question just in case, but I hope this issue of mine helps others in the future searching for answers. Active 4 years, 2 months ago.

Viewed times. Improve this question. Add a comment. Active Oldest Votes. Improve this answer. Jobin Jobin 4 4 silver badges 9 9 bronze badges. Sign up or log in Sign up using Google.

Sign up using Facebook. Sign up using Email and Password. Post as a guest Name. Email Required, but never shown.



0コメント

  • 1000 / 1000