I’ve been having quite a bit of trouble with my blog since I switched from WordPress.com to WordPress.org. When I switched, I bought hosting with SiteGround who seemed to be the best option for me at the time, as they would migrate my site for me and I could basically just continue blogging, just with different hosting. However, before I signed up for their plan, I noticed that the plan I was looking at mentioned that it was ideal for around 10,000 views a month. I thought to myself that it should be heaps, but I decided to just double check with their staff what would happen if I went above 10,000.
I was told not to worry about it, as the 10,000 was not a restriction, simply an indication of what plan would suit me best based on how many views I get. I get way less, so I was happy with that and went ahead and signed up and paid the fees.
Now, there was no mention of the fact that apparently ‘bots’ from search engines will be ‘crawling’ my site to index it. I found this out when I got an email from SiteGround saying my site has been restricted and can’t be accessed, and I had to contact their support team again. I mean, I totally understand that my site would be indexed, which I’m more than fine with seeing as search engines are a great way to get traffic to your site. But apparently those bots are maxing out my allowance of 10,000. Yeah, apparently it’s now an ‘allowance’. When I contacted their support team this morning to query this, I was told that if I go over that allowance, my site may be restricted for 24 hours. Oh, great.
I let them know that my decision to go with them might have been different if I knew that this was the case, and that I had been given the wrong information, but they didn’t really seem to care. I was advised to edit my robots.txt file on my domain to prevent site engines from indexing my site with no further explanation. That to me sounded like my blog simply wouldn’t come up in search engines, which is not a resolution that I’m happy with at all.
So I contacted their support this afternoon to discuss the matter again. The staff member I dealt with this time was a lot more helpful than the others by far. He explained that they can edit the robots.txt file for me and in fact create it as I didn’t even have one. Not only that, they can make it so that official search engines such as Google, Yahoo, MSN etc. can still index my site, but unknown ones or ‘bad ones’ as he called them, would be blocked. That certainly sounds a lot better!
So he’s done that for me now. I guess now I’ll just wait and see if it fixes the problem. If you’ve been trying to access my blog the last couple of days, you probably wouldn’t have been able to, and I haven’t been able to write anything either, so fingers crossed it will be back to normal now.