I am usually pretty calm but I am losing it at this time. I have put everything I've into this site, focusing on it for 12+ hrs Every single day within the last 6 several weeks, and everything works nicely, except...
Today the website finally released openly I eliminated the whitelist during my .htaccess file and stopped obstructing search engines like google via robots.txt. I am positive that individuals two files are fine. The problem is the fact that I made the website having a WordPress theme (and therefore, a WordPress install). I'd made 2 test blogs prior to getting began on development. For reasons uknown, individuals posts are you receive whenever you perform a Search.
This is actually the URL format of pages on the website, just in case it matters: http://www.replacedduetocomplaints.com/?sg=[RECORD ID]
So what can I actually do to repair this?! It appears like WordPress says to Google "don't be concerned about any pages unless of course there is a value for ?p or ?cat"... Help if you're able to, I am really worrying over this!
EDIT: SOLVED AFTER BEING CLOSED!
"Text file: For fundamental Web Sitemaps (Sitemaps which include only web site Web addresses, not image, video, or any other specalized data), you are able to provide Google having a simple text file that consists of one URL per line."
I simply need to make a listing of database records and perform a little miracle in Notepad++ to show the record ID's to their particular URL's.
It might take a few days before google indexes your site. You might start by looking into making a sitemap, and adding this to google website owner tool.
Google won't keep checking whether it can access your site. You blocked it for some time, therefore it requires a while for google to return.
Find google website owner tool here: http://www.google.com/webmasters/
Sitemap generator for wordpress is here now: http://wordpress.org/extend/plugins/google-sitemap-generator/
Then, it's also wise to focus on the URL. Allow it to be readable with a user. If your user may use it properly, and knows what he's doing, it always ought to be fine.
For instance, create a url like
Google might be thinking your
sg= parameter appears like a session ID. It certainly appears like someone to me. A couple of techniques you can test:
- Add your subpages to some sitemap
- Place the sg parameter directly into the url, as with: http://www.forktips.com/sg/SG_23pQp9ju12j8qDp0aRW9ep?task=0 (easy related to mod_rewrite)
- Use descriptive web addresses, for example http://www.forktips.com/ma/worcester/20+front+st/subway (possibly by having an ID somewhere inside) (harder to complete, but search engines like google such as this kind of factor)
More to the point, though, you have to really possess some links for this deep content. Google might have began doing AJAX moving, however they do not do it greatly, really. They actually don't ping your map in random locations hoping finding some links.
It is also entirely possible that google simply does not think your internet site is worth moving at length yet - acquire some quality incoming links plus they might change their mind. Or just wait for a crawler to find again.
Finally, Google really does not like sites that merely scrape content using their company sites. Your website appears to become simply dishing in the same information Google maps searches would, which is something which will hurt you when they notice it (for just one, they might block you against being able to access the maps API...)
Oh, and you will most likely wish to remove individuals test posts. It'd look really strange to come across them at random - and you wouldn't want Google's crawler depressed by them either. Also, advertising that you are a higher-school dropout in your upsell page does not look very professional...