Shaun Atwood, (Among the men who chose to make this site) authored an excellent article around the need for sitemaps.
I am a little irritated that people have to setup this special apply for the Googlebot to complete its job correctly it appears in my experience that web spiders should have the ability to spider lower our simple paging URL plan without me giving them an explicit assist.
The good thing is that since we setup our sitemaps.xml, every question on Stack Overflow is eminently findable. However when 50% of the traffic comes in one source, possibly it is best not to request most of these questions.
So yeah, effective for individuals, or effective for google?
I'd have thought a HTML sitemap ought to be helpful to some human, whereas these 2 sites aren't. If you are attempting to target a internet search engine a sitemap.xml file that adjusts to sitemaps.org will be a better approach. Although the html approach works it's simpler to develop a xml file and also have your robots.txt file pointing only at that.