Can there be any tool/library for Ruby that, when given a website title, will return a listing of all of the pages at this domain?
You could utilize Anemone, it's a Ruby web spider framework. It takes Nokogiri like a dependency, since it must parse the (X)HTML.
Enumeration is really a struggle if your website is anything apart from an accumulation of static HTML pages. When you enter into server-side scripting of any sort, the "page" came back can depend heavily around the condition of the session. An apparent example could be pages or assets purely available once you sign in. Due to this, many automated enumeration tools (usually a part of web application security auditing programs) fail and miss large servings of the website. My point here's that there's frequently more to enumeration than managing a tool.
The good thing is it's really simple to create your personal enumerator that work well given a little of understanding you can aquire mostly from just poking around on the site. I authored such like using Mechanize, which handily tracks your history while you request pages. Therefore it is quite a simple task of having Mechanize to setup the server-side condition you'll need (namely, signing in) after which going to every link you discover. Simply request the leading page, or any "list" pages that you'll require and a range of links. Iterate over this listing of links and, when the link isn't within the history, visit that link and keep listing of links on that page. Repeat before the listing of links is empty.
But like I stated, everything is dependent on what's happening server-side. There might be pages that are not associated with, or aren't accessible by you that you simply will not have the ability to uncover by doing this.