I'm able to use robots.txt to prevent a folder of images/html files being indexed. But how about dynamic pages, e.g. stopping indexing of certain WordPress pages?

The robots.txt syntax does not worry about whether a webpage is dynamic or otherwise: The only goal for this may be the directory structure.

If you work with a permalink structure like

example.com/blog/year/month/slug

you need to have the ability to exclude single pages like so:

user-agent: *
disallow: /blog/2011/09/this-is-a-test-entry

you could utilize Google's website owner tools to ensure whether that occurs correctly.

Keep in mind that Wordpress stores static content like images and PDF documents in /wp-content - you cannot block individuals by doing this unless of course you need to block all assets for the reason that directory.