Search engines like google prefer to crawl and index web pages or Web addresses, but let's say your web pages/Web addresses have expired content and also you don't want these to be indexed after a lot of days?

Are you able to put an expiration within the URL and also have mod_rewrite 301 redirect pages following a given expiration date?

Or simply a cron job to include a 301 redirect header to any or all expired pages?

Simply have the 'expired' pages return a 404? I think that after Google encounters a 404, it'll take away the page.

Not 404 or 301, but 410 Gone. This is actually the appropriate HTTP response:

The asked for resource is no more offered at the server with no sending address is famous. This problem is anticipated that need considering permanent. Clients with link editing abilities SHOULD remove references towards the Request-URI after user approval. When the server doesn't know, or doesn't have facility to find out, set up condition is permanent, the status code 404 (Not Found) Ought to be used rather. This fact is cacheable unless of course indicated otherwise.

The 410 fact is mainly meant to profit the task of web maintenance by notifying the recipient the resource is deliberately not available which the server proprietors desire that remote links to that particular resource be removed. This kind of event is typical for limited-time, marketing services as well as for assets owned by people no more working in the server's site. There is no need to mark all permanently not available assets as "gone" in order to keep your mark for just about any period of time -- that's left towards the discretion from the server owner.

The way you provide this fact is available to discussion, however. You will find many different ways.