I usually believed that databases ought to be denormalized for reading through, because it is accomplished for OLAP database design, and never exaggerated much further 3NF for OLTP design.

PerformanceDBA in a variety of posts, for ex., in Performance of various aproaches to time-based data defends the paradigm that database ought to be always well-created by normalization to 5NF and 6NF (Normal Form).
Have I understood it properly (and what had I understood properly?)?
Wrong with traditional denormalization approach/paradigm style of OLAP databases (below 3NF) and advices that 3NF is sufficient for many practical cases of OLTP databases?

Please either downvote or upvote, don't mix both.

I've multiple posts showing votes getting 3-7 upvotes + 3-7 downvotes. I favor getting 6-14 downvotes than total score. For ex., this inside a couple of minutes after posting had with 4 votes (-2+2). I favor getting -4 rather.
I don't worry about personal repetition but about underepresentation of great interest to subjects.

Update3: for instance:

OK, I ought to confess which i could never hold the ideas that denormalization facilitates data reading through performance. Can anybody produce references with good logical explanations of the as well as the contrary values?
What exactly are sources that I'm able to make reference to support (to convince my stakeholders) that OLAP/DataWareHousing databases ought to be stabilized?

To enhance visibility I replicated here from comments:

"It might be nice if participants would add (disclose) the number of real-existence (no science projects incorporated) data-warehouse implementations in 6NF they've seen or took part in. Type of a fast-pool. Me = . – Damir Sudarevic"

http://en.wikipedia.org/wiki/Data_Warehouse informs:

"The stabilized approach [versus. dimensional one by Rob Kimball], also known as the 3NF model, whose supporters are known to as “Inmonites”, have confidence in Bill Inmon's approach in so it is mentioned the data warehouse ought to be patterned utilizing an E-R model/stabilized model"

It appears like stabilized datawarehousing approach (by Bill Inmon) is regarded as not exceeding 3NF (?)
Among the finest to know what's the origin from the myth (or ubiquitous axiomatic belief) that datawarehousing/OLAP is synonym of denormalization?

Damir Sudarevic clarified that they're well-paved approach. Allow me to go back to question exactly why is denormalization thought to facilitate reading through?

Shouldn't a database be denormalized for reading through performance?

Okay, here goes an overall total "Your Mileage Can VaryInch, "It Is dependent", "Make Use Of The Proper Tool For Each Job", "One Size Doesn't Fit All" answer, with a little of "Don't Repair It Whether It Ain't Damaged" tossed in:

Denormalization is an excellent method to enhance query performance in a few instances. In other situations it might really reduce performance (due to the elevated disk use). It certainly makes updates harder.

It will simply be considered whenever you hit a performance problem (since you are giving the advantages of normalization and introduce complexity).

The disadvantages of denormalization are a smaller amount of an problem with data that's never up-to-date, or only up-to-date in batch jobs, i.e. not OLTP data.

If denormalization solves a performance problem that you'll require solved, which less invasive techniques (like indexes or caches or purchasing a larger server) don't solve, then yes, you want to do it.