A common problem I see is how advanced the concept of "I don't want to take responsibility" has become. When companies want to make a decision about whether to do something or not the actual something is no longer important.
For example: At company I am familiar with an employee created an extremely advance database optimization - on the order of fifty times faster than what was currently being done for a particular problem space. Virtually all their customers could make use of this optimization and it eliminated a number of extraneous elements in their over all product. It also required the use of a single, simple third party element to replace these other elements and cost roughly the same.
The impact of this optimization was huge and would save the company and its customers untold hours of misery, delay and problems.
The solution was finished, i.e., the employee developed, installed, tested with customer data, and so on. So this was not a speculation - it actually performed some fifty or more times faster.
Yet the enhancement languished because no one could imagine taking on the responsibility of actually changing what they were doing...
So, in effect, a customer that really loved the database and purchased a large number of the products could effectively put this company out of business if their work involved uses of the product where the optimization was not installed. What I mean by this is that if there product were slow but successful customers would not be able to purchase enough capacity to really make use of the product: this would force the customers into other solutions and leave the company in question without a customer.
Effectively this is a scalability issue, er, rather a lack of scalability.
Yet this fact does not override everyone's fear of changing a bad product for the better.
What I see is the creation of a "hive mind" that operates bureaucratically. Everyone thinks the same way because thinking on your own makes you stand out and seem different than everyone else. And we all know from our University days that this is a bad thing.
A symptoms of this "hive mind" is a strange way of vocal inflection when the speaker makes a statement that they are unsure of. You've probably hear this:
Normally one would ask a question about something that they don't understand:
"How does the boofaroon factor affect the output of framitz module?"
But in bureau-speak this is turned into a statement with an inflection on the last word or too (inflect (raise the pitch of your voice) on the italicized words)
"The boofaroon factor affects the output of the framitz module."
This inflection tells you the speaker is unsure of what they are saying but, rather than ask a question and admit ignorance instead they convert the question to a statement and inflect the last few words.
Based on experience I can only assume that this new speech pattern is the result of education.
Another element of this "responsibility" issue is the "white wash" technique. This is typically found during a meeting where something very complex is being discussed. For example, the complex elements of a realtime computer-controlled manufacturing line: computers, software, hardware gizmos all having to work together in unison for long periods of time.
Because of the short attention spans and lack of interest in developing deep understanding of problem spaces what you find is that the "leader" will hone in on one element of the problem - very typically a specific symptom - and worry only about that.
This is accomplished by "white washing" over important or critical information that the "leader" doesn't understand or cannot figure out.
A typical example of this would be the following:
"When the engine RPM reaches X the steering starts to get "jumpy" and vibrate and the vehicle becomes unsafe."
The classic engineering model (used from the pyramids on) is to develop an understanding of underlying cause of the vibrations and to develop a solution built on that understanding: sort through the data and other factors until there is a proven understanding of the problem and only then build a solution that addresses that issue.
Instead the solution model is more like this: Its "too hard" to figure out what this vibration is but we know its bad. We could limit the RPM so that the problem never happens. (White washing over everything but the most obvious symptom.)
You might hear this in a meeting on the topic:
"We can limit the RPM X - Y so that we don't have the vibration."
(The italicized words are inflected to indicate the speaker has no idea if this is right or wrong in terms of addressing the problem.)
In this case "too hard" typically means digging into complex issues that require real homework to master, or require reviewing lots of old data to see why the problem didn't show up before, that sort of thing.
Things that in the busy corporate world take away from developing meeting notes, filling out reports, and trying to find open calendar dates between six or more people in order to schedule calls and meetings.
Never mind that what might in fact be the problem are bad engine mounts - something truly dangerous.
The thinking is that if no one knows the actual cause of the steering problem no one can be blamed for it and that the obvious symptom was addressed.
These problems stem from today's educational system. This system does not support the advancement of critical thinking (the skill of solving problems on your own). Hence the resulting kiddies turned out into the corporate world are not equipped to solve problems.
So strategies like the ones above take over for actually solving real problems.
A classic example of these kinds of thought processes are exposed in the movie "Fight Club" - particularly the Ed Norton character and his interactions at work explaining that things like bad motor mounts are only a problem if they cause more accidents than the "norm" (while in the background a family burns in a car).