Wednesday, May 20, 2009

Rodrik's post on governance has substantial potential to mislead

It's important to be careful in discussing the link between good governance and economic growth because it's easy to mislead. First, there's what you define as "good governance", and second there's the issue of whether you're talking about direct effects, indirect effects, or total effects.

For example, Harvard economist Dani Rodrik talk's about how a good industrial policy can be very helpful. The government makes the decision whether to implement a good industrial policy, so you could say that how good you call governance is based in part on this.

Likewise, the policies and ingredients for moving out of poverty that Rodrik describes in his book, "One Economics, Many Recipes", are highly dependent on government actions. Certainly a government which pursues those things aggressively and intelligently (and you could certainly think this is an important part of what's called "good governance") greatly increases the odds of the country quickly moving out of poverty, especially as opposed a government which does the opposite – very harmful things to development.

Rodrik might say that he defines good governance not this way, but as democratic, transparent, non-corrupt, etc., but it looks like you could make a good case that even with these things, usually, or often, depending on other particulars, the more you have, the better the policies and environment will be for moving out of poverty.

I especially think Rodrik's apparent claim that the U.S. in the 2000s is an example of how a country can have terrible economic results with good governance, and so good governance can be a non strongly positive factor, is dead wrong (or at least has great potential to mislead about very important things).

The example actually shows the exact opposite. It shows the great impact good governance (as I think most people would interpret the term) can have on economic growth and wealth.

The quality of governance plummeted astoundingly with the take over of Bush and the Republicans from Clinton and a more balanced mix of power in the rest of government – and the drop in wealth, efficiency, and growth that resulted from this was amazing over a period as short as eight years. They brought us from an overall sound economy and record surpluses to record deficits and the brink of a depression in a mere eight years, and if we had just a few more years of their governance, we probably would have had a depression.

The Bush years were just the opposite of what Rodrik apparently claimed; they were an amazing example of the how much bad governance can hurt economic growth.

And the Clinton and Obama periods are an amazing example of how much good governance can help economic growth, especially if you don't want all of the growth (and more) going to the rich.

And even if you call good governance just democraticness, lack of corruption, transparancy, etc., it was not having enough of those things that allowed Bush and the Republicans to seize power in the 2000 election; it was not having enough of those things that greatly aided them in doing so much harm. And I'm not talking about just the corrupt Supreme Court decision; I'm talking about the Electoral College; I'm talking about the fact that there are no run-off elections so that a Nader can cause the more popular of the two top candidates to lose; I'm talking about the fact that Wyoming has the same voting power in the Senate as California, and the District of Columbia, which has more people than Wyoming, gets zero voting power in the senate; I'm talking about how corporations can donate enormous sums to help politicians, and more.

But, it was having enough democraticness, lack of corruption, transparency, etc. which allowed us to make the amazingly positive change in 2008, in economics and so many other areas, of going from Republican control to Democratic.

Now, that was a roaring place to end, but my primary purpose of writing this blog is to teach, and discuss for important understanding. This takes a priority far higher than "good writing style". So I will continue with two statements by Dani that I think it's important to respond to:

"Johnson argues that U.S. economic policies have been captured by a (financial) oligarchy, in much the same way that business elites corrupt policy-making in much poorer countries such as Russia. The U.S., it turns out, is not that different."

Even in the darkest depths of Republican control, the U.S. was still far better than Russia in this regard. Plus, look at how quickly the U.S. Democratically pulled out of corrupt and incompetent control by electing the Democrats. It's far harder and takes far longer for Russians to identify and expel corrupt and incompetent administrations than Americans precisely because Americans have much better democraticness, lack of corruption, transparency, etc.

"After all, no-one can deny that the United States, for all its financial follies, is a rich country. It turns out that it is possible to be corrupt in a fundamental way and still be rich."

Not if you continue that way. Not if the U.S. had stayed that way. If the U.S. had kept Republican control, there would have been a depression lasting many years, and after that a long-term descent towards banana republic status. After a few generations, other first world countries would have left us far behind. But again, that's where good governance, as defined by democraticness, lack of corruption, transparency, etc., can be so valuable. It can allow a nation to kick out a really harmful, corrupt, and incompetent administration relatively quickly and easily.

Thursday, May 14, 2009

Scientific does not mean simple-minded

William Easterly writes today:

Airline passengers recently ejected an innocent Muslim family from an airplane because they were afraid the family were terrorists. Similar reasoning explains why Dani Rodrik favors industrial policy as a key to success...

All of us are making the amazingly common mistake of REVERSING CONDITIONAL PROBABILITIES. The airline passengers perceived from media coverage that the probability that IF you are a terrorist, THEN you are a Muslim is high. Unfortunately for the poor family, the passengers confused this with the relevant probability, which is the chance that IF you are a Muslim, THEN you are a terrorist (which is extremely low even if the first probability really is high, because terrorists are very rare).

So here is Dani Rodrik on success and industrial policy: “the countries that have produced steady, long-term growth during the last six decades are those that relied on a different strategy: promoting diversification into manufactured … goods” (cited in Economist’s View).

So Dani concludes, “What matters [for growth in developing countries] is their output of modern industrial goods” and that developing countries will have to get busy with “real industrial policies.” Finally, “external policy actors (for example, the World Trade Organization) will have to be more tolerant of these policies.”

Unfortunately, Dani is also REVERSING CONDITIONAL PROBABILITIES. Dani’s evidence is based on what he believes is the high probability that IF you have had steady growth for six decades, THEN you had industrial policy. This is interesting, but this is not the right probability in deciding whether to choose industrial policy, which is “IF you have industrial policy, THEN what is your chance of steady growth for six decades?”

This second, correct, probability would seem to be pretty low, since many other countries -- especially African and Latin American -- extensively tried industrial policies over the past six decades with low and erratic growth as a result...

I am really going through a MAJOR Mlodinow slash Kahneman phase about how economists (present company included) misinterpret data.

Here is what I think is the biggest overall way that economists misinterpret data: They think scientific means simple-minded, like you aren't scientific unless you assume away (and really take these assumptions literally, or substantially overly literally, act as though they are actually true in making conclusions for the real world) all of the complexity, all of the non-formal evidence and information, and assume everything relevant all fits into an extremely unrealistically simple formula, turn off your high-level, high-dimensional flexible thinking (which can still be rock solidly logical, with each link in the logic chains completely solid, even if it's not written in fancy looking, but really grossly over-simplified mathematics), and ignore any a priori information that's not formal and published in an academic journal in your field, no matter how important and compelling it is.

Harvard Economist Dani Rodrik is clearly an extremely proficient and regular user of high level intelligence. He is not saying that any industrial policy is a sufficient condition for strong movement out of poverty, he is just saying that the formal simple statistical data suggests that it can substantially increase the odds, and if you read his books, you see that he constructs very strong logic chains using less mathematical and formal, but also less simple, evidence to add a great deal of support to the simple statistical data.

Yes, there are a lot of countries that used at least some form (not necessarily a good one) and some amount (not necessarily relatively large and sustained) of industrial policy and failed, but that doesn't mean that industrial policy (good industrial policy) often cannot substantially increase the odds and amount of success, because there is a lot more to success than just having any industrial policy.

Every ATP tour tennis pro plays and practices an average of more than 10 hours per week, but only a fraction of 1% of people who play and practice more than 10 hours per week achieve ATP tour status. So are you going to make the simple-minded argument that it doesn't matter if you play and practice more or less than 10 hours per week, that it doesn't matter how much you play and practice per week in your probability of becoming an ATP tour pro? Or that "it's impossible to tell" (This is one of the lines I find most irritating. It's usually highly inaccurate and misleading. Often the evidence, all of the evidence at hand, although it won't tell you for sure, it will tell you that one possibility is much more likely than another, but people tend to feel so smart and level-headed saying "It's impossible to tell".)

First, clearly, if playing and practicing heavy hours was unrelated to success it would be very unlikely that it would just purely by chance end up existing in every single success story, but be relatively rare in the non-success stories. Now, there is data mining, but to see whether this is a random artifact of data mining, you should look at more than only the simple statistical data. You should use your high level intelligence and look at all of the evidence, even if it doesn't fit into a simple mathematical formula. You look at human biology, physiology, neurology; you look at evidence from sports in general, etc., etc. From this you can construct extremely compelling, rock solid logic chains, anchored to just very reasonable and mild assumptions (as opposed to the kind typically used by Chicago-style economists), showing that how much you play and practice matters greatly in your probability of achieving ATP pro status, even if those logic chains don't consist of solely highly simplifying mathematical symbols.

And, by the way, I do think advanced mathematics can be very useful, and use it myself, but only if it's used and interpreted intelligently.

You can come to some pretty ridiculous conclusions if you ignore the vast majority of the evidence because it does not fit some snobbish and simple-minded definition of what's scientific. Here's my definition of what's scientific, or at least the most important part of it: It's logical. The conclusions to the real world that these snobbish pseudo-scientific types make often don't meet this definition (see Chicago School).

Wednesday, April 29, 2009

Induction, deduction, and a model is only as good as its interpretation

View, understand, and then predict the behavior of the macro environment, rather than attempting to go from assumptions about micro to predictions about macro. – Robert Haugen, Emeritus Professor of Finance, University of California, Irvine. From his 2004 book, "The New Finance", 3rd Edition, page 123.
...as we travel from left to right, we go from order to complexity and finally into chaos.

At the extreme left, where there is order, mathematical models predict and explain well [as in much of physics].

As we move to the right, induction and statistical estimation dominate deduction and mathematical modeling in their ability to explain and predict... –"The New Finance", page 131.

Financial economists, both rational and behavioral, dazzle themselves with sophisticated mathematics. They gain much comfort in the intellectual rigor of their methodologies.

It makes no difference if their assumptions are completely unrealistic, so long as they parallel those made by their peers.

To them elegance [advanced, complete, and impressive mathematics] is all that matters. They look with disdain on studies of psychologists, sociologists, and anthropologists because their work seems so mushy in comparison to their own. They dismiss as unimportant forces that may actually be crucial but impossible to treat with mathematical rigor. –"The New Finance", page 132.

Reading London School economist John Kay's recent Financial Times article, "How Economics Lost Sight of the Real World", reminded me of Robert Haugen. Haugen is a maverick (let's not let John McCain and Sarah Palin ruin an important word), who fought very hard, and caustically, for decades against grossly unrealistic literal, or relatively literal, interpretation of models that show tremendous efficiency by making assumptions like, for example:

– Everyone in the world has perfect, or near perfect, rationality.

– Everyone in the world has advanced and specialized expertise in finance, economics, law, government, science, etc., that takes years or even decades of education and training to acquire. Or, they are able to perfectly know who has that expertise and can also be trusted to give it honestly, and at relatively little or no cost.

– Gathering information relevant to financial asset valuation (information, not just data) takes no time and is costless – even massive information gathering.

– Analysis of information relevant to financial asset valuation – even massive amounts of very complicated and difficult to interpret information – takes no time and is costless.

– Unlimited liquidity for all assets, and all buyers and short sellers.

– All assets can be sold short, and this short selling can be done instantly and at zero transactions cost.

– Even a relatively tiny number of savvy investors will have enough wealth, or access to enough wealth, that they can always buy assets up to their efficient price (Note: Even if they actually did have enough wealth to do this if they wanted to, a big problem that I pointed out in a 2006 letter in the Economist's Voice, which I have not seen elsewhere in the literature at least explicitly, is that they would be constrained by how undiversified their portfolio could become. As I wrote:
...suppose IBM is currently selling for $100, but its efficient, or rational informed, price is $110. It must be remembered that the rational informed price is what the stock is worth to the investor when added in the appropriate proportion to his properly diversified portfolio of other assets. Such a savvy investor will purchase more IBM as it only costs $100, but as soon as he purchases more IBM, IBM becomes worth less to him per share, because it becomes increasingly risky to put so much of his money in the IBM basket. By the time this investor has purchased enough IBM that it constitutes 20 percent of his portfolio, the stock may have become so risky that it’s worth less than $100 to him for an additional share. At that point he may have only purchased enough IBM stock to push the price to $100.02, far short of its efficient market price of $110. Thus, if the rational and informed investors do not hold or control enough—a large enough proportion of the wealth invested in the market—they may not be able to come close to pushing prices to the efficient level.
– The global equilibrium of the model is reached before any of the exogenous factors change, rather than those exogenous factors regularly changing before the economy can get anywhere close to that equilibrium (oh, and heaven forbid that a model should ever not have an equilibrium, that some key things should just always move, on average, in one direction. That never happens in reality over any time period of important length – except for trivial things like GDP growth, accumulation of knowledge, and advancement of technology)

– All asset returns have a normal data generating process (DGP), or some other DGP which is simple enough to write mathematically on a single line, or maybe a few, (The real DGP, depending on the level of precision you desire, can take thousands of pages to describe, or more), and has at least relatively thin, well behaved tails.

– Quick simple local numerical optimization techniques will find the global optimum even in highly complicated, high dimensional problems (and any cherry picking the starting point to get a more publishable result is fine, especially since you're almost never asked to provide the computer programs and a large class of important assumptions and details you used by the academic finance journals, including by and large the top ones)

– Accuracy and stability of numerical algorithms is never a problem, so you can use whatever techniques you know, whichever are the easiest, or give you the most publishable results. (And, you learn how to do numerical accuracy and stability well and then spend the time to do it well at your own peril. Because it's given little if any consideration at the academic finance journals, and it takes a lot of time, which will substantially lower your publication production and therefore your advancement.)

Of course, not all models in economics and finance that conclude great efficiency make all of these assumptions, but they all make assumptions like these, that as a group are extremely unrealistic. Does that mean that you can't still learn some valuable lessons from models like these? No. Often you can. But you have to interpret the model intelligently, using high level, not mechanical, intelligence. You certainly don't unthinkingly automatically interpret these models literally, as if the real world behaves exactly, or even qualitatively exactly, like the model. And the same goes for econometric models and techniques utilizing empirical data.

Now let's get back to Haugen. In his 2004 book, "The New Finance", 3rd edition (the 4th is being released May 2nd), he makes many of the same points Kay does, as well as important related ones. I think this book can convey some very important insights that are seldom or never heard in the academic finance literature, but like models, and like most writing today, you should be careful about interpreting it too literally. There's hyperbolic or very hyperbolic writing throughout the book, and many of the statements are literally exaggerated, or falsely absolute.

Does Haugen understand that these statements are exaggerated and falsely absolute? I've read a lot of his work, and he's extremely intelligent. I think he does understand this, at least in most cases. In part, though, like almost everyone, he succumbs at least to a substantial extent to the great pressure to write with what's considered "good style", and that means smooth and simple, (as well as, depending on the venue, profound-sounding, "professional", entertaining, etc.) even if it results in a false simplicity, saying things that are literally false and/or likely to mislead a substantial percentage of the readers in important ways (for more on this, see my very first blog post).

In large part, though, Haugen is just shouting because he's angry, and because it's so hard to get through with the grip the unrealistic efficient market people have (and especially had) on academic finance, with their great control of the journals and departments, and therefore advancement, prestige and money.

That said, let's get to some of the statements in Haugen's book that are similar to, or related to, statements in Kay's article. I think they can add valuable insight; there's a lot of truth to them. But again, I recommend that you be careful not to take Haugen's exaggerated and absolutist statements completely literally:

KAY: Since the 1970s economists have been engaged in a grand project. The project’s objective is that macroeconomics should have microeconomic foundations...

Most economists would claim that the project has been a success. But the criteria are the self-referential criteria of modern academic life. The greatest compliment you can now pay an economic argument is to say it is rigorous. Today’s macroeconomic models are certainly that...

But policymakers and the public at large are, rightly, not interested in whether models are rigorous. They are interested in whether the models are useful and illuminating – and these rigorous models do not score well here...There is not, and never will be, an economic theory of everything. Physics may, or may not, be different. But the knowledge we can hope to have in economics is piecemeal and provisional, and different theories will illuminate different but particular situations. We should observe empirical regularities and – as in other applied subjects such as medicine and engineering – we will often find pragmatic solutions that work even though our understanding of why they work is incomplete.

HAUGEN: Chaos aficionados sometimes use the example of smoke from a cigarette rising from an ashtray. The smoke rises in an orderly and predictable fashion in the first few inches. Then the individual particles, each unique, begin to interact. The interactions become important. Order turns to complexity. Complexity turns to chaotic turbulence...(page 122)

How then to understand and predict the behavior of an interactive system of traders and their agents?

Not by taking a micro approach, where you focus on the behaviors of individual agents, assume uniformity in their behaviors, and mathematically calculate the collective outcome of these behaviors.

Aggregation will take you nowhere.

Instead take a macro approach. Observe the outcomes of the interaction – market-pricing behaviors. Search for tendencies after the dynamics of the interactions play themselves out.

View, understand, and then predict the behavior of the macro environment, rather than attempting to go from assumptions about micro to predictions about macro...(page 123)

...as we travel from left to right [in figure 10-5 above], we go from order to complexity and finally into chaos.

At the extreme left, where there is order, mathematical models predict and explain well [as in much of physics].

As we move to the right, induction and statistical estimation dominate deduction and mathematical modeling in their ability to explain and predict...

Induction dominates deduction in its predictive power... (page 131)

Financial economists, both rational and behavioral, dazzle themselves with sophisticated mathematics. They gain much comfort in the intellectual rigor of their methodologies.

It makes no difference if their assumptions are completely unrealistic, so long as they parallel those made by their peers.

To them elegance [advanced, complete, and impressive looking mathematics] is all that matters. They look with disdain on studies of psychologists, sociologists, and anthropologists because their work seems so mushy in comparison to their own. They dismiss, as unimportant forces that may actually be crucial but impossible to treat with mathematical rigor. (page 132)

I largely agree with Haugen, but a key point of disagreement, at least with what he often writes literally, is that deduction is useless, or near useless, in highly complex situations. Deduction can still be extremely valuable.

Although in such situations deduction alone is not very good at very precise forecasts:

a) It can still give you very valuable qualitative understanding and ideas, like if you do X, or follow X policy, you will become much wealthier, at least 50% wealthier. You don't know the amount very precisely, but you do know that it's in a big range, and so the policy or idea is well worth doing. For example, the Capital Asset Pricing Model (CAPM) may not be very good (used alone) at precise forecasts of stock prices, but it does make clear that I can get a far lower risk level, not just a little lower, but far lower, for a given mean return, if I buy stocks in a large, highly diversified portfolio, than if I buy them singly, if I'm a layperson with no special information or analysis.

b) The understanding we get from deduction can help us improve our inductive research and models. It can give us a much better idea of where to look and what to look at in the inductive process of studying end results and situations. But we only get good and valuable understanding from deductive models if we interpret them intelligently, not automatically literally, or automatically qualitatively literally. It's worth repeating: A model is only as good as its interpretation.

c) Deductive understanding can be combined with inductive models and econometrics to substantially – often greatly – improve the forecasts. It can tell us when the inductive model's forecast based on the past will be much too low, or much too high, because of important recent changes from the past.

The problem isn't that deduction, and deductive models, are useless, or near-useless, in highly complex situations. It's that especially fresh water economists have been making ridiculously overly-literal interpretations and claims from deductive models. This is partly due to a focus on mathematics rather than economic intuition and other high level thinking. And it's partly due to the fact that many of these economists are extremely Libertarian, and are very willing to intentionally mislead in making conclusions from these models to support Libertarian economic policies.

In addition, there is a very strong incentive to make untrue big claims about an academic's models, or the models in an academic's area, so that they are more likely to be published in top journals, which is overwhelmingly what determines employment, promotion, power, prestige, and earnings. Academics especially feel freer to make these claims when they use advanced mathematics so little known that not only the general public can't read the papers to see how overblown and downright false they are, even the vast majority of fellow economists can't; even for them these paper are like written in Greek. They may even speak some Greek, but to decipher Greek this elaborate and advanced would take a lot more time than they have, especially since the vast majority of economists are not in this area and don't get paid to spend time in this area.

Eventually we reached a point where academics making grandiose, but ridiculous claims from highly mathematical models with efficient equalibria exerted great control at the top journals. And they used their gatekeeper ability to fight very hard to allow in papers that agreed with them, and to keep out papers that didn't. Sadly, they have been very successful at this, and they did it largely because if their research became a lot less prestigious and publishable, it would mean a huge decrease in personal prestige, positions, prizes, and earnings.

Likewise, their students, future economists, had a huge incentive to push these ridiculous claims and conclusions, because they had spent years learning these models and the associated mathematics, and if the claims of these models were exposed, their ability to publish in top journals, and from that to get high paying, high prestige jobs at top universities, would decrease dramatically. Since Economics Ph.D. students at schools like Chicago are largely type A, ultra-ambitious from childhood, work-a-holics, the thought of ending up making $80,000 per year as a professor at Penn State, rather than hundreds of thousands, or millions per year, including royalties, consulting, etc., at Chicago, is a huge incentive to support the party line.

So there are serious problems in economics and finance academia that stem pivotally from enormous asymmetric information, the fact that those predominantly paying for the research, the tax payers, have almost no ability to understand the highly mathematical and technical papers, and discern which are of great societal value, which are of little, and which have extremely unrealistic and harmful overly-literal conclusions from models to the real world.

We talk about market problems in finance justifying greater regulation, but the same may be true of finance academia, and economics academia. We really need to think about having a federal government department to monitor, study, and regulate, at least to some extent, how academia uses it's human and other resources, whether they are being spent in proportion to their risk-adjusted expected societal value over the short, medium, long, and extremely long run. The journals and departments right now, and for some time, award publications, grants, jobs, promotions, and prizes grossly out of line with the social NPV of the research work.

We really need to seriously study the idea, and specifics, of a large federal government department staffed by academics in economics, finance and other fields to study whether their academic fields, their departments and journals, are rewarding, and spending resources, in line with societal NPV (which does, of course, consider all benefits, including very long term, and unlikely but potentially huge), rather than largely in line with the enjoyment, prestige, and enrichment of those in control. And if things are way out of line, the government department has strong and wide ranging powers to do something about it.

Such a government department will, of course, need to develop a culture of loyalty to the public good, and not one's academic field. And there will need to be transparency and watchdogs, and in general the systems, techniques, and procedures which have greatly improved the efficiency and professionalness of civil service over the last century (especially when we have a party in power that tries to make government succeed, rather than one which tries to make it fail, and uses it extensively to enrich cronies).

Certainly there are problems with this idea; any implementation would require a lot of fine tuning, checks and balances and safeguards, with some portion of funds earmarked to be spent completely at academic department disgression, but too many economists forget a cornerstone of economics, that analyses should be cost-benefit, not cost alone. Yes, there are costs and problems with this idea, but the benefits could be enormous, and the costs of doing nothing could be far larger. If academic economics and finance had been tightly focused on honest research to maximize societal NPV over the last generation, we could easily have generated trillions of dollars more in wealth and total societal utility. A great place to start in increasing economics' societal NPV would be stopping the gatekeepers from ignoring the pink elephant of economics, positional/context/prestige externalities.

Wednesday, February 25, 2009

The frequent trade-off of "politeness" and "civility" for clarity, accuracy, and exposing important truths. How to find the socially optimal balance?

-->I'm currently in the middle of a discussion with Nick Rowe on his and Stephen Gordon's blog regarding positional/context/prestige externalities and their effect on saving. Nick noted that he thought, but was not sure, that Greg Mankiw had a post stating these externalities did not affect the saving rate. In my reply, I wrote, "...but Mankiw has a shameful record of constantly intentionally misleading for the right.", and, "or Mankiw is describing it in a deliberately misleading way, which is par for the course for him."
In Nick's reply he wrote, "And you really should not make those sort of accusations, please. It's part of the very bad decline in civility, that I associate with recent US politics, and certain bloggers I won't name. "If someone disagrees with me, he must be lying, and a paid shill". That sort of argument is the first step towards totalitarianism."
Ok, difficult, unpleasant issue, but it's very important, so we should discuss it.
First, I should make very clear that I don't always think, "If someone disagrees with me, he must be lying, and a paid shill". I usually think they're just making an honest mistake, or I am, in which case if I think the odds are that they are right, I change my opinion. I only think, or am willing to say, they're lying or a paid shill if there's very good evidence that they are.
For example, suppose a Harvard Climatologist says the Earth is flat or that carbon accumulation in the atmosphere has absolutely zero effect on global warming, and that these false assertions please a political party he favors and/or is highly paid by. What am I to think? I know he's saying something that's not true. And I know being a Harvard Climatologist that unless he just had a serious head injury, he knows it's not true, but he said it anyway. I don't want to use the L-word, because so many people say you're not supposed to, and it can cause a lot of problems, but that is the definition of the L-word, knowingly saying something that's untrue.
And note that in this case, yes, the public would know anyway that the Earth is flat is a lie, but there are many things that a climatologist, or any expert, can say which can sound plausible to the general public, but which a fellow expert would know can only be intentional misleading, or lying. Should that fellow expert remain silent on the intentionality of the first expert's untruth? You can't say he should just point out why this untruth is untrue, and then it won't matter if he tells people that the person who said it has a record of regularly intentionally misleading. 

The reason is because the messenger, and her credibility, does matter. It's not just the message. The world is too complicated and advanced for that. The message itself can be very strong and convincing to you, but in such an advanced and complicated world, and with an area that you, as a member of the general public, are not expert in, there's still often a substantial probability that there's something you're missing, and the message is, in fact, not correct, or not completely correct. The more credible the messenger, the lower the probability that that's the case, and vice-versa. So it is valuable for the public in important decision making, like voting, to know if messengers are serial intentional misleaders for causes that they don't consider good.
So what are you to do?
As economists we're trained to do cost-benefit analyses, and to do something if its benefits outweigh the costs. In finance, a close cousin or sub-field of economics, the foundation is the Net Present Value rule, which is the same thing applied to financial problems. So let's try to do this with the situation where there is very good evidence that a person or group is regularly intentionally misleading for a cause that we think is bad and harmful, or extremely harmful.
If you write that they have a history of intentionally misleading, or intentionally grossly misleading, then a cost, or con, is that this can create anger and other bad feelings with the other party and with some members of the public. This can make it harder to communicate and work with them. With today's Republicans it is unlikely that we will ever get more than a few in the entire congress voting for any kind of intelligent economic stimulus program, but by being largely polite and friendly nonetheless, Republican and Democratic congress people can at least sit in the same room, handle the administrative details that have to be done, and talk about the issues where there is some real chance of at least some meaningful interchange, learning, and cooperation, and people don't start doing things out of spite, even if they think it will hurt the country.
Generally, civility and politeness, or what is often called civility and politeness, can help to keep lines of communication open, can help people work together more productively, and can prevent people from doing things to get back at their rivals even if they know it hurts themselves and the greater good. It can also make work, or any human interaction, a lot more pleasant. Moreover, lack of civility, especially if it's really bad, can intimidate people, and discourage them from communicating. It can stifle discussion and learning.
So there are very important pros to civility and politeness. Now let's talk about the costs, or cons, of civility and politeness, or what is often called civility and politeness, especially if it's taken too far, or taken to mean being extremely unwilling to say anything that could be construed as an insult no matter how true it is, and no matter how important it is for the public to know.
If a person or organization is constantly lying. Or intentionally misleading while saying things that can maybe, by some definition, be considered to be literally true (for example Bill Clinton's famous public statement, "I did not have sexual relations with that woman"), then if we can never say that they are less credible, that you should be skeptical and on guard because they constantly intentionally mislead, it makes it easier for them to mislead the public.
And often this is far from trivial. If more people had been willing to directly and clearly point out the outright lies and intentional misleading of George W. Bush and the Republican machine in 2000 they easily might have lost that election. Vagueness and fuzziness and not being willing to clearly and directly state when they were being misleading, or intentionally misleading, in the name of "civility" or "politeness" or "professionalness" made it less clear to the public what was going on.
Directness, clarity, accuracy, and precision in critiquing one's rivals, their ideas, what they say, is often considered uncivil and impolite. But if you're vague, and sort of allude to, to be polite and civil, often many people will not understand your critique, or debunking, well. It will just be a lot less clear, and their understanding and decision making will be worse as a result.
Likewise, if you don't say certain very important and relevant truths, like in the Angry Bear post, "Cato Disinformation", because you think it's uncivil or impolite to, then much of the public will not know these very important and relevant truths, and their understanding and decision making will be worse as a result. If the decisions in question are huge, like whether to elect George W. Bush, whether to elect enough Republicans to the senate to allow them to stop the measures which will prevent a long wrenching recession or depression, then the costs of being "polite" and "civil" – to that extreme – may be monumental, and the benefits minute by comparison.
Look at what George W. Bush and the Republican machine have done to this country over the last generation, largely aided by the stealth and obscuring they get from what's often called "civility" and "politeness". Do you really still think we should, as an unbending rule, never, ever, in any case, directly and clearly tell people when, on important issues, a person or organization is intentionally misleading or outright lying, that they are, and that it is a regular occurrence, so be careful, keep a skeptical eye, and look to see if more credible sources are backing them up?
I think sometimes we should be willing to say to the public that this person or organization regularly intentionally misleads, even if it will look uncivil to some, and sometimes we shouldn't. It depends on the time, place, and situation. If these are such that the costs outweigh the benefits, we shouldn't. But there are times when the costs of being what many people call "civil" are great, much greater than the benefits.
I'm all for pleases and thank-yous and avoiding swearing and so-on, because it has a clear benefit and it's basically costless. It doesn't fuzzy up understanding or hide important truths, but I think some people's definition of civil is too extreme. It's beyond the point where marginal cost equals marginal benefit.

Additional:
With regard to the specific catalyst for this post. I do regret using the word shameful in, "...but Mankiw has a shameful record of constantly intentionally misleading for the right." That's a case where the benefit did not outweigh the cost, even in a relatively informal, "dinner conversation"-like venue like the comments section of a blog. Moreover, I'm really not at all sure that Greg Mankiw's reasons are shameful. Like many people, I'm puzzled as to why he says and does the things he does. Maybe he's not intentionally misleading for personal gain; maybe he's just an extreme economic Libertarian, who sincerely thinks it's worth having far lower growth in wealth, science, medicine, and total societal utility, far more human suffering and far less human happiness, to avoid giving up even small amounts of personal economic freedom. Maybe he thinks he can affect positive change to the Republican Party better by staying on the inside than by being kicked out of the upper levels. And to stay in he has to say things which please those in control.
I don't know; his reasons are puzzling and unclear. But I'm quite sure that he regularly intentionally misleads for the positions of the Republicans. As a start, see the Economist's View posts "Honest Brokers", "Economists, Ideology, and Stimulus", "Can Economists Be Trusted?" "Are There Ever Any Wrong Answers in Economics?" , and my post "The latest disinformation from Mankiw" . The whole thing, though, is a difficult issue. There are times when it's an easy call that the benefits far outweigh the costs of saying as politely as reasonable that some one, or some group, regularly intentionally misleads, but there are other times when it's a tough call.
I am putting together a response to Nick's most recent reply on the savings effects of positional/context/prestige externalities. He makes a good point, and brings up a valid first order factor. I just don't think it's the only relevant factor. There's more to the story. I hope to have my response to this up in the comments section of Nick's blog in the next day or two.