Uncategorized Archive

0

Evil Bastards

Evil+ClownFor a variety of reasons, I prefer to run my websites, and those of a few friends, email and various other services on my own server. it is sitting here beside me, occasionally winking its disk access light at me.

Doing this is apparently an open invitation to half of the population of China, 30% of Russia and practically the entire populations of Poland, Vietnam and a host of other countries to attempt to break into the machine, either for purely vandalistic reasons, or to try to use the system to launch further attacks or to deliver SPAM.

Last Christmas I treated myself to the bits to build a more compact, but much faster and more capable server. I, of course, loaded the latest and greatest versions of all the software and frequently load all the latest patches. So far, there have been no direct successful break-ins.

However, there are a continuous stream of people “rattling the doorknob”.
I noticed that some of the log files were getting big. Fast. It was hundreds of attempts per minute to remotely log into the system, trying random user names, and long lists of passwords.
One or two of these, I would just ignore, because they are not going to get in with just a username and password anyway. But the huge volume was just annoying.

I have a firewall between the Internet and my internal network, so I set up a set of rules to allow people three attempts to successfully login (after all, I want to be able to do it myself), and on the third failure, block that IP for 24 hours.

After 24 hours, the list was hundreds of entries long. But there were now many fewer people rattling the doorknob.

Next, I woke up one morning to email on my cell phone with several hundred bounced email messages.
I don’t allow people to route email via my system unless they can authenticate, but someone by continuous trial and error had somehow guessed my wife’s password and was using that to send SPAM – by the thousand. A quick change of password fixed that, but I wasn’t happy looking at the continuous stream of attempts to guess passwords on the email server.

Back to the firewall, and a new set of rules to do basically the same thing that I did for remote login. As soon as it had blocked a few hundred IP addresses, things quietened down.

Looking at the IP addresses in the block lists, most are from China, followed by Russia, Poland and a variety of smaller countries.

A while later, there was another deluge of bounced email messages. A bit of digging around revealed that these were coming via the web server.

I run a few websites. For simplicity, I have tended to set up WordPress for most of these. Its relatively easy for people to use (I don’t have to get too involved) and can be made to look not too bad relatively easily. The problem with WordPress is that it is a pile of s**t. Full of holes for hackers to exploit. I had already somewhat locked down user registration, requiring anyone wanting to register on one of these sites to respond to an email to activate the account. Since 99.99% of the “new users” come from bots, they tend not to reply to email. But that doesn’t stop them trying dictionary attacks, guessing usernames and passwords.

So, you say, what is the worst they can do? Post a few spammy messages?, Well, no. There is a utility in WordPress intended for things like “trackbacks” (which I consider useless anyway), but which also allows basically execution of arbitrary commands — once you have authenticated. That had been used to add extra code to the website to go fetch SPAM and a list of addresses, and to blast it out to those addresses.

A quick delete of the entire website, and a restore from a backup removed the problem. But I had to stop it happening again. First thing I did was to change the permissions on that evil little tool (xmlrpc) to completely disallow access.

My logs now began to fill with error messages caused by attempts to access this file. There were also various other things they were trying to access, including the login process.

Unfortunately, since this was a website, there was no easy way to detect/block these at the external firewall. There is a tool (fail2ban) that can be used to scan log files for certain patterns, and block IPs locally on the machine when it finds those patterns. I took a look, and decided that it was overly complex, so let it drop, and decided to just ignore the garbage in the logs.

Then, someone found a new bug in WordPress, and within a day hacked every WordPress site I had to convert them into sites via which to download all sorts of undesirable content (mostly stolen/hacked software, documentation etc). A new WordPress release was rushed out to fix this hole, but everything was already hacked. So, one by one, delete the website, restore from backup, upgrade WordPress. Of course, there were still thousands of hits on the old download URLs (which no longer worked), as well as attempts to restore the hack.

I gave in. I installed fail2ban.
I set it up such that anyone just touching the xmlrpc file was immediately blocked.
Anyone attempting to access the old hacked URL was immediately blocked.
Anyone trying to authenticate more than N times within M seconds was blocked.

There was still a lot of traffic trying to authenticate. It was coming from random IPs. These people have vast numbers of hacked machines that they use, so you tend not to see the same IP too regularly.

But, this website is a little different. Its mine. Only I log on to it. So I set this one up so that any authentication attempt not coming from my computer gets blocked immediately. That locks out that IP from all the other websites too.

The lockout is about two weeks.

looking at the list of IPs blocked just by fail2ban, there are currently 1,936 of them, and growing steadily.

I do wish these people would go find another job. I find doing this stuff annoying.

2

Understanding the BBC 2006 Seminar Issue

Those encountering the story about the infamous BBC “policy seminar” in 2006 and its subsequent revelations  for the first time (which you will not have seen on the BBC or almost any other mainstream media at this point) might not appreciate all its implications.

The story is not simple, and drags out over many years. This is an attempt to give a brief history of the issue and explain why it is important.

The BBC Charter

The BBC is a somewhat peculiar institution in that it is funded by what is essentially a tax, but not controlled by the government. It operates under a charter which makes it independent of the government, and free to operate in whatever way it sees fit, subject to some fundamental constraints. One of the most important of these is that of impartiality. The BBC is required to be impartial. This has been one of its greatest strengths  and the reason why it came to be regarded as the benchmark against which all other news organizations were judged. Since it has no shareholders, and gets its money independently of what it reports about the government, it is in a unique position in the entire world.

Adherence to the charter, in spirit as well as its letter is what made the BBC respected/great.

Impartiality is spelled out in the charter. If there are multiple sides to any story, the BBC is required to present all sides of the story.

The Charter Broken

It became apparent some years ago that the BBC was being less than impartial on the subject of Global Warming (I will stick with that name, even though it has gone through several marketing make-overs and name changes). Only one side of the story was appearing in BBC output, that of the impending end of the world if “something” was not done. Now, there are many very respectable scientists that disagree with this, and have very valid questions about the validity of the models and their results upon which these predictions are based.

As time progressed, empirical evidence, actual physical measurements, began to disagree with the model predictions, and research into into the data and methods being used started to show some unsettling discrepancies.

Rather than explore these, the BBC essentially closed ranks, ignored these and doubled down on the side of predictions of doom (if you don’t all pay up one way or another).

Questions were raised about the requirement of the charter to explore all sides of  any story. When it became obvious that these questions would not go away, the BBC Trust (those responsible for ensuring that the charter is observed) produced a report. There is one famous paragraph which is the center of today’s problems for the BBC:

The BBC has held a high-level seminar with some of the best scientific experts, and has come to the view that the weight of evidence no longer justifies equal space being given to the opponents of the consensus. But these dissenters (or even sceptics) will still be heard, as they should, because it is not the BBC’s role to close down this debate. They cannot be simply dismissed as ‘flat-earthers’ or ‘deniers’, who ‘should not be given a platform’ by the BBC. Impartiality always requires a breadth of view: for as long as minority opinions are coherently and honestly expressed, the BBC must give them appropriate space. ‘Bias by elimination’ is even more offensive today than it was in 1926. The BBC has many public purposes of both ambition and merit – but joining campaigns to save the planet is not one of them. The BBC’s best contribution is to increase public awareness of the issues and possible solutions through impartial and accurate programming. Acceptance of a basic scientific consensus only sharpens the need for hawk-eyed scrutiny of the arguments surrounding both causation and solution. It remains important that programme-makers relish the full range of debate that such a central and absorbing subject offers, scientifically, politically and ethically, and avoid being misrepresented as standard-bearers. The wagon wheel remains a model shape. But the trundle of the bandwagon is not a model sound.

On the basis of this, the BBC gave essentially no opportunity for those with a viewpoint contrary to “the consensus” and exposure, and when they did, tended to choose the least articulate and well prepared and contrast them to very articulate and well prepared proponents of the consensus.

Before looking at the central issue here, it is interesting to observe that science does not operate by consensus. Even their own example “flat-earthers” was once the scientific consensus. Consensus has no part in real scientific debate. Science is based upon evidence. The output of models, while useful in formulating a hypothesis, is not in itself evidence.

But back to the core subject. The paragraph begins with this assertion:

The BBC has held a high-level seminar with some of the best scientific experts, and has come to the view that the weight of evidence no longer justifies equal space being given to the opponents of the consensus.

There were some that questioned what self-respecting group of scientists would agree with that statement. When pushed to be more specific about this meeting the BBC said that it was a seminar held in 2006.

Being interested in who the scientists were that would formulate such a statement, Tony Newbury asked for more information. In particular who attended this meeting. The BBC refused to divulge that information. Tony then issued a request under the Freedom Of Information Act (FOIA) asking for the same information.

After a long series of refusals to deliver the information as the law requires, the case was taken to court. The court case was interesting. On one side, we have Tony, an old age pensioner with very limited means, and on the other, the BBC had retained no less than six barristers to represent them in court. The legal bill for these has been estimated at £40,000 per day.

After some rather strange statements by  the judge, indicating and expressing a distinct bias towards the BBC case, the final verdict was that the information need not be revealed because for these purposes the BBC (totally funded by tax) was a private organization.

The verdict is likely to be appealed simply on the basis that there is not, and should be no possibility that the BBC can be considered a private organization.

But at that point, it appeared that there was little possibility of the names of the attendees at this rather important meeting ever being revealed.

The WayBack Machine

At this point, we will take a small diversion to mention a project that few people are really aware of.  For many years a non-profit organization based in San Francisco, California has been taking snapshots of many servers on the Internet. It is essentially a history project, documenting the evolution of the web. It is possible to go back many years and request a page on a given web server as it appeared at any specific date. Not all pages and not all servers are covered, but enough to make this a very useful research tool.

Enter Maurizio Morabito

Maurizio was looking around to see what he could find on this infamous seminar and found a tantalizing reference to it, which ended up pointing nowhere. He had the idea of using the WayBack Machine to locate the document.

A version of the complete document was found there, containing the names of all the participants. It appears that these had been published on the Internet, then, when questions began to be asked, the document had been heavily edited to remove the participants names in one case, and deleted in others (for example, the broken link that Maurizio found).

The website on which this was located (International Broadcasting Trust) was interesting in itself, as were some associated documents found there.

It appears that the BBC deliberately obstructed, and spent something of the order of £100,000 of taxpayer money to suppress the release of information previously freely available on the Internet.

One has to wonder why.

It is also clear that this meeting was not organized for the purposes stated. It is much more likely that some justification was needed for the failure to observe the charter, and this meeting was selected to be that justification, even if it served some other purpose entirely.

 Why This is Important

A quick look reveals that the attendees at this seminar we not “the best scientific experts” at all, but mostly representatives from groups with a vested interest in perpetuating the global warming story.

In addition, there were high-level representatives from almost every branch of BBC  programming.

Since this was obviously not what the BBC had stated (a meeting with the best climate scientists), what was its purpose? It is also interesting to note that the attendees and purpose of the meeting were misstated, under oath, by several BBC executives. They appear to have committed deliberate  perjury. One can only wonder if there will be any repercussions from this. Ordinarily, it might have been “overlooked”, but with the current focus on the BBC for its dealing with sexual predator issues, there may be a less forgiving attitude.

So what was the meeting about? There was a clue in the climategate emails:

date: Wed Dec  8 08:25:30 2004
from: Phil Jones <p.jones@uea.xx.xx>
subject: RE: something on new online.
to: “Alex Kirby” <alex.kirby@bbc.xxx.xx>

At 17:27 07/12/2004, you wrote:

Yes, glad you stopped this — I was sent it too, and decided to
spike it without more ado as pure stream-of-consciousness rubbish. I can well understand your unhappiness at our running the other piece. But we are constantly being savaged by the loonies for not giving them any coverage at all, especially as you say with the COP in the offing, and being the objective impartial (ho ho) BBC that we are, there is an expectation in some quarters that we will every now and then let them
say something. I hope though that the weight of our coverage makes it clear that we think they are talking through their hats.
—–Original Message—–

Prof. Phil Jones
Climatic Research Unit

The aim is stated fairly clearly in the document that Maurizio unearthed:

The aim
The aim of the seminars is to change minds and hearts. We want to talk about the
developing world in a way that is interesting, engaging and provocative, so that the BBC
participants and independent producers come away convinced that this is an area which
their programmes should no longer ignore. We are not pitching ideas and have no
guarantees that specific programmes will be commissioned on these issues. Our goal,
therefore, is to bring to life stories and issues from the developing world. We shall not be
talking in detail about tv coverage so we do not need participants to have a detailed
knowledge of British television.

It appears the aim, which seems to have been achieved, was to insinuate the global warming meme into every thread of every class of programming produced by the BBC and ensure minimal to no airtime for anyone with a different viewpoint. Anyone listening/watching any amount of BBC programming will appreciate how ingrained references to global warming have become, it is mentioned at every appropriate opportunity (and at some totally inappropriate ones too).

This is propaganda that Goebbels would recognize and appreciate. Force ideas into everyday life, get them unconsciously assimilated. They have become the propaganda wing for (lunatic) Green fringe.

Not only is this a most egregious trashing of the BBC charter, but it has undoubtedly cost the British (and probably other) taxpayers untold millions and diverted government policy into what may well be  the biggest waste of money ever. There are also those that have suffered and died through being unable to afford heating due to the effects of this atrocity.

0

An Open Letter to Advertising Executives

As an executive responsible for managing a large budget and safeguarding the image of the various brands that you manage, you may be interested in one of the principal reasons why I, and I suspect a lot of other people are not seeing the advertisements that you are paying a lot of money to have shown on prime-time national TV networks. Even more to the point, I (and many others) are actually paying good money to avoid watching the programs that you are betting upon to attract attention to your advertisements.

The problem isn’t, as you may be thinking, that the sheer volume and length of advertising slots during programs is the issue. That is an issue, but its one that we have learned to live with over the years. No the real issue is that the TV networks are making the programs themselves — you know, the filler material between the ads, the stuff that attracts the eyeballs in the first place — completely unwatchable.

They do this with those incredibly annoying and disruptive animated things at the bottom of the screen. They were bad enough when they started as little more than ticker-tape height, but over time have grown to fill the bottom third of the screen, filling it with highly animated and brightly colored distraction. Most often, seemingly timed to completely disrupt the atmosphere that the program producer has spent his talents building.

Apparently, the networks think that this is just a free slot they can use to pimp their future programming, not having to use that valuable time that they can sell to you.

Unfortunately, this makes watching any serious programming impossible. This is why I, and presumably many others, will wait, and actually pay money to watch the exact same program via the Internet.

It’s not only being able to watch on my schedule as opposed to that of the networks.  It’s not to avoid your advertising (although that too has its attractions), but is very definitely to avoid the garbage that the networks insist on slapping on the screen during the program I am trying to watch.

I accidentally tried to watch House on Fox last night. It was impossible to not be continually be distracted and lose track.

It left a very bad taste in my mouth, not only for Fox, but for the morons that spend good money paying for advertising slots that I will ensure I never watch again. Back to Amazon and Netflix for me.

I suggest that you, who really control what goes on here with your advertising budget, apply as much pressure as you can to ensure that programming around your paid for time does not include this garbage. Then, one day, I may actually see some of your advertising material once again.

 

1

Growth is good. Isn’t it?

With all the wringing of hands and prophesies of doom from politicians, industrialists and TV talking-heads when they begin to discuss the fact that growth of the economy  is running at a feeble 2% or so, one can only assume that growth is good, and lack of growth is bad.

However, if we accept that the economy should grow, and that the growth should be sustained at some annual percentage rate, we have to accept that we are then looking at an exponential growth, as illustrated in the graph on the left.

The interesting thing about exponential growth is that it is unsustainable. What exponential growth actually means is that over some period, whatever we are measuring the growth of doubles in size. Over the next period it doubles again, then again, then again.

The actual period for doubling is 69.3147/n, where n is the annual growth rate expressed as a percentage. 69.317 is hard to remember, so lets call it 70, which is much easier o remember because it is the average life-span of humans. So a 3.5% growth rate, which is on the low side of where many economists would like to see it, means that the economy doubles in size every 70/3.5 = 20 years. For the USA, in 2010 the GDP (which is one way of measuring the size of the economy) was 14,526,550 million dollars. A 3.5% annual growth rate means that in 2030, the GDP would be 29,053,100 million dollars, 58,106,200 in 2050 etc.

Thats a lot of dollars.Thats a lot of production. How is this going to be achieved? In the past, economic growth has been sustained by essentially two things: an exponential growth in population, and increased productivity of workers.

Until he 1960’s the US population was growing exponentially, but then leveled off, and has stayed relatively constant, and would actually be falling without immigration (1960’s was when the contraceptive pill became widely available).

So its clear that population growth can’t be counted on to produce more workers, to produce more widgets to sell to grow the economy.

Since the beginning of the industrial revolution productivity has been increased by the application of power and technology.

The first stage was mechanization. From the spinning wheel to the spinning jenny, from the horse drawn plough to the tractor and combine harvester, from hand-built coaches to the production lines of Detroit.

Following mechanization came automation. Applying advances in electronics and computers to replace man workers with a smaller number of workers overseeing the automatic production. Faster production of more widgets by smaller numbers of people means higher productivity, since productivity is measured essentially in widgets per person per hour.

These advantages had two effects, one was to allow each worker to produce more, and what was produced was cheaper. Production increased, and so did the market as items became more affordable, not only in America but in foreign markets.

The problem is, we have reached a point where not only are there no more workers to tap, but technological productivity multipliers have reached their limits. There just isn’t a new technology to produce cars faster and cheaper. There isn’t any new technology to make the same agricultural increases as moving from horse drawn plough to tractor, no more forcing crop yields beyond what is currently achievable with fertilizer and genetic modification.

So how is the economy to grow? Short term solutions such as making use of spare (and cheaper) population in other countries takes up some of the slack, especially since many of these countries populations are still increasing exponentially. Importing workers helps a little, artificially boosting the population, but longer term these are destined to fail. Of course, these measures really reduce productivity since the less educated, less skilled (cheaper) workers require more people to do the same work, but by only counting the managers overseeing the “outsourcing” in the head count and not the much more numerous actual workers, productivity appears to be rising.

Population growth is a problem. There is limited space on the planet, there is limited space to grow food, becoming even more limited when population competes with space to grow food to feed that population.

Nature has ways of taking care of exponential growth. If a single bacteria is placed in a culture medium and allowed to divide, the bacteria population will grow at an exponential rate. Eventually, the limited nutrient available, limited space and the bacteria’s own waste products cause a catastrophic  collapse in the population. In the more general case, there are a variety of things which nature creates to stabilise a population: predators, disease, famine and war. Humans are not immune to nature. Relying upon growth in “offshore” populations is not going to last very long.

As the above chart shows, productivity among countries is not uniform. As worker availability becomes more and more of a resource contraint, how can those countries with lower productivity improve? One way is to compte with those with higher productivity for their (offshore/immigrant) workers, for energy and raw materials resources which also need to keep pace with an exponential growth.

One of the buzzwords of politicians when talking about energy is “sustainable”. In the same breath they often talk about sustainable growth. A little bit of thought shows that once a certain stage of development is reached, there is no such thing as sustainable growth. Growth needs to be zero. Over time, the balance of industries will need to change, some will grow, but others have to contract to compensate. Failure to recognize this leads inevitably to the catastrophic crash that wipes out entire populations.

Humans are mostly incapable of dealing with exponential growth. They are much more familiar with linear processes and almost always underestimate the speed with which the final stages leading to a crash occur. In the video series linked to below there are some examples of the rapidity with which things go bad. One of the examples is to take a jar and place a single bacteria in the jar. The bacteria will split into 2, each of those will split giving 4, then 8, then 16, 32, 64, 128 etc. A typical exponential growth.

After exactly one hour, the jar is full. What is amazing to most people is to realize that after 59 minutes the jar is only half full. At 58 minutes it is a quarter full, at 57 minutes one eighth. The end comes very rapidly.

The following link will take you to a page with a series of links to 10 minute videos of a lecture given by Prof. Bartlett. The whole lecture is approximately one hour long. It will be well worth your time to watch the entire series. You will never look at those innocent looking few percent growth rates in the same way again.

http://www.albartlett.org/presentations/arithmetic_population_energy_video1.html

 

 

0

Apparently, England is still special.

Someone using the name Younger Dryas (that’s not a real name, by the way, if you don’t know what it refers to, look it up on wikipedia) has noticed something very odd about the temperature in England.

There is a series of temperature measurements that has been kept of central England since 1772. This is the longest continuous temperature record anywhere in the world. It is based upon real thermometer readings, not tree rings or any other proxy measurements (although, to be accurate, a thermometer is actually a proxy for temperature, since we are not directly measuring temperature but its effect upon the measuring device). This temperature seris is known as the Central England Temperature (CET) series. It covers a roughly triangular area between Lancashire, London and Bristol.

The UK MET Office produces temperature series for the whole of England (as well as Wales, Scotland and Northern Ireland).

What ‘Younger Dryas” did was to take the temperature readings for the CET, and subtract the average temperature for some period. This leaves the differences in temperature, known as “temperature anomalies”. These are what climate scientists prefer to work with, since it shows only the changes (from some arbitrary temperature) and isn’t masked by the large value of the actual temperature.

He then took the temperature anomalies generated by the MET Office for the whole of Englang, and subtracted these from the CET anomalies. Since there is going to be some difference between temperatures over the whole of England as compared to a the small part of it covered by the CET, you would expect to see variations up and down around zero as different weather patterns affected different parts of England, and in fact, this is what we do see until fairly recently:

CET compared to whole of England

Around 1992 there is a large difference between the CET temperature, and the MET Office temperature for England. England apparently getting warmer than the CET region.

Intrigued by this, our friend then decided to compare the CET series against those maintained by the MET Office for Wales, Scotland and Northern Ireland.

He was surprised to discover that the very obvious spike in temperature for England was completely absent when he compared the CET series against these other regional series.

For example, the series for Wales compared against the CET series.

Exactly why parts of England outside the CET triangle, but excluding Wales, Scotland and Northern Ireland should be getting hotter every year is something of a mystery.

My initial thought was that it might be due to the fact that the CET series has been “adjusted” for Urban Heat Island (UHI) effects since 1974. This would lower the actual temperature readings, and if the temperatures for the whole of England were not so adjusted the effect would be to show England getting warmer.

However, its clear from the graph that the problem doesn’t begin in 1974, not until around 1992. Also, the effect would be apparent when comparing to other regions.

Strangely, this effect kicks in just about the time that the “Oh no! We are all going to die! CO2 is burning up the planet!” craze kicked in.

The New Zealand MET Office has been caught adjusting their temperatures to rise every  month. Surely the UK MET Office would NEVER stoop to such underhanded tricks?

CET and Wales anomalies

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Original thread is here: http://theweatheroutlook.com/twocommunity/default.aspx?g=posts&t=4732

Tags:
1

Life just gets worse

Life is getting hard for the proponents of Anthropogenic Global Warming AGW), and Greens who want to use the AGW theory to drive civilization back into the stone age (no, not Bronze – making Bronze requires the use of FIRE).

I mentioned the name of Mark Lynas in a previous article. Mark has always been very much a believer in AGW and strongly supported the UK government’s push to move to mostly “renewable” energy sources. The politically acceptable renewable sources that is, solar and wind, never things like hydro electric generation (it affects the fish, which is a big no-no, but mincing bats and birds in windmills is seemingly ok … odd that.), and no tidal generation (fish again … Greens seem to have a thing about fish). Oh, and ABSOLUTELY NEVER, EVER atomic power. Mark was a true believer.

Note the use of past-tense there.

Today, Mark published an article in the UK Daily Mail. As usual in the mainstream media, the headline is chosen for its shock value, but in this case appears to be very close to the truth.

Mark’s conversion came not on the road to Damascus (don’t worry if you don’t get the reference there) but after going and doing a bit of his own research into the “facts” presented to him by various “Green” organizations on the topic of power generation, and especially nuclear power. What he discovered was that the real scientific facts were very different from what he had been led to believe. The more he looked, the more he discovered that he had not been misled, he had been lied to. Hence the rather strong language in his article headline.

He discovered that the death toll of the Chernobyl accident was 50, not the 5,000 claimed by Greenpeace, that despite all the wailing in the press about Fukushima, not a single person has died, and no one is likely to.

Unfortunately, Mark still clings to the CO2 being totally responsible for global warming theory, but we can assume that once he is over the shock of discovering that he has been lied to about nuclear energy, he will turn his attention to looking at the real science behind climate change (or whatever its called today).

On that topic, a paper was published today; Kaufmann et. al. 2011 which attempts to address an annoying little fact that until now AGW proponents have been trying to ignore. The fact that between 1998 and 2008 (2008 was the latest data the paper examines) there was no global temperature rise, while CO2 levels have continued to rise at the same rate as pre-1998.

This is the first paper in which this has actually been acknowledged. The point of the paper is to show that this can be explained by tweaking a global temperature model, changing various parameters. Basically they lay the blame on particulate matter from Chinese coal fired power stations. They don’t explain exactly why Chinese power stations are different, and their net effect is to lower global temperature, or why the answer to global warming is not therefore to fire up a lot more coal fired power stations in the West.

The paper is generally self-contradictory and based upon a really old and out of date climate model, but is important because it finally puts into print, from AGW approved scientists, the fact that global warming has stopped, and not followed the “inescapable” rise due to increased CO2.

It seems that being Green, as Kermit the Frog has always claimed, really is hard. And seemingly due to get even harder.