Sunday, 7 October 2018

"reads like someone outside their area of expertise wrote it"

Reblog of Twitter thread by: Ryan Maue | weathermodels.com | @RyanMaue

Tropical storm expert Ryan Maue said the new IPCC report section on tropical storms "reads like someone outside their area of expertise wrote it"


The Tropical Cyclones section in Chapter 3 of the IPCC 1.5°C Special Report is poorly written & referenced -- reads like someone outside their area of expertise wrote it ... including out of date 15-year old studies? This isn't 2005.

Don't waste printer paper or ink on the @IPCC_CH 1.5°C report for tropical cyclones. Reference this page ---> https://t.co/I908S7hCgb which is continually updated by experts.

1) Numerous studies have not reported a decrease in the number of global tropical cyclones or accumulated cyclone energy.

I'd know because I wrote the last papers cited in IPCC SREX that say otherwise.

The statement is just false -- makes no sense w/cited references. Bizarre.

2) This paragraph is awful -- fails to provide the consensus reached already in the previous IPCC AR5 and SREX and instead goes back to papers from 10-15 years ago -- long outdated and deprecated.

3) Tropical Cyclones in next 10-40 years? Same number globally -- probably a few more intense by percentage of total. A bit wetter near the eye ... and when making landfall.

That's it.

Wednesday, 3 October 2018

Will the Large Hadron Collider (LHC) destroy earth?

No.

Many of the particles which strike the Earth are orders of magnitude more energetic than anything we are ever likely to produce. Some particles like the infamous “Oh-my-god” particle which struck Earth in 1991 with an energy of 3×108 TeV, hitting us at 99.99999999999999999999951% of the speed of light defy explanation – we shall likely never find a way to produce particle energies of that magnitude (for comparison the Large Hadron Collider, Earth’s most powerful particle accelerator, produces particles at around the 4TeV range).

-- Eric Worrall

Sunday, 30 September 2018

Greenhouse Effect flat earth physics diagram

Reblog. Authored by Stephen Wells

The goto paper for academic courses on the greenhouse effect which forms the basis of man made global warming alarmism. Taught in physics undergraduate courses across the world and responsible for the dumbing down of science for the last two decades.

I asked the following question to Astrophysicist Joseph Postma

"You’ve mentioned a few times that you were taught the Greenhouse Effect flat earth physics diagram in your undergraduate degree. May I ask why do you think you didn’t pick up on the pseudoscience back then? What conditions prevailed back then that obscured the errors from you?"

His answer:

"Now that’s an excellent question! Great of you to find the dichotomy there and realize that it is a question. If we can solve that question, perhaps we could solve it for others?

"Let’s see. Well, when I was in undergrad, it was simple: I trusted what I was being told. I saw the flat Earth, saw how it was mathematically formulated, and thought that since it was an average, then everything is OK. I simply didn’t question it or think about it. That’s what happens in a class-room: you accept what you are told. You believe it because nothing being taught in class would be wrong anymore, would it? How could they teach wrong things in a classroom, when in a classroom the point is to get the correct answers?

"Wow…that is scary. Doesn’t that show just how corruptible, corrupt, dangerous and useless our education system actually is. It’s all brain washing and conditioning at a subconscious level. How can anything taught in a class room be wrong when the point of being in a class room is to learn true things that you get rewarded for repeating with check marks and higher grades?

"So to answer your question, I think the answer is that the conditions which prevailed were: naivety. And that is a term I’ve used in recent articles criticizing how academics so readily accept bad science from “high” sources – they naively believe that other people are doing right things.

"What broke my naivety, then? I would say first, it was simply in reading reading “Fit for Life” upon recommendation of a girl I was dating. That started the process. Then it was solidified totally when I read “State of Fear” upon recommendation from another girl I was dating, and decided to look into the criticisms and their answers myself.

"Why would that work for me? I don’t know.

"But the key is that a person has to begin to realize that the world is not true as it has been presented to them. You have to realize just how much we take everything for granted, without truly understanding where these ideas came from. It takes a lot of work and a lot of critical thinking…and this is something that I guess most people aren’t interested in doing, or maybe they would be, but they’re too propagandized and distracted with trivia."

Saturday, 29 September 2018

Statisticians are misled by bad data.

Statistics makes sense provided that the original data is clean. When the original data is dirty, adjusted, and infilled by up to 50%, can one still give credence to statistics, as Andrew Gelman does in this post?

I suggest the chart Andrew leads that post with is junk data. Too dirty, adjusted and infilled to form a record of reality.

  1. USA's best land surface data, USCRN, U.S. Climate Reference Network shows no warming since it began operating 13 years ago:
  2. That satellite record doesn't show dramatic warming either. UAH series is about 0.06ÂșC/decade over the last 3 decades now that man-made CO2 emissions are greatest. Only about 20% of what warmists think it is!
  3. Warmists avoid using data which "hides their warming". Is this is the first time most of you reading have heard of USCRN? Why?, given that it is the most accurate, most scientific, faithful data for land surface temperatures in USA for the past 13½ years.
  4. Andrew Gelman's data is "dirty, adjusted, and infilled by up to 50%"
  5. Can Andrew Gelman to explain to us how warmist NASA GISS got their chart to disagree so widely with Hansen's from 1999, when both are based on the same data?

Thursday, 27 September 2018

We Should Not Call Climate Alarmists "Liars".

Because they are far more dangerous and deluded than mere liars.

Calling alarmists liars is OTT, and counterproductive. It undermines criticism. OK, I admit: some of them have lied. But that's not the norm, nor was it responsible for their 15 years of fame. Their norm is a groupthink so biased, so committed to its mission, that it's almost insane. Groupthink and mission. But groupthink leads to incompetence. Mann was favoured and parachuted into position as IPCC lead author. His Hockey Stick was accepted as “science”. Even today, climate alarmists tell me it's “science”. No group-thinking climate alarmists looked at what Mann did. Not one looked at the nitty-bitty details. Only skeptics did that to find the flaws, and pseudo-science. Yet that same groupthink still believe it acceptable to drive skeptics out-of-work! (So some of them tell me), even when they can see the harm they’ve done, they still want to cause more. The “climate consensus” are dangerous, irresponsible people. But, in their minds, they do not tell “lies”. They do irresponsible “science” (modelling actually). Declare it to be “true”, settled and a consensus. All because it most accurately represents their bias, and fulfills their "mission", or purpose.

Saturday, 1 September 2018

Wegman Report

The Wegman report was made in 2006. It is an independent analysis of the statistics used to create the Hockey Stick, which starred in the IPCC 3rd report. It is here.

The executive summary reports the primary problem with Mann's Hockey Stick paper:

“The controversy of Mann’s methods lies in that the proxies are centered on the mean of the period 1902-1995, rather than on the whole time period. This mean is, thus, actually decentered low, which will cause it to exhibit a larger variance, giving it preference for being selected as the first principal component. The net effect of this decentering using the proxy data in MBH98 and MBH99 is to produce a “hockey stick” shape. Centering the mean is a critical factor in using the principal component methodology properly. It is not clear that Mann and associates realized the error in their methodology at the time of publication.” Finding number 7 discredits claims that the 1990s were the hottest decade in a millennium: “Our committee believes that the assessments that the decade of the 1990s was the hottest decade in a millennium and that 1998 was the hottest year in a millennium cannot be supported by the MBH98/99 analysis. As mentioned earlier in our background section, tree ring proxies are typically calibrated to remove low frequency variations. The cycle of Medieval Warm Period and Little Ice Age that was widely recognized in 1990 has disappeared from the MBH98/99 analyses, thus making possible the hottest decade/hottest year claim. However, the methodology of MBH98/99 suppresses this low frequency information. The paucity of data in the more remote past makes the hottest-in-a-millennium claims essentially unverifiable.” McIntyre and McKitrick exposed the problems and showed that stationary trendless red noise would exhibit the same hockey stick shape after being processed using the MBH methodology! This was confirmed by Wegman.

Urban heat island effect

The urban heat island effect refers to the tendency of urban temperatures to be much higher than rural temperatures. The extra warmth is due to:

  • more heat created by people
  • landscape changes. For example roads and concrete surfaces absorb more solar heat
  • less evaporative cooling. There's less vegetation and water evaporation in built-up areas. Water evaporation is one of the major causes of earth surface cooling.

Is the global warming scare due to misread thermometers?

  • Actual measurements of the best rural ground stations, in USA, show no warming trend over the past 13 years, since the North American 'climate reference network' was built.

    When we look at average surface temperature but exclude urban data we get something like this:

    The chart above shows data from the U.S. Climate Reference Network (USCRN). This is a network of best practice surface stations from 114 sites, carefully sited to suffer no bias from an urban heat island effect. USCRN use the latest, highest quality measuring instruments. These are the most scientific, accurate, precise, direct, ground-level measurements made of temperature anywhere. USCRN shows no warming. The network is only 13 years old, and is only in the USA. Climate alarmists mostly pretend this network does not exist. They don't quote it.

  • The chart below shows: Annual-mean temperature in California, averaged over population centers exceeding 1,000,000 (upper) & of less than 100,000 (lower). Superimposed is the record of Global Mean Temperature (GMT) from the network of surface stations (dotted). The warming trend is clearly far higher in densely populated areas. Source: Goodridge, J, 1996: Bull Am Meteorol Soc, 77, 1588–1589.
  • The 3rd chart is a Histogram of observed temperature trend over California, as a function of population. This chart implies near zero warming in lightly populated areas. Source: Robinson, A, Balliunas, S, Soon, W, and Z Robinson, 1998: Med Sent, 3, 171–178.
  • The UHI effect got worse over time. Paradoxically, since we began worrying about 'global warming' scientists took lower quality surface readings. They included a higher proportion of stations corrupted by UHI effect. In 1990s thousands of rural surface weather stations were closed. Leaving a strong bias with urban weather stations.
  • For 20 years, climate scientists have been making massive adjustments to surface temperatures. They justify these adjustments by claiming to be fixing the corruption introduced by UHI! Adjustments are massive in both the number of adjustments and the size of those adjustments. Here is an example from Iceland. This seems strange to me that climate scientists should be so reliant on temperatures measured at surface stations. Because:
    • most are so unreliable. Temperature measurements are usually just the maximum and minimum temperature recorded per day. So no 'average' daily temperature is never measured. Error bounds are usually around +/- 2C
    • the excuse for closing so many surface stations in the 1990s was that, in future, scientists would rely on (more accurate and precise) satellite measurements. Satellites give:
      1. far both more accurate,
      2. more precise, measurements
      3. made at far more locations (at least ten times more),
      4. a much better picture of isolated areas (which we would otherwise hardly know about).
      5. Satellites eliminate the bias introduced by relying on stations corrupted by the urban heat island effect.
      To summarise: Satellites allow scientists to:
      1. avoid an urban heat island effect biasing temperatures
      2. not interpolate (invent data) for isolated areas where they don't take surface station readings
      3. take more accurate and precise readings
      4. avoid the issue of broken and malfunctioning surface stations
      5. obtain far more readings
      Why did climate scientists continue to use the obsolete ground surface measurements?, and make such a big deal of them in the media. Satellite readings are available for the past 40 years? It seems to me, unscientific to claim that adjusted surface readings, containing massive amounts of interpolated data are more reliable than Satellite readings. But that's what they say. Scientists complain that the Satellite record is corrupted by cloud and atmospheric effects. That is true. Yet what of the massive adjustments they make to the surface record. Today scientists interpolate (make up) nearly half the data they use, and adjust some of the rest.
  • The last 11,000 years show no remarkable recent climate warming.

    One might think that scientists are able to compare the effect of errors introduced by relying on either 1) satellites, or 2) existing ground stations. I never see them quote these errors to give any measure of reliability, or any measure to compare the quality of satellite against ground station measurements. They seem to have no agreed method to calculate the degree of error. So their choice of one over the other (their preference for massively adjusted ground station data) is a subjective judgement.

    Today's climate scientists are not behaving as real scientists.