Building New Skills

There is no better gig than getting paid to learn.

In August of last year, I decided to take on the challenge of teaching an applied statistical techniques course—one small problem—I’m not a statistician. Like most scientist and engineers, I’ve used (and abused) statistics all my entire career, so I do have experience with the concepts. Also, my math skills are not too shabby as I have incorporated calculus, differential equations, and other advanced topics into my work over the years. (Yes, some people do use algebra.) The good news? I had about four months before the first day of class and enough control over my schedule where I could commit 10+ hours per week to develop the course content. The real challenge was to create the course using R, a language, and environment for statistical computing and graphics. Being open about my own abilities, I was not proficient with this language.

With this in mind, I set out to create the course—with emphasis on the “applied” and “techniques.” Starting in the fall, I developed a module that students would work through each week. I estimated my 10–12 hours of work each week would translate into the three hours of material for each class and this worked out surprisingly well.

The course is coming to an end this week, and the best part of the experience was how much I learned (or maybe relearned). But most surprising, it was how much I learned the SECOND time I worked through the materials during the weeks I taught the course.

I classify skill sets at three levels:

  • novice: apply basic principles to solve structured problems,
  • hack: gather external resources to solve moderately complex problems,
  • expert: apply advanced principles to solve complex problems with the minimal use of external resources.

I still consider myself a hack when it comes to performing statistical analysis using R, but having the opportunity to expand my own skill set and providing a framework for others to learn something new—that was a great gig.

First-year Environmental Chemistry After 30 Years

As both of my daughters are now in their first year of college and having a Ph.D. in Chemistry, there is an assumption that you can be a helpful resource with basic chemistry. (For the record, this is not a safe assumption.)

In my efforts to be helpful, I pulled out my first-year Chemistry text. In the spring of 1987 I was completing my second semester of Chemistry, and as I reviewed the old class syllabus, I noticed that Environmental Chemistry was one of the chapters covered. In thinking about the course, I specifically remember Professor John Hubbard making the analogy that the environment was like a buffer. I don’t recall the particular system, but the analogy applies to both the atmosphere and oceans.

The definition of a buffer solution is pretty simple; it’s a solution that resists change in pH upon addition of either an acid or a base. In a broader sesnse, we use the term to describe any system that resists change upon addition of a compound that would alter the equilibrium of a system.

A single section of the chapter discussed the topics of acid rain, photochemical smog, carbon monoxide, and climate. Within this text, a single paragraph summarized the role of carbon dioxide and its role in maintaining surface temperatures. Within this one paragraph, there was the warning “If the calculated effect of doubling of CO2 level on the surface temperature is correct, this means that the earth’s temperature will be 3 degrees C higher within 70 years.” (Chemistry: The Central Science, T. Brown and H.E. LeMay, Jr., 3rd ed., Prentice-Hall, Englewood Cliffs, 1985, p. 393.) Current CO2 levels are 405 ppm (parts per million) compared to 330 ppm as referenced in the 1985 text.
( A 23% increase.

I was curious… can I see this prediction in data from my home locale of Salt Lake City, Utah?

I pulled a simple data set from NOAA’s website–annual averages from 1948 through 2016. Here are the data and a simple analysis.

SLC annual temperatues 1948-2016Annual Average Temperature (°F) for Salt Lake City, UT

It’s pretty remarkable. Over the past 69 years, the average annual temperature is increasing at a rate of 0.05 °F/year (0.028 °C/year).

The average (mean) value over this period is 52.4 °F (11.33 °C) with a 95% lower and upper confidence limits of 52.0 °F (11.33 °C) and 52.8 °F (11.54 °C), respectively. The top and bottom traces on the graph show the 95% prediction intervals.

So what does this mean? If we look at temperatures from 2012 through 2016, they all fall inside the 95% prediction intervals. So, no problem! (Right?)

But go back to the initial premise that the atmosphere is a buffer, when will we know that we’ve exceeded the “buffer capacity” for CO2?

buffer example rev00An example of a buffer curve showing the variable
under observation versus percent completion.

And that’s the problem, we don’t know how much of the buffering capacity we’ve consumed, and we probably won’t know where we are on the curve until after we’ve reached a tipping point, and temperatures accelerate beyond the slow, apparently linear trend we observe today.