First-year Environmental Chemistry After 30 Years

As both of my daughters are now in their first year of college and having a Ph.D. in Chemistry, there is an assumption that you can be a helpful resource with basic chemistry. (For the record, this is not a safe assumption.)

In my efforts to be helpful, I pulled out my first-year Chemistry text. In the spring of 1987 I was completing my second semester of Chemistry, and as I reviewed the old class syllabus, I noticed that Environmental Chemistry was one of the chapters covered. In thinking about the course, I specifically remember Professor John Hubbard making the analogy that the environment was like a buffer. I don’t recall the particular system, but the analogy applies to both the atmosphere and oceans.

The definition of a buffer solution is pretty simple; it’s a solution that resists change in pH upon addition of either an acid or a base. In a broader sesnse, we use the term to describe any system that resists change upon addition of a compound that would alter the equilibrium of a system.

A single section of the chapter discussed the topics of acid rain, photochemical smog, carbon monoxide, and climate. Within this text, a single paragraph summarized the role of carbon dioxide and its role in maintaining surface temperatures. Within this one paragraph, there was the warning “If the calculated effect of doubling of CO2 level on the surface temperature is correct, this means that the earth’s temperature will be 3 degrees C higher within 70 years.” (Chemistry: The Central Science, T. Brown and H.E. LeMay, Jr., 3rd ed., Prentice-Hall, Englewood Cliffs, 1985, p. 393.) Current CO2 levels are 405 ppm (parts per million) compared to 330 ppm as referenced in the 1985 text.
(https://www.scientificamerican.com/article/atmospheric-carbon-dioxide-hits-record-levels/) A 23% increase.

I was curious… can I see this prediction in data from my home locale of Salt Lake City, Utah?

I pulled a simple data set from NOAA’s website–annual averages from 1948 through 2016. Here are the data and a simple analysis.

SLC annual temperatues 1948-2016Annual Average Temperature (°F) for Salt Lake City, UT
(1948⎯2016)

It’s pretty remarkable. Over the past 69 years, the average annual temperature is increasing at a rate of 0.05 °F/year (0.028 °C/year).

The average (mean) value over this period is 52.4 °F (11.33 °C) with a 95% lower and upper confidence limits of 52.0 °F (11.33 °C) and 52.8 °F (11.54 °C), respectively. The top and bottom traces on the graph show the 95% prediction intervals.

So what does this mean? If we look at temperatures from 2012 through 2016, they all fall inside the 95% prediction intervals. So, no problem! (Right?)

But go back to the initial premise that the atmosphere is a buffer, when will we know that we’ve exceeded the “buffer capacity” for CO2?

buffer example rev00An example of a buffer curve showing the variable
under observation versus percent completion.

And that’s the problem, we don’t know how much of the buffering capacity we’ve consumed, and we probably won’t know where we are on the curve until after we’ve reached a tipping point, and temperatures accelerate beyond the slow, apparently linear trend we observe today.

Why should we fund research? It is investment in individuals that pays off.

What’s the ROI to the nation? A quick, back-of-the-envelope estimation (okay… a quick spreadsheet estimation) shows the internal rate of return to be positive. The short-term cost of supporting graduate students has long term benefits.

image (2)Figure 1. Estimated salary from age 22 to 59 for a hypothetical worker.

 

image (3)Figure 2. The internal rate of return on supporting a graduate student for six years in this example is 3.6%.

As a democracy, we have a unique opportunity to decide what is important. What “Great” ideas can we as a nation support that as individuals we could never achieve.

Most people today are familiar with an MRI; however, most people do not know the technology is based on Nuclear Magnetic Resonance (NMR). The fundamental physics of NMR was first worked out in the 1930s. Development of the technique continued after World War II, but it wasn’t until 1977 that the first MRI was performed on a person. (https://www.aps.org/publications/apsnews/200607/history.cfm)

That’s a three to four-decade process from lab to commercialization. For every mainstream technology our society uses today, there is a similar story.

How many graduate students worked on projects during the 30+ years of development of the final product? Buried inside the major headlines are thousands of “little” stories. Stories of our nation’s commitment to supporting researchers and their basic research.

Those who pursue advanced degrees in science and engineering – financially supported by our national government – provide the foundation for companies who take an idea to market; industries (semiconductor, pharmaceutical, medical device, energy) aren’t building widgets anymore, they’re building vast databases of knowledge that will result in physical products. The jobs needed to create those physical products will rely on highly automated systems and any tasks that require low skill labor is going to be outsourced or marginalized. As a Democracy, we can support the creation of knowledge and develop skills to improve our quality of life or fall behind countries making that investment. As a bonus, it makes financial sense as well.