Saturday, September 23, 2023

Global Warming Fact-Checking and Independent Data Analysis/Verification

Introduction

At some point around 2016, I decided to do a small check if Global Warming is real. I made a small program and downloaded weather data from a normal weather website (I think it was http://www.wunderground.com). I only got data for 10 cities around the world and after simple analysis, I found that in some cities temperature goes up and in some cities, it goes down. Before this, I was a Global Warming true believer but after this first data analysis, I started having some doubts.

Research Flow

Starting point

All these years from 2016 until 2023 I was thinking about it and this is what I came up with. Finally, in 2023 I found all the data I needed and I did the proper analysis. As with any research, it is always good to understand what was done by others before me. I asked myself, what should be the most trusted and knowledgeable source of information about Global Warming? The answer is obvious, it is The Intergovernmental Panel on Climate Change (https://www.ipcc.ch/). It is the United Nations organisation, which has all the latest knowledge about this topic. Every few years this organisation produce a report and the latest one in 2023 was the Sixth Assessment Report. There are many different versions of this report, short, full, summary for policymakers, etc. I took the full scientific version which is "The Physical Science Basis"  and available at this link: https://report.ipcc.ch/ar6/wg1/IPCC_AR6_WGI_FullReport.pdf. This report was published in 2021. The previous report is dated 2013.

Raw Data Sources

The report mentioned above has 2409 pages. I have to admit, did not read all of them. I only found relevant pieces. Let's start with page 59 (page 76 by PDF reader) of this report. It is written there, that the temperature increased by 1.09 degrees Celcius from 1850 until 2020. In brackets, there is so so-called confidence interval [0.95 - 1.20]. It means the real temperature increase is somewhere between 0.95 and 1.20 degrees Celcius.


On page 61 (page 78 by PDF reader) there is a graph of the temperature depending on the year.

In this graph, data set names are mentioned. I tried to find all of them and I found that NOAA is easier to get and analyze because it is in CSV format. For example, Berkley Earth has the data only up to 2013 and other has different data format or does not have raw data. By raw data, I mean temperature at a specific point on a specific date. Temperature is recorded by so-called weather stations. This is how the raw data in NOAA data set looks like

AE000041196,20230101,TAVG,207,H,,S,

First is the weather station id, then the data (1 January 2023), and next is the data description. TAVG means average temperature. 207 is the temperature multiplied by 10, so the real temperature is 20.7 degrees Celcius. All this raw data can be found by this link http://www.ncei.noaa.gov/pub/data/ghcn/daily/by_year/ and readme.txt has a detailed description of the fields and their meaning. Also, there is data about each weather station here https://www.ncei.noaa.gov/pub/data/ghcn/daily/ghcnd-stations.txt, which has station position.
I decided to build the same graph as in the previous image by myself based on the raw data, which can be found in the links above.

Data Preparation and Analysis

First, I downloaded all the data from the link above from the year 1750 until 2022. Each year's data comes as a separate file and I merged all of them into a single file. The data contains a lot of different weather parameters such as precipitation, wind, etc. I had to clean up it (delete all non-temperature values) to keep only the temperature. Then I found that there are some missing data like there is no average temperature for some specific date, but there is minimum and maximum temperature. I restored the average temperature by adding minimum plus maximum temperature and dividing by 2.
I calculated the average temperature per year and put this data into Tableau business intelligence tool so that I could publish nice dashboards here.

Questions about Global Warming

Before I show the results of my analysis, I would like to mention what kind of question I have about Global Warming. Actually, there are only 2 questions: error and coverage. Let me explain it with more detailed information.

Source Data Error

Usually, in physical experiments, all original data comes like this 20.7±0.1, which means the actual value is between 20.6 and 20.8. 0.1 is uncertainty. So, first question, in the raw data here http://www.ncei.noaa.gov/pub/data/ghcn/daily/by_year/ there is no uncertainty. As I find here https://journals.ametsoc.org/view/journals/atot/29/7/jtech-d-11-00103_1.xml (official publication from NOAA) some of the data was provided by volunteers and nobody knows what kind of tools they had and what the uncertainty of this volunteers data is.
In real scientific experiments, any data has to be paired with uncertainty (error). In my university, if I come to a professor with the experimental data like this:

Temperature °C

20.7

21.1

20.5

The professor will reject my work. It must be like this:

Temperature °C

Uncertainty ±°C

20.7

0.1

21.1

0.1

20.5

0.1

If it is not like this, then the data is not scientific. Without uncertainty, it is a school-level experiment.

Data Coverage

This is a very important question because if we do not have enough data points on the map or weather station our data can be meaningless. For example, if there are only 2 points for the whole continent, then there is no point in calculating an average temperature for this continent. It is just not enough data. The good question is how many weather stations are needed to provide good coverage. Sometimes, I drive 10 km from my home to the seashore and the temperature goes down by 2 degrees. I believe the good coverage will be if there is one weather station every 10 km. I will show what is actual coverage in the detailed analysis.
Someone might say, that we have satellite data as well, so why we should rely on the weather stations? I would like to put a link to NASA official website, where they say, that ground weather stations are still more precise than satellites. This is the link https://climate.nasa.gov/faq/49/which-measurement-is-more-accurate-taking-earths-surface-temperature-from-the-ground-or-from-space/.

Not only temperature

All the topics, such as, glaciers melting, and polar bears' problems because polar ice is melting too early in the year are related to the data error and coverage questions. Are there the measurements of the glacier size every 100 meters for the last 50 years in order to tell, that it's size was reduced? Polar bears live on the shore line which is 12000 km long. Are there the measurements available of all these 12000 km at least for every 10 km?

Detailed Analysis

Temperature trend

If I build a graph similar to the graph in the report from The Intergovernmental Panel on Climate Change (temperature per year from 1850 until 2022) I get this graph.


Temperature goes up from 9 to 12 degrees (more than 3 degrees). Even more than in The Intergovernmental Panel on Climate Change report. But let's see what happens if I change the year range. If I put year range 1960-2022, then I get this:
So the temperature increased just a little bit, 0.1 degrees or even less. If I put 1970-2022, then it looks like this:
Temperature goes down by 0.6 degrees! I could not believe it when I first got this result!!! We have global cooling, no Global Warming. The ice age is coming soon!!!
Anyone can play with this data and filter it by the required year by this link on Tableau website or below (embedded dashboard from Tableau website):

Data coverage analysis

As a second stage, I decided to check the coverage or how many weather stations we have and how they are distributed on The Earth. As I mentioned above, I believe that there should be one weather station every 10 km to get good coverage. This is the weather stations coverage for 2022.

Many regions are hardly covered, such as Antarctica, Greenland, North of Canada, Siberia and Africa. Another example, is in the Berkley Earth data set, there is a global average temperature anomaly here https://berkeley-earth-temperature.s3.us-west-1.amazonaws.com/Global/Land_and_Ocean_summary.txt for the year 1850 it is -0.454 degrees Celcius with the uncertainty (error) 0.143 degrees. This looks very precise for the year 1850. Let's see how many weather station was there in 1850. In the Berkley Earth data set only 208 (In NOAA even less - 7 stations). So, 208 weather stations mean the average distance between each station is around 840 km. I can believe that in 1850 there were such precise tools, which could measure temperature with 0.1 degree precision. There was already very complex steam locomotive and other machines. However, it is difficult to believe, that having one weather station every 840 km, we can calculate an average with 0.143 degrees precision.
Even with the maximum number of active weather stations, which was about 20000, it means the average distance between stations is 80 km. It is far from ideal coverage 10 km between 2 closest stations.
I made an interactive dashboard with a map where anyone can play with the number of weather stations and temperature trends for a specific weather station or country/region. It is available by this link on Tableau website or below (embedded dashboard from Tableau website):
In addition, I made a dashboard with the number of active weather stations per year and it is available by this link or below:

Conclusion and Open Questions

I am not saying that Global Warming does not exist. Maybe this simple average is not good enough. I did not take into account sea temperature data. It was more difficult to find.
What I am saying is that there are some reasonable doubts and a lot of questions about Global Warming. In other words, there is not enough evidence to say, that Global Warming exists with probability 95%. In some countries like The Netherlands, people are paying CO2 taxes. Is there enough evidence to force all the people to pay CO2 tax?

These open questions are still actual:
  • What is the precision of the original raw data? It is not mentioned in the data source or related works. In other words, what is the precision of tools in each weather station?
  • Coverage is too low. Even during the last 50 years, when there are 15000-20000 weather stations, the coverage is only about 1.4% (under assumption that for good coverage it should be 1 weather station every 10 km).






















Tuesday, January 11, 2022

Motion Detection and Object detection with Machine Learning from webcamera

Motion Detector with ML
Program initialization/Machine Learning model loading. Please wait...
Usually it takes 5-10 seconds, but if loading takes more than 30-60 seconds, refresh the page.
 

I could not find a good program, which can do motion detection and I made it myself. This program has several functions:

  • Motion detection and take a photo when motion is detected and when motion level is more than specified sensitivity
  • Detect and mark objects on the photo if the probability of object detection is more than specified.
  • Interval photo for timelapse making.
  • Turn on flash if the program is running on the Android phone.
  • Add photos to zip file.
  • Works fine on PC, Mac, Android Phone. Limited functionality works on iPad with Safari browser and it does not really work on iPhone.

Note 1: In Google Chrome it works only if the automatic download is enabled. If Chrome is configured to ask every time file path where to download the files, it will not work.

Note 2: Program will ask permission to access the camera immediately after starting. This is required to build a list of all available cameras.

If the program does not work well in this blog please try it here.

If Google Drive is mapped as an independent drive and the Browser is setup to automatically download photos to the Google Drive folder, then it can be used as home video surveillance.