Archive for the 'visualization' Category

Information about solar irradiance measurements sought

Thursday, January 12th, 2017


Planetary science at The Shard: How soon is the sun’s radiation going to be destructive? (apart from the effect on the CCD chip)

This blog post is based on a thread in the Azimuth forum.

The current theories about the sun’s life-time indicate that the sun will turn in about 5 billion years into a red giant. How and when this process is going to be destructive to earth is still debated. Apparently according to more or less current theories there has been a quasi linear raise in luminosity, quoting from p. 3 “Distant future of the Sun and Earth revisited” by K.-P. Schröder and Robert Connon Smith, 2008:

The present Sun is increasing its average luminosity at a rate
of 1% in every 110 million years, or 10% over the next billion years.

Unfortunately I feel a bit doubtful about this, in particular after I looked at some irradiation measurements.
But let’s recap a bit.



In the Azimuth forum I asked for information about solar irradiance measurements. Why I was originally interested in how bright the sun is shining is a longer story, which includes discussions about the global warming potential of methane. For this post I prefer to omit this lengthy historical survey about my original motivations (may be I come back to this later) – meanwhile (see above) there is an also a newer reason why I am interested in solar irradiance measurements, which I want to talk about here.

Strictly speaking I was not only interested in knowing more about how bright the sun is shining, but how bright each of it’s “components” is shining, i.e. I liked to see spectrally resolved solar irradiance measurements and in particular measurements from a range between roughly the frequencies* 650nm and 950nm.

So I had found the Sorce mission, which is a NASA sponsored satellite mission, whose website is located at the University of Colorado. The website provides very nicely an interactive part with a fairly clear and intuitive LISIRD interactive app with which the spectral measurements of the sun can be studied.

As a side remark I should mention that this NASA mission belongs to the NASA Earth Science mission, which is currently threatened to be scrapped.

By using this app I found in the 650nm and 950nm range a very strange rise in radiation between 2003 and 2016 which happened mainly in the last 2-3 years. Here you can see this rise:


spectral line 774.5nm from day 132 to 5073, day 132 starting Jan 24 in 2003, day 5073 is end of 2016

Now, fluctuations within certain spectral ranges within the sun spectrum are no news, however here it rather looked as if a rather stable range suddenly started to change rather “dramatically”.

I put the word “dramatically” in quotes for a couple of reasons.

Spectral measurements are complicated and prone to measurement errors. Alone the subtle issue of dirty lenses etc. suggests that this is no easy feat and that so this strange rise might easily be due to a measurement failure. Moreover as said it looked as this was a fairly stable range over the course of ten years, but maybe this new rise in irradiation is part of the 11 years sun cycle, i.e. a common phenomenom. In addition, although the rise looks big it may overall still be rather subtle.

But so – how subtle or non-subtle is it then?

In order to assess that question I made a quick estimation (see forum discussion) and found that if all the additional radiation would arrive on soil (which of course it doesn’t due to absorption) than on 1000 sqm you could easily power a lawn mower with that subtle change! I.e. my estimation was 1200 W for that lawn patch. WOA!

That was disconcerting enough to download the data and linearly interpolate it and calculate the power of that change. I programmed a calculation program in javascript for that. The computer calculations revealed 1000 W, i.e. my estimation was fairly close. WOA again!

How does this translate to overall changes in solar irradiance? Some increase had already been noticed. NASA wrote 2003 on it’s webpage:

Although the inferred increase of solar irradiance in 24 years, about 0.1 percent, is not enough to cause notable climate change, the trend would be important if maintained for a century or more.

That was 13 ys. ago.

I now used my program to calculate the irradiance for one day in 2016 between the frequencies 180.5nm and 1797.62nm, i.e. about a quite big part of the solar spectrum and got the value \(627 W/m^2\) and computed the difference to one day in 2003 and got \(0.61 W/m^2\), which is 0.1% in 13 years, rather then 24 years. But of course this is no average and fluctuations play a big role in some parts of the spectrum, but well – this may indicate that the overall rate (!) of rise in solar radiation may have doubled. And concerning the question of the sun’s luminosity: for assessing luminosity one would need to take the concrete satellite-earth orbit at the day of measurement into account, as the distance to the sun varies or at least average – but still, on a first glance this appears disconcerting.

Moreover for this specific range I mentioned above I calculated the value \(192 W/m^2\) for day in 2016 (day 5073), so this would mean for this frequency range the increase in 13 ys was about 0.5% and most of it in the last 2-3 years.

Given that this spectral range has e.g. an overlap with the absorption of water (clouds!) this should at least be discussed.

And indeed one can even see the rise in this range within the solar spectrum without zooming in. See how the spectrum splits into a purple and dark red line in the lower circle?


Difference in spectrum between day 132 and 5073

The upper circle display another rise, which is discussed in the forum.

So concluding all this looks as if this needs to be monitored a bit more closely. Finally the theories about the lifetime of the sun are only theories.
In particular it would be important to see wether these rises in irradiance are also displayed in other measurements, so I asked in the Azimuth Forum, but sofar got no answer.

The russian wikipedia site about solar irradiance contains unfortunately no links to russian satellite missions (if I haven’t overseen something) and there exists no chinese or indian wikipedia webpage about solar irradiance. I also couldn’t find publicly accessible spectral irradiance measurements on the ESA website (although they have some satellites out there) and wrote in December an email to the head of the section solar radiometry of the World Radiation Center (WRC) Wolfgang Finsterle with no answer yet.

In short if you know about publicly available solar spectral irradiance measurements other than the LISIRD ones then please let me know.

update Jan 15, 2017: This post appeared also as a guest post on John Baez blog Azimuth with minor modifications, in particular the english was polished by John.

correction Feb, 3, 2017: * frequencies should read inverse spatial frequencies or simply wavelength

Gesture Steered Turing Machines

Friday, July 1st, 2016

A new astlab project, which comes closer to realize something which I have carried around in my head for now almost ten years.

(more…)

Alert, Nunavut etc.

Saturday, November 28th, 2015

With some help from Tim and on the occasion of the 2015 United Nations Climate Change Conference I did a visualization which combines local temperatures with methane data. The local temperatures are from the HADCRUT4 file, so they unfortunately stop in 2011. The methane data is from the website of the Earth System Research Lab. Unfortunately there are not so many methane measurements as there should be. In particular very few temperature stations have also made methane measurements, so I improvised a bit and joined some measurement points which are geographically close. The measurements are from Vestmannaeyar, Iceland; Alert Nunavut, Canada; Svalbard, Norway with temperatures from Lufthavn and CH4 from Ny Ålesund; Syowa, Antarctica and from Azores, Portugal, where the temperatures are from Santa Maria Island and the methane data is from Terceira Island (if I interpreted the station names correctly).

I have currently not so much Internet time left, partly because I currently have a job, where I have to sit a lot in front of a computer and partly because I have been trying to improve things in my local surroundings (partially as it seems in vain though) – so no long explanations. I hope you see at least what I see in the images above.

temperature curve: mean of anomalies (monthly deviations of values from monthly mean over measured time period, annual mean of that)
methane curve: annual mean of values

global warming didn’t stop

Saturday, July 11th, 2015

Image from NOAA (public domain if I understood correctly)

Those who follow the randform posts closely know that Tim and me had worked on a visualization of a main collection of global temperature stations. It was used in a post on Azimuth – a blog which is mostly concerned with environmental topics and which is run by the mathematical physicist John Baez. In the post I reviewed the temperature data, which was used by the IPCC for their sofar published climate Assessment Reports up to AR4 in 2007. I left the conclusions about the investigated temperature records and their quality to the reader, but in the comment section I became a bit more “direct” and wrote:

Well every reader may judge him/herself by looking at the visualizations. If you want my opinion: I think this is rather catastrophic. In particular I wouldn’t wonder if the “global warming hiatus” is connected to the gaps.

The “global warming hiatus” or “global warming pause” is a finding that the global temperature rise has approximately paused since 1998 and hence by making this comment I questioned this “warming pause” or at least its shape. Unfortunately my suspicion has now been more or less confirmed. That is there global warming continues.

The article “Possible artifacts of data biases in the recent global surface warming hiatus” by Karl et al. Science 2015 0 (2015)” in the journal “Science” has unfortunately to be rented for the prize of 20$/day for reading (so I haven’t looked at it), but NOAA has a summary, where it is written:

“Adding in the last two years of global surface temperature data and other improvements in the quality of the observed record provide evidence that contradict the notion of a hiatus in recent global warming trends,” said Thomas R. Karl, L.H.D., Director, NOAA’s National Centers for Environmental Information. “Our new analysis suggests that the apparent hiatus may have been largely the result of limitations in past datasets, and that the rate of warming over the first 15 years of this century has, in fact, been as fast or faster than that seen over the last half of the 20th century.”

About the newly included datasets it is written:

New analyses with these data demonstrate that incomplete spatial coverage also led to underestimates of the true global temperature change previously reported in the 2013 IPCC report. The integration of dozens of data sets has improved spatial coverage over many areas, including the Arctic, where temperatures have been rapidly increasing in recent decades. For example, the release of the International Surface Temperature Initiative databank, integrated with NOAA’s Global Historical Climatology Network-Daily dataset and forty additional historical data sources, has more than doubled the number of weather stations available for analysis.

I mentioned the International Surface Temperature Initiative (ISTI) in the Azimuth blogpost together with a citation from their blog:

The ISTI dataset is not quality controlled, so, after re-reading section 3.3 of Lawrimore et al 2011, I implemented an extremely simple quality control scheme, MADQC.

which doesn’t sound too great, if it comes to quality assessment.

But still: I suspect that the new temperature curves of that article match the real temperatures to a much better degree than the ones which were used for the IPCC reports until 2013.
It is though unfortunate that these new temperatures are not available, because I still have that suspicion that the role of methane in that warming trend is greatly underestimated and I still think it IS ULTIMATELY URGENT to investigate that suspicion. The exact shape of the curve would be rather important, because amongst others there was also a “hiatus” in the rise of methane and I think you can see that short pause in the above image.

Methane may however play eventually also a role in a way more dramatic environmental context. In my point of view that context should also be investigated URGENTLY, but it seems the view of methane is viewed controversely among climate scientists, at least Gavin Schmidt of the NASA’s Goddard Institute for Space Studies blurrily expressed anti-alarmistic words in an interview with John H. Richardson from Esquire (Esquire link via John Baez) by saying that:

“The methane thing is actually something I work on a lot, and most of the headlines are crap. There’s no actual evidence that anything dramatically different is going on in the Arctic, other than the fact that it’s melting pretty much everywhere.”

visibility deterioration of deterioration

Thursday, September 18th, 2014

Unfortunately our temperature visualization from last post is currently not running anymore. Probable reason: It currently seems that WebGl Earth has moved two library files. In particular the WebGl earth API base script which we were thinking was self-contained unfortunately doesn’t seem to be self-contained. We are going to look at this trouble in the near future.

supplement 05.10.2014: The interactive visualization is currently working again. Klokan technologies had responded and promised to look into this problem.

On the deterioration of data

Thursday, August 21st, 2014

Tim and me are currently working on a interactive browser visualization using temperature data from HADCRUT, namely the CRUTEM 4 temperature station data which we map with the help of the open source web GL earth API (which seems to be to quite some extend the work of the Czech-Swiss company Klokan technologies) onto a model of the earth (covered with open street maps).
The visualization is still work in progress, but what is already visible is that the temperature data is quite deteriorating (please see also the previous randform post on the topic of deteriorization of data). Where it looks as if the deterioration had been bigger in the years from 2000-2009 than in the years 1980-2000. Below you can see screenshots of various regions of the world for the month of January for the years 1980, 2000 and 2009. The color of a rectangle indicates the (monthly) temperature value for the respective station (the station is represented by a rectangle around its coordinates) which is encoded with the usual hue encoding (blue is cold, red is hot). Black rectangles are invalid data. The CRUTEM 4 data file contains the data of 4634 stations. Mapping all the station data makes the visualization very slow, especially for scaling, therefore the slightly different scalings/views for each region and the fact that screenshots are on display. The interactive application will probably be not for all stations at once.

North America:



Jan 1980


Jan 2000



Jan 2009





Africa:


Jan 1980


Jan 2000


Jan 2009





Asia:



Jan 1980


Jan 2000


Jan 2009





Eurasia/Northern Africa:



Jan 1980


Jan 2000


Jan 2009





Northpole:



Jan 1980


Jan 2000


Jan 2009

Employment to population ratio

Wednesday, July 23rd, 2014

I am still collecting data on global employment in order to better investigate the replacement of human work by machines. Unfortunately it turned out that the International Labour Organisation (ILO), which holds most of the original data restructured their IT-sector. This means in particular that some older data can’t be reproduced any more. Above you can see that the worldwide employment went down on average since the nineties. I keep the data now here locally on our account as a copy from ILO in order to keep the findings reproducible. The data source as well as the source code for extracting it (GPL) are here. As always: if you spot some mistakes please let me know.

Periodicity

Sunday, June 22nd, 2014

This concerns a discussion on Azimuth. I found that the temperature anomaly curve, which describes the global combined land [CRUTEM4] and marine [sea surface temperature (SST)] temperature anomalies (an anomaly is a deviation from a mean temperature) over time (HADCRUT4-GL) has a two-year periodicity (for more details click here). The dots in the above image shall display, why I think so. The dark line drawn over the jagged anomaly curve is the mean curve. The grey strips are one year in width. A dot highlights a peak (or at least an upward bump) in the mean curve. More precisely there are:

18 red dots which describe peaks within grey 2-year interval
5 yellow dots which describe peaks out of grey 2-year interval
(two yellow peaks are rather close together)
1 uncolored dot which describes no real peak, but just a bump
4 blue dots which describe small peaks within ditches

One sees that the red and yellow dots describe more or less all peaks in the curve (the blue dots care about the minor peaks, and there is just one bump, which is not a full peak). The fact that the majority of the red and yellow dots is red, means that there is a peak every 2 years, with a certain unpreciseness which is indicated by the width of the interval.

Upon writing this post I saw that I forgot one red dot. Can you spot where?

Especially after doing this visualization this periodicity appears to me meanwhile so visible that I think this should be a widely known phenomenom, however at Azimuth nobody has heard yet about it. If its not a bug then I could imagine that it could at least partially be due to differences in the solar irradiance for northern and southern hemissphere, but this is sofar just a wild guess and would need further investigations, which would cost me a lot of (unpaid) time and brain. So if you know how this phenomen is called then please drop a line. If its not a bug then this phenomen appears to me as an important fact which may amongst others enter the explanation for El Niño.

Aimbottleneck

Tuesday, April 8th, 2014


Title: “Kreative Mode beim Bedrockabgrundste-in”, oil on canvas, artist: Mudda Prahler

There was recently a post on Gamasutra with the title: Titanfall: Why Respawn is punishing cheaters. The computer game Titanfall is a First person shooter that can be played with a couple of people in one environment. Wikipedia describes it as follows:

Players fight either on foot as free-running “Pilots” or inside agile mech-style walkers called “Titans” to complete team-based objectives[2][3] on a derelict and war-torn planet[4] as either the Interstellar Manufacturing Corporation (IMC) or the Militia.[5]

I don’t know Titanfall (In general I have been playing first person shooters rather rarely) but what apparently happened was that there where too many people cheating in the game.

In the post it isn’t really described what exactly is implied by cheating, but what I refer from the “punishment” announcement, I think what was happening was that some people used game bots and in particular socalled aimbots, which are software solutions which make shooting easier in such a game. From the Titanfall announcement:

You can play with other banned players in something that will resemble the Wimbledon of aimbot contests. Hopefully the aimbot cheat you paid for really is the best, or these all-cheater matches could be frustrating for you. Good luck.

I was asking myself though wether this action is part of some viral marketing campaign. That is that some cheaters could think that it could be way cooler to “win the Wimbledon of aimbot contests” rather than the usual game. Given that Titanfall had however performance problems which as it seems where due to overloaded game servers and connections, it doesn’t though look as if this would improve with aimbot contests.

In this context:

In a citation about a report by a tech- and investment-advisory firm in the time article: The Surprisingly Large Energy Footprint of the Digital Economy

In his report, Mills estimates that the ICT system now uses 1,500 terawatt-hours of power per year. That’s about 10% of the world’s total electricity generation

The New York times article: Power, Pollution and the Internet remarks the following about e.g. US data centers:

Nationwide, data centers used about 76 billion kilowatt-hours in 2010, or roughly 2 percent of all electricity used in the country that year, based on an analysis by Jonathan G. Koomey, a research fellow at Stanford University who has been studying data center energy use for more than a decade. DatacenterDynamics, a London-based firm, derived similar figures.

A summary of the last IPCC report about climate change and global warming.

and:

In Berlin there is currently the International games week Berlin.

Generation Z: Renoise

Sunday, February 16th, 2014

IMG_0644.JPG

For Berliners and those who can afford to go to Berlin for a quick trip I would like to mention an absolute must see exhibition, namely the exhibition Generation Z: Renoise about the russian musical avantgarde in the 20s and later which is curated by L. Pchelkina, A. Smirnov, P. Aidu, K. Dudakov-Kashuro and E. Vorobyeva. The exhibition is unfortunately not as highly promoted as it should – given how fabulous it is! I hope that this post makes some more people visit it. It is definitely worth it! The exhibition is in the Künstlerhaus Bethanien, Kunstraum (unfortunately not so easy to find), it runs until Feb.23, 2014. It is free of entrance and open from 12:00-19:00 o clock.

The exhibition has the themes: Projectionism and Radio-ear, Revasavr, GIMN Institute, Theremin, Graphical Sound, Industrial Noise Machines, Amateur Noise Instruments and Destruction of Utopia. Below is a small excerpt from the respective themes. A lot of details can also be found in Andrei Smirnov’s book “Sound in Z“.

(more…)