Jump to content
  • Member Statistics

    17,588
    Total Members
    7,904
    Most Online
    LopezElliana
    Newest Member
    LopezElliana
    Joined

2012 Hottest Spring Ever in US


PhillipS

Recommended Posts

If you have links to Arctic data from HadCRUT4 or NCDC, I would re-run the numbers using those datasets. It's my understanding that those datasets do not provide data at such a level of detail. All of them reasonably represent global temperatures. Therefore, one can rely on the data presented for global temperature anomalies, including GISS's Arctic region temperatures.

Moreover, if one examines climate data from the recent decades one finds:

1. A divergence of global temperatures from solar irradiance. Nevertheless, some have speculated that amplification of solar forcing might explain most of the observed recent warming. However, the extreme solar minimum in the 2005-10 timeframe offered a lab test of sorts on the solar + solar amplification issue. Despite the deepest and longest solar minimum since at least the early 20th century, the earth maintained a large energy imbalance.

2. Some have insisted more narrowly that the ongoing Arctic warming is the result of the AMO. Yet, Arctic temperatures have diverged from the AMO, regardless of whether one uses the annual data or 5-year moving averages.

GissArctic.jpg

3. Measured warming is also confirmed by expected responses. Declining Arctic ice, later first-freezes, earlier last freezes, lengthening growing seasons, etc. all are consistent with warming.

If one lacked any other information, one would be inclined to believe that a mystery or unknown forcing must explain the relatively recent divergence of global temperatures from the natural forcings. Of course, that would raise the question as to why such a mystery forcing had been dormant so to speak until recent decades.

However, things are not so bleak for climate science. Other information does exist. In fact, a very substantial body of information has been developed through careful and rigorous scientific research. That body of work, which has been validated by peer review, includes reasonable approximations for a host of forcings, natural and manmade.

RadiativeForcingsChartIPCC2007.png

The evidence strongly suggests that the human contribution in the form of greenhouse gas emissions, particularly carbon dioxide, explain most of the recent observed warming. The decoupling of global temperatures from natural forcings highlights the growing role atmospheric concentrations of greenhouse gases are playing. The relative non-response to the longest and deepest solar minimum since at least the early 20th century dismissed the assumption (or hope among some contrarians) that even as solar forcing is relatively modest, amplification leads to a much greater combined direct and indirect influence.

In their 2011 paper concerning the persistence of the earth's energy imbalance, Hansen, et. al., concluded:

The strong positive energy imbalance during the solar minimum, and the consistency of the planet's energy imbalance with expectations based on estimated human-made climate forcing, together constitute a smoking gun, a fundamental verification that human-made climate forcing is the dominant forcing driving global climate change.

If they wish to overturn the scientific consensus, those who disagree with the scientific consensus must furnish credible evidence of a new natural forcing or series of such forcings that allows them to accurately represent the climate--including the recent warming--using solely natural forcings and/or demonstrate that CO2's heat trapping properties are much less than what is currently understood. Attempts to dismiss rigorous and credible datasets that measure the earth's temperature (e.g., Watt's efforts to discredit the instrument record) are not a substitute for meeting that requirement. Instead, they represent an attempt to evade what is likely an insurmountable challenge and possibly a challenge that they know they can't surmount.

In the meantime, natural and anthropogenic forcings will continue to drive the climate, with the latter exerting a relatively greater influence for the foreseeable future. In the near-term, the forecast of imminent global cooling being peddled by some bloggers, could well be shattered next year as the global temperature approaches or exceeds its 2010 peak (GISS and/or NCDC) courtesy of an El Niño that is emerging in the context of a still warming global climate.

Great post, Don!

Link to comment
Share on other sites

  • Replies 248
  • Created
  • Last Reply

http://www.ncdc.noaa.gov/sotc/national/2012/6/supplemental

In some locations, 2012 temperatures have been so dramatically different that they establish a new "neighborhood" apart from the historical year-to-date temperatures. For a visual presentation, click a station's rank for a "Haywood plot" that shows its cumulative year-to-date temperature for each year it has been reporting. As you read from left to right, a particular year's trace represents the average of all previous days during that year. Because of the relatively few days that go into the year-to-date average during January and February, the plots are often very noisy during these months.

Link to comment
Share on other sites

http://www.ncdc.noaa.../6/supplemental

In some locations, 2012 temperatures have been so dramatically different that they establish a new "neighborhood" apart from the historical year-to-date temperatures. For a visual presentation, click a station's rank for a "Haywood plot" that shows its cumulative year-to-date temperature for each year it has been reporting. As you read from left to right, a particular year's trace represents the average of all previous days during that year. Because of the relatively few days that go into the year-to-date average during January and February, the plots are often very noisy during these months.

Thank you Bluewave for sharing that link with us. That is a great site and is chock full of info. I confess I had never heard of a Haywood plot, but I've bookmarked the site and look forward to studying it further.

Link to comment
Share on other sites

NYC Spring decade averages...3/1-5/31

decade.....ave temp/precip.....max.temp...

1870-79......48.4......10.50......

1880-89......48.4........9.31......96

1890-99......49.2........9.86......96

1900-09......50.1......11.27......90

1910-19......50.3......11.20......95

1920-29......50.5......10.66......93

1930-39......51.1......10.82......96

1940-49......51.8......11.48......96

1950-59......51.6......10.93......94

1960-69......51.7........9.78......99

1970-79......52.5......13.10......96

1980-89......53.2......14.90......97

1990-99......52.8......12.45......96

2000-09......52.8......12.23......96

2010-19......55.9......14.41......92 (2010-2012)

1870-

2009..........51.0......11.32......95

1980-

2009..........52.9......13.19......96

since 1970 Spring in NYC is averaging almost two degrees above the long term Spring average......the last three years are way above the average but it's a small sample...The 1980's averaged 53.2...The 2010's are averaging 55.9 after three years...The city will have to get a few cool Springs in the next seven years to bring the average closer to normal...

Precipitation has gone up substantially since 1971...It's gotten warmer and wetter since the 1960's...

Link to comment
Share on other sites

Thank you Bluewave for sharing that link with us. That is a great site and is chock full of info. I confess I had never heard of a Haywood plot, but I've bookmarked the site and look forward to studying it further.

Sure. There is so much great information packed into the NCDC site, but it's easy miss some of the specific links.

Link to comment
Share on other sites

http://www.ncdc.noaa.../6/supplemental

In some locations, 2012 temperatures have been so dramatically different that they establish a new "neighborhood" apart from the historical year-to-date temperatures. For a visual presentation, click a station's rank for a "Haywood plot" that shows its cumulative year-to-date temperature for each year it has been reporting. As you read from left to right, a particular year's trace represents the average of all previous days during that year. Because of the relatively few days that go into the year-to-date average during January and February, the plots are often very noisy during these months.

Wow... look at the trace from Muskegon, MI

http://www1.ncdc.noaa.gov/pub/data/cmb/images/us/2012/jun/haywood/firsthalf.USW00014840.png

Link to comment
Share on other sites

Wow... look at the trace from Muskegon, MI

http://www1.ncdc.noa...USW00014840.png

St. Louis looks the same. We have been blowing out our or record warm years 1910, 1921 since March.

We finally get a break from the heat and this is the result so far through July 7th.

St. Louis, Missouri

Period of record: 1874- Present

Top Ten Avg. Temperatures

1) 60.0 2012

2) 57.8 1921

3) 57.0 1991

4) 56.3 1990/1880

6) 55.8 2006

7) 55.6 1946/1925/1911

10) 55.5 1878

This is the part that always gets me now days.

The forecast was for upper 80s upper 60s after the heat wave by Monday.

actual:

7/8: 98/75/87 = +7F

7/9: 92/74/83 = +3F

7/10:95/71/83 = +3F

We hit 95F today with 18C 850 mb temps. Very impressive. The NAM has us at 95-96F tomorrow.

After the same low 90s to mid 90s regime. The Heat ridge builds back in place.

Through the 9th STL was averaging 90.5F for the month, which is 10.5F above normal and if Juiy by a miracle finished that warm it would be the record by a few degrees here.

But these "cooler" days are still running well above normal.

we have been above normal so much the last year, it's no joke we are at record heat here, 18 month, 12 months, 6 months, 3 months. it's been consistent with ups and downs.

June was 2.5F above normal that is a cool month the last year. Many are 4-7F above normal.

Link to comment
Share on other sites

I took a closer look at the January-June 2012 CONUS temperatures and both the record and margin by which the old record was smashed are quite remarkable.

The 6-month mean temperature of 52.88°F was 2.991 sigma above the 1981-2010 average and 3.744 sigma above the 20th century average. Moreover, the margin by which the January-June 2012 period surpassed the prior record of 51.68°F (2006) was 1.059 standard deviations (1981-2010 base period) and 1.001 standard deviations (20th century base period).

Link to comment
Share on other sites

St. Louis looks the same. We have been blowing out our or record warm years 1910, 1921 since March.

We finally get a break from the heat and this is the result so far through July 7th.

St. Louis, Missouri

Period of record: 1874- Present

Top Ten Avg. Temperatures

1) 60.0 2012

2) 57.8 1921

3) 57.0 1991

4) 56.3 1990/1880

6) 55.8 2006

7) 55.6 1946/1925/1911

10) 55.5 1878

This is the part that always gets me now days.

The forecast was for upper 80s upper 60s after the heat wave by Monday.

actual:

7/8: 98/75/87 = +7F

7/9: 92/74/83 = +3F

7/10:95/71/83 = +3F

We hit 95F today with 18C 850 mb temps. Very impressive. The NAM has us at 95-96F tomorrow.

After the same low 90s to mid 90s regime. The Heat ridge builds back in place.

Through the 9th STL was averaging 90.5F for the month, which is 10.5F above normal and if Juiy by a miracle finished that warm it would be the record by a few degrees here.

But these "cooler" days are still running well above normal.

we have been above normal so much the last year, it's no joke we are at record heat here, 18 month, 12 months, 6 months, 3 months. it's been consistent with ups and downs.

June was 2.5F above normal that is a cool month the last year. Many are 4-7F above normal.

Your ground temps are probably running high above normal, the proceeding air-mass isn't going to bring you average temps until you have average soil temps.

Link to comment
Share on other sites

Nobody gets into a plane and flys up to the arctic to study historical temps with a neutral POV, sorry it just doesn't happen unfortunately.

Your unsupported opinion is noted. So far all you and Ben have offered are conspiracy theories, ad hominem attacks on the scientists and research organizations involved in arctic research, and innuendo. Not one datum of actual evidence.

If the two of you feel so strongly that all of the arctic data is tainted then go to the arctic and do your own research. There's nothing stopping you. I'm sure Heartland and other 'skeptical' groups would be delighted to fund your expedition for the truth. Heck, I'll even pledge $10 to send you there - provided you agree to stay north of the Arctic Circle for at least two years. I'll bet there are others on this forum who would match my pledge. Imagine how famous you'll be when you come back with the data to blow the lid off the Great AGW Hoax. This is your chance for a Nobel - don't hesitate and lose the opportunity.

Don't just blog from the safety and comfort of your keyboards - go north and make a name for yourselves.

Link to comment
Share on other sites

The previous data which Don just displayed is why I say the 1981-2010 so called norms are irrelevant to today's climate. We should 2 or 3 degrees to those figures and that would be the true normal for today.

The climate has continued to warm e.g., but the changes are not anywhere close to 2°-3°. For the CONUS, the mean temperature during the 1981-2010 period was 58.58°F. The 2000-11 mean temperature was 53.97°, a +0.39°F departure from the 1981-2010 base.

Link to comment
Share on other sites

Your unsupported opinion is noted. So far all you and Ben have offered are conspiracy theories, ad hominem attacks on the scientists and research organizations involved in arctic research, and innuendo. Not one datum of actual evidence.

I never mentioned a conspiracy theory. I just said I don't trust the Arctic GISS manipulation.

Link to comment
Share on other sites

St. Louis up to 2011...

station.gif

That chart appears to be inaccurate. For example, it shows 2010 to be warmer than 2011 at St. Louis. In fact, as per the NWS, 2010 had a mean temperature of 58.0° and 2011 had a mean temperature of 58.7°.

For 1950-2011, the NCDC's graph for St. Louis's temperatures is below:

STL19502011.jpg

If one is looking for the actual annual mean temperatures for St. Louis, one can find them here: http://www.crh.noaa....al_averages.xls

In terms of 2012, St. Louis is running much warmer than any prior year on record:

http://www1.ncdc.noaa.gov/pub/data/cmb/images/us/2012/jun/haywood/firsthalf.USW00013994.png

Link to comment
Share on other sites

How could you manipulate Arctic sea ice to melt if the temperature wasn't actually getting warmer?

When you see what we have seen take place in the arctic, you should expect a massive change in surface temperatures.

Maybe folks who say Giss is a scam can show us through the buoy data what the real story is. I highly doubt the many buoys who happen to support GISS and BEST would all have bad data.

or they would just tell us what they feel temp anomalies should be between 64-90N when snow and ice cover over the last 12 years there during summer has plummeted to record lows.

Link to comment
Share on other sites

How could you manipulate Arctic sea ice to melt if the temperature wasn't actually getting warmer?

How can we know that the ice has really melted? I certainly haven't gone up there and measured sea ice personally - I rely on ground and satellite data. Ben has already admitted that he feels the ground data has been nefariously manipulated and is untrustworthy. I wonder if he feels that the satellite images are photoshopped. For that matter, how do we even know that there are satellites up there? Maybe it's ALL a hoax!

Link to comment
Share on other sites

How can we know that the ice has really melted? I certainly haven't gone up there and measured sea ice personally - I rely on ground and satellite data. Ben has already admitted that he feels the ground data has been nefariously manipulated and is untrustworthy. I wonder if he feels that the satellite images are photoshopped. For that matter, how do we even know that there are satellites up there? Maybe it's ALL a hoax!

That eliminates buoys since humans install them and moorings.

But if this is the case then Ben has to drop his under water heat papers he posted. Those use those same manned buoys to get there underwater temp data in the arctic.

the new science" It's not a lie, if you believe it.

Link to comment
Share on other sites

When you see what we have seen take place in the arctic, you should expect a massive change in surface temperatures.

Maybe folks who say Giss is a scam can show us through the buoy data what the real story is. I highly doubt the many buoys who happen to support GISS and BEST would all have bad data.

or they would just tell us what they feel temp anomalies should be between 64-90N when snow and ice cover over the last 12 years there during summer has plummeted to record lows.

There has been a massive change in surface temps? There was also a massive change in surface temps 60 to 70 years ago. There have been massive changes to sea ice? There were massive changes to sea ice 60 to 70 years ago. The GISS manipulation is what they did to temps from the 30's and 40's. Not what they are doing to temps now.

Link to comment
Share on other sites

That eliminates buoys since humans install them and moorings.

But if this is the case then Ben has to drop his under water heat papers he posted. Those use those same manned buoys to get there underwater temp data in the arctic.

the new science" It's not a lie, if you believe it.

Buoys don't lie, that is equipment measured data, not reconstructed and highly biased. Buoys may show temporary warming such as la Nina or el nino though.

Sent from my phone, please excuse my grammar!

Link to comment
Share on other sites

How can we know that the ice has really melted? I certainly haven't gone up there and measured sea ice personally - I rely on ground and satellite data. Ben has already admitted that he feels the ground data has been nefariously manipulated and is untrustworthy. I wonder if he feels that the satellite images are photoshopped. For that matter, how do we even know that there are satellites up there? Maybe it's ALL a hoax!

Post 1979 satellite data is 100% fact, coincidentally the decline in ice begins nearly the day that they were launched, how convenient.

Sent from my phone, please excuse my grammar!

Link to comment
Share on other sites

Post 1979 satellite data is 100% fact, coincidentally the decline in ice begins nearly the day that they were launched, how convenient.

Sent from my phone, please excuse my grammar!

If you think satellite data is 100% fact then you know very little about remote sensing and sensor limitations. If you want exact data you have to take direct measurements - whether it's temperature, mass, salinity, thickness or whatever. Satellites, by the fact that they are in orbit, can only take indirect measurements - and often only of a proxy becuse the parameter of interest can't be remotely sensed. For example, no satellite ever launched has ever measured the daily min and max temperatures of a location on the Earth's surface - for that matter they can't measure surface temperatures at all because the intervening atmosphere is opaque at those wavelengths. UAH and RSS use data from AMSU sensors which measure channels of microwave radiation emitted by oxygen molecules (the proxy), and process that data through their models to derive temperature values. Extrapolation is used to fill in areas not covered by the satellites' orbits. But as we have seen with the history of the UAH program, those models have been repeatedly modified and revised resulting in the output values being heavily manipulated.

And since data manipulation seems to be an issue for you, here's an excerpt from the wikipedia article on remote sensing:

Data processing levels

To facilitate the discussion of data processing in practice, several processing “levels” were first defined in 1986 by NASA as part of its Earth Observing System[6] and steadily adopted since then, both internally at NASA (e. g.,[7]) and elsewhere (e. g.,[8]); these definitions are:

Level Description

0 Reconstructed, unprocessed instrument and payload data at full resolution, with any and all communications artifacts (e. g., synchronization frames, communications headers, duplicate data) removed.

1a Reconstructed, unprocessed instrument data at full resolution, time-referenced, and annotated with ancillary information, including radiometric and geometric calibration coefficients and georeferencing parameters (e. g., platform ephemeris) computed and appended but not applied to the Level 0 data (or if applied, in a manner that level 0 is fully recoverable from level 1a data).

1b Level 1a data that have been processed to sensor units (e. g., radar backscatter cross section, brightness temperature, etc.); not all instruments have Level 1b data; level 0 data is not recoverable from level 1b data.

2 Derived geophysical variables (e. g., ocean wave height, soil moisture, ice concentration) at the same resolution and location as Level 1 source data.

3 Variables mapped on uniform spacetime grid scales, usually with some completeness and consistency (e. g., missing points interpolated, complete regions mosaicked together from multiple orbits, etc.).

4 Model output or results from analyses of lower level data (i. e., variables that were not measured by the instruments but instead are derived from these measurements).

A Level 1 data record is the most fundamental (i. e., highest reversible level) data record that has significant scientific utility, and is the foundation upon which all subsequent data sets are produced. Level 2 is the first level that is directly usable for most scientific applications; its value is much greater than the lower levels. Level 2 data sets tend to be less voluminous than Level 1 data because they have been reduced temporally, spatially, or spectrally. Level 3 data sets are generally smaller than lower level data sets and thus can be dealt with without incurring a great deal of data handling overhead. These data tend to be generally more useful for many applications. The regular spatial and temporal organization of Level 3 datasets makes it feasible to readily combine data from different sources.

What we see from UAH, RSS, Cryosat and most other satellites are Level 4 data records - thoroughly processed, massaged, adjusted, interpreted, and manipulated.

Still feel they are 100% fact?

Link to comment
Share on other sites

If you think satellite data is 100% fact then you know very little about remote sensing and sensor limitations. If you want exact data you have to take direct measurements - whether it's temperature, mass, salinity, thickness or whatever. Satellites, by the fact that they are in orbit, can only take indirect measurements - and often only of a proxy becuse the parameter of interest can't be remotely sensed. For example, no satellite ever launched has ever the daily min and max temperatures of a location on the Earth's surface - for that matter they can't measure surface temperatures at all because the intervening atmosphere is opaque at those wavelengths. UAH and RSS use data from AMSU sensors which measure channels of microwave radiation emitted by oxygen molecules (the proxy), and process that data through their models to derive temperature values. Extrapolation is used to fill in areas not covered by the satellites' orbits. But as we have seen with the history of the UAH program, those models have been repeatedly modified and revised resulting in the output values being heavily manipulated.

And since data manipulation seems to be an issue for you, here's an excerpt from the wikipedia article on remote sensing:

Data processing levels

To facilitate the discussion of data processing in practice, several processing “levels” were first defined in 1986 by NASA as part of its Earth Observing System[6] and steadily adopted since then, both internally at NASA (e. g.,[7]) and elsewhere (e. g.,[8]); these definitions are:

Level Description 0 Reconstructed, unprocessed instrument and payload data at full resolution, with any and all communications artifacts (e. g., synchronization frames, communications headers, duplicate data) removed. 1a Reconstructed, unprocessed instrument data at full resolution, time-referenced, and annotated with ancillary information, including radiometric and geometric calibration coefficients and georeferencing parameters (e. g., platform ephemeris) computed and appended but not applied to the Level 0 data (or if applied, in a manner that level 0 is fully recoverable from level 1a data). 1b Level 1a data that have been processed to sensor units (e. g., radar backscatter cross section, brightness temperature, etc.); not all instruments have Level 1b data; level 0 data is not recoverable from level 1b data. 2 Derived geophysical variables (e. g., ocean wave height, soil moisture, ice concentration) at the same resolution and location as Level 1 source data. 3 Variables mapped on uniform spacetime grid scales, usually with some completeness and consistency (e. g., missing points interpolated, complete regions mosaicked together from multiple orbits, etc.). 4 Model output or results from analyses of lower level data (i. e., variables that were not measured by the instruments but instead are derived from these measurements).

A Level 1 data record is the most fundamental (i. e., highest reversible level) data record that has significant scientific utility, and is the foundation upon which all subsequent data sets are produced. Level 2 is the first level that is directly usable for most scientific applications; its value is much greater than the lower levels. Level 2 data sets tend to be less voluminous than Level 1 data because they have been reduced temporally, spatially, or spectrally. Level 3 data sets are generally smaller than lower level data sets and thus can be dealt with without incurring a great deal of data handling overhead. These data tend to be generally more useful for many applications. The regular spatial and temporal organization of Level 3 datasets makes it feasible to readily combine data from different sources.

What we see from UAH, RSS, Cryosat and most other satellites are Level 4 data records - thoroughly processed, massaged, adjusted, interpreted, and manipulated.

Still feel they are 100% fact?

Thank You Phillip. +1000000000000000000000

Link to comment
Share on other sites

If you think satellite data is 100% fact then you know very little about remote sensing and sensor limitations. If you want exact data you have to take direct measurements - whether it's temperature, mass, salinity, thickness or whatever. Satellites, by the fact that they are in orbit, can only take indirect measurements - and often only of a proxy becuse the parameter of interest can't be remotely sensed. For example, no satellite ever launched has ever measured the daily min and max temperatures of a location on the Earth's surface - for that matter they can't measure surface temperatures at all because the intervening atmosphere is opaque at those wavelengths. UAH and RSS use data from AMSU sensors which measure channels of microwave radiation emitted by oxygen molecules (the proxy), and process that data through their models to derive temperature values. Extrapolation is used to fill in areas not covered by the satellites' orbits. But as we have seen with the history of the UAH program, those models have been repeatedly modified and revised resulting in the output values being heavily manipulated.

And since data manipulation seems to be an issue for you, here's an excerpt from the wikipedia article on remote sensing:

Data processing levels

To facilitate the discussion of data processing in practice, several processing “levels” were first defined in 1986 by NASA as part of its Earth Observing System[6] and steadily adopted since then, both internally at NASA (e. g.,[7]) and elsewhere (e. g.,[8]); these definitions are:

Level Description

0 Reconstructed, unprocessed instrument and payload data at full resolution, with any and all communications artifacts (e. g., synchronization frames, communications headers, duplicate data) removed.

1a Reconstructed, unprocessed instrument data at full resolution, time-referenced, and annotated with ancillary information, including radiometric and geometric calibration coefficients and georeferencing parameters (e. g., platform ephemeris) computed and appended but not applied to the Level 0 data (or if applied, in a manner that level 0 is fully recoverable from level 1a data).

1b Level 1a data that have been processed to sensor units (e. g., radar backscatter cross section, brightness temperature, etc.); not all instruments have Level 1b data; level 0 data is not recoverable from level 1b data.

2 Derived geophysical variables (e. g., ocean wave height, soil moisture, ice concentration) at the same resolution and location as Level 1 source data.

3 Variables mapped on uniform spacetime grid scales, usually with some completeness and consistency (e. g., missing points interpolated, complete regions mosaicked together from multiple orbits, etc.).

4 Model output or results from analyses of lower level data (i. e., variables that were not measured by the instruments but instead are derived from these measurements).

A Level 1 data record is the most fundamental (i. e., highest reversible level) data record that has significant scientific utility, and is the foundation upon which all subsequent data sets are produced. Level 2 is the first level that is directly usable for most scientific applications; its value is much greater than the lower levels. Level 2 data sets tend to be less voluminous than Level 1 data because they have been reduced temporally, spatially, or spectrally. Level 3 data sets are generally smaller than lower level data sets and thus can be dealt with without incurring a great deal of data handling overhead. These data tend to be generally more useful for many applications. The regular spatial and temporal organization of Level 3 datasets makes it feasible to readily combine data from different sources.

What we see from UAH, RSS, Cryosat and most other satellites are Level 4 data records - thoroughly processed, massaged, adjusted, interpreted, and manipulated.

Still feel they are 100% fact?

The entire point flew over your head like a flock of geese, complete in V formation.

Most pre-satallite data is reconstructed data based on ship reports and other less then stellar methods, although its the best we have, with the unfortunete human hand involved.

Can you show me an overhead shot of 1950 ice?

Sent from my phone, please excuse my grammar!

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.

×
×
  • Create New...