Latest Post
Showing posts with label EARTH QUAKES. Show all posts
Showing posts with label EARTH QUAKES. Show all posts

Drilling Reveals Fault Rock Architecture in New Zealand’s Central Alpine Fault

Written By Unknown on Sunday, February 8, 2015 | 6:51 PM

Figure 1: Location map of study by Virginia Toy et al. Click on the image for a larger version.
            Figure 1: Location map of study by Virginia Toy et al. Image Credit: GSA
Boulder, Colo., USA - Rocks within plate boundary scale fault zones become fragmented and altered over the earthquake cycle. They both record and influence the earthquake process. In this new open-access study published in Lithosphere on 4 Feb., Virginia Toy and colleagues document fault rocks surrounding New Zealand's active Alpine Fault, which has very high probability of generating a magnitude 8 or greater earthquake in the near future.

Descriptions already suggest that the complex fault rock sequence results from slip at varying rates during multiple past earthquakes, and even sometimes during aseismic slip. They also characterize this fault before rupture; Toy and colleagues anticipate that repeat observations after the next event will provide a previously undescribed link between changes in fault rocks and the ground shaking response. They write that in the future this sort of data might allow realistic ground shaking predictions based on observations of other "dormant" faults.

The first phase of the Deep Fault Drilling Project (DFDP-1) yielded a continuous lithological transect through fault rock surrounding the Alpine fault (South Island, New Zealand). This allowed micrometer- to decimeter-scale variations in fault rock lithology and structure to be delineated on either side of two principal slip zones intersected by DFDP-1A and DFDP-1B. Here, we provide a comprehensive analysis of fault rock lithologies within 70 m of the Alpine fault based on analysis of hand specimens and detailed petrographic and petrologic analysis. The sequence of fault rock lithologies is consistent with that inferred previously from outcrop observations, but the continuous section afforded by DFDP-1 permits new insight into the spatial and genetic relationships between different lithologies and structures. We identify principal slip zone gouge, and cataclasite-series rocks, formed by multiple increments of shear deformation at up to coseismic slip rates. A 20−30-m-thick package of these rocks (including the principal slip zone) forms the fault core, which has accommodated most of the brittle shear displacement. 

This deformation has overprinted ultramylonites deformed mostly by grain-size-insensitive dislocation creep. Outside the fault core, ultramylonites contain low-displacement brittle fractures that are part of the fault damage zone. Fault rocks presently found in the hanging wall of the Alpine fault are inferred to have been derived from protoliths on both sides of the present-day principal slip zone, specifically the hanging-wall Alpine Schist and footwall Greenland Group. This implies that, at seismogenic depths, the Alpine fault is either a single zone of focused brittle shear that moves laterally over time, or it consists of multiple strands. Ultramylonites, cataclasites, and fault gouge represent distinct zones into which deformation has localized, but within the brittle regime, particularly, it is not clear whether this localization accompanies reductions in pressure and temperature during exhumation or whether it occurs throughout the seismogenic regime. These two contrasting possibilities should be a focus of future studies of fault zone architecture.

Source:GSA

Magma pancakes beneath Indonesia's Lake Toba: Subsurface sources of mega-eruptions

Written By Unknown on Sunday, December 21, 2014 | 10:26 PM

Lake Toba, Indonesia
The tremendous amounts of lava that are emitted during super-eruptions accumulate over millions of years prior to the event in the Earth's crust. These reservoirs consist of magma that intrudes into the crust in the form of numerous horizontally oriented sheets resting on top of each other like a pile of pancakes.

A team of geoscientists from Novosibirsk, Paris and Potsdam presents these results in the current issue of Science. The scientists investigate the question on where the tremendous amounts of material that are ejected to from huge calderas during super-eruptions actually originate. Here we are not dealing with large volcanic eruptions of the size of Pinatubo of Mount St. Helens, here we are talking about extreme events: The Toba caldera in the Sumatra subduction zone in Indonesia originated from one of the largest volcanic eruption in recent Earth history, about 74,000 years ago. It emitted the enormous amount of 2,800 cubic kilometers of volcanic material with a dramatic global impact on climate and environment. Hereby, the 80 km long Lake Toba was formed.

Geoscientists were interested in finding out: How can the gigantic amounts of eruptible material required to form such a super volcano accumulate in the Earth's crust. Was this a singular event thousands of years ago or can it happen again?
Researchers from the GFZ German Research Centre for Geosciences successfully installed a seismometer network in the Toba area to investigate these questions and provided the data to all participating scientists via the GEOFON data archive. GFZ scientist, Christoph Sens-Schönfelder, a co-author of the study explains: "With a new seismological method we were able to investigate the internal structure of the magma reservoir beneath the Toba-caldera. We found that the middle crust below the Toba supervolcano is horizontally layered." The answer thus lies in the structure of the magma reservoir. Here, below 7 kilometers the crust consists of many, mostly horizontal, magmatic intrusions still containing molten material.
New seismological technique

It was already suspected that the large volume of magma ejected during the supervolcanic eruption had slowly accumulated over the last few millions of years in the form of consequently emplaced intrusions. This could now be confirmed with the results of field measurements. The GFZ scientists used a novel seismological method for this purpose. Over a six-month period they recorded the ambient seismic noise, the natural vibrations which usually are regarded as disturbing signals. With a statistical approach they analyzed the data and discovered that the velocity of seismic waves beneath Toba depends on the direction in which the waves shear the Earth's crust. Above 7 kilometers depth the deposits of the last eruption formed a zone of low velocities. Below this depth the seismic anisotropy is caused by horizontally layered intrusions that structure the reservoir like a pile of pancakes. This is reflected in the seismic data.

Supervolcanoes
Not only in Indonesia, but also in other parts of the world there are such supervoclcanoes, which erupt only every couple of hundred thousand years but then in gigantic eruptions. Because of their size those volcanoes do not build up mountains but manifest themselves with their huge carter formed during the eruption -- the caldera. Other known supervolcanoes include the area of the Yellow-Stone-Park, volcanoes in the Andes, and the caldera of Lake-Taupo in New Zealand. The present study helps to better understand the processes that lead to such super-eruptions.

The tsunami-early warning system for the indian ocean: Ten years after

Technical concept of GITEWS.
The day after Christmas this year will mark the 10 anniversary of the tsunami disaster in the Indian Ocean. On 26 December 2004, a quarter of a million people lost their lives, five million required immediate aid and 1.8 million citizens were rendered homeless. The natural disaster, which caused extreme devastation over huge areas and the accompanying grief and anxiety, especially in Indonesia, Thailand and Sri Lanka exceeded the imaginable and reached such drastic dimensions, mainly due to the lack of a warning facility and a disaster management plan for the entire Indian Ocean region at this time.
Germany and the international community of states reacted with immediate support. Within the framework of the German Flood Victim Aid the Federal Government commissioned the Helmholtz Association of German Research Centres under the direction of the GFZ German Research Centre for Geosciences with the development of an Early Warning System for the Indian Ocean. From 2005 to 2011, with the large-scale project GITEWS (German-Indonesian Tsunami Early Warning System), the core of an integrated, modern, and effective Tsunami Early Warning System in Indonesia was established. With the follow-up project PROTECTS (Project for Training, Education and Consulting for Tsunami Early Warning Systems, 2011-2014) the personnel of the participating Indonesian institutions were trained to proceed independently and to take over responsibility for the operation of the Early Warning System as well as for the diverse technical and organizational components. In this ways PROTECTS which started in June 2011 and comprised a total of 192 training courses, internships, and hands-on-practice courses, covering all aspects of operation and maintenance of the Tsunami-Early Warning System contributed significantly to the sustainability of InaTEWS.
Under the auspices of the IntergovernmentalOceanographicCommission of UNESCO and with the collaboration of international partner institutes from Germany, the USA, China and Japan, GITEWS was integrated into a Tsunami Early Warning System for Indonesia. GITEWS was positively reviewed by a commission of international experts in 2010 and handed over to Indonesia in March 2011. Since then it has been providing its services under the name InaTEWS -- Indonesian Tsunami Early Warning System and is operated by the Indonesian Service for Meteorology, Climatology and Geophysics BMKG.

On 12 October 2011 the exercise drill "IOWAVE11" was carried out in the Indian Ocean. With this drill, InaTEWS successfully demonstrated that it could, furthermore, take over the role of a Regional Tsunami Service Provider (RTSP). Since then Indonesia, in addition to Australia und India, performs the double function as a National Tsunami Warning Center (NTWC) and also as a RTSP and takes over the responsibility for the timely warning of 28 states around the Indian Ocean in the event of a threatening Tsunami. With the on-going step-by-step development, a comprehensive all-encompassing InaTEWS could be successfully realized.

Indonesia now avails of one of the most modern Tsunami Early Warning Systems. On the basis of data from approx. 300 measuring stations a warning can be issued at a maximum of five minutes after an earthquake. These measuring stations include e.g. seismometers, GPS stations und coastal tide gauges. With the data gained from the sensors and using the most modern evaluation systems such as SeisComP3 which was developed by GFZ scientists for the analyses of earthquake data and a Tsunami simulation system in the Warning Centre it is possible to compile a comprehensive picture of the situation. With the aid of a decision support system respectively classified warnings for the affected coastal areas can then be issued. A total of 70 people are involved the operation of the Warning Centre in Jakarta, with 30 employees working solely in a full shift system. According to information provided by the BMKG a total of 1700 earthquakes with a magnitude of more than M= 5 and 11 quakes with a magnitude of 7 and higher have been evaluated and six Tsunami Warnings have been issued to the public by the Earthquake Monitoring and Tsunami Early Warning Centre since the hand over in March 2011.

Schooling, training and disaster precautions (capacity development) for the local community and Town and District councils have received special emphasis. This Capacity Development has been carried out since 2006 in three "typical" regions: Padang (Sumatra), Chilacap (South-Java) and Denpassar (Bali, tourist stronghold). Here particular emphasis was placed on understanding both the warnings issued and the planned evacuation measures.

Local disaster management structures are established with local decision-makers and Disaster Risk Reduction Strategies are developed. Specifically, the education of trainers who are, in turn, responsible for the further spreading of the developed concepts plays a significant role.

Another key element is the determination of hazard and risk maps as a basis for the local evacuation planning as well as for future town and land-use planning. In Bali communication with the hotel industry was an additional factor.

No Early Warning System will ever be able to prevent a strong earthquake and a resulting tsunami and also, in the future, there will be loss of life and material damage. However, through the existence of an Early Warning System and the integration of organizational measures together with comprehensive capacity building the adverse effects of such a natural disaster can certainly be reduced.

Subtle shifts in the Earth could forecast earthquakes, tsunamis

Written By Unknown on Saturday, December 20, 2014 | 6:17 AM

University of South Florida graduate student Denis Voytenko prepares a GPS unit for a high-precision geodetic measurement.
Credit: Jacob Richardson
Earthquakes and tsunamis can be giant disasters no one sees coming, but now an international team of scientists led by a University of South Florida professor has found that subtle shifts in Earth's offshore plates can be a harbinger of the size of the disaster.

In a new paper published in the Proceedings of the National Academy of Sciences, USF geologist Tim Dixon and the team report that a geological phenomenon called "slow slip events" identified just 15 years ago is a useful tool in identifying the precursors to major earthquakes and the resulting tsunamis. The scientists used high precision GPS to measure the slight shifts on a fault line in Costa Rica, and say better monitoring of these small events can lead to better understanding of maximum earthquake size and tsunami risk.

"Giant earthquakes and tsunamis in the last decade -- Sumatra in 2004 and Japan in 2011 -- are a reminder that our ability to forecast these destructive events is painfully weak," Dixon said.
Dixon was involved in the development of high precision GPS for geophysical applications, and has been making GPS measurements in Costa Rica since 1988, in collaboration with scientists at Observatorio Vulcanológico y Sismológico de Costa Rica, the University of California-Santa Cruz, and Georgia Tech. The project is funded by the National Science Foundation.
Slow slip events have some similarities to earthquakes (caused by motion on faults) but release their energy slowly, over weeks or months, and cannot be felt or even recorded by conventional seismographs, Dixon said. Their discovery in 2001 by Canadian scientist Herb Dragert at the Pacific Geoscience Center had to await the development of high precision GPS, which is capable of measuring subtle movements of the Earth.
The scientists studied the Sept. 5, 2012 earthquake on the Costa Rica subduction plate boundary, as well as motions of the Earth in the previous decade. High precision GPS recorded numerous slow slip events in the decade leading up to the 2012 earthquake. The scientists made their measurements from a peninsula overlying the shallow portion of a megathrust fault in northwest Costa Rica.
The 7.6-magnitude quake was one of the strongest earthquakes ever to hit the Central American nation and unleased more than 1,600 aftershocks. Marino Protti, one of the authors of the paper and a resident of Costa Rica, has spent more than two decades warning local populations of the likelihood of a major earthquake in their area and recommending enhanced building codes.

A tsunami warning was issued after the quake, but only a small tsunami occurred. The group's finding shed some light on why: slow slip events in the offshore region in the decade leading up to the earthquake may have released much of the stress and strain that would normally occur on the offshore fault.

While the group's findings suggest that slow slip events have limited value in knowing exactly when an earthquake and tsunami will strike, they suggest that these events provide critical hazard assessment information by delineating rupture area and the magnitude and tsunami potential of future earthquakes.

The scientists recommend monitoring slow slip events in order to provide accurate forecasts of earthquake magnitude and tsunami potential.

Source: University of South Florida (USF Health)

Geologists discover ancient buried canyon in South Tibet

This photo shows the Yarlung Tsangpo Valley close to the Tsangpo Gorge, where it is rather narrow and underlain by only about 250 meters of sediments. The mountains in the upper left corner belong to the Namche Barwa massif. Previously, scientists had suspected that the debris deposited by a glacier in the foreground was responsible for the formation of the steep Tsangpo Gorge -- the new discoveries falsify this hypothesis. Credit: Ping Wang
A team of researchers from Caltech and the China Earthquake Administration has discovered an ancient, deep canyon buried along the Yarlung Tsangpo River in south Tibet, north of the eastern end of the Himalayas. The geologists say that the ancient canyon--thousands of feet deep in places--effectively rules out a popular model used to explain how the massive and picturesque gorges of the Himalayas became so steep, so fast.

"I was extremely surprised when my colleagues, Jing Liu-Zeng and Dirk Scherler, showed me the evidence for this canyon in southern Tibet," says Jean-Philippe Avouac, the Earle C. Anthony Professor of Geology at Caltech. "When I first saw the data, I said, 'Wow!' It was amazing to see that the river once cut quite deeply into the Tibetan Plateau because it does not today. That was a big discovery, in my opinion."
Geologists like Avouac and his colleagues, who are interested in tectonics--the study of the earth's surface and the way it changes--can use tools such as GPS and seismology to study crustal deformation that is taking place today. But if they are interested in studying changes that occurred millions of years ago, such tools are not useful because the activity has already happened. In those cases, rivers become a main source of information because they leave behind geomorphic signatures that geologists can interrogate to learn about the way those rivers once interacted with the land--helping them to pin down when the land changed and by how much, for example.
"In tectonics, we are always trying to use rivers to say something about uplift," Avouac says. 

"In this case, we used a paleocanyon that was carved by a river. It's a nice example where by recovering the geometry of the bottom of the canyon, we were able to say how much the range has moved up and when it started moving."

The team reports its findings in the current issue of Science.

Last year, civil engineers from the China Earthquake Administration collected cores by drilling into the valley floor at five locations along the Yarlung Tsangpo River. Shortly after, former Caltech graduate student Jing Liu-Zeng, who now works for that administration, returned to Caltech as a visiting associate and shared the core data with Avouac and Dirk Scherler, then a postdoc in Avouac's group. Scherler had previously worked in the far western Himalayas, where the Indus River has cut deeply into the Tibetan Plateau, and immediately recognized that the new data suggested the presence of a paleocanyon.

Liu-Zeng and Scherler analyzed the core data and found that at several locations there were sedimentary conglomerates, rounded gravel and larger rocks cemented together, that are associated with flowing rivers, until a depth of 800 meters or so, at which point the record clearly indicated bedrock. This suggested that the river once carved deeply into the plateau.
To establish when the river switched from incising bedrock to depositing sediments, they measured two isotopes, beryllium-10 and aluminum-26, in the lowest sediment layer. The isotopes are produced when rocks and sediment are exposed to cosmic rays at the surface and decay at different rates once buried, and so allowed the geologists to determine that the paleocanyon started to fill with sediment about 2.5 million years ago.

The researchers' reconstruction of the former valley floor showed that the slope of the river once increased gradually from the Gangetic Plain to the Tibetan Plateau, with no sudden changes, or knickpoints. Today, the river, like most others in the area, has a steep knickpoint where it meets the Himalayas, at a place known as the Namche Barwa massif. There, the uplift of the mountains is extremely rapid (on the order of 1 centimeter per year, whereas in other areas 5 millimeters per year is more typical) and the river drops by 2 kilometers in elevation as it flows through the famous Tsangpo Gorge, known by some as the Yarlung Tsangpo Grand Canyon because it is so deep and long.

Combining the depth and age of the paleocanyon with the geometry of the valley, the geologists surmised that the river existed in this location prior to about 3 million years ago, but at that time, it was not affected by the Himalayas. However, as the Indian and Eurasian plates continued to collide and the mountain range pushed northward, it began impinging on the river. Suddenly, about 2.5 million years ago, a rapidly uplifting section of the mountain range got in the river's way, damming it, and the canyon subsequently filled with sediment.

"This is the time when the Namche Barwa massif started to rise, and the gorge developed," says Scherler, one of two lead authors on the paper and now at the GFZ German Research Center for Geosciences in Potsdam, Germany.

That picture of the river and the Tibetan Plateau, which involves the river incising deeply into the plateau millions of years ago, differs quite a bit from the typically accepted geologic vision. Typically, geologists believe that when rivers start to incise into a plateau, they eat at the edges, slowly making their way into the plateau over time. However, the rivers flowing across the Himalayas all have strong knickpoints and have not incised much at all into the Tibetan Plateau. Therefore, the thought has been that the rapid uplift of the Himalayas has pushed the rivers back, effectively pinning them, so that they have not been able to make their way into the plateau. But that explanation does not work with the newly discovered paleocanyon.

The team's new hypothesis also rules out a model that has been around for about 15 years, called tectonic aneurysm, which suggests that the rapid uplift seen at the Namche Barwa massif was triggered by intense river incision. In tectonic aneurysm, a river cuts down through the earth's crust so fast that it causes the crust to heat up, making a nearby mountain range weaker and facilitating uplift.

The model is popular among geologists, and indeed Avouac himself published a modeling paper in 1996 that showed the viability of the mechanism. "But now we have discovered that the river was able to cut into the plateau way before the uplift happened," Avouac says, "and this shows that the tectonic aneurysm model was actually not at work here. The rapid uplift is not a response to river incision."

Exploring a large, restless volcanic field in Chile

Laguna del Maule, Chile, is at the center of a volcanic field that has erupted 36 times during the last 25,000 years, and is now experiencing significant uplift due to magma intrusion.
Credit: David Tenenbaum
If Brad Singer knew for sure what was happening three miles under an odd-shaped lake in the Andes, he might be less eager to spend a good part of his career investigating a volcanic field that has erupted 36 times during the last 25,000 years. As he leads a large scientific team exploring a region in the Andes called Laguna del Maule, Singer hopes the area remains quiet.

But the primary reason to expend so much effort on this area boils down to one fact: The rate of uplift is among the highest ever observed by satellite measurement for a volcano that is not actively erupting.

That uplift is almost definitely due to a large intrusion of magma -- molten rock -- beneath the volcanic complex. For seven years, an area larger than the city of Madison has been rising by 10 inches per year.

That rapid rise provides a major scientific opportunity: to explore a mega-volcano before it erupts. That effort, and the hazard posed by the restless magma reservoir beneath Laguna del Maule, are described in a major research article in the December issue of the Geological Society of America's GSA Today.

"We've always been looking at these mega-eruptions in the rear-view mirror," says Singer. 

"We look at the lava, dust and ash, and try to understand what happened before the eruption. Since these huge eruptions are rare, that's usually our only option. But we look at the steady uplift at Laguna del Maule, which has a history of regular eruptions, combined with changes in gravity, electrical conductivity and swarms of earthquakes, and we suspect that conditions necessary to trigger another eruption are gathering force."

Laguna del Maule looks nothing like a classic, cone-shaped volcano, since the high-intensity erosion caused by heavy rain and snow has carried most of the evidence to the nearby Pacific Ocean. But the overpowering reason for the absence of "typical volcano cones" is the nature of the molten rock underground. It's called rhyolite, and it's the most explosive type of magma on the planet.

The eruption of a rhyolite volcano is too quick and violent to build up a cone. Instead, this viscous, water-rich magma often explodes into vast quantities of ash that can form deposits hundreds of yards deep, followed by a slower flow of glassy magma that can be tens of yards tall and measure more than a mile in length.

The next eruption could be in the size range of Mount St. Helens -- or it could be vastly bigger, Singer says. "We know that over the past million years or so, several eruptions at Laguna del Maule or nearby volcanoes have been more than 100 times larger than Mount St. Helens," he says. "Those are rare, but they are possible." Such a mega-eruption could change the weather, disrupt the ecosystem and damage the economy.
Trying to anticipate what Laguna del Maule holds in store, Singer is heading a new $3 million, five-year effort sponsored by the National Science Foundation to document its behavior before an eruption. With colleagues from Chile, Argentina, Canada, Singapore, and Cornell and Georgia Tech universities, he is masterminding an effort to build a scientific model of the underground forces that could lead to eruption. "This model should capture how this system has evolved in the crust at all scales, from the microscopic to basinwide, over the last 100,000 years," Singer says. "It's like a movie from the past to the present and into the future."
Over the next five years, Singer says he and 30 colleagues will "throw everything, including the kitchen sink, at the problem -- geology, geochemistry, geochronology and geophysics -- to help measure, and then model, what's going on."

One key source of information on volcanoes is seismic waves. Ground shaking triggered by the movement of magma can signal an impending eruption. Team member Clifford Thurber, a seismologist and professor of geoscience at UW-Madison, wants to use distant earthquakes to locate the underground magma body.

As many as 50 seismometers will eventually be emplaced above and around the magma at Laguna del Maule, in the effort to create a 3-D image of Earth's crust in the area.

By tracking multiple earthquakes over several years, Thurber and his colleagues want to pinpoint the size and location of the magma body -- roughly estimated as an oval measuring five kilometers (3.1 miles) by 10 kilometers (6.2 miles).

Each seismometer will record the travel time of earthquake waves originating within a few thousand kilometers, Thurber explains. Since soft rock transmits sound less efficiently than hard rock, "we expect that waves that pass through the presumed magma body will be delayed," Thurber says. "It's very simple. It's like a CT scan, except instead of density we are looking at seismic wave velocity."

As Singer, who has been visiting Laguna del Maule since 1998, notes, "The rate of uplift -- among the highest ever observed -- has been sustained for seven years, and we have discovered a large, fluid-rich zone in the crust under the lake using electrical resistivity methods. Thus, there are not many possible explanations other than a big, active body of magma at a shallow depth."

The expanding body of magma could freeze in place -- or blow its top, he says. "One thing we know for sure is that the surface cannot continue rising indefinitely."

Source:  University of Wisconsin-Madison

Re-thinking Southern California earthquake scenarios in Coachella Valley, San Andreas Fault

New 3D numerical modeling that captures more geometric complexity of an active fault segment in southern California than any other suggests that the overall earthquake hazard for towns on the west side of the Coachella Valley such as Palm Springs may be slightly lower than previously believed. Credit: Courtesy Google Earth and UMass Amherst
New three-dimensional (3D) numerical modeling that captures far more geometric complexity of an active fault segment in southern California than any other, suggests that the overall earthquake hazard for towns on the west side of the Coachella Valley such as Palm Springs and Palm Desert may be slightly lower than previously believed.

New simulations of deformation on three alternative fault configurations for the Coachella Valley segment of the San Andreas Fault conducted by geoscientists Michele Cooke and Laura Fattaruso of the University of Massachusetts Amherst, with Rebecca Dorsey of the University of Oregon, appear in the December issue of Geosphere.
The Coachella Valley segment is the southernmost section of the San Andreas Fault in California. It has a high likelihood for a large rupture in the near future, since it has a recurrence interval of about 180 years but has not ruptured in over 300 years, the authors point out.
The researchers acknowledge that their new modeling offers "a pretty controversial interpretation" of the data. Many geoscientists do not accept a dipping active fault geometry to the San Andreas Fault in the Coachella Valley, they say. Some argue that the data do not confirm the dipping structure. "Our contribution to this debate is that we add an uplift pattern to the data that support a dipping active fault and it rejects the other models," say Cooke and colleagues.

Their new model yields an estimated 10 percent increase in shaking overall for the Coachella segment. But for the towns to the west of the fault where most people live, it yields decreased shaking due to the dipping geometry. It yields a doubling of shaking in mostly unpopulated areas east of the fault. "This isn't a direct outcome of our work but an implication," they add.

Cooke says, "Others have used a dipping San Andreas in their models but they didn't include the degree of complexity that we did. By including the secondary faults within the Mecca Hills we more accurately capture the uplift pattern of the region."

Fattaruso adds, "Others were comparing to different data sets, such as geodesy, and since we were comparing to uplift it is important that we have this complexity." In this case, geodesy is the science of measuring and representing the Earth and its crustal motion, taking into account the competition of geological processes in 3D over time.

Most other models of deformation, stress, rupture and ground shaking have assumed that the southern San Andreas Fault is vertical, say Cooke and colleagues. However, seismic, imaging, aerial magnetometric surveys and GPS-based strain observations suggest that the fault dips 60 to 70 degrees toward the northeast, a hypothesis they set out to investigate.
Specifically, they explored three alternative geometric models of the fault's Coachella Valley segment with added complexity such as including smaller faults in the nearby Indio and Mecca Hills. "We use localized uplift patterns in the Mecca Hills to assess the most plausible geometry for the San Andreas Fault in the Coachella Valley and better understand the interplay of fault geometry and deformation," they write.
Cooke and colleagues say the fault structures in their favored model agree with distributions of local seismicity, and are consistent with geodetic observations of recent strain. "Crustal deformation models that neglect the northeast dip of the San Andreas Fault in the Coachella Valley will not replicate the ground shaking in the region and therefore inaccurately estimate seismic hazard," they note.

This work was supported by the National Science Foundation.

Source:  University of Massachusetts at Amherst

California's drought is the worst in 1,200 years, evidence suggests

Written By Unknown on Thursday, December 18, 2014 | 11:41 PM

The 2012-2014 California drought, unusual in the context of the last 1,200 years, greatly diminished water reserves in Lake Nacimiento of the upper Salinas Valley. Credit: Photo by Daniel Griffin
As California finally experiences the arrival of a rain-bearing Pineapple Express this week, two climate scientists from the University of Minnesota and Woods Hole Oceanographic Institution have shown that the drought of 2012-2014 has been the worst in 1,200 years.

Daniel Griffin, an assistant professor in the Department of Geography, Environment and Society at the University of Minnesota, and Kevin Anchukaitis, an assistant scientist at Woods Hole Oceanographic Institution, asked the question, "How unusual is the ongoing California drought?" Watching the severity of the California drought intensify since last autumn, they wondered how it would eventually compare to other extreme droughts throughout the state's history.

To answer those questions, Griffin and Anchukaitis collected new tree-ring samples from blue oak trees in southern and central California. "California's old blue oaks are as close to nature's rain gauges as we get," says Griffin. "They thrive in some of California's driest environments." These trees are particularly sensitive to moisture changes and their tree rings display moisture fluctuations vividly.

As soon as the National Oceanic and Atmospheric Administration (NOAA) released climate data for the summer of 2014, the two scientists sprang into action. Using their blue oak data, they reconstructed rainfall back to the 13th century. They also calculated the severity of the drought by combining NOAA's estimates of the Palmer Drought Severity Index (PDSI), an index of soil moisture variability, with the existing North American Drought Atlas, a spatial tree-ring based reconstruction of drought developed by scientists at Columbia University's Lamont-Doherty Earth Observatory. These resources together provided complementary data on rainfall and soil moisture over the past millennium. Griffin and Anchukaitis found that while the current period of low precipitation is not unusual in California's history, these rainfall deficits combined with sustained record high temperatures created the current multiyear severe water shortages. "While it is precipitation that sets the rhythm of California drought, temperature weighs in on the pitch," says Anchukaitis.

"We were genuinely surprised at the result," says Griffin, a NOAA Climate & Global Change Fellow and former WHOI postdoctoral scholar. "This is California--drought happens. Time and again, the most common result in tree-ring studies is that drought episodes in the past were more extreme than those of more recent eras. This time, however, the result was different." While there is good evidence of past sustained, multi-decadal droughts or so-called "megadroughts"' in California, the authors say those past episodes were probably punctuated by occasional wet years, even if the cumulative effect over decades was one of overall drying. The current short-term drought appears to be worse than any previous span of consecutive years of drought without reprieve.

Tree rings are a valuable data source when tracking historical climate, weather and natural disaster trends. Floods, fires, drought and other elements that can affect growing conditions are reflected in the development of tree rings, and since each ring represents one year the samples collected from centuries-old trees are a virtual timeline that extend beyond the historical record in North America.

So what are the implications? The research indicates that natural climate system variability is compounded by human-caused climate change and that "hot" droughts such as the current one are likely to occur again in the future. California is the world's 8th largest economy and the source of a substantial amount of U.S. produce. Surface water supply shortages there have impacts well beyond the state's borders.

With an exceptionally wet winter, parts of California might emerge from the drought this year. "But there is no doubt," cautions Anchukaitis, "that we are entering a new era where human-wrought changes to the climate system will become important for determining the severity of droughts and their consequences for coupled human and natural systems."

Source:  Woods Hole Oceanographic Institution

Source of volcanoes may be much closer than thought: Geophysicists challenge traditional theory underlying origin of mid-plate volcanoes

Traditional thought holds that hot updrafts from the Earth's core cause volcanoes, but researchers say eruptions may stem from the asthenosphere, a layer closer to the surface.
Credit: Virginia Tech
A long-held assumption about the Earth is discussed in today's edition of Science, as Don L. Anderson, an emeritus professor with the Seismological Laboratory of the California Institute of Technology, and Scott King, a professor of geophysics in the College of Science at Virginia Tech, look at how a layer beneath the Earth's crust may be responsible for volcanic eruptions.

The discovery challenges conventional thought that volcanoes are caused when plates that make up the planet's crust shift and release heat.

Instead of coming from deep within the interior of the planet, the responsibility is closer to the surface, about 80 kilometers to 200 kilometers deep -- a layer above the Earth's mantle, known as the as the asthenosphere.

"For nearly 40 years there has been a debate over a theory that volcanic island chains, such as Hawaii, have been formed by the interaction between plates at the surface and plumes of hot material that rise from the core-mantle boundary nearly 1,800 miles below the Earth's surface," King said. "Our paper shows that a hot layer beneath the plates may explain the origin of mid-plate volcanoes without resorting to deep conduits from halfway to the center of the Earth."

Traditionally, the asthenosphere has been viewed as a passive structure that separates the moving tectonic plates from the mantle.

As tectonic plates move several inches every year, the boundaries between the plates spawn most of the planet's volcanoes and earthquakes.

"As the Earth cools, the tectonic plates sink and displace warmer material deep within the interior of the Earth," explained King. "This material rises as two broad, passive updrafts that seismologists have long recognized in their imaging of the interior of the Earth."
The work of Anderson and King, however, shows that the hot, weak region beneath the plates acts as a lubricating layer, preventing the plates from dragging the material below along with them as they move.

The researchers show this lubricating layer is also the hottest part of the mantle, so there is no need for heat to be carried up to explain mid-plate volcanoes.

"We're taking the position that plate tectonics and mid-plate volcanoes are the natural results of processes in the plates and the layer beneath them," King said.

Source: Virginia Tech

NASA data underscore severity of California drought

Written By Unknown on Tuesday, December 16, 2014 | 9:39 PM

Trends in total water storage in California, Nevada and bordering states from NASA's Gravity Recovery and Climate Experiment (GRACE) satellite mission, September 2011 to September 2014. NASA scientists use these images to better quantify drought and its impact on water availability. Two-thirds of the measured losses were a result of groundwater depletion in California's Central Valley. Credit: NASA JPL/Caltech
It will take about 11 trillion gallons of water (42 cubic kilometers) -- around 1.5 times the maximum volume of the largest U.S. reservoir -- to recover from California's continuing drought, according to a new analysis of NASA satellite data.

The finding was part of a sobering update on the state's drought made possible by space and airborne measurements and presented by NASA scientists Dec. 16 at the American Geophysical Union meeting in San Francisco. Such data are giving scientists an unprecedented ability to identify key features of droughts, and can be used to inform water management decisions.

A team of scientists led by Jay Famiglietti of NASA's Jet Propulsion Laboratory in Pasadena, California, used data from NASA's Gravity Recovery and Climate Experiment (GRACE) satellites to develop the first-ever calculation of this kind -- the volume of water required to end an episode of drought.

Earlier this year, at the peak of California's current three-year drought, the team found that water storage in the state's Sacramento and San Joaquin river basins was 11 trillion gallons below normal seasonal levels. Data collected since the launch of GRACE in 2002 show this deficit has increased steadily.

"Spaceborne and airborne measurements of Earth's changing shape, surface height and gravity field now allow us to measure and analyze key features of droughts better than ever before, including determining precisely when they begin and end and what their magnitude is at any moment in time," Famiglietti said. "That's an incredible advance and something that would be impossible using only ground-based observations."

GRACE data reveal that, since 2011, the Sacramento and San Joaquin river basins decreased in volume by four trillion gallons of water each year (15 cubic kilometers). That's more water than California's 38 million residents use each year for domestic and municipal purposes. About two-thirds of the loss is due to depletion of groundwater beneath California's Central Valley.

In related results, early 2014 data from NASA's Airborne Snow Observatory indicate that snowpack in California's Sierra Nevada range was only half of previous estimates. The observatory is providing the first-ever high-resolution observations of the water volume of snow in the Tuolumne River, Merced, Kings and Lakes basins of the Sierra Nevada and the Uncompahgre watershed in the Upper Colorado River Basin.

To develop these calculations, the observatory measures how much water is in the snowpack and how much sunlight the snow absorbs, which influences how fast the snow melts. These data enable accurate estimates of how much water will flow out of a basin when the snow melts, which helps guide decisions about reservoir filling and water allocation.

"The 2014 snowpack was one of the three lowest on record and the worst since 1977, when California's population was half what it is now," said Airborne Snow Observatory Principal Investigator Tom Painter of JPL. "Besides resulting in less snow water, the dramatic reduction in snow extent contributes to warming our climate by allowing the ground to absorb more sunlight. This reduces soil moisture, which makes it harder to get water from the snow into reservoirs once it does start snowing again."

New drought maps show groundwater levels across the U.S. Southwest are in the lowest 2 to 10 percent since 1949. The maps, developed at NASA's Goddard Space Flight Center in Greenbelt, Maryland, combine GRACE data with other satellite observations.

"Integrating GRACE data with other satellite measurements provides a more holistic view of the impact of drought on water availability, including on groundwater resources, which are typically ignored in standard drought indices," said Matt Rodell, chief of the Hydrological Sciences Laboratory at Goddard.

The scientists cautioned that while the recent California storms have been helpful in replenishing water resources, they aren't nearly enough to end the multi-year drought.
"It takes years to get into a drought of this severity, and it will likely take many more big storms, and years, to crawl out of it," said Famiglietti.

NASA monitors Earth's vital signs from land, air and space with a fleet of satellites and ambitious airborne and ground-based observation campaigns. The agency develops new ways to observe and study Earth's interconnected natural systems with long-term data records and computer analysis tools to better see how our planet is changing. The agency shares this unique knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.

For more information on GRACE, visit: http://www.nasa.gov/grace and http://www.csr.utexas.edu/grace
For more on the Airborne Snow Observatory, visit: http://aso.jpl.nasa.gov/
For more information about NASA's Earth science activities, visit: http://www.nasa.gov/earthrightnow

Source: NASA/Jet Propulsion Laboratory

Hydraulic fracturing linked to earthquakes in Ohio

Written By Unknown on Sunday, December 7, 2014 | 10:51 PM

Seismograph (stock image). Hydraulic fracturing triggered a series of small earthquakes in 2013 on a previously unmapped fault in Harrison County, Ohio, according to a study. Credit: © hakandogu / Fotolia
Hydraulic fracturing triggered a series of small earthquakes in 2013 on a previously unmapped fault in Harrison County, Ohio, according to a study published in the journal Seismological Research Letters (SRL).

Nearly 400 small earthquakes occurred between Oct. 1 and Dec. 13, 2013, including 10 "positive" magnitude earthquake, none of which were reported felt by the public. The 10 positive magnitude earthquakes, which ranged from magnitude 1.7 to 2.2, occurred between Oct. 2 and 19, coinciding with hydraulic fracturing operations at nearby wells.

This series of earthquakes is the first known instance of seismicity in the area.
Hydraulic fracturing, or fracking, is a method for extracting gas and oil from shale rock by injecting a high-pressure water mixture directed at the rock to release the gas inside. The process of hydraulic fracturing involves injecting water, sand and chemicals into the rock under high pressure to create cracks. The process of cracking rocks results in micro-earthquakes. Hydraulic fracturing usually creates only small earthquakes, ones that have magnitude in the range of negative 3 (−3) to negative 1 (-1).

"Hydraulic fracturing has the potential to trigger earthquakes, and in this case, small ones that could not be felt, however the earthquakes were three orders of magnitude larger than normally expected," said Paul Friberg, a seismologist with Instrumental Software Technologies, Inc. (ISTI) and a co-author of the study.

The earthquakes revealed an east-west trending fault that lies in the basement formation at approximately two miles deep and directly below the three horizontal gas wells. The EarthScope Transportable Array Network Facility identified the first earthquakes on Oct. 2, 2013, locating them south of Clendening Lake near the town of Uhrichsville, Ohio. A subsequent analysis identified 190 earthquakes during a 39-hour period on Oct. 1 and 2, just hours after hydraulic fracturing began on one of the wells.

The micro-seismicity varied, corresponding with the fracturing activity at the wells. The timing of the earthquakes, along with their tight linear clustering and similar waveform signals, suggest a unique source for the cause of the earthquakes -- the hydraulic fracturing operation. The fracturing likely triggered slip on a pre-existing fault, though one that is located below the formation expected to confine the fracturing, according to the authors.

"As hydraulic fracturing operations explore new regions, more seismic monitoring will be needed since many faults remain unmapped." Friberg co-authored the paper with Ilya Dricker, also with ISTI, and Glenda Besana-Ostman originally with Ohio Department of Natural Resources, and now with the Bureau of Reclamation at the U.S. Department of Interior.

Source: Seismological Society of America

Earthquakes in the ocean: Towards a better understanding of their precursors

Written By Unknown on Monday, November 10, 2014 | 2:04 AM

Published on 14 September in Nature Geoscience, the study conducted by researchers from several institutes, including IFREMER (French Research Institute for Exploitation of the Sea), CNRS and IFSTTAR, offers the first theoretical model that, based on fluid-related processes, explains the seismic precursors of an underwater earthquake. Using quantitative measurements, this innovative model established a link between observed precursors and the mainshock of an earthquake. The results open a promising avenue of research for guiding future investigations on detecting earthquakes before they strike.

A model specific to the submarine environment
The data used to construct the model presented in the article were collected from subsea observatories* deployed in the North-East Pacific fracture zones.

The researchers showed that the properties of the fluids that circulate in submarine fault zones change over time, during what is called the “seismic cycle”. This term describes the cycle during which strain accumulates along a fault until it exceeds the frictional forces that prevent the fault from slipping. An earthquake results at the moment of rupture, due to the sudden release of built-up strain. A new cycle begins with strain accumulating and continues until the next rupture occurs along the fault...

Due to their proximity to mid-ocean ridges, the fluids that circulate in the faults undergo tremendous pressure and extremely high temperatures. These fluids can reach the supercritical state. The physical properties of supercritical fluids (density, viscosity, diffusivity) are intermediate to those of liquids and gases.

The compressibility of supercritical fluid varies greatly with pressure, and, according to the study’s analysis, this change in compressibility may trigger an earthquake, occurring after a short period of foreshocks.

Seismic precursors
Seismic precursors are the early warning signs before an earthquake strikes. Many different types of earthquake precursors have been studied by the scientific community: ground movements, seismic signals, fluid or gas emissions, electrical signals, thermal signals, animal behaviour, etc.
For an event as large as an earthquake, which releases a considerable amount of energy, there must be a preparatory phase. This problem in predicting earthquakes does not lie in the absence of precursors (hindsight observations are numerous), but in the capacity to detect these forerunners before the mainshock.

The results of the model can help guide future research in the detection of seismic precursors with, ultimately, potential applications for earthquake prediction. Supercritical fluids require very specific conditions; they are also encountered on land in hydrothermal and volcanic areas, such as Iceland.

Details of the model
Under the effect of tectonic forces, two antagonistic effects are usually in play near transform faults. First, increasing shear stress tends to break rocks and weaken resistance in the transform fault. Second, decreasing pressure of the fluid contained in the fault results in an increase in the volume of the pore space between rock beds. This effect acts as a stabilising suction cup, counterbalancing the ‘weakening’ in the rock bed and delaying the triggering of an earthquake.

The efficiency of this counterbalancing mechanism depends on fluid compressibility. It is highest in the presence of fluids in the liquid state, whose low compressibility causes a dramatic decrease in fluid pressure in response to small increases in volume. Conversely, for gas-type fluids, which are highly compressible, the suction cup effect is nearly inexistent.

When a change in the ‘liquid-gas’ state of the fluid occurs during a fault slip, the counterbalancing mechanism fails, allowing a major shock to be triggered. This transition occurs over several days and has numerous signs, including many small foreshocks.

*Subsea observatories are comparable to a laboratory on the seafloor. Equipped with a series of instruments, they record many types of data that can be used to study the geophysical events that occur in the ocean.

Source: Institut français de recherche pour l'exploitation de la mer (Ifremer)

A global surge of great earthquakes from 2004-2014 and implications for Cascadia

The last ten years have been a remarkable time for great earthquakes. Since December 2004 there have been no less than 18 quakes of Mw8.0 or greater -- a rate of more than twice that seen from 1900 to mid-2004. Hundreds of thousands of lives have been lost and massive damage has resulted from these great earthquakes. But as devastating as such events can be, these recent great quakes have come with a silver lining: They coincide with unprecedented advances in technological and scientific capacity for learning from them.

"We previously had very limited information about how ruptures grow into great earthquakes and interact with regions around them," said seismologist Thorne Lay of the University of California at Santa Cruz. "So we are using the recorded data for these recent events to guide our understanding of future earthquakes. We've gained a new level of appreciation for how one earthquake can influence events in other zones."

High on the list of areas ripe for a great quake is Cascadia, the Pacific Northwest, where the risk for great quakes had long been under appreciated. Evidence began surfacing about 20 years ago that there had been a great quake in the region in the year 1700. Since then the view of the great quake risk in Cascadia has shifted dramatically.

"We don't know many details about what happened in 1700," said Lay. There were no instruments back then to observe and record it. And so the best way to try and understand the danger and what could happen in Cascadia is to study the recent events elsewhere.

Over the last decade Lay and his colleagues have been able to gather fine details about these giant earthquakes using data from an expanded global networks of seismometers, GPS stations, tsunami gauges, and new satellite imaging capabilities such as GRACE, InSAR, and LandSAT interferometry. Among the broader conclusions they have come to is that great quakes are very complicated and idiosyncratic. Lay will be presenting some of those idiosyncrasies at the meeting of the Geological Society of America in Vancouver on Oct. 21.

"What we've seen is that we can have multiple faults activated," said Lay. "We've seen it off Sumatra and off Japan. Once earthquakes get going they can activate faulting in areas that were thought not physically feasible."

The great Sumatra-Andaman earthquake of Dec. 26, 2004, for instance, unzipped a 1,300 kilometer long segment of the subduction zone and unleashed one of history's most destructive, deadly tsunamis. Much of the rupture was along a region with very limited plate convergence. In Japan, the Kuril Islands, and the Solomon Islands, great mega-thrust ruptures have ruptured portions of the subduction zones that were thought too warm or weak to experience earthquakes.

"These earthquakes ruptured right through areas that had been considered to have low risk," said Lay. "We thought that would not happen. But it did, so we have to adjust our understanding."

Perhaps the best recent analogy to Cascadia is off the coast of Iquique, Chile, said Lay. There had been a great quake in 1877, and a conspicuous gap in quakes ever since. Like the 1700 Cascadia earthquake, there is little data for the 1877 event, which killed more than 2,500 people. In both subduction zones, the converging plates are thought to be accumulating strain which could be released in a very large and violent rupture. On April 1 of this year, some of that strain was released offshore of Iquique. There was a Mw8.1 rupture in the northern portion of the seismic gap. But it involved slip over less than 20 percent of the region that seismologists believe to have accumulated strain since 1877.

"We have no idea why only a portion of the 1877 zone ruptured," said Lay. "But clearly, 80 percent of that zone is still unruptured. We don't have a good basis for assessment of how the rest will fail. It's the same for Cascadia. We don't know if it always goes all at once or sometimes in sequences of smaller events, with alternating pattern. It is prudent to prepare for the worst case of failure of the entire region in a single event, but it may not happen that way every time."

What is certain is that studying these recent big earthquakes has given geophysicists the best information ever about how they work and point to new ways to begin understanding what could be in Cascadia's future.

Source: Geological Society of America

Hippos-Sussita excavation: Silent evidence of the earthquake of 363 CE

The city of Hippos-Sussita, which was founded in the second century BCE, experienced two strong and well-documented earthquakes. The first was in the year 363 CE and it caused heavy damage. The city, did, however, recover. The great earthquake of 749 CE destroyed the city which was subsequently abandoned completely. Evidence of the extensive damage caused by the earthquake of 363 was found in earlier seasons. None, however, was as violent, thrilling and eerie as the evidence discovered this year. Credit: Image courtesy of University of Haifa
Silent evidence of a large earthquake in 363 CE -- the skeleton of a woman with a dove-shaped pendant was discovered under the tiles of a collapsed roof by archeologists from the University of Haifa during this excavation season at Hippos-Sussita. They also found a large muscular marble leg and artillery ammunition from some 2,000 years ago. "The data is finally beginning to form a clear historical-archaeological picture," said Dr. Michael Eisenberg, head of the international excavation team.

The past fifteen excavation seasons at Hippos-Sussita, run by archeologists from the Zinman Institute of Archaeology at the University of Haifa, have not stopped providing a constant flow of fascinating findings. The team digging at the city site -- situated east of the Sea of Galilee in the Sussita National Park, which is under the management of the Israel Nature and Parks Authority -- has grown over the years, with more and more teams and excavators from various countries joining them. This time, the security situation in the south of Israel "sent" them a Canadian team, led by Dr. Stephen Chambers, as reinforcement.

The city of Hippos-Sussita, which was founded in the second century BCE, experienced two strong and well-documented earthquakes. The first was in the year 363 CE and it caused heavy damage. The city, did, however, recover. The great earthquake of 749 CE destroyed the city which was subsequently abandoned completely. Evidence of the extensive damage caused by the earthquake of 363 was found in earlier seasons. None, however, was as violent, thrilling and eerie as the evidence discovered this year.

To the north of the basilica, the largest building in town that served as the commercial, economic and judicial center of the city, the dig's senior area supervisor Haim Shkolnik and his team unearthed the remains of several skeletons that had been crushed by the weight of the collapsed roof. Among the bones of one of the women lay a gold dove-shaped pendant.

This year, evidence was found for the first time that the great earthquake of 363 CE had destroyed the Roman bathhouse, which was uncovered by the team run by Arleta Kowalewska from Poland. Like the basilica, it too was not rebuilt. According to Dr. Eisenberg, the evidence found so far shows that the earthquake was so powerful it completely destroyed the city, which took some twenty years to be rebuilt. Among the wreckage from the bathhouse, an excellent Roman marble sculpture of a muscular right leg of a man leaning against a tree trunk was found. "It is too early to determine who the man depicted in the sculpture was. It could be the sculpture of a god or an athlete; it was more than two meters tall. We hope to find more parts of the sculpture in the coming seasons to shed some light on his identity," said Dr. Eisenberg.

Excavations were resumed in the bastion, the main defense post of the Roman period city built on the southern edge of the cliff, where the work focused on the fortified position of a projectile machine that propelled/launched ballista stones. The catapult was some eight meters long according to the size of the chamber. So far the archeologists have found a number of ballista balls that fit the massive catapult, as well as smaller balls that were used on smaller ballista machines. These machines were positioned above the bastion's vaults and were used to launch basalt ballista balls slightly smaller than soccer balls as far as 350 meters.

A section of the western part of the city's main colonnaded street, which traversed its entire length of 600 meters from east to west (the decumanus maximus) was excavated this year with the help of a Canadian team, after their planned dig in the south was cancelled. The archeologists uncovered another original piece of the wall that supported the street columns, confirming the theory that it had been a magnificent colonnaded street similar to those of the Roman East cities that were built at the peak of the Pax Romana -- the Roman era of peace during the first few centuries CE.

While working on the dig the team also invested a lot of work on the site's conservation. "I am extremely proud that we were able to organize a sizable conservation team this year as well, from our own internal budgets and with the help of the Western Galilee College in Acre. Twenty-two students from the college's Department of Conservation together with five experienced conservators under the direction of Julia Burdajewicz from the Academy of Fine Arts in Warsaw conducted the conservation work. This is one of the major tourist destinations in the northern part of the country, and as such I see this as a national mission, even if the budget comes primarily from our own sources, without government support," concluded Dr. Eisenberg.

Source: University of Haifa

Radiation exposure linked to aggressive thyroid cancers, researchers confirm for the first time

Written By Unknown on Thursday, October 30, 2014 | 4:29 AM

fukushima disaster
For the first time, researchers have found that exposure to radioactive iodine is associated with more aggressive forms of thyroid cancer, according to a careful study of nearly 12,000 people in Belarus who were exposed when they were children or adolescents to fallout from the 1986 Chernobyl nuclear power plant accident.

Researchers examined thyroid cancers diagnosed up to two decades after the Chernobyl accident and found that higher thyroid radiation doses estimated from measurements taken shortly after the accident were associated with more aggressive tumor features.

"Our group has previously shown that exposures to radioactive iodine significantly increase the risk of thyroid cancer in a dose-dependent manner. The new study shows that radiation exposures are also associated with distinct clinical features that are more aggressive," said the paper's first author, Lydia Zablotska, MD, PhD, associate professor in the Department of Epidemiology and Biostatistics at UC San Francisco (UCSF). The paper will be published online in the journal Cancer.

Zablotska said the findings have implications for those exposed to radioactive iodine fallout from the 2011 nuclear reactor incidents in Fukushima, Japan, after the reactors were damaged by an earthquake-induced tsunami.

"Those exposed as children or adolescents to the fallout are at highest risk and should probably be screened for thyroid cancer regularly, because these cancers are aggressive, and they can spread really fast," Zablotska said. "Clinicians should be aware of the aggressiveness of radiation-associated tumors and closely monitor those at high risk."

Chernobyl studies led by Zablotska also showed for the first time that exposures to the radioactive iodine after the Chernobyl nuclear plant accident are associated with a whole spectrum of thyroid diseases, from benign to malignant. Benign encapsulated tumors of the thyroid gland are called follicular adenomas, and are treated in the same way as thyroid cancer -- by removing the thyroid gland, then giving patients pills to replace the hormones that are lost. Lifelong hormone supplementation treatment is both costly and complicated for patients.

Thyroid cancer is ordinarily rare among children, with less than one new case per million diagnosed each year. Among adults, about 13 new cases will be diagnosed each year for every 100,000 people, according to the Surveillance, Epidemiology and End Results (SEER) Program of the National Cancer Institute (NCI). But in the Belarus cohort, the researchers diagnosed 158 thyroid cancers among 11,664 subjects during three rounds of screening. Those who had received higher radiation doses also were more likely to have solid or diffuse variants of thyroid cancer, as well as to have more aggressive tumor features, such as spread to lymphatic vessels and several simultaneous cancer lesions in the thyroid gland.

Source: University of California, San Francisco (UCSF)
 
Support : Creating Website | Johny Template | Mas Template
Copyright © 2011. The planet wall - All Rights Reserved
Template Created by Easy Blogging Published by Mas Template
Proudly powered by Blogger