Latest Post
Showing posts with label VIRTUAL ENVIRONMENT. Show all posts
Showing posts with label VIRTUAL ENVIRONMENT. Show all posts

Live adaptation of organ models in the OR

Written By Unknown on Thursday, January 8, 2015 | 3:40 AM

The non-deformed liver model (red) adapts to the deformed surface profile (blue). Credit: Graphics: Dr. Stefanie Speidel, KIT, in Medical Physics, 41
During minimally invasive operations, a surgeon has to trust the information displayed on the screen: A virtual 3D model of the respective organ shows where a tumor is located and where sensitive vessels can be found. Soft tissue, such as the tissue of the liver, however, deforms during breathing or when the scalpel is applied. Endoscopic cameras record in real time how the surface deforms, but do not show the deformation of deeper structures such as tumors. Young scientists of the Karlsruhe Institute of Technology (KIT) have now developed a real-time capable computation method to adapt the virtual organ to the deformed surface profile.

The principle appears to be simple: Based on computer tomography image data, the scientists construct a virtual 3D model of the respective organ, including the tumor, prior to operation. During the operation, cameras scan the surface of the organ and generate a stiff profile mask. To this virtual mold, the 3D model then is to fit snuggly, like jelly to a given form. The Young Investigator Group of Dr. Stefanie Speidel analyzed this geometrical problem of shape adaptation from the physical perspective. "We model the surface profile as electrically negative and the volume model of the organ as electrically positive charged," Speidel explains. "Now, both attract each other and the elastic volume model slides into the immovable profile mask." The adapted 3D model then reveals to the surgeon how the tumor has moved with the deformation of the organ.

Simulations and experiments using a close-to-reality phantom liver have demonstrated that the electrostatic-elastic method even works when only parts of the deformed surface profile are available. This is the usual situation at the hospital. The human liver is surrounded by other organs and, hence, only partly visible by endoscopic cameras. "Only those structures that are clearly identified as parts of the liver by our system are assigned an electric charge," says Dr. Stefan Suwelack who, as part of Speidel's group, wrote his Ph.D. thesis on this subject. Problems only arise, if far less than half of the deformed surface is visible. To stabilize computation in such cases, the KIT researchers can use clear reference points, such as crossing vessels. Their method, however, in contrary to others does not rely on such references from the outset.

In addition, the model of the KIT researchers is more precise than conventional methods, because it also considers biomechanical factors of the liver, such as the elasticity of the tissue. So for instance, the phantom liver used by the scientists consists of two different silicones: A harder material for the capsule, i.e. the outer shell of the liver, and a softer material for the inner liver tissue.

As a result of their physical approach, the young scientists also succeeded in accelerating the computation process. As shape adaptation was described by electrostatic and elastic energies, they found a single mathematical formula. Using this formula, even conventional computers equipped with a single processing unit only work so quickly that the method is competitive. Contrary to conventional computation methods, however, the new method is also suited for parallel computers. Using such a computer, the Young Investigator Group now plans to model organ deformations stably in real time.

How does the brain react to virtual reality? Completely different pattern of activity in brain

Illusions (stock image). UCLA neurophysicists have found that space-mapping neurons in the brain react differently to virtual reality than they do to real-world environments. Credit: © agsandrew / Fotolia
UCLA neurophysicists have found that space-mapping neurons in the brain react differently to virtual reality than they do to real-world environments. Their findings could be significant for people who use virtual reality for gaming, military, commercial, scientific or other purposes.

"The pattern of activity in a brain region involved in spatial learning in the virtual world is completely different than when it processes activity in the real world," said Mayank Mehta, a UCLA professor of physics, neurology and neurobiology in the UCLA College and the study's senior author. "Since so many people are using virtual reality, it is important to understand why there are such big differences."

The study was published today in the journal Nature Neuroscience.

The scientists were studying the hippocampus, a region of the brain involved in diseases such as Alzheimer's, stroke, depression, schizophrenia, epilepsy and post-traumatic stress disorder. The hippocampus also plays an important role in forming new memories and creating mental maps of space. For example, when a person explores a room, hippocampal neurons become selectively active, providing a "cognitive map" of the environment.

The mechanisms by which the brain makes those cognitive maps remains a mystery, but neuroscientists have surmised that the hippocampus computes distances between the subject and surrounding landmarks, such as buildings and mountains. But in a real maze, other cues, such as smells and sounds, can also help the brain determine spaces and distances.

To test whether the hippocampus could actually form spatial maps using only visual landmarks, Mehta's team devised a noninvasive virtual reality environment and studied how the hippocampal neurons in the brains of rats reacted in the virtual world without the ability to use smells and sounds as cues.

Researchers placed a small harness around rats and put them on a treadmill surrounded by a "virtual world" on large video screens -- a virtual environment they describe as even more immersive than IMAX -- in an otherwise dark, quiet room. The scientists measured the rats' behavior and the activity of hundreds of neurons in their hippocampi, said UCLA graduate student Lavanya Acharya, a lead author on the research.

The researchers also measured the rats' behavior and neural activity when they walked in a real room designed to look exactly like the virtual reality room.

The scientists were surprised to find that the results from the virtual and real environments were entirely different. In the virtual world, the rats' hippocampal neurons seemed to fire completely randomly, as if the neurons had no idea where the rat was -- even though the rats seemed to behave perfectly normally in the real and virtual worlds.

"The 'map' disappeared completely," said Mehta, director of a W.M. Keck Foundation Neurophysics center and a member of UCLA's Brain Research Institute. "Nobody expected this. The neuron activity was a random function of the rat's position in the virtual world."

Explained Zahra Aghajan, a UCLA graduate student and another of the study's lead authors: 

"In fact, careful mathematical analysis showed that neurons in the virtual world were calculating the amount of distance the rat had walked, regardless of where he was in the virtual space."

They also were shocked to find that although the rats' hippocampal neurons were highly active in the real-world environment, more than half of those neurons shut down in the virtual space.

The virtual world used in the study was very similar to virtual reality environments used by humans, and neurons in a rat's brain would be very hard to distinguish from neurons in the human brain, Mehta said.

His conclusion: "The neural pattern in virtual reality is substantially different from the activity pattern in the real world. We need to fully understand how virtual reality affects the brain."

Neurons Bach would appreciate

In addition to analyzing the activity of individual neurons, Mehta's team studied larger groups of the brain cells. Previous research, including studies by his group, have revealed that groups of neurons create a complex pattern using brain rhythms.

"These complex rhythms are crucial for learning and memory, but we can't hear or feel these rhythms in our brain. They are hidden under the hood from us," Mehta said. "The complex pattern they make defies human imagination. The neurons in this memory-making region talk to each other using two entirely different languages at the same time. One of those languages is based on rhythm; the other is based on intensity."

Every neuron in the hippocampus speaks the two languages simultaneously, Mehta said, comparing the phenomenon to the multiple concurrent melodies of a Bach fugue.

Mehta's group reports that in the virtual world, the language based on rhythm has a similar structure to that in the real world, even though it says something entirely different in the two worlds. The language based on intensity, however, is entirely disrupted.

When people walk or try to remember something, the activity in the hippocampus becomes very rhythmic and these complex, rhythmic patterns appear, Mehta said. Those rhythms facilitate the formation of memories and our ability to recall them. Mehta hypothesizes that in some people with learning and memory disorders, these rhythms are impaired.

"Neurons involved in memory interact with other parts of the hippocampus like an orchestra," Mehta said. "It's not enough for every violinist and every trumpet player to play their music flawlessly. They also have to be perfectly synchronized."

Mehta believes that by retuning and synchronizing these rhythms, doctors will be able to repair damaged memory, but said doing so remains a huge challenge.

"The need to repair memories is enormous," noted Mehta, who said neurons and synapses -- the connections between neurons -- are amazingly complex machines.

Previous research by Mehta showed that the hippocampal circuit rapidly evolves with learning and that brain rhythms are crucial for this process. Mehta conducts his research with rats because analyzing complex brain circuits and neural activity with high precision currently is not possible in humans.

Other co-authors of the study were Jason Moore, a UCLA graduate student; Cliff Vuong, a research assistant who conducted the research as a UCLA undergraduate; and UCLA postdoctoral scholar Jesse Cushman. The research was funded by the W.M. Keck Foundation and the National Institutes of Health.

Source: University of California - Los Angeles

Ancient auditory illusions reflected in prehistoric art?

Written By Unknown on Sunday, December 28, 2014 | 8:10 PM

Here are prehistoric paintings of hoofed animals in a cave with thunderous reverberations located in Bhimbetka, India. Credit: S. Waller
During the 168th Meeting of the Acoustical Society of America (ASA), to be held October 27-31, 2014 at the Indianapolis Marriott Downtown Hotel, Steven J. Waller, of Rock Art Acoustics, will describe several ways virtual sound images and absorbers can appear supernatural.

"Ancient mythology explained echoes from the mouths of caves as replies from spirits, so our ancestors may have made cave paintings in response to these echoes and their belief that echo spirits inhabited rocky places such as caves or canyons," explained Waller.

Just as light reflection gives an illusion of seeing yourself duplicated in a mirror, sound waves reflecting off a surface are mathematically identical to sound waves emanating from a virtual sound source behind a reflecting plane such as a large cliff face. "This can result in an auditory illusion of somebody answering you from within the rock," Waller said.

Echoes of clapping can sound similar to hoof beats, as Waller pointed out, while multiple echoes within a cavern can blur together into a thunderous reverberation that mimics the sound of a herd of stampeding hoofed animals.

"Many ancient cultures attributed thunder in the sky to 'hoofed thunder gods,' so it makes sense that the reverberation within the caves was interpreted as thunder and inspired paintings of those same hoofed thunder gods on cave walls," said Waller. "This theory is supported by acoustic measurements, which show statistically significant correspondence between the rock art sites and locations with the strongest sound reflection."

Other acoustical characteristics may have also been misinterpreted by ancient cultures unaware of sound wave theory. Waller noticed a resemblance between an interference pattern and Stonehenge, so he set up an interference pattern in an open field with just two flutes "droning the same note" to explore what it would sound like.

"The quiet regions of destructive sound wave cancellation, in which the high pressure from one flute cancelled the low pressure from the other flute, gave blindfolded subjects the illusion of a giant ring of rocks or 'pillars' casting acoustic shadows," Waller said.

He traveled to England and demonstrated that Stonehenge does indeed radiate acoustic shadows that recreate the same pattern as interference. "My theory that musical interference patterns served as blueprints for megalithic stone circles -- many of which are called Pipers' Stones -- is supported by ancient legends of two magic pipers who enticed maidens to dance in a circle and turned them all into stones," Waller noted.

There are several important implications of Waller's research. Perhaps most significantly, it demonstrates that acoustical phenomena were culturally significant to early humans -- leading to the immediate conclusion that the natural soundscapes of archaeological sites should be preserved in their natural state for further study and greater appreciation.

"Even today, sensory input can be used to manipulate perception and lead to illusions inconsistent with scientific reality, which could have interesting practical applications for virtual reality and special effects in entertainment media," Waller said. "Objectivity is questionable, because a given set of data can be used to support multiple conclusions."

The history of humanity is full of such misinterpretations, such as the visual illusion that the sun moves around the earth. "Sound, which is invisible and has complex properties, can easily lead to auditory illusions of the supernatural," he added. "This, in turn, leads to the more general question: what other illusions are we living under due to other phenomena that we are currently misinterpreting?"

Presentation #2aAA11, "Virtual sound images and virtual sound absorbers misinterpreted as supernatural objects," by Stephen J. Waller will take place on Tuesday, October 28, 2014. The abstract can be found by searching for the presentation number here: https://asa2014fall.abstractcentral.com/planner.jsp

Source:  Acoustical Society of America (ASA)
 
Support : Creating Website | Johny Template | Mas Template
Copyright © 2011. The planet wall - All Rights Reserved
Template Created by Easy Blogging Published by Mas Template
Proudly powered by Blogger