Latest Post
Showing posts with label VIRTUAL REALITY. Show all posts
Showing posts with label VIRTUAL REALITY. Show all posts

Virtual reality speeds up rehabilitation: Integrating force feedback into therapies

Written By Unknown on Sunday, January 18, 2015 | 4:47 PM

A child is receiving virtual door opening training under the guidance of a therapist. Credit: Copyright The Hong Kong Polytechnic University
The Hong Kong Polytechnic University has successfully developed a novel training programme using haptic technology for impaired hands that cannot function normally. This programme is unique as it provides force feedback, which creates a true sense of weight to the user through the control device.

Our hands are essential to our lives; we need them in all daily tasks including eating, bathing and getting dressed. However, even the simplest tasks are challenging for people with impaired hands due to various conditions such as cerebral palsy, stroke and ageing. Fortunately, they will soon benefit from a new training technology which may greatly improve their conditions.

In response to therapeutic needs, a computerized training programme against impaired hands has been developed at the School of Nursing of The Hong Kong Polytechnic University. Patients being trained are supposed to exercise their hands through playing a series of well-designed computer games that simulate everyday tasks, such as opening a locked door with a key or pouring tea into a cup. While playing, their hand movements are monitored and recorded by a haptic device, which is connected to the control unit held by the patient at one end, and a computer at the other. The haptic device then feeds the data into the computer, resulting in the instant reflection of the patient's actions in the animation on screen.

In addition, the haptic technology which the programme employs is more true-to-life than similar programmes, as feedback is provided through the force created by the control unit to players. For example, they can literally feel the weight of a simulated bottle diminishing as the water is being poured out. Such kind of precision will greatly enhance training effectiveness and improve the patient's coordination.

Game-based therapies are highly motivating. Firstly, playing 3D games in colourful animation is more interesting than monotonous physical exercises. Secondly, a reward system incorporated in the programme is sure to fuel a sense of competition and accomplishment. "Our games are designed to be engaging. When players make successful attempts, they get bonus points. And as they win, they move on to the next level, where more attractive rewards are waiting," said Dr Kup-sze Choi, the leader of the research team. It is satisfying for players to work their way up and keep going with the therapy, thereby improving their hand functions.

Compared to physical training, computer simulated training is a safer option when sharp or breakable objects are involved, making practices on preparing simple meals with a knife possible. It is also less likely to be interrupted by undesired circumstances. Dr Choi explained, "For instance, the hands of cerebral palsy sufferers are usually stiff, weak and prone to uncontrolled movements. If they practise pouring real tea in repeated sessions, they may make spills all over the place and end up soaking wet, requiring the healthcare workers to clean up the mess. That is not a good thing for both the trainee and the trainer." With computer simulation, there will be no such interruptions.

To cater to different degrees of disability, the programme has a built-in difficulty mode with which the level of difficulty can be adjusted with the touch of a button. Therapists can also monitor their patients' progress easily, as the system keeps track of their movements and performance.

The effectiveness of this training programme was preliminarily confirmed, as a similar tool aimed to improve hand-writing was tested on the children at the Hong Kong Red Cross Princess Alexandra School. The results have shown a marked improvement in the time they needed to complete the task after two weeks of training. More tests and trials are on the way, and the team expect that a longer period of computer-assisted training will yield greater benefits. The training system has already won a Silver Medal at the 42nd International Exhibition of Inventions of Geneva in Switzerland.

According to Dr Choi, computer simulated training using haptic technology will widen the access to rehabilitation and help more patients with impaired hands . In the future, the team will work on combining this computer-aided rehabilitation programme with traditional therapy in order to optimize the training system and benefit more patients. The prototype of the haptic platform customized for self-care training Copyright : The Hong Kong Polytechnic University The haptic platform technology developed by Dr Kup-sze Choi and his team has won a Silver Medal at the 42nd International Exhibition of Inventions of Geneva. 
Copyright : The Hong Kong Polytechnic University

Printing in the hobby room: Paper-thin and touch-sensitive displays on various materials

Written By Unknown on Wednesday, January 14, 2015 | 3:31 AM

Paper- thin and touch-sensitive displays on various materials.
Credit: Image courtesy of University Saarland
Until now, if you want to print a greeting card for a loved one, you can use colorful graphics, fancy typefaces or special paper to enhance it. But what if you could integrate paper-thin displays into the cards, which could be printed at home and which would be able to depict self-created symbols or even react to touch? Those only some of the options computer scientists in Saarbrรผcken can offer. They developed an approach that in the future will enable laypeople to print displays in any desired shape on various materials and therefore could change everyday life completely.

For example: A postcard depicts an antique car. If you press a button, the back axle and the steering wheel rim light up in the same color. Two segments on a flexible display, which have the same shape as those parts of the car, can create this effect. Computer scientists working with Jรผrgen Steimle printed the post card using an off-the-shelf inkjet printer. It is electro-luminescent: If it is connected to electric voltage, it emits light. This effect is also used to light car dashboards at night.

Steimle is leader of the research group "Embodied Interaction" at the Cluster of Excellence "Multimodal Computing and Interaction." Simon Olberding is one of his researchers. "Until now, this was not possible," explains Olberding. "Displays were mass-produced, they were inflexible, they always had a rectangular shape." Olberding and Steimle want to change that. The process they developed works as follows: The user designs a digital template with programs like Microsoft Word or Powerpoint for the display he wants to create.

By using the methods the computer scientists from Saarbrรผcken developed, called "Screen Printing" and "Conductive Inkjet Printing," the user can print those templates. Both approaches have strengths and weaknesses, but a single person can use them within either a few minutes or two to four hours. The printing results are relatively high-resolution displays with a thickness of only 0.1 millimeters. It costs around €20 to print on a DIN A4 page; the most expensive part is the special ink. Since the method can be used to print on materials like paper, synthetic material, leather, pottery, stone, metal and even wood, two-dimensional and even three-dimensional shapes can be realized. Their depiction can either consist of one segment (surface, shape, pattern, raster graphics), several segments or variously built-up matrixes. "We can even print touch-sensitive displays," says Olberding.

The possibilities for the user are various: displays can be integrated into almost every object in daily life -- users can print not only on paper objects, but also on furniture or decorative accessories, bags or wearable items. For example, the strap of a wristwatch could be upgraded so that it lights up if a text message is received. "If we combine our approach with 3D printing, we can print three-dimensional objects that display information and are touch-sensitive," explains Steimle.

Live adaptation of organ models in the OR

Written By Unknown on Thursday, January 8, 2015 | 3:40 AM

The non-deformed liver model (red) adapts to the deformed surface profile (blue). Credit: Graphics: Dr. Stefanie Speidel, KIT, in Medical Physics, 41
During minimally invasive operations, a surgeon has to trust the information displayed on the screen: A virtual 3D model of the respective organ shows where a tumor is located and where sensitive vessels can be found. Soft tissue, such as the tissue of the liver, however, deforms during breathing or when the scalpel is applied. Endoscopic cameras record in real time how the surface deforms, but do not show the deformation of deeper structures such as tumors. Young scientists of the Karlsruhe Institute of Technology (KIT) have now developed a real-time capable computation method to adapt the virtual organ to the deformed surface profile.

The principle appears to be simple: Based on computer tomography image data, the scientists construct a virtual 3D model of the respective organ, including the tumor, prior to operation. During the operation, cameras scan the surface of the organ and generate a stiff profile mask. To this virtual mold, the 3D model then is to fit snuggly, like jelly to a given form. The Young Investigator Group of Dr. Stefanie Speidel analyzed this geometrical problem of shape adaptation from the physical perspective. "We model the surface profile as electrically negative and the volume model of the organ as electrically positive charged," Speidel explains. "Now, both attract each other and the elastic volume model slides into the immovable profile mask." The adapted 3D model then reveals to the surgeon how the tumor has moved with the deformation of the organ.

Simulations and experiments using a close-to-reality phantom liver have demonstrated that the electrostatic-elastic method even works when only parts of the deformed surface profile are available. This is the usual situation at the hospital. The human liver is surrounded by other organs and, hence, only partly visible by endoscopic cameras. "Only those structures that are clearly identified as parts of the liver by our system are assigned an electric charge," says Dr. Stefan Suwelack who, as part of Speidel's group, wrote his Ph.D. thesis on this subject. Problems only arise, if far less than half of the deformed surface is visible. To stabilize computation in such cases, the KIT researchers can use clear reference points, such as crossing vessels. Their method, however, in contrary to others does not rely on such references from the outset.

In addition, the model of the KIT researchers is more precise than conventional methods, because it also considers biomechanical factors of the liver, such as the elasticity of the tissue. So for instance, the phantom liver used by the scientists consists of two different silicones: A harder material for the capsule, i.e. the outer shell of the liver, and a softer material for the inner liver tissue.

As a result of their physical approach, the young scientists also succeeded in accelerating the computation process. As shape adaptation was described by electrostatic and elastic energies, they found a single mathematical formula. Using this formula, even conventional computers equipped with a single processing unit only work so quickly that the method is competitive. Contrary to conventional computation methods, however, the new method is also suited for parallel computers. Using such a computer, the Young Investigator Group now plans to model organ deformations stably in real time.

How does the brain react to virtual reality? Completely different pattern of activity in brain

Illusions (stock image). UCLA neurophysicists have found that space-mapping neurons in the brain react differently to virtual reality than they do to real-world environments. Credit: © agsandrew / Fotolia
UCLA neurophysicists have found that space-mapping neurons in the brain react differently to virtual reality than they do to real-world environments. Their findings could be significant for people who use virtual reality for gaming, military, commercial, scientific or other purposes.

"The pattern of activity in a brain region involved in spatial learning in the virtual world is completely different than when it processes activity in the real world," said Mayank Mehta, a UCLA professor of physics, neurology and neurobiology in the UCLA College and the study's senior author. "Since so many people are using virtual reality, it is important to understand why there are such big differences."

The study was published today in the journal Nature Neuroscience.

The scientists were studying the hippocampus, a region of the brain involved in diseases such as Alzheimer's, stroke, depression, schizophrenia, epilepsy and post-traumatic stress disorder. The hippocampus also plays an important role in forming new memories and creating mental maps of space. For example, when a person explores a room, hippocampal neurons become selectively active, providing a "cognitive map" of the environment.

The mechanisms by which the brain makes those cognitive maps remains a mystery, but neuroscientists have surmised that the hippocampus computes distances between the subject and surrounding landmarks, such as buildings and mountains. But in a real maze, other cues, such as smells and sounds, can also help the brain determine spaces and distances.

To test whether the hippocampus could actually form spatial maps using only visual landmarks, Mehta's team devised a noninvasive virtual reality environment and studied how the hippocampal neurons in the brains of rats reacted in the virtual world without the ability to use smells and sounds as cues.

Researchers placed a small harness around rats and put them on a treadmill surrounded by a "virtual world" on large video screens -- a virtual environment they describe as even more immersive than IMAX -- in an otherwise dark, quiet room. The scientists measured the rats' behavior and the activity of hundreds of neurons in their hippocampi, said UCLA graduate student Lavanya Acharya, a lead author on the research.

The researchers also measured the rats' behavior and neural activity when they walked in a real room designed to look exactly like the virtual reality room.

The scientists were surprised to find that the results from the virtual and real environments were entirely different. In the virtual world, the rats' hippocampal neurons seemed to fire completely randomly, as if the neurons had no idea where the rat was -- even though the rats seemed to behave perfectly normally in the real and virtual worlds.

"The 'map' disappeared completely," said Mehta, director of a W.M. Keck Foundation Neurophysics center and a member of UCLA's Brain Research Institute. "Nobody expected this. The neuron activity was a random function of the rat's position in the virtual world."

Explained Zahra Aghajan, a UCLA graduate student and another of the study's lead authors: 

"In fact, careful mathematical analysis showed that neurons in the virtual world were calculating the amount of distance the rat had walked, regardless of where he was in the virtual space."

They also were shocked to find that although the rats' hippocampal neurons were highly active in the real-world environment, more than half of those neurons shut down in the virtual space.

The virtual world used in the study was very similar to virtual reality environments used by humans, and neurons in a rat's brain would be very hard to distinguish from neurons in the human brain, Mehta said.

His conclusion: "The neural pattern in virtual reality is substantially different from the activity pattern in the real world. We need to fully understand how virtual reality affects the brain."

Neurons Bach would appreciate

In addition to analyzing the activity of individual neurons, Mehta's team studied larger groups of the brain cells. Previous research, including studies by his group, have revealed that groups of neurons create a complex pattern using brain rhythms.

"These complex rhythms are crucial for learning and memory, but we can't hear or feel these rhythms in our brain. They are hidden under the hood from us," Mehta said. "The complex pattern they make defies human imagination. The neurons in this memory-making region talk to each other using two entirely different languages at the same time. One of those languages is based on rhythm; the other is based on intensity."

Every neuron in the hippocampus speaks the two languages simultaneously, Mehta said, comparing the phenomenon to the multiple concurrent melodies of a Bach fugue.

Mehta's group reports that in the virtual world, the language based on rhythm has a similar structure to that in the real world, even though it says something entirely different in the two worlds. The language based on intensity, however, is entirely disrupted.

When people walk or try to remember something, the activity in the hippocampus becomes very rhythmic and these complex, rhythmic patterns appear, Mehta said. Those rhythms facilitate the formation of memories and our ability to recall them. Mehta hypothesizes that in some people with learning and memory disorders, these rhythms are impaired.

"Neurons involved in memory interact with other parts of the hippocampus like an orchestra," Mehta said. "It's not enough for every violinist and every trumpet player to play their music flawlessly. They also have to be perfectly synchronized."

Mehta believes that by retuning and synchronizing these rhythms, doctors will be able to repair damaged memory, but said doing so remains a huge challenge.

"The need to repair memories is enormous," noted Mehta, who said neurons and synapses -- the connections between neurons -- are amazingly complex machines.

Previous research by Mehta showed that the hippocampal circuit rapidly evolves with learning and that brain rhythms are crucial for this process. Mehta conducts his research with rats because analyzing complex brain circuits and neural activity with high precision currently is not possible in humans.

Other co-authors of the study were Jason Moore, a UCLA graduate student; Cliff Vuong, a research assistant who conducted the research as a UCLA undergraduate; and UCLA postdoctoral scholar Jesse Cushman. The research was funded by the W.M. Keck Foundation and the National Institutes of Health.

Source: University of California - Los Angeles

New technology allows medical professionals to step into their patients' shoes

Transports is a piece of technology that uses a low cost Raspberry Pi computer system, and allows users to recreate symptoms including dizziness and speech problems, along with wearable technology which creates a 6Hz tremor in the participant’s right hand. Credit: Image courtesy of University of Royal Holloway London
A pioneering piece of technology will allow users to experience the world through the eyes of a person with Young-Onset Parkinson's disease -- which could revolutionise the way carers and medical staff treat people with the degenerative condition.

Analogue, a theatre company set up by alumni of Royal Holloway, University of London has designed Transports -- a piece of technology which uses a low cost Raspberry Pi computer system, and allows users to recreate symptoms including dizziness and speech problems, along with wearable technology which creates a 6Hz tremor in the participant's right hand.
Young-Onset Parkinson's is a form of the neurological condition which affects people aged under 50. Symptoms of the disease include tremors, slowed movement and falls, and with no cure currently available, how care and treatment are managed can make a significant difference to a patient's quality of life.

The revolutionary project has been designed in collaboration with neuroscience specialist, Professor Narender Ramnani from the Department of Psychology at Royal Holloway, along with carers, researchers and people with the disease, at Parkinson's UK to ensure it is as effective and realistic as possible.

Liam Jarvis, Co-director of Analogue and Drama PhD Student at Royal Holloway, said: "Our principal interest is to work out how we can improve and facilitate communication and empathy by using simple technologies to immerse participants in the remote embodied experiences of others. Using inexpensive Raspberry Pi technology, we hope to expand the project to place participants inside different virtual subjects as an aid to better understand the experience of others."

The technology will now be tested with BSc psychology students at Royal Holloway and carers at Parkinson's UK to see how it can help medical practitioners better understand their patients and in the long term improve the treatment of the disease.

Anna Farrer, User Involvement Advisor at Parkinson's UK, said:"We know that people with Parkinson's feel that better public awareness about the condition would mean that they face less discrimination and have a better quality of life. Projects such as Transports have an important role educating the public, raising awareness -- and we hope, changing attitudes."

Source: University of Royal Holloway London
 
Support : Creating Website | Johny Template | Mas Template
Copyright © 2011. The planet wall - All Rights Reserved
Template Created by Easy Blogging Published by Mas Template
Proudly powered by Blogger