Categories
Health

The Robotic Surgeon Will See You Now

Dr. Danyal Fer sat on a stool a few feet from a long-armed robot, wrapping his fingers around two metal handles near his chest.

As it moved the handles – up and down, left and right – the robot mimicked every little movement with its own two arms. Then, when he squeezed his thumb and forefinger together, one of the robot’s tiny claws did the same. So surgeons like Dr. Fer has long been using robots to operate on patients. You can withdraw a prostate from a patient while they are sitting at a computer console in the room.

After this brief demonstration, Dr. Fer and his colleagues at the University of California at Berkeley how they hope to advance the state of the art. Dr. Fer let go of the handles and a new kind of computer software took over. As he and the other researchers watched, the robot began to move on its own.

With one claw the machine lifted a tiny plastic ring from an equally small pen on the table, passed the ring from one claw to the other, moved it across the table, and carefully hooked it onto a new pen. Then the robot did the same with several more rings and completed the task as quickly as it would under Dr. Fer.

The training exercise was originally designed for people; By moving the rings from pen to pen, surgeons learn to operate robots like the one in Berkeley. According to a new research report from the Berkeley team, an automated robot performing the test can match or even outperform a human in terms of skill, precision, and speed.

The project is part of a much broader effort to bring artificial intelligence into the operating room. Using many of the same technologies that support self-driving cars, autonomous drones, and warehouse robots, researchers are also working to automate surgical robots. These methods are still far from everyday use, but progress is accelerating.

“It’s an exciting time,” said Russell Taylor, a professor at Johns Hopkins University and a former IBM researcher known in academia as the father of robotic surgery. “This is where I was hoping we would be 20 years ago.”

The aim is not to remove surgeons from the operating room, but to reduce their burden and possibly even increase the success rate – where there is room for improvement – by automating certain phases of the operation.

Robots can exceed the accuracy of humans for some surgical tasks, such as inserting a pen into a bone (a particularly risky task with knee and hip replacements). The hope is that automated robots can perform other tasks like cuts or sutures more accurately and reduce the risks associated with overworked surgeons.

During a recent phone conversation, Greg Hager, a computer scientist at Johns Hopkins, said that surgical automation would advance much like the autopilot software that guided his Tesla while talking on the New Jersey Turnpike. The car drove alone, he said, but his wife still has her hands on the steering wheel if something goes wrong. And she would take over when it was time to get off the freeway.

“We can’t automate the whole process, at least not without human error,” he said. “But we can start developing automation tools that make a surgeon’s life a little easier.”

Five years ago, researchers at the National Children’s Health System in Washington, DC, developed a robot that could automatically sut up a pig’s intestines during surgery. It was a remarkable step in the direction of Dr. Gaunt envisaged future. But it came with an asterisk: the researchers implanted tiny markings in the pig’s intestines that emitted near-infrared light and helped control the robot’s movements.

The method is far from practical as the markers cannot be easily implanted or removed. In recent years, artificial intelligence researchers have greatly improved the performance of computer vision, allowing robots to perform surgical tasks on their own without such markers.

Change is driven by so-called neural networks, mathematical systems that can learn skills by analyzing large amounts of data. For example, by analyzing thousands of cat photos, a neural network can learn to recognize a cat. Similarly, a neural network can learn from images captured by surgical robots.

Surgical robots are equipped with cameras that record three-dimensional videos of each operation. The video is streamed into a viewfinder, where surgeons look into as they lead the operation and observe from the robot’s point of view.

After that, however, these images also provide a detailed roadmap showing how operations are performed. You can help new surgeons understand how to use these robots, and they can train robots to do tasks on their own. By analyzing images that show a surgeon guiding the robot, a neural network can learn the same skills.

In this way, Berkeley researchers have worked to automate their robot, which is based on the da Vinci Surgical System, a two-armed machine that allows surgeons to perform more than a million procedures annually. Dr. Fer and his colleagues collect images of the robot that moves the plastic rings under human control. Then your system learns from these images by pointing out the best ways to grip the rings, guide them between claws, and move them onto new pens.

However, this process was marked with its own asterisk. When the system told the robot where to go, the robot often missed the spot by millimeters. Over the months and years, the many metal cables in the robot’s twin arms stretched and bent in small ways so that its movements weren’t as precise as they needed to be.

Human operators could unconsciously compensate for this shift. But the automated system couldn’t. This is often the problem with automated technology: it struggles to deal with change and uncertainty. Autonomous vehicles are still a long way from being widespread as they are not yet nimble enough to cope with the chaos of the everyday world.

The Berkeley team decided to build a new neural network that would analyze the robot’s errors and learn how much precision it was losing every day. “It learns how the robot’s joints develop over time,” said Brijen Thananjeyan, a PhD student on the team. Once the automated system could accommodate this change, the robot could grab and move the plastic rings, which was what human operators could do.

Other laboratories try different approaches. Axel Krieger, a Johns Hopkins researcher who was part of the Pig Seam Project in 2016, is working on automating a new type of robotic arm, one with fewer moving parts that is more constant than the type of robot used by the Berkeley team becomes . Researchers at the Worcester Polytechnic Institute are developing methods for machines that will allow them to carefully guide surgeons’ hands as they perform certain tasks, such as: B. inserting a needle for a cancer biopsy or burning it into the brain to remove a tumor.

“It’s like a car where the lane following is autonomous, but you still control the gas and the brakes,” said Greg Fischer, one of the Worcester researchers.

Scientists realize that there are many obstacles ahead of us. Moving plastic pens is one thing; Cutting, moving and sewing meat is another. “What happens if the camera angle changes?” said Ann Majewicz Fey, an associate professor at the University of Texas, Austin. “What if smoke gets in the way?”

For the foreseeable future, automation will be something that works with surgeons rather than replacing them. But even that could have profound implications, said Dr. Fer. For example, doctors could perform operations over distances well beyond the width of the operating room – perhaps several kilometers or more – to help wounded soldiers on distant battlefields.

The signal delay is too long to currently allow this. But if a robot could do at least some of the tasks on its own, remote surgery could become profitable, said Dr. Fer: “You could send a high-level plan and then the robot could execute it.”

The same technology would be essential for remote operation over even greater distances. “If we humans operate on the moon,” he said, “surgeons will need entirely new tools.”

Categories
Health

Want a New Knee or Hip? A Robotic Might Assist Set up It

“When I started practicing 30 years ago, if someone had hip pain we would take an x-ray, and even if they had arthritis and were in their forties, we told them to change their activity and wait,” he told Dr . William Maloney, professor of orthopedic surgery at Stanford University.

No longer. “The technology has fulfilled our patients’ desire to stay active,” he said.

One of the greatest innovations came in the late 1990s and early 2000s – just in time for the marathon runners who play tennis and play tennis to show signs of wear and tear.

“The industry has found a way to make the implants better,” said Robert Cohen, president of digital, robotic and activation technologies at Stryker’s orthopedic joint replacement division in Mahwah, New Jersey. and subject it to a post process of heat and radiation that made it even stronger. “

The implants made of “highly cross-linked polyethylene” significantly reduced the need for revision surgery. “One of the main reasons for the revision was the breakdown of polyethylene in the replacement compound,” he said.

Thanks to the advent of the stronger and more durable material, he says, “We’ve all but eliminated that.”

The new implants also contributed to faster recovery times.

“When I was a resident, people were hospitalized for 10 days after a total hip or knee,” said Dr. Dorothy Scarpinato in Melville, NY. “Now they’ll bring her out in a day or two.” As a result, she added, “People are no longer as afraid of this operation as they used to be.”

Some of the factors that contribute to shorter hospital stays, according to Dr. Maloney less invasive surgery, accelerated rehabilitation protocols, better pain management methods and the use of regional as opposed to general anesthesia.

Categories
Health

LG to launch robotic that disinfects surfaces amid coronavirus pandemic

LG Electronics is working on an autonomous robot that uses ultraviolet light to disinfect what the South Korean tech giant calls “high-touch, high-traffic areas”.

In an announcement this week, LG announced it would roll out the technology to retail, education, hospitality and corporate customers in the United States beginning early next year.

In a statement, Roh Kyu-chan, Head of Robots at LG’s Business Solutions Company, said, “This autonomous UV robot comes at a time when hygiene is a top priority for hotel guests, students and restaurant customers.”

“Customers in the contactless ecosystem we are facing now will expect a higher level of hygiene,” said Roh.

According to LG, its robot will use UV-C light. There are three main types of UV radiation: UV-A, UV-B and UV-C.

The US Food and Drug Administration has described the latter as “a well-known disinfectant for air, water and non-porous surfaces”.

Regarding the current pandemic, the FDA notes that there is currently “limited published data on the wavelength, dose, and duration of UVC radiation required to inactivate the SARS-CoV-2 virus”.

For many people around the world, concerns about cleanliness and hygiene have increased due to the coronavirus pandemic. There is also debate within the scientific community about the risk of inanimate object transmission.

The US Centers for Disease Control and Prevention said on their website, “It is possible that a person could get COVID-19 by touching a surface or object that has the virus on and then their own Touches her mouth, her own nose, or her own eyes. “

However, it adds, “COVID-19 is not believed to spread to touching surfaces.” Most commonly, the virus spreads through close contact between people, according to the CDC.

LG Electronics is one of many large organizations and companies developing technologies that focus on UV-C as a disinfectant.

In October, Transport for London announced that over 200 devices that use ultraviolet light to disinfect surfaces will be installed on London’s extensive underground network.

TfL said the technology will be deployed on the handrails of 110 escalators over a period of several weeks.

According to the transport body, the device uses a “small dynamo” to generate electricity from the movement of the handrail, which in turn powers the UV lamp that is used to disinfect its surface.

Signify – a major player in the lighting industry – is now offering a so-called “desk lamp” for sale in selected Asian countries. The “lamp” can be used to disinfect rooms in houses.

Look no hands

While some are turning to UV light to address concerns about cross-contamination and virus spread, others are trying to put in place systems that could alter the way you physically interact with public spaces.

Even before the pandemic, movement-activated taps and toilets were introduced in heavily frequented transport hubs such as train stations and airports.

GEZE UK, which specializes in technologies related to doors, windows and security, committed itself to the bathroom issue at the beginning of this month and declared that it had developed a so-called “hands-free toilet door kit”.

The system, which uses sensors and is based on “contactless activation”, can be connected to the outside communal door of a public toilet.

This ensures that “those who leave the washroom do not have to touch the door after washing their hands”.