The sixth extinction: Why we need to protect endangered species to save ourselves

We are in the early stages of the Earth’s sixth mass extinction, according to a study from Stanford University. And while previous extinctions have been caused by natural planetary transformations or asteroid strikes, it seems that humans may be responsible for this one.

Biologists have drawn a chilling connection between the decline of animal populations and the knock-on effects it could have on human health – with risks including plague epidemics in densely populated areas.

Up to 33% of all vertebrates species are estimated to be threatened or endangered globally. Now a team of scientists, led by Stanford biology Professor Rodolfo Dirzo, has revealed how, through a complex chain of cascading effects, human lives in large numbers could be at stake if we don’t ensure the survival of these animals.


“We tend to think that extinction is a phenomenon that will affect a particular population,” says Dirzo, who coined the term ‘defaunation’ describing the decline of animals as a consequence of human impact. But if one animal population is driven to local extinction, the effects on the ecosystem could scale up all the way to a global level, the scientist warns.

Experiments conducted by Dirzo and his colleagues in Kenya have studied how the absence of large animals such as zebras, giraffes and elephants impacts on the ecosystem. They observed that rather quickly, affected areas will be overwhelmed with rodents as seeds and shelter from grass and shrubs become more easily available and the risk of predation drops.

Consequently, the number of rodents doubles – as does the number of disease-carrying ectoparasites they harbour. Many of the pathogens the researchers found on the rodents in Kenya pose a threat to human health, including the bacteria that cause plague.

This could cause a disastrous chain of effects, particularly in densely populated areas, says Dirzo. “Where human density is high, you get high rates of defaunation, high incidence of rodents, and thus high levels of pathogens, which increases the risks of disease transmission.”


So what can we do to prevent an apocalyptic scenario of human populations being eradicated by rodents carrying the plague?

Defaunation is driven directly by hunting, poaching and illegal trade of animals and indirectly by changes in land use, which can reduce or isolate natural habitats, preventing native species from maintaining healthy populations. In an earlier study Dirzo and colleagues estimated that 50% of all mammal species could be placed under serious risk of extinction in the next 200 years.

Finding a solution is tricky, the scientist admits. Immediately reducing rates of habitat change and overexploitation would help. But these approaches need to be tailored to individual regions and also need do address rural poverty which often drives hunting, poaching and illegal trade.

While reforestation projects are already working to reverse the catastrophic effects of declining rainforests, Dirzo says we need to create a similar process of ‘refaunation’ – the restoration of endangered animal species and their habitats.

Clearly, such a process will take time and significant changes in human habits and activities. But maybe the awareness that the ongoing mass extinction will not only affect large, charismatic animals but could also wipe out human populations will provide the incentive to spur change.



Digital expressions: New research makes virtual faces more realistic

Facial expressions of virtual characters can more realistically capture those of their human counterparts, thanks to new research from Universidad Autónoma del Estado de México (UAEM) (Autonomous University of Mexico State).

Over the years, technology has made video games increasingl­y realistic and motion capture particularly has been used to better replicate human behaviour in the virtual world.

This has been especially evident in the area of facial expressions with recent developments from companies such as Rockstar Games and Naughty Dog leading the way.

Rockstar Games’ LA Noire uses MotionScan technology to capture actors’ facial expressions, allowing players, aka detectives, to better spot a suspect’s lie. For Naughty Dog’s The Last of Us, special animation was used to capture actors’ facial muscles individually for a more realistic result.

While these developments have certainly brought a new level of reality to video games, the technology has, up to this point, been dependent on actors. But what if virtual characters could be made to be more realistic without the help of actors?


“I would like to see a character that is expressing itself, not precaptured, not generating canned expressions or creating them from semantic rules, but creating expressions by the same things that create our expressions,” says Javier von der Pahlen, director of creative research and development at the Central Studios division of Activision, as quoted by website IEEE Spectrum in May.

Writer Tekla Perry goes on to say that “a truly digital character, one that does its own acting rather than conveying the acting of another actor, will be created only by merging artificial-intelligence technology that passes the Turing test in both verbal and nonverbal responses — physical gestures, facial expressions and so on — with perfectly rendered computer graphics. That will be really hard.”

Could UAEM’s new research provide us with this kind of technology?

The university’s department of computer science teamed up with the Centro de Investigación y de Estudios Avanzados del Instituto Politécnico Nacional (Center for Research and Advanced Studies of the National Polytechnic Institute) and the University of Guadalajara, with students serving as human models.

Together, they are working on a virtual project known as ‘juego serio’ (‘serious game’) which, according to UAEM engineer Marco Antonio Ramos Corchado, is for educational, scientific and civil purposes. But the research can also be used to make the facial expressions of video game characters more realistic.


Currently, virtual characters mimic human behaviour through programmed commands or scripts which, Corchado says, results in a “robotic” reaction. UAEM’s research, on the other hand, studies the 43 muscles involved in human facial behaviour to generate more realistic expressions and emotions.

Students are fitted with tactile sensors that produce tiny electrical pulses as they perform different gestures. Using a 3D camera, researchers are able to capture these different gestures.

Taking into account the influence of emotions, attitudes and moods on human behaviour, and their variations depending on social context, the information is then translated into numerical data and entered into a kinesic model designed by UAEM.

This is then used to animate expressions and gestures of virtual characters, such as happiness, sadness, fear, anger, surprise and disgust.

Corchado and his team are busy simulating natural disasters and other situations as part of their ‘serious game’ to capture the varied range of human expressions. In the meantime, it may be time to get ready for an improved sense of reality in less serious, but equally important, games too.



Electronic nose gives robots the power to smell

A postgraduate student has designed a device that gives robots the ability to smell. This technology could save lives by helping to locate the victims of natural disasters.

To initiate her research, Blanca Lorena Villareal studied the olfactory systems of living organisms.

Animals distinguish the source of an odour by registering the concentrations of the scent and the time elapsed as strength of the odour varies.

Villareal later applied mathematics to begin transforming the ideas into robotic realities.

Using artificial intelligence algorithms, she first developed a system that could recognise the smell of alcohol. Then, she altered the algorithms and developed them further to allow the detection of other scents.


This robotic olfactory system uses chemical sensors to function as nostrils. Data is then transmitted to a computer, where it is evaluated to determine the direction and proximity of the source of the smell.

“Unlike in other olfactory systems, this has the feature that in each cycle of ventilation the air chamber empties, making sensors ready for a new measurement,” explained Villareal, who developed the electronic nose as a postgraduate at the Monterrey Institute of Technology in Mexico.

Since the system can detect changes in the direction of the odour within one cycle, the robot can quickly identify and locate its source.

The device is compatible with various robotic platforms so it is not limited to any single application. As a result, robotic smell-tracking technology could prove useful in a number of fields.


Because the device can recognise odours such as blood, sweat and urine, it could track people trapped in dangerous situations as a result of natural disasters.

It is already being implemented into a project by the Mexican National Science and Technology Council to test its efficacy in emergency rescue situations.

Maybe one day the device could also be used to track intoxicated drivers and keep our roads safer, as well.

Villareal has been named as one of the most innovative young Mexicans by the Massachusetts Institute of Technology Review for her contributions to robotics, and is continuing her research by developing algorithms that will widen the variety of odours the robot can recognise.

She is also working to integrate smell-sensing into the robot’s decision-making process.

In the future, smell-tracking technology could even be programmed into androids to heighten their sensing capabilities and further humanise robots, even equipping them with a sharper sense of smell than the humans who created them.



Perfecting the biochip: Shape breakthrough paves way for neural implant boom

The use of electrode-based neural implants for medical purposes is a rapidly growing field, with research underway on applications as diverse as brain repair and drug measurement.

The precise design of such implants is still up for debate, with different shapes being used by different researchers around the world.

However the shape can have a significant impact on the quality of the connection between the implant and the cells, which can affect treatment success.

Until now, that is, because scientists from German research institution Forschungszentrum Jülich have determined exactly which shape is best to ensure the best connection. The scientists found that cells did a better job of incorporating the implants when they had a long, thin stalk topped off with a wide cap.

This information is a significant step for the fledgling field, as it will enable a standardised approach to neural implants and boost large-scale implant development.


Understanding how cells coat foreign bodies and incorporate them was central to the research.

“In the development of nano-structured 3D surfaces for bio-electronic interfaces we use this behaviour to improve the connection between the cell membrane and the electronics,” explained Proffessor Andreas Offenhäusser, bioelectronics director of the Jülich Peter Grünberg institute.

Researchers worldwide favour different shaped nano-electrodes, so the team assessed the effectiveness of a variety of shapes using both theoretical and practical models.

Among the other shapes assessed was a mushroom cap-like electrode and a thin column without a central cap, however the long stalk/wide cap combination proved to be most effective in keeping the gap between the cells and the electrode to a minimum.

“For a variety of applications, it is important that the cell lies very close to the electrode. Already the distance of one ten thousandth of a millimeter is enough and you cannot measure anything more,” said Offenhäusser.


Nano-electrode implants have the potential to be used for a huge range of different applications, both as treatments and to aid drug development.

Scientists could use the chips in conjunction with single cells in lab environments to determine the effectiveness or possible side-effects of drug candidates, and study the development of brain diseases more effectively.

When used as an implant, the electrodes could be used to aid a whole host of conditions, including as retinal implants to improve – and perhaps one day cure – the vision of the visually impaired.

Other uses include a host of applications for mental illnesses, with current research including treatments for depression, and for conditions such as brain damage and Parkinson’s.

In the future there are hopes that such neural implants could even replace organs or provide thought-control for the severely disabled.

Body images courtesy of Forschungszentrum Jülich.



Children train robots in school to prepare for the future workplace

Children as young as ten years old are now learning how to train robots in their schools as a result of Rethink Robotics’ expanding distribution of the Baxter Research Robot.

The Baxter Research Robot is a humanoid robot designed to help in fields such as healthcare, manufacturing and education. The robots can typically be found in laboratories, graduate and undergraduate programs, and are even being used as an educational tool in primary school classrooms.

According to Scott Eckert, the CEO of Rethink Robotics, “By the time [today’s students] enter the workforce, robots will be integrated into nearly every industry, as we see in manufacturing today.”

Indeed, robots are becoming a crucial component for automotive, plastics, electronics and various other industries that string together manual tasks to build their products, as they provide cheap and reliable labour.

The increasing use of robots in many fields has stirred fears of job displacement for human workers. But by educating children about how to operate robots and perhaps even program or design them, we can ensure that people will adapt their future careers to the use of robots.

In addition, students will learn how they can improve the robots of the future to optimize efficiency in their workplaces while maintaining a healthy job market.

“These children will have an important advantage—experience—thanks to the K-12 schools, colleges and universities that are investing in robotics now,” Eckert stated.

Rethink Robotics has recently partnered with three new distributors, Robotshop, Teq, and Gaitech International, giving classrooms around the world the opportunity to use Baxter, not just those in the US where the Rethink’s robots are designed and manufactured.

“Robotics already play a large part in the educational market and corporate [research and development] markets and that will only continue to grow. Providing Baxter Research Robots to the Asian market is a logical and important step in that growth,” said Jenssen Chang, CEO of Gaitech International.


The use of robots in the classroom points to a future where they are a normal and indispensable part of everyday life in our homes, school and workplaces. Nearly every job could require some level of interaction with robots.

As this vision of the future becomes a reality, it will become more important than ever for all people to have baseline knowledge about robots and how they operate.

Perhaps one day robot training programs such as the one Baxter provides will even become mandatory curriculum for schools.

Images and video courtesy of Rethink Robotics.



Self-healing materials to repair iPhones and consumer culture

Self-healing materials have the potential to change our consumer culture in a significant way.

Many people have experienced the dread of dropping a smartphone only to pick it up and discover a scratched screen.

It is no surprise, then, that clumsy smartphone users everywhere rejoiced when Apple’s patent for self-healing iPhone displays was unveiled in February.

However, Apple is not the only company to develop self-healing technology. Natoco, a paint manufacturer from Japan, has developed a self-healing coating made from a polymer alloy that can be applied to different types of objects.

Natoco describes its coating through two different features. One is a “curling effect” that smoothes the surface of the object so that it is slippery, preventing scratches in the first place.

The second is a “trampoline effect”, a restorative feature that softens the impact on the dropped object and bounces back its energy, effectively “healing” the object.


The coating can be used for smartphones and other electronics that are easily scratched, but its applications are not limited to screens. Natoco foresees using it on vehicles, and it could also prove useful to prevent home appliances and kitchenware from being scratched.

Today, obsolescence is commonplace, and people discard objects that are still perfectly functional to replace them with the shiniest and newest models.

Smartphones are one of the most prominent examples of this throwaway mindset. If a phone gets a small scratch, it is often seen as unusable though it can still perform.

The quick cycle of buying and discarding is harmful for a number of reasons, particularly because it places no value on the resources that are used in the manufacturing of these items and ignores the resulting environmental impact.


Perhaps self-healing technology can transform our current attitude towards early obsolescence. Though not a perfect solution, people might be less likely to buy new electronics and appliances at such a speedy rate if coatings such as Natoco’s can maintain their sleek appearances by preventing and healing unsightly scratches.

By using products to their full capacity of functionality, we could reduce waste and save money.

As this technology continues to develop, we will find other applications for it as well. Self-healing coatings could strengthen a wide variety of structures and objects, from aircrafts to water pipes, increasing their safety as well as their aesthetic.

Maybe one day our homes could even withstand extreme weather damage with the help of these materials.

Second body image courtesy of Matthijs Rouw.



Safety first: Automated technology moves cars closer to an accident-free future

A suite of smart technologies designed to prevent vehicle injuries are to become the most advanced automated safety features to ever come as standard on a car.

Announced by Volvo for its new XC90 SUV, the automated features include an auto-brake feature to prevent collisions at intersections and automatic safety measures in run-off road scenarios.

The technologies are part of the car company’s ambitious plan to ensure that no one is killed or seriously injured in one of its vehicles from 2020.

At present there is a wider move to automate some vehicle features by the automotive industry, with the intention of increasing safety and reducing the cost of insurance.


In what Volvo claims is a world first, the XC90 can identify when a vehicle is leaving the roadway and take steps to reduce injury from a run-off road collision.

Upon detecting what is happening, the car automatically tightens the front seatbelts to keep the driver and passenger in position and prevent them being thrust forwards through the windscreen. The belts remain tight while the vehicle is in motion, only loosening once it has come to a complete stop.

This alone, however, would not prevent spinal injuries that can occur when a vehicle hits hard objects or ground, so the vehicle is also equipped with energy-absorbing seats. These cushion vertical forces from impact by up to a third, significantly reducing potential injury.


In a second world-first, the vehicle is equipped with a feature to prevent collisions at intersections, such as city crossings or on highways.

Here if a driver turns in front of an oncoming car the automated system will kick in and apply the brakes, which will either prevent or significantly reduce the damage from a collision.

The auto-braking feature, dubbed City Safety, will also make its presence known if it detects a potential collision with a cyclist, pedestrian or another vehicle, which it can identify using a highly sensitive camera that can perform in any light.

Other safety features include a strengthened frame; 360° surround view courtesy of four fish-eye cameras and inflatable curtains that activate in a rollover situation to prevent head injuries.

The XC90 is unlikely to be alone in offering automated protection, and this kind of technology may in the future be required by law in some regions if it is found to have a significant impact on vehicle-related deaths, in much the same way as the seatbelt was.

It will no doubt prove popular with some drivers, particularly if it results in cheaper insurance premiums, but for some this automation may just be an erosion of freedom.

Images courtesy of Volvo.


Inspiring innovation