We do not merely recognize objects - our brain is so good at this task that we can automatically supply the concept of a cup when shown a photo of a curved handle or identify a face from just an ear or nose. Neurobiologists, computer scientists, and robotics engineers are all interested in understanding how such recognition works - in both human and computer vision systems. New research by scientists at the Weizmann Institute of Science and the Massachusetts Institute of Technology (MIT) suggests that there is an "atomic" unit of recognition - a minimum amount of information an image must contain for recognition to occur.

Dead European honeybees have almost 57 different pesticides detected, according to a new paper in the Journal of Chromatography A.

Should that be a concern? Not really. The great thing about modern technology is that we can detect parts per trillion, orders of magnitude what can be harmful. Yet proponents of low-dose effect, like environmental groups and researchers enabling them, will want to claim that being able to detect something means it must be bad.

In the 1990s, diagnoses of ADD (attention-deficit disorder) and then ADHD (attention-deficit hyperactivity disorder) boomed, aided by public school teachers who didn't want to deal with diverse personalities in the classrooms and sketchy therapists exploiting the worries of parents.

Obviously it is a real condition also, but like many mental health fads (people declared that everyone they didn't like had Asperger's Syndrome a decade ago, for example) a lack of clinical relevance means it gets used in many cases where it should not be. Now, some reports have indicated a prevalence of up to 15% - but just in Western countries, where more money than sense is in evidence.

In a survey, older adults who recalled more robots portrayed in films had lower anxiety toward robots than seniors who remembered fewer robot portrayals, said S. Shyam Sundar, Distinguished Professor of Communications and co-director of the Media Effects Research Laboratory at the Human-Robot Interaction conference. The most recalled robots included robots from: Bicentennial Man; Forbidden Planet; I, Robot; Lost In Space; Star Wars; The Terminator; Transformers and Wall-E.

COLUMBUS, Ohio - When people playing violent video games focus on killing and maiming, they don't often remember the corporate brands they see along the way.

That's the conclusion of a new study that is one of the first to look at whether product placements in video games are an effective form of advertising.

Results showed that gamers who played with nonviolent goals recalled 51 percent more brands shown inside the game than did those playing the exact same game with violent goals.

"Killing characters in video games may be fun for players, but it appears to be bad for business," said Brad Bushman, co-author of the study and professor of communication and psychology at The Ohio State University.

Oklahoma City (March 8, 2016) What do cancer cells and a runny nose have in common? The answer is mucus; and researchers at the Stephenson Cancer Center at the University of Oklahoma have shown it may hold the key to making cancer treatment better.

Most of us know about the thick, gooey stuff we blow from our noses when we have a cold. In that instance, mucus protects the normal tissue in the nose from drying out and helps the body recognize and fight off invaders like bacteria and viruses.

Educational neuroscience has little to offer schools or children's education, according to new research from the University of Bristol, UK.

In a controversial research paper published in Psychological Review, Professor Jeffrey Bowers of Bristol's School of Experimental Psychology warns that schools are investing in expensive interventions because they claim a neuroscientific basis. However, the paper points out that understanding the role of different structures of the brain does not actually help improve teaching or assessing how children progress in a classroom setting.

HOUSTON - (March 9, 2016) - Rice University bioengineers have introduced a fast computational method to model tissue-specific metabolic pathways. Their algorithm may help researchers find new therapeutic targets for cancer and other diseases.

Metabolic pathways are immense networks of biochemical reactions that keep organisms functioning and are also implicated in many diseases.

They present the kind of challenges "big data" projects are designed to address. In recent years, computer scientists have built many ways to model these networks in humans, particularly since the 2007 introduction of the first genome-scale model of human metabolic pathways.

Damaging cyberattacks on a global scale continue to surface every day. Some nations are better prepared than others to deal with online threats from criminals, terrorists and rogue nations.

Data-mining experts from the University of Maryland and Virginia Tech recently co-authored a book that ranked the vulnerability of 44 nations to cyberattacks. Lead author V.S. Subrahmanian discussed this research on Wednesday, March 9 at a panel discussion hosted by the Foundation for Defense of Democracies in Washington, D.C.

The United States ranked 11th safest, while several Scandinavian countries (Denmark, Norway and Finland) ranked the safest. China, India, Russia, Saudi Arabia and South Korea ranked among the most vulnerable.

How much time and effort do you spend chewing?

Although you probably enjoy a few leisurely meals every day, chances are that you spend very little time and muscular effort chewing your food. That kind of easy eating is very unusual. For perspective, our closest relatives, chimpanzees, spend almost half their day chewing, and with much greater force.

When and how did eating become so easy? And what were its consequences?