Betting Seed: How Birds Give in to Gambling

Humans may not be the only animals who go for the jackpot.  Illustration by Emily Maute
Humans may not be the only animals who go for the jackpot. Illustration by Emily Maute

By Fink Densford
BU News Service

Pigeons are not stars of the animal kingdom. They aren’t featured alongside the patriotic bald eagle, the proud grizzly bear or the noble wolf. Mostly we see them on the streets. We spend money and time trying to keep the wrong ones away, and others we happily serve private banquets to – usually, in public parks.

At heart, however, pigeons are a lot like humans. They, like us, are animals that do well in urban environments and like to keep together. And in addition to the fact that the perky, rolling way they walk resembles certain elements of our own locomotion, they may have what was previously thought of as a uniquely human flaw – the urge to gamble.

In fact, new research is showing that pigeons, along with other members of the animal kingdom, may be as susceptible as humans to the demons of rolling the dice – even when it’s to their own detriment.

Research on this devious behavior began in 2009, when Dr. Thomas Zentall, a professor at the University of Kentucky, asked pigeons to make a simple choice. In his initial experiment, he gave 16 stark white Carneaux pigeons (more often seen in fine gourmet than laboratories) the chance to gamble. He and his colleagues shuffled the birds into metal boxes the size of large ice chests, and presented them with a set of lights that they could peck in hopes that they’d be rewarded with birdseed.

In his initial experiment, Zentall asked pigeons to choose between two options in the form of lit-up, round buttons – about the size you’d see in an elevator. On one side was a chance to gamble. If the bird pecked this button, there was a 50 percent chance they’d be shown a green light symbolizing a win and rewarding the birds with seed, and a 50 percent chance they’d see a red light, which symbolized a loss and ended with no seed.

The other button, if chosen, produced one of two random lights – yellow or blue – both of which symbolized a win, and rewarded the birds with seed 100 percent of the time. This choice was the steadiest option. It always rewarded the pigeons for their effort, and there was no chance of missing out on seed.

Zentall noted that the button colors in all experiments shifted during different trials with different birds, to make sure that there wasn’t simply a color preference at the heart of their decision-making.

Paradoxically, the birds made a decision to peck the gambling light 69 percent of the time in each trial. 13 of 16 birds showed a preference for this random choice that ended up giving the birds less.

greenRESIZE
What drives a pigeon to gamble? Illustration by Emily Maute

The results were unexpected. Zentall said that while he had considered the possibility that the birds may choose poorly, he hadn’t expected to see such a preference for the gambling-like option. But while the results were interesting, he and his lab would need more data to show that the birds were more interested in the risky reward simply because it was a gamble.

In 2010, Zentall gave the pigeons a more human-like opportunity. Instead of a sure payout or none at all, they would get the chance to shoot for a jackpot.

In a small metal box – this time with a more sophisticated pellet delivery system – the pigeons would be asked to choose between two lit-up buttons. The button on the left was a gamble. When they pecked it, there was a two-in-ten chance they’d get a “green” light, resulting in a payout of ten food pellets. But eight out of ten times, the pigeons would get a “red” light that gave them no food, and symbolized a loss. The button on the right represented a non-gambling option. After pecking that button, the pigeons would be shown another colored button, and after another peck the birds would always end up with three pellets.

Again, the birds, like many people, seemed to be happy going for the long shot. In this experiment, 89% of the pigeons chose the worse option, betting on a one-in-five chance at 10 pellets over a sure three. And the reason may have been that the association they had with the light that symbolized that random chance at a jackpot, says Zentall.

At the heart of why pigeons might be so driven to choose the riskier option is a concept known as “contrast”. Kristina Pattison, a graduate researcher in the comparative cognition laboratory at the University of Kentucky explained that in humans, the chance of no payout in gambling leaves expectations low. When expectations and emotional state are low, rewards feel more rewarding. It’s like having a cup with water in it – a normal emotional state would be half full, low would be empty and high would be to the brim. To get from low to full takes more water, and feels more ‘rewarding’ to humans and animals. It’s often known as the ‘contrast effect.’

“You don’t see people crowding around soda machines, eager to feed it dollars to get sodas, or crowding around change machines waiting to get four quarters,” said Pattison. “But you do see them sitting for hours at slot machines.” Gamblers are known to often downplay losses, and slot machines do their best to help them forget, with no shows of light and sound to announce the loss. The event isn’t as impactful, and the player is urged to try again.

So why would pigeons be so convinced to make such a sour decision? Would this be the kind of thing they see in their environment naturally? Dr. Zentall doesn’t think so. “Nothing in nature is anything like the randomness of gambling,” said Zentall. The sort of skewed probabilities in gambling, both in his experiment and the type that hooks humans are not something animals have had to adapt to deal with in the wild.

The mystery of the gambling bird goes deeper than just contrast, though. Further investigation in pigeons has shown other factors also seem to play a part in their susceptibility to gambling-like behavior. In another experiment, published in the Psychonomic Bulletin & Review in 2013, Pattison and Zentall found that enriching pigeons by allowing them to socialize in large, communal cages significantly lowered how often the birds wanted to take the risky option.

socialRESIZE
Socialization has been shown to reduce pigeons susceptibility to gambling. Illustration by Emily Maute

The experiment mirrored the earlier model, offering the pigeons a chance at a large payout or a smaller, but more stable paycheck. The only difference laid in what some of the pigeons did with their free time. Half the group got four hours a day of socializing time with other pigeons. Normally, for health and safety reasons, the pigeons are housed independently and are not allowed to interact. But for this experiment, half of the group was cleared and allowed to roost together. The effect was dramatic. Within the first ten trials, socialized birds chose the risky option 50 percent less often than birds that had been left alone.

Pattison said that previous research has shown that enrichment and socialization has a dramatic effect on performing complex tasks in other animals, but this was different. It was effectively immunizing the animals from the dramatic preference – at least for a time. Even the socialized birds caved to the urge to gamble, showing preference against that slowly faded to a preference for the gambling option as the trials went on.

The results could say a lot about an unconscious basis for gambling behavior in humans, but these experiments can’t be called an exact replica of human gambling, said Zentall. He and his colleagues call it gambling-like behavior, mainly because they can’t recreate earning and spending money with animals, or any of the other cultural constructs that human gambling includes. Instead, they are asking the birds to make a statistical decision. “What we’re doing is, essentially, telling them they can have 3 pellets, much like earning money,” said Zentall. “Or offering them the chance at 10.”

The question, however, is close enough to humans to allow at least some theorizing of a simpler explanation for why people might gamble, says Zentall. “I think for all practical purposes, it’s the same.” He’s most interested in seeing if there’s a simpler reason why animals, and people, are so drawn to the jackpot.

Ben Hayden, an assistant professor at the University of Rochester studying Rhesus monkeys for similar gambling-like behavior, thinks the behavior might be genetic. He thinks there’s a possibility that there was, at some point in the past, an evolutionary advantage to be gained with going with the long shot. “Our economic biases are probably more reflective of something that shaped both human and animal evolution.”

Zentall isn’t sure of the cause of the effect, but believes that it can tell us something about humans. In nature, finding a large reward, in terms of food, usually means there’s a higher chance you’ll find more of the same around it. If the conditions are right for one plant to produce a lot of berries, then other plants in the same area are likely to do the same. “So I think the implication is that we think of gambling as an absence of appropriate morality,” said Zentall. “But I think there’s a general tendency, in animals and humans, to be attracted to large winnings, large rewards.”

Learning to Feel With Light

GelSight_egg_2
Baxter, equipped with a GelSight sensor, shows how it can hold a delicate item, such as an egg, without any worries of breaking it. Photo by Rui Li.

By Fink Densford
BU News Service

It can be a little hard to focus when you’re sharing a room with a robot that’s taller than you. Especially when it has enormous red arms, awkwardly jutting from a thin frame. Each thick arm on Baxter, the robot I’ve been sitting next to for half an hour, is tipped with a two-pronged plastic claw. The claw is an advanced version of the handheld kind used to pick up litter or pinch unsuspecting siblings.

The room I’m sharing with the Baxter is a large glass cubicle in the Stata Center at the Massachusetts Institute of Technology. Outside the windowed cage are scattered desks, each furnished with stacks of paperwork, food wrappers, cables and computer equipment. But the most interesting thing is at the end of Baxter’s hand, currently gripping an egg.

GelSight_potato chip_2
The GelSight sensor is sensitive enough to hold a potato chip without cracking it. Photo by Rui Li.

What I had come to see was perched on the tip of the robot’s claw, gripping at the egg. Pressed up against one side was a small, taped up black box about the size of an iPhone power plug. The little black box does something special with a tacky-textured pad on one side– it “feels” with light.

In the room with Baxter and me is Rui Li, a PhD student at MIT and lead developer on the “GelSight” program, which aims to give robots a sense of touch through the use of cameras and lights.

Li slipped the egg from Baxter’s grip. A moment later, he placed a small empty jar – the kind in which you’d normally find maraschino cherries or olives – in the claw of the robot. Then he began to fill it with water.

Nothing noticeable happened, but that was the point. Normally, this would be a very advanced task for a robot – tactile sensing is one of the most difficult tasks robots can do – and the robot would drop the bottle as it gained weight. Most robots are programmed to operate around tactile sensors – finely tuned to perform tasks with no need to feel. Computer monitors in front of Li, being used to control and program the robot and it’s sensors, lit up with numbers showing that the claw was automatically adjusting its pressure as the jar got heavier, keeping it from slipping out of its grasp.

“Push your finger against it,” offered Li, gesturing to the small pad, the sensor in question. On the screen of his laptop, a green and black grid showed an image of anything pushed against the small sensor in Baxter’s claw. It did so by reflecting the light from four LEDs, positioned on each side of the gel pad, off the painted metallic surface. When something pressed against the gel pad, the other side was deformed in the same way. A camera behind the pad constantly fed the image to a computer, which created a 3D image of the object it was pressed against.

MIT-GelSight-01
The GelSight Sensor is used to plug in a USB cord. Image by Melanie Gonick / MIT News Office

By creating a microscopically detailed 3D image of what it is holding, the GelSight sensor doesn’t have to rely on pressure readings or any other complex interpretations. It can literally see it, how it’s moving and apply enough pressure to hold on tight.

In this case, the screen showed my fingerprint, its complex canyons and ridges displayed in bright detail, in 3D. Li also showed me more complex images that the sensor was able to pick up, including two human hairs that crisscrossed, with enough detail to tell which was on the bottom. Li seemed especially proud of the clear image of a ”1” from a hundred dollar bill.

The device, Li explained, provided anywhere from a 100 by 100 pixel image to a 600 by 800 pixel image of the surface. In doing so, the sensor was able to provide a level of responsive detail that started in the thousands in terms of actively monitored points on the sensor. The closest non-visual tactile sensor that is commercially available, says Li, provides around 19.

Li has previously shown that the sensor, when attached to a robot, can allow it to pick up a free-hanging USB cord and plug it in – a task that comes with about a single millimeter of leeway between success and failure. Baxter, it seems, is about as sensitive as robots come.

And the sensor is able to hold more than just USB cords, eggs and jars. Li said that since the sensor produces high-resolution 3D geometry of the surfaces it touches, it could be used for any object – even a potato chip can be held between the plastic claws without cracking.

Robots often struggle to find ways to feel like humans, and the sensor is effectively giving robots as a whole a new sense to take advantage of. The cost of the unit was a mere fraction of its competitors, said Li, and is quickly moving towards commercial availability.

So while Baxter may be a menacing looking piece of machinery, it’s the tip of his finger that holds the most potential. And with Li behind the device, it’s impressive list of abilities is bound to keep growing.

Autos Entering the Autonomous Era

By Fink Densford
BU News Service

A future of self-driving cars is becoming less of a fantasy and more of an inevitability.

Both Audi and Mercedes Benz presented cars that were able to drive themselves to the center stage at CES. Mercedes presented the F015 Luxury In Motion, an autonomous concept specifically for CES, with a hopeful statement of a 2030 release.

Photo Gallery: The Cars of CES

Audi modified an existing A7 and sent it on a journey from Palo Alto to Las Vegas, arriving in time for the opening of CES without a driver managing the journey.

While the Mercedes F015 concept was more a research project than a commercial product, Audi has stated that they will be releasing a semi-autonomous A8 by 2016.

The rush for driverless cars isn’t new, however – Google revealed it’s own version last year, and has continued development on a working prototype.

googlecar
Google’s self-driving car. Image courtesy of Google.

But Google’s vision for a driverless car is vastly different than what Mercedes Benz envisions. The Google self-driving car is cute, with lights and grill giving it a pseudo-face – and inside is limited to two seats, no controls and a top speed of 25 mph. The F015, by stark contrast, is a 17-foot long, low-slung luxury sedan with a glossy silver exterior and arrays of lights that give it more of a UFO look than something you’d normally see on the street.

Dieter Zetsche, head of Mercedes Benz referred to the F015 as more of a return to the age of the passenger, not the driver, and likened the car to a luxury carriage. And while Mercedes has stated that the autonomous systems will shift the focus away from drivers and onto passengers, the F015 still comes with a wheel and pedals, so manual control is still possible, something the Google self-driving car does not have.

F015
Mercedes Benz F015 Luxury In Motion concept car, revealed at CES 2015. Photo by Pankaj Khadka/BU News Service

At Audi, all attention was on “Jack”, the autonomous A7 that drove itself on a 550-mile journey from Northern California to Las Vegas. Audi claims the system, which it calls “Piloted Driving”, is production ready, and will be seen in their new A8.

The A7 on display, dubbed “Prologue” was summoned to the stage by an Audi-customized smart watch. Other than its ability to lock and unlock the car, however, no more information was revealed about the watch.

A7Prologue
The Audi “Prologue”, based on an existing A7, presented at CES 2015. Photo by Pankaj Khadka/BU News Service

Previous autonomous experiments at Audi have included an RS7 that drove itself at high speeds around the Hockenheim ring, a motor-racing circuit in Germany. During its self-piloted trip around the track, the car was able to reach speeds of nearly 150 mph without a human behind the wheel.

Other car companies aren’t backing away from the idea either. At a press conference on Tuesday, Ford’s CTO Raj Nair said “there absolutely will be a Ford autonomous vehicle in the future,” but the company did not have any such vehicles on show at CES.

And features of autonomous driving is already in cars in many ways. Lane departure control, adaptive cruise control and automatic parallel parking can be found in many models from a wealth of different manufacturers. While these features aren’t entirely autonomous, they still allow the sensors and radar on the car to take over if the human behind the wheel can’t, or doesn’t want to, keep up.

So far, only three states, Nevada, California and Florida, as well as Washington D.C., have passed laws allowing autonomous cars on the road. As the technology improves, more states are likely to join the trend, but the reality of taking a nap on the way to work is still at least a few years away.

Auto Makers Take Slight Detour on Future

By Fink Densford
BU News Service

According to both Toyota and Mercedes Benz, the future of the automotive industry is destined to shift dramatically in the coming years. How it will be changing, however, was not a point the two automakers could agree on.

At a press conference Monday, Toyota brought its Mirai – the soon-to-be first commercially available hydrogen fuel cell car to discuss the future of not the auto, but it’s fuel. And to help pave the way for this, Toyota will be offering over 5,680 patents related to hydrogen fuel cell development.

The car itself isn’t new – Toyota presented a concept Mirai at CES 2014, and it is slated to be available for purchase in California later this year.

Hydrogen, said Bob Carter – a Toyota US senior vice president, will be the inevitable source of energy for the cars of the next 100 years. He reiterated Toyota’s hope that opening up the patents would accelerate adoption of the new technology among other automakers and improve the development of an infrastructure to support the new fuel.

Many similarities between Toyota’s incredibly successful Prius line and the new Mirai were made. The Prius overcame years of doubts about whether the public was ready for a commercial hybrid after its worldwide debut in 2000, and Toyota seems hopeful that the Mirai will follow suit.

For Mercedes Benz, the future lies, much like their present, in luxury. Presenting an entirely new concept for CES, the F 015 Luxury in Motion, Mercedes Benz focused on how autonomous driving systems will affect not only how we drive, but how cars are designed and conceived.

The F 015 drove onto the stage at Mercedes Benz conference to make the point, and featured a distinctly futuristic design. Arrays of lights on the front and back of the vehicle were used to communicate with individuals and pedestrians outside the car, even verbally acknowledging when it was safe to move past it.

The interior design beckoned back to the days of the carriage, said Mercedes, with a focus on the passenger, and not the driver. The four seats in the car swiveled to face each other, and each door in the F 015 was equipped with multiple interactive gesture displays that could be customized for each individual passenger.

Hydrogen was at the base of the car’s power system, a fact sure to bring a smile to Toyota, but the focus was more on the design and concept built around the idea of a driverless car.

Mercedes claimed that as autonomous driving systems become the norm, driving will be considered a relaxing get away instead of an attended to activity, which would shift both the design and concepts at the base of all autos in the coming years.

Autonomous cars seem to be taking center stage at CES 2015, with Audi sending an autonomous version of its A7 from San Francisco to Las Vegas. Both BMW and Hyundai are planning shows of their own autonomous vehicles as well.

Photo galleries of the events below.

Toyota

[tn3 origin=”album” ids=”1040″ transitions=”default”]

 

Mercedes Benz

[tn3 origin=”album” ids=”1046″ transitions=”default”]

CES 2015 Live Blog

BU News Service App
Started as Class Project

The new BU News Service app is available for iPhone.
The new BU News Service app is available for iPhone.

The Boston University News Service is ready to make the jump from your computer to your iPhone. Begun as a class project, the BU News Service app was developed by Randall Spence, a recent BU computer science grad.  Spence began work on the application in a computer science course last spring but spent nearly a month of his personal time after the class had ended putting it together and bringing it iTunes app store.

“It was exhausting,” said Spence, speaking of the time it took to build the application and get it approved for the app store. The design came from John Hughes, a personal friend of Spence’s. Hughes is a recent graduate of the Rhode Island School of Design and helped create a mock-up that they put into action on the mobile front.

The application has gone through a few updates since it’s inception, said Spence, who continues to maintain and upgrade it as new bugs and issues arise. Spence, who will soon be returning to Boston for work, said the application helped him learn some of the ins-and-outs of development. His current projects include mobile games he’s developing with Hughes.

— Fink Densford