February 1, 2013
It’s the time of year when the National Football League gets a little bit smaller.
Sure, the Super Bowl on Sunday is its championship game and more than 100 million people will be watching, but if the outcome isn’t decided in the last two minutes, more people on Monday will be talking about the funniest TV commercials or how Beyonce sang–or didn’t–at halftime or the post-game homage to the Baltimore Ravens’ Ray Lewis as he dances off into the sunset.
It’s been this way for a while now. As the spectacle of everything around it has become bigger, what actually happens on the field during the Super Bowl has gotten smaller. And that’s been okay with the league as long as it’s only happened once a year.
But now, with the rise of giant home video screens and the ability to see every scoring play of every game on the NFL’s RedZone network or watch games from different angles on a computer tablet, people running the league and its teams have realized that they need to pump up the stadium experience. What happens on the field, they fear, soon may no longer be enough to keep the customers satisfied.
Hitting the big, big screen
No question that the Dallas Cowboys ratcheted things up in 2009 when they opened, with much hoopla, the new Cowboys Stadium. Not only did it cost more than $1 billion, but hanging 90 feet above the field is an HDTV screen so large–it stretches from 20-yard-line to 20-yard line–that players who are quite massive in real life look like little Lego men moving around below.
Next fall, the Houston Texans will one-up the Cowboys when they unveil their own field-dwarfing video screen, almost 25 percent larger than the one in Dallas. And now even colleges are starting to join the monster screen club. The University of Nevada, Las Vegas, hardly a football powerhouse, just released plans for a new stadium that will include a video screen 100 yards long.
That’s right, it will be as long as the playing field.
Stand up and cheer
Okay, so we can expect the screens to get bigger and bigger. But some think the stadiums may actually get smaller, or at least there will be fewer seats. Instead, more attention will be paid to where people can stand and what they can do while they’re there.
Here’s how Eric Grubman, the NFL’s executive vice president of business operations, described a football stadium of the future in a recent interview with the Los Angeles Times:
“What if a new stadium we built wasn’t 70,000, but it was 40,000 seats with 20,000 standing room? But the standing room was in a bar-type environment with three sides of screens, and one side where you see the field. Completely connected. And in those three sides of screens, you not only got every piece of NFL content, including replays, RedZone and analysis, but you got every other piece of news and sports content that you would like to have if you were at home.
Now you have the game, the bar and social setting, and you have the content. What’s that ticket worth? What’s that environment feel like to a young person? Where do you want to be? Do you want to be in that seat, or do you want to be in that pavilion?”
Phoning it in
Other stadium innovations are heading in a different direction. Instead of having the game be only part of a multi-screen, sports bar party experience, they would entertain fans by allowing them to immerse themselves more deeply into the game itself. And they would do it all on smart phones and tablets.
Take the case of the New England Patriots. At the beginning of this past season, they became the first NFL team to deploy a free Wi-Fi network for streaming video in their home field, Gillette Stadium. Fans were able to use mobile apps to watch instant replays on their phones and get real time stats.
And next season, they’ll have more options, ones that take them into the games within the game. There will be apps that allow them to tune into cameras following star players around, apps that let them watch what goes on in their team’s locker room at halftime, apps that listen in on players wearing microphones and eavesdrop on conversations between the coaches and the quarterback (with a 15-second delay, of course).
And there will an app that, by the fourth quarter, could be the most valuable of all. It will tell them where to find the shortest bathroom lines.
Here are other recent advances in football tech:
- A red zone you don’t want to enter: Reebok has developed something it calls a Head Impact Indicator. It’s a thin skullcap lined with sensors that can detect dangerous hits to the head. If a yellow or red light goes on, it’s time for a player to head to the sidelines.
- Now if they could only do something about helmet hair: Meanwhile, engineers at Purdue University say they’ve developed the model for a football helmet that disperses the energy of a smack to the head instead of just protecting a player’s skull. They report that tests with a polymer-lined Army helmet they designed showed it could reduce the G-force a player’s brain absorbed by as much as 50 percent.
- Like we need another reason to boo the refs: You know that imaginary yellow line you see on TV games to show where the first down marker is? After this season, the NFL is going to take a look at technology that would project a laser line across the field so people in the stadium could see what everyone at home has been seeing for years.
- Hardbodies the easy way: When they run out on the field Sunday, four San Francisco 49ers players, including both of the team’s quarterbacks, will be wearing a form of customized body armor under their uniforms. It’s called EvoShield and it’s a gel that hardens to fit a player’s body when exposed to air.
Video bonus: Okay, here’s a sneak peek of two Super Bowl ads already being declared winners, a spot about how getting the keys to the family Audi jacks up the testosterone of a boy headed to his high school prom, and a Volkswagen ad using a Minnesotan-turned-Rastafarian to celebrate the power of German engineering.
More from Smithsonian.com
January 30, 2013
Admittedly, it’s a little hard to imagine smell scientists, but research published earlier this week has those who study the sense of smell taking sides.
It comes down to how our noses detect odors. The long-standing explanation is that our noses have receptors that respond based on the shapes of odor molecules. Different molecules fit together with different receptors, the thinking goes, and when a match is made, the receptor tips off the brain that our nose has picked up a whiff of coffee or perhaps a very different smell emanating from the bottom of our shoe.
But a conflicting and more exotic theory received a boost in the new study by researchers in Greece. It holds that we can also sense smells through quantum physics, in this case the vibration of odor molecules. As Mark Anderson posits at Scientific American, “Does the nose, in other words, read off the chemical makeup of a mystery odorant—say, a waft of perfume or the aroma of wilted lettuce—by ‘ringing’ it like a bell?”
I know what you’re thinking: What difference does this make as long as I can still smell bacon?
Sniffing out trouble
But actually it does matter, because the more we understand the process of smelling, the more effective we can be at recreating it in machines. In fact, just last month IBM, in its annual “5 in 5″ forecast–a list of technologies it believes will hit the mainstream in five years–focused exclusively on the development of the five human senses in machines.
To mimic smelling, tiny sensors would be integrated into smartphones or other mobile devices and, as a breathalyzer can determine alcohol levels, they would gather data from the smell of your breath by detecting chemicals that humans wouldn’t perceive and send it to a computer in your doctor’s office. The thinking is that eventually this would be a core component of home health care–the ability to “smell” diseases remotely, such as liver or kidney ailments, asthma or diabetes.
Or on a more basic level, as IBM’s Hendrik Hamann put it: “Your phone might know you have a cold before you do.”
IBM is also working with health care organizations to equip patient and operating rooms with sensors that can help address one of the biggest problems hospitals face today–how do you keep them hygienic? Hundreds of sensors will basically sniff for cleanliness, identifying the chemical compounds that create odors, some of which are undetectable by humans. The staff can say they cleaned a room; the sensors will know if and when they did.
Every breath you take
The smell tests might even detect cancer. Last fall, in a study in the Journal of Thoracic Oncology, researchers from Israel and Colorado reported that breath analysis could distinguish between benign and malignant lung tumors with 88 percent accuracy. Plus, the breath test could determine the specific type and stage of the lung cancers.
And at the Cleveland Clinic, Dr. Peter Mazzone, director of the lung cancer program, is testing a sensor array that changes color when a patient’s breath passes over it. In a study of 229 patients, the test, using a machine developed by the California firm Metabolomx, was able to distinguish those with lung cancer with more than 80 percent accuracy.
Meanwhile, Mazzone and his team are collecting as many breath samples as possible from patients, both with and without lung cancer. The goal is match breath patterns with physical conditions. “My vision,” Mazzone told the Wall Street Journal, “is being able to say, ‘This is a 60-year old with emphysema who smoked for 30 years—what’s the chance of there being cancer there?’ But we have to teach the device what it looks like first.”
Or, perhaps more accurately, what it smells like.
Here are other recent discoveries scientists have made about smell:
- Me, my smell and I: Research in Germany concluded that not only can we identify our own body odor, but that we prefer it. For the study, women were asked to select which of their armpit odors they liked more. They showed a clear preference for the one perfumed with a solution that included elements of their own scent.
- Can robots wear Axe?: The U.S. Navy is looking to use scent-sniffing robots to move 1,000-pound bombs on ships. The idea is that a human would control the lead robot and it would dispense the equivalent of a robot pheromone that a swarm of other robots would follow like army ants.
- I love the smell of gridlock in the morning: When people are anxious, their sense of smell becomes more acute, according to a recent study at the University of Wisconsin-Madison.
- Why your dog can sniff out a chicken leg from a block away: And from the University of Chicago comes research finding that animals are able to focus their sense of smell much like humans can focus our eyes. Through their finely-honed sniffing techniques, they apparently can bring scents to receptors in different parts of the nose.
- There’s the rub: And finally, a study in the U.K. has found that thanks to a genetic variation, two percent of the population never has underarm body odor. Yet more than three-quarters of them still use deodorant because, well, that’s what people do.
Video bonus: Stuart Firestein, chairman of the biology department at Columbia University, tells you all you want to know about how our nose does its job.
Video bonus bonus: A Chinese airline that checks out the armpit odors of people interviewing to be pilots.
More from Smithsonian.com
January 11, 2013
Since the beginning of mankind, we’ve wanted our kids to get smarter. Since the beginning of the 21st century, we’ve wanted our phones to get smarter.
So when are we going start wanting our TVs to get smarter? Or will we always be content with them being dumb, as long as they’re big and dumb? Okay, maybe not dumb, but most of us don’t yet feel a compelling need to have our TVs think like computers, as long as the picture looks pretty up there on the wall.
Which always makes things interesting at the Great Gadgetpalooza also known as the Consumer Electronics Show (CES). For the past several years, the big electronics companies that focus on hardware, such as Samsung and Panasonic, and the big tech companies that focus on software, such as Google, have been rolling out nifty products at the annual Las Vegas spectacle with the promise that this is the year that Smart TV goes mainstream.
Boob tube no more
And so it’s been at this year’s version of CES, which ends today. Samsung has done its part to convince us that the time has come for us to love TVs for their brains by unveiling what it calls its S-Recommendation engine.
It’s software that, as Samsung puts it, not only understands what you like, but recommends things it thinks you’ll like. (Sure, Amazon’s been doing this for years, but this is your big, dumb TV we’re talking about.) And it doesn’t just suggest TV shows, but could throw in streaming programs options from the Web, or even video you’ve shot on your smartphone.
The goal ultimately is to get you to do all those things you’re now doing on your smartphone or your tablet–say, watch Hulu or Skype with a family member or check out your Facebook page–on your TV instead. To encourage that behavior, Samsung has revamped its Smart Hub so you can flip through all of your entertainment options in five different index screens–one that tells you what’s on regular old TV now or soon, another that lists movies and on-demand TV, a third that pulls in photos or music or video stored on any other devices around the house, a fourth where you can Skype or pull up Facebook and a fifth that provides access to any apps you’ve downloaded.
And neither of the above requires pushing a lot of buttons on a remote. The S-Recommendation engine responds to voice commands and the Smart Hub is designed to be controlled with hand gestures.
For its part, Panasonic has rolled out a feature it calls My Home Screen, which allows each member of your family to create his or her own homepage on the TV, where easy access is provided to their favorite digital content, streaming video and apps. Some of the company’s Viera models actually come with their own cameras that tell the TV who turned it on. And as a smart TV should, it dutifully brings up that person’s home screen.
Plus, Panasonic unveiled “Swipe and Share 2.0″, which lets users move photos from a tablet or phone to a big TV screen, where they can then be edited with a touch pen.
But can you love a TV?
So that seals it, right? This must be the year when TVs take back center stage, especially now that they’re finally learning to care about our needs, right?
Maybe not. We’ve built some pretty strong personal connections to our cell phones and tablets. And a lot of people think it’s going to take a while for us to develop that kind of bond with a TV, no matter how smart it is.
As Greg Stuart, CEO of the Mobile Marketing Association told Ad Age earlier this week: “”People don’t have that kind of interactive relationship with their TV. The TV on the wall is a family device. It’s a multi-user device. If I want to share something, its going to be with a personal device, and that’s going to be my tablet or my mobile.”
TV or Not TV?
Here are other recent TV innovations:
- Robert, 6th Earl of Grantham, meet Tony Soprano: One day, thanks to Samsung, two people will be able to watch full-screen versions of Downton Abbey and Sopranos reruns at the same time. By adapting 3D technology, the company has created a TV that can display a different and full resolution image to each viewer depending on whether they’re sitting to the left or the right of the screen. Of course, both people would have to wear special glasses that come with headphones so you can hear only the sound for your show, but is that such a big price to pay for domestic peace?
- Read my lips. No more Gangham style: LG, the other South Korean TV giant, has upgraded its “Magic Remote” so that it now responds to natural language. You say the name of a show or even something like “videos with Gangham-style dancing,” and your choice pops up on the screen.
- I got my MoVo workin’: Also at CES, the Chinese TV manufacturer TCL showed off an HD TV called MoVo that uses facial recognition software to identify who’s watching and then make programming suggestions customized for that person.
- Okay, who blinked?: Meanwhile, Haier, another Chinese company, has developed a technology it calls Eye Control TV where, yes, you can change channels by moving your eyes.
- Ah, to be 65 and only see ads for meds: It was only a matter of time. A company called Gracenote will soon begin trials on a technology that, based on your viewing habits and personal data, will personalize the TV ads you see. Isn’t that special?
Video bonus: You didn’t make it to the big electronics show this year? Not to worry. Here’s the Samsung demo of its S-Recommendation engine. Remember, people tend to gush a lot at CES.
More from Smithsonian.com
December 14, 2012
There are times when I wonder why so many scientists are spending so much time trying to recreate something as fickle and full of fogginess as the human brain.
But who am I kidding? Those dyspeptic moments inevitably pass, as anyone who’s been following this blog knows. Every few months, it seems, I’m back writing about the latest attempt to build machines that can learn to recognize objects or even develop cognitive skills.
And now there’s Spaun.
Staying on task
Its full name is the Semantic Pointer Architecture Unified Network, but Spaun sounds way more epic. It’s the latest version of a techno brain, the creation of a Canadian research team at the University of Waterloo.
So what makes Spaun different from a mindboggingly smart artificial brain like IBM’s Watson? Put simply, Watson is designed to work like a supremely powerful search engine, digging through an enormous amount of data at breakneck speed and using complex algorithms to derive an answer. It doesn’t really care about how the process works; it’s mainly about mastering information retrieval.
But Spaun tries to actually mimic the human brain’s behavior and does so by performing a series of tasks, all different from each other. It’s a computer model that can not only recognize numbers with its virtual eye and remember them, but also can manipulate a robotic arm to write them down.
Spaun’s “brain” is divided into two parts, loosely based on our cerebral cortex and basal ganglia and its simulated 2.5 million neurons–our brains have 100 billion–are designed to mimic how researchers think those two parts of the brain interact.
Say, for instance, that its “eye” sees a series of numbers. The artificial neurons take that visual data and route it into the cortex where Spaun uses it to perform a number of different tasks, such as counting, copying the figures, or solving number puzzles.
Soon it will be forgetting birthdays
But there’s been an interesting twist to Spaun’s behavior. As Francie Diep wrote in Tech News Daily, it became more human than its creators expected.
Ask it a question and it doesn’t answer immediately. No, it pauses slightly, about as long as a human might. And if you give Spaun a long list of numbers to remember, it has an easier time recalling the ones it received first and last, but struggles a bit to remember the ones in the middle.
“There are some fairly subtle details of human behavior that the model does capture,” says Chris Eliasmith, Spaun’s chief inventor. “It’s definitely not on the same scale. But it gives a flavor of a lot of different things brains can do.”
The fact that Spaun can move from one task to another brings us one step closer to being able to understand how our brains are able to shift so effortlessly from reading a note to memorizing a phone number to telling our hand to open a door.
And that could help scientists equip robots with the ability to be more flexible thinkers, to adjust on the fly. Also, because Spaun operates more like a human brain, researchers could use it to run health experiments that they couldn’t do on humans.
Recently, for instance, Eliasmith ran a test in which he killed off the neurons in a brain model at the same rate that neurons die in people as they age. He wanted to see how the loss of neurons affected the model’s performance on an intelligence test.
One thing Eliasmith hasn’t been able to do is to get Spaun to recognize if it’s doing a good or a bad job. He’s working on it.
Here are a few other recent developments in brain research and artificial intelligence:
- I can’t get this song out of your head: Scientists in Berlin wired guitarists playing a duet with electrodes and found that when they had to closely coordinate their playing, their brain activity became synchronized. But when they weren’t coordinated, when one was leading and the other following, their brain activity was distinctly different.
- One day the brain may actually understand itself: A team of MIT neuroscientists has developed a way to monitor how brain cells coordinate with each other to control specific behaviors, such as telling the body to move. Not only could this help them map brain circuits to see how tasks are carried out, but it also may provide insight into how psychiatric diseases develop.
- Deep thinking is so yesterday: The top prize in a recent competition sponsored by pharmaceutical giant Merck went to a team of researchers from the University of Toronto who used a form of artificial intelligence known as deep learning to help discover molecules that could become new drugs.
- So robots will learn how to stare at smart phones?: To teach robots how to function in social situations, scientists at Carnegie-Mellon University are tracking groups of people with head-mounted cameras to see when and where their eyes converge in social settings.
- Unfortunately, they keep trying to hide nuts: By using the deceptive behavior of birds and squirrels as a model, researchers at Georgia Tech have been able to develop robots that can trick each other.
Video bonus: Check out a demo of Spaun in action.
More from Smithsonian.com
December 4, 2012
It was a moment that would have brought a smile–a sardonic one, of course–to the face of Bones McCoy.
Last week, the California-based firm Scanadu announced that by the end of next year, it will begin selling a device called Scout. The little gadget, which fits in the palm of your hand, will, in conjunction with your smartphone, be able to tell you your temperature, blood pressure, heart rate, breathing rate and the level of oxygen in your blood–all within 10 to 15 seconds.
In other words, it will be the closest thing we’ll have to that bulky but nifty tricorder that McCoy wielded so deftly as chief medical officer on the Starship Enterprise back in the glory days of Star Trek. Which is the point, because Scanadu is one of the competitors for the $10 million award in Qualcomm’s Tricorder X Prize.
Scanadu is already making comparisons to the innovation of the family thermometer back in the 19th century, an invention that gave people the opportunity to gather health data at home. They may be right about that.
Most doctors would certainly agree that this is a good thing, in that it will make it ridiculously easy for a person to check his vitals every day. In theory it would, like the thermometer, let people know if they have a health problem without attempting to explain what it might be.
But then there’s this tagline on the Scanadu website: “Sending your smartphone to med school.” Sure, it’s meant as a clever, pithy pitch. But it also raises a notion that makes a lot of people in the medical community very uneasy about where this boom in health and medical apps is headed.
When does gathering data slide into making diagnoses or even promising cures? And if it does, who’s going to ensure that any of this is based on real science?
Apparently, a lot of what’s out there now isn’t. Last month, the New England Center for Investigative Reporting released the results of its analysis of 1,500 health mobile apps that cost money. It’s not a pretty picture.
The reporters found that more than 20 percent of the apps they reviewed claim to treat or cure medical problems. Of those 331 therapeutic apps, nearly 43 percent relied on cellphone sound for treatments. Others promised results using a cellphone’s light and a few pitched the power of phone vibrations. Scientists told the journalists that none of the above could possibly treat the conditions in question.
There’s no longer an app for that
The Food and Drug Administration (FDA) is expected to soon announce how it plans to regulate medical apps. It’s not likely to worry about the thousands of health apps that allow people to track their workouts or their daily calorie counts or how they slept. But it will look closely at apps that are promoted as a way to diagnose or treat a disease or condition.
By its latest count, there are now almost 18,000 health and fitness apps and more than 14,500 medical apps. As cautious as the feds have been has been about getting into the business of regulating software, they haven’t been able to ignore a few of the more egregious examples of mobile app magical thinking.
Last year the Federal Trade Commission banned the sale of two apps that promised to cure acne.
And that’s why they call it a smartphone
Here are other recent examples of mobile tech transforming the field of medicine:
- Is it the blue pill or the red pill?: Microsoft has jumped into the medical apps business by joining with NextGen Healthcare to develop, for Windows 8, an app called NextGen MedicineCabinet. It will allow people to create and store a detailed digital record of their prescription medications and be able to share it with doctors and hospitals when necessary. It also will let health care providers identify potentially harmful drug interactions.
- Will it tell you if you’re watching “Cops” too much? California startup Lark Technologies has launched a product it calls larklife–wristbands with sensors that work with an iPhone to track your daytime activities–calories burned, distance traveled, steps taken, food eaten–and your nighttime–how you slept. Then it provides you with tips during the day based on what your data says. For instance, if you don’t sleep as much as usual, it might point out that it’s a good idea to eat breakfast. Or it might congratulate you for a big fitness accomplishment, such as walking 1,000 steps in one day.
- Because it’s so hard to show surgery on stick people: A company called Visible Health has created a product called DrawMD, a series of free iPad apps that allow surgeons to explain surgical processes to their patients. Instead of scratching out a crude pencil sketch on a notepad, doctors can use digital anatomical images in the apps, which they can sketch or type on to illustrate a medical procedure.
- Is there a doctor in the house? HealthTap, with a large searchable doctor directory–complete with ratings, peer-reviews, and the ability to book appointments–plus a popular health Q&A feature, has been a player in the medical apps world for awhile. And last week it got even bigger, buying Avvo Health, another medical Q&A service with a network of physicians. That expands HealthTap’s Medical Expert Network to more than 30,000 American doctors and dentists.
- But does it send an alert when he needs a massage? It’s about time. Last week Japanese tech giant Fujitsu announced the launch of Wandant, a device that attaches to a dog’s collar and keeps track of how many steps it takes during a day. It also measures the dog’s temperature and comes with an online diary where owners can record what their furry overlord has eaten, what it weighs and the condition of its stool.
Video bonus: Yes, there are a lot of fitness videos out there, but few make running as much fun as Zombies, Run! Hear from the diabolical minds who created it.
More from Smithsonian.com: