July 11, 2013
“There is no doubt that over time, people are going to rely less and less on passwords. People use the same password on different systems, they write them down and they really don’t meet the challenge for anything you want to secure.”
None other than Bill Gates said this. Back in 2004.
People in the business of keeping data secure will tell you that passwords should have gone the way of dial-up Internet by now. Sure, back in the day, when we only needed them for two or three websites and hackers weren’t nearly so diabolical, we could get away with using the same “123456″ password for everything, without worrying that someone on the other side of the world was a click away from emptying our bank accounts.
Ah, sweet innocence. Now, we have an average of 24 different online accounts, for which we use at least six different passwords. And we need them for tablets and smartphones, too. If we’ve heeded the security gods—although most of us haven’t—we’ve abandoned the memorably quaint for strange, long combos of numbers, letters—capital and lower case—and symbols that dare to be remembered. (Then again, most of us don’t seem to have a knack for this passwords thing, considering that year after year, the world’s most popular password is still the word “password.”)
Not that conjuring up the perfect password guarantees immunity from code crackers. Just last week the giant game company Ubisoft admitted that its database had been breached and advised those with Ubisoft accounts to change their passwords immediately. Last summer’s big cybersecurity caper was a hack of LinkedIn, in which more than 6 million encrypted passwords were exposed.
It’s time, it would seem, for a better idea.
So, who figures to make the first big splash in the post-password world? Right now, a lot of the betting is on Apple, with speculation that the killer feature of the iPhone 5S coming out later this year will be a fingerprint scanner, perhaps embedded under the home button. Some Apple watchers think the iWatch, also expected on the market by the end of 2013, will likewise come with scanner capabilities that would allow the device to verify the user’s identity. Apple tipped its hand last year when it paid $356 million for AuthenTec, a company that develops fingerprint scanners.
Other big names pushing for the password’s demise are Google and PayPal, two of the key players in an industry group known as FIDO, which stands for Fast IDentity Online Alliance. FIDO isn’t boosting any particular approach to identity recognition; mainly it plans to set industry standards. But it is promoting what’s known as two-step verification as a move in the right direction.
This is when you’d be identified by a combination of “something you know”—such as a password—with “something you have”—such as a token that plugs into your device’s USB port—or “something you are”—such as your fingerprint. This combo of a password and a device you carry around with you—Google security experts have suggested a log-in finger ring—would be a lot safer than a simple password, and would let you use an easy-to-remember password, since the account can’t be hacked without your ring or your fingerprint.
And once fingerprint sensors or face and voice recognition software become more common, it will be that much easier for passwords to simply fade away.
That feels inevitable to Michael Barrett, chief information-security officer of PayPal and president of FIDO. “Consumers want something that’s easy to use and secure,” he says. “Passwords are neither.”
A fingerprint scanner on your phone is only the beginning. There are a number of other inventive, and yes, even bizarre ideas for replacing passwords. Among them:
- Coming soon to a stomach near you: Let’s start strange. At a conference in late May, Regina Dugan, head of advanced research at Motorola, suggested that one day you’ll be able to take a pill every day that would verify your identity to all of your devices. The pill would have a tiny chip inside and when you swallow it, the acids in your stomach would power it up. That creates a signal in your body, which, in essence becomes the password. You could touch your phone or your laptop and be “authenticated in.” No, it’s not happening any day now, but the FDA has already approved its precursor—a pill that can send information to your doctor from inside your body. In other words, it’s a lot more plausible than it sounds.
- So, how about a tattoo that spells “password:” But that’s not all Dugan projected for the future. She also showed off an electronic tattoo. Motorola, now owned by Google, is working with a company named MC10, which has developed this “stretchable” tattoo with its own antenna and sensors embedded in it. It’s so thin, it can flex with your skin. And it would serve as your password, communicating with your devices and verifying that you are who you say you are.
- Now, what are all these keys for?: Back to the present. A Canadian company called PasswordBox is now offering a free app that remembers and automatically enters all your passwords across all your platforms. It signs you into websites, logs into apps, and enables you to securely share your digital keys with friends and loved ones—all through an app for your smartphone and a Chrome browser extension for your desktop. Its pitch is one-click login everywhere.
- Would my heart lie?: Another Canadian company called Bionym is building its business around the fact that heartbeats, like fingerprints, are unique. Its approach is to turn your heartbeat into a biometric pass code that’s embedded in a wrist band which, in turn, uses Bluetooth to let your machines know you’re the real deal.
Video bonus: Let’s go back to the future with John Chuang, a researcher at the UC Berkeley School of Information. He’s working on the idea of allowing people to verify their identities through their brain waves. Okay, at least hear him out.
Video bonus bonus: The Internet Password Minder is a stroke of…something. Even Ellen DeGeneres was impressed, in a funny way.
More from Smithsonian.com
How You Type Could Become Your New Password
May 10, 2013
To be honest, I’ve never associated motherhood with science. I assume this has everything to do with the fact that I’m one of eight kids, and while I’m sure we were a study in chaos theory, my mother didn’t have much time to nail the concept and work it into bedtime stories.
That said, moms remain a subject of scientific inquiry because, no matter how constant they may seem to us, they’re always changing to keep up with the times.
Here then are 10 recent studies or surveys that give a bit more insight into the institution of 21st century moms.
1) Have I got a story for you: According to a study published recently in the journal Sex Roles, moms are better than dads at telling stories and reminiscing with their kids, and that helps children develop their emotional skills. The researchers observed that moms tended to include more emotional terms in their stories and were more likely to then explain them to their children.
2) But how many of the answers were “Because I said so”: A survey of 1,000 moms in the United Kingdom found that the typical mother answers up to 300 questions a day from their kids. Four-year-old girls are the most inquisitive, averaging a fresh question about every two minutes. The most questions are asked during meals–an average of 11–followed by shopping trips–10 questions–and bedtime–nine questions.
3) That magic touch: The skin-to-skin touch of a mother can make a big difference in helping preemies or other at-risk babies deal with the pain and stress of injections. Researchers determined that the touch of a father or an unrelated women can also help lower the stress of an at-risk baby, but neither had quite the soothing effect of physical contact with the child’s mother.
4) Even mom spit is special: A recent article in the journal Pediatrics recommended that mothers clean off their child’s pacifier by putting it in their own mouths. That’s right. What the researchers found is that infants whose mothers sucked on their pacifiers to clean them developed fewer allergies than children whose mothers rinsed or boiled the pacifiers. The children of moms who gave pacifiers a mouth rinse also had lower rates of eczema, fewer signs of asthma and smaller amounts of a type of white blood cell that rises in response to allergies and other disorders. The findings are in line with the growing evidence that some exposure to germs at a young age can be good for kids.
5) Heigh-ho, heigh-ho, it’s off to work I go: About 40 percent of working mothers in the U.S. now say the ideal situation for them would be to work full time. That’s according to the latest research on the matter from the Pew Research Center. It’s almost twice as many who felt that way in 2007, when 21 percent of the women surveyed said that would be their preference. The researchers speculated that this is probably a reflection of tough economic times. But working part time is still the top choice among working women, although the percentage of women who said that would be the best situation for them dropped from 60 percent in 2007 to 50 percent in the most recent survey.
6) Don’t do what I do: Just as moms generally can do more good for their kids than dads, they also apparently can do more harm. A 34-year study by the British think tank Demos found that the alcohol drinking habits of mothers can have the greatest impact on how their children consume alcohol. While at age 16, a child’s drinking behavior was greatly influenced by peers, the researchers found that that changed as children reached maturity. Then, the scientists more often discovered clear connections between alcohol consumption–particularly binge drinking–and childhood memories of how their mothers would drink.
7) Crouching tiger, failing children: So much for the power of Tiger Moms, the stereotypical demanding Asian mother depicted in the much-debated Battle Hymn of the Tiger Mother in 2011. A University of Texas professor named Su Yeong Kim, who had been following more than 300 Asian-American families for a decade, recently published her findings. What she observed didn’t quite match the stereotype. Children of parents whom Kim classified as “tiger” had lower academic achievement–and more psychological problems–than the kids of parents characterized as “supportive” or “easygoing.”
8) Even in utero we know to take a vowel: According to a joint study of newborns in Washington State and in Stockholm, babies start learning language from their moms even before they leave the womb. The scientists said their research showed that the infants began locking on to the vowel sounds of their mothers before they were born. How did they know that? They studied 40 infants, all about 30 hours old, and they found that the babies–who were played vowel sounds in foreign languages and the language of their mothers–consistently sucked longer on pacifiers when they heard sounds different from the ones they had heard in utero.
9) Sure, but you’d know nothing about Legos without us: Judging by a bit of research done in Finland, boys, at least in times past, could take almost nine months off a mother’s life, compared to girls. The Finnish scientists analyzed the post-childbirth survival rates of 11,166 mothers and 6,360 fathers in pre-industrial Finland, between the 17th and 20th centuries. And they found that a mother who bore six sons would live on average another 32.4 years after the youngest son’s birth, while a mother who gave birth to girls would live approximately 33.1 years after her youngest daughter came along. The shorter life expectancy was the same regardless of the mom’s social or financial status. The researchers surmised that not only was bearing boys more physically demanding for the mothers, but also that daughters were more likely to prolong their mothers’ lives by helping with household responsibilities.
10) Putting it in words: And finally…this probably shouldn’t come as a big surprise, but a study just published in the journal Proceedings of the National Academy of Sciences suggests that caveman didn’t just grunt, but actually had a decent little vocabulary that included the equivalent of words for ‘thou’, ‘you’, ‘we,’ ‘bark,’ ‘fire,’ ‘spit’ and yes, ‘mother.’
More from Smithsonian.com
April 19, 2013
Question: What’s needed to raise the quality of school teachers in America?
Answer: A bar exam?
So say the head of the country’s most powerful teachers’ union, the governor of New York and the U.S. secretary of education, among others. Their contention is that the only way teachers can truly elevate their profession–and with it the level of public education–is if they follow the lead of doctors, lawyers and engineers and are required to pass a test to prove mastery of their subject matter and how to teach it.
Randi Weingarten, president of the American Federation of Teachers (AFT), first floated the idea last summer at the Aspen Ideas Festival when asked what more could be done in training teachers. Then, late last year, her union put out a report, titled “Raising the Bar,” that pushed the idea further, calling for “a rigorous entry bar for beginning teachers.”
The debate has raged on ever since.
Joining those singing the praises of a tough teacher assessment is Joel Klein, the former chancellor of New York City’s Department of Education. Writing on The Atlantic website, he pointed out that pretty much anyone who graduates from college in America today can become a teacher, and that “job security, not teacher excellence, defines the workforce culture.” He also quoted a sobering statistic from McKinsey: The U.S. gets nearly half of its teachers from the bottom third of its college classes.
And just last weekend, in the New York Times, Jal Mehta,an associate professor at the Harvard Graduate School of Education, wrote that compared to many other fields where quality is maintained by building a body of knowledge and training people in that knowledge, “American education is a failed profession.”
“We let doctors operate, pilots fly and engineeers build because their fields have developed effective ways of certifying that they can do these things. Teaching, on the whole, lacks this specialized knowledge base; teachers teach based mostly on what they have picked up from experience and from their colleagues.”
So what exactly do the proponents have in mind? For starters, they think any exam would need to focus both on the prospective teacher’s subject and on teaching more generally, particularly the social and emotional aspects of learning. While states would be able to adapt the guidelines, the intent would be to set national certification standards. And, above all, the process would need to be “rigorous.” They say “rigorous” a lot.
AFT’s proposal also recommends that American universities need to get much more selective in accepting students into education programs, that they should require a minimum of a 3.0 grade point average, plus an average score in the top third percentile on college entrance exams. The goal, ultimately, is make teaching a skill to be mastered, and one that requires serious preparation. Said Weingarten: “It’s time to do away with a common rite of passage into the teaching profession—whereby newly minted teachers are tossed the keys to their classrooms, expected to figure things out, and left to see if they and their students sink or swim.”
Of course, not everyone thinks this is such a good idea. Some critics have suggested that it’s a ploy by the teacher’s union to sound high-minded, while actually aiming to protect its current members–who likely wouldn’t have to take the exam–and to justify a sizable bump in salary. Or that it’s really a swipe at programs like Teach for America, which offers a different route to becoming a teacher.
Still others think that focusing so much on a test score doesn’t make sense for a profession so dependent on interpersonal and motivational skills. Jonathan Kozol, author of numerous books on education, including “Letters to a Young Teacher,” makes the point that no test, no matter how refined, could adequately measure what he thinks is a good teacher’s greatest quality, that he or she loves being with students. The only way you can gauge that, he says, is watching them teach.
And Jason Richwine and Lindsey Burke, both of the conservative think tank, the Heritage Foundation, argued recently in The Atlantic that having knowledge and being able to impart it are two different things. They wrote:
“A teacher with a doctorate degree, every certification and license available, and 15 years of experience is no more likely to be a high performer than a teacher with a B.A., the minimal certification, and five years of experience.”
In the end, this discussion often ends up in Finland. It’s the Magic Kingdom of Education, the place the experts talk about when they imagine what American teachers could be. Roughly 40 years ago, the Finnish government concluded that the key to the country’s economic future was a first-class public education system. And the key to that was a system that gave teachers the prestige of doctors.
To even be accepted into a Finnish teacher education program, candidates must be at the top of their class, complete exams on pedagogy, be observed often in clinical settings, and pass a challenging interview. Only about 1 in 10 Finnish applicants are accepted to study to be teachers. And while the U.S. has more than 1,200 universities that train teachers, Finland has only eight. In short, teachers need to earn the right to feel special.
So, does that elevated status of teachers there result in better students? Yes, you could say that. In science, in math, in reading, Finnish students rank first in the world.
Here are other recent innovations in education:
- Never start by trying to learn Chinese: One of the hot trends in higher education is predictive analysis, which evaluates data to help identify students at risk of dropping out and also which course sequences are more likely keep kids in school and which are more likely to make them choose to drop out.
- Even tests can be all about you: A new online portal called Smart Sparrow allows teachers to offer material that’s adapted specifically to a student. For instance, quiz questions can be based on how a student answered the previous question. If he got it right, the next question’s harder, if he got it wrong, it’s easier.
- Do the math: A company called Mango Learning is building a reputation for its mobile apps that teach grade school kids math. They’re interactive games that supposedly can make kids even want to add decimals.
Video bonus: The Young Turks online news show offers its take on what makes Finnish education so special.
More from Smithsonian.com
March 29, 2013
Depending on who you’re listening to, Massive Open Online Courses, aka MOOCs, are either the greatest boon to the spread of knowledge since Gutenberg cranked his first press or the biggest threat to learning on campus since the coming of cheap beer.
No question that they are the most disruptive innovation to come out of universities in a very long time, although it’s still too soon to say if that’s “good” disruptive or bad. A quick refresher: Though free online courses, notably through Khan Academy, were already starting to build an audience, the first MOOC by a university professor popped up at Stanford in the fall of 2011 when Sebastian Thrun, also head of the team behind Google’s driverless car, decided that he and his colleague, Peter Norvig, would offer online–and free–their course on artificial intelligence. About 160,000 people around the world signed up.
The following semester Thrun left Stanford–which didn’t particularly like the free part of his grand experiment–and started his own online education service called Udacity. A few months later, two more Stanford computer scientists, Andrew Ng and Daphne Koller, got venture capital backing to create another online company named Coursera, built around the model of signing up professors from top universities to teach classes. And then last fall, MIT and Harvard anted up, jumping in with a MOOC service they called edX.
A lot of professors who taught in the first wave of MOOCs were effusive about the experience, especially about having the opportunity to reach more than 100,000 people all over the world with just one class. But plenty of others wondered what really had been let out of the bottle, and whether once people got used to the idea of free college courses, how would they feel about the old model, you know, the one involving payment of tens of thousands of dollars.
Views from the front line
So, more than a year has passed since Thrun went to the free side and MOOCs–and the philosophy they promulgate of valuing competency more and time in the classroom less–are clearly gaining momentum.
Last week the State University of New York’s Board of Trustees approved an ambitious program of online education, including MOOCs designed to help students finish their degrees in less time for less money. The week before that, Darrell Steinberg, a leader of California’s State Senate, introduced legislation that would allow students to get full credit for a class by taking a MOOC if he or she was shut out of a course and unable to find a comparable one.
Also, the National Science Foundation has kicked in $200,000 to study a free online course in electronics offered through MIT last year, with the goal of comparing data and feedback from students who took the class online with what was gathered from those who took the same course in a classroom setting.
But a bit of analysis already has been done, in the form of a survey published by The Chronicle of Higher Education earlier this month. More than 100 professors who have taught MOOCs responded to an online questionnaire. Among the highlights of their feedback:
- Almost 80 percent said they think MOOCs are worth all the hype–although the Chronicle did point out that the professors most enthusiastic about the experience were more likely to respond.
- Eighty-six percent said they thought MOOCs would eventually reduce the cost of getting a college degree (45 percent said it would significantly, 41 percent marginally.)
- But 72 percent said they didn’t think free online students should receive full credit from their universities.
The dark side
It is a noble notion, this idea of first-rate professors sharing their wisdom with knowledge-hungry students around the world, playing the role of “sage on the stage,” as the New York Times’ Thomas Friedman put it recently.
In practice, it hasn’t been such an idyllic model. The large majority of people who sign up for free online courses are what Phil Hill, an education consultant who has analyzed some of the MOOC data, refers to as “lurkers.” These are people who perhaps watch a video or two, but then drop out–a lot never get beyond registering. Hill says as many as 60 to 80 percent of MOOC students never make it past the second week of a course.
It’s apparently not unusual for as many as 90 percent of those who sign up for a free online class to drop out before they finish it. In one case, a bioelectronics course offered by Duke University through Coursera, only 3 percent of those who registered made it to the final exam.
Proponents of free online classes acknowledge that a lot of people who sign up for MOOCs are more curious than committed, and with neither a financial investment nor the option to earn credit, they don’t feel a compunction to stick it out to the end. More often now, universities are providing certificates to students who finish a course, for a nominal fee, generally under $100.
For professors, a big part of the motivation to teach MOOCs, according to the Chronicle survey, was the sense that mass online education is inevitable and that it would be wise to get ahead of the curve. Many also said they thought the experience made them better teachers.
But some believe the trend doesn’t bode well for many universities, particularly smaller ones and community colleges. Michael Cusumano, a professor of the Sloan School of Management at MIT, sees a troubling parallel with what happened with newspapers. “Free is actually very elitist,” Cusumano wrote recently in the monthly magazine of the Association for Computing Machinery. The result, he warns, could be a “few, large well-off survivors” and far more casualties.
His worst case scenario is “if increasing numbers of universities and colleges joined the free online education movement and set a new threshold price for the industry–zero–which becomes commonly accepted and difficult to undo.”
Adds Cusumano: “Will two-thirds of the education industry disappear? Maybe not, but maybe! It is hard to believe that we will be better off as a society with only a few remaining megawealthy universities.”
Here are other recent developments in open online learning:
- “Like” us if you’d rather not have a mid-term: The first MOOC service based in the U.K., called Futurelearn, launched in December and will be offering classes later this year. Its CEO says that one day people may congregate around online learning courses the way they now do around Facebook.
- Engineering can be fun! No, really: Brown University has begun offering a free, six-week online course designed to encourage more kids to consider careers in engineering.
- All MOOCs, all the time: And in Rwanda, a non-profit called Generation Rwanda is moving ahead with a creating a “university” for which all of the courses are taught online by professors elsewhere.
Video bonus: Here’s a bit more on MOOCs in a New York Times video report.
More from Smithsonian.com
How Artificial Intelligence Can Change Higher Education
January 7, 2013
Here in Washington we have heard of this thing you call “advance planning,” but we are not yet ready to embrace it. A bit too futuristic.
Still, we can’t help but admire from afar those who attempt to predict what could happen more than a month from now. So I was impressed a few weeks ago when the big thinkers at IBM imagined the world five years hence and identified what they believe will be five areas of innovation that will have the greatest impact on our daily lives.
They’ve been doing this for a few years now, but this time the wonky whizzes followed a theme--the five human senses. Not that they’re saying that by 2018, we’ll all be able to see, hear and smell better, but rather that machines will–that by using quickly-evolving sensory and cognitive technologies, computers will accelerate their transformation from data retrieval and processing engines to thinking tools.
See a pattern?
Today, let’s deal with vision. It’a logical leap to assume that IBM might be referring to Google’s Project Glass. No question that it has redefined the role of glasses, from geeky accessory that helps us see better to combo smartphone/data dive device we’ll someday wear on our faces.
But that’s not what the IBMers are talking about. They’re focused on machine vision, specifically pattern recognition, whereby, through repeated exposure to images, computers are able to identify things.
As it turns out, Google happened to be involved in one of last year’s more notable pattern recognition experiments, a project in which a network of 1,000 computers using 16,000 processors was, after examining 10 million images from YouTube videos, able to teach itself what a cat looked like.
What made this particularly impressive is that the computers were able to do so without any human guidance about what to look for. All the learning was done through the machines working together to decide which features of cats merited their attention and which patterns mattered.
And that’s the model for how machines will learn vision. Here’s how John Smith, a senior manager in IBM’s Intelligent Information Management, explains it:
“Let’s say we wanted to teach a computer what a beach looks like. We would start by showing the computer many examples of beach scenes. The computer would turn those pictures into distinct features, such as color distributions, texture patterns, edge information, or motion information in the case of video. Then, the computer would begin to learn how to discriminate beach scenes from other scenes based on these different features. For instance, it would learn that for a beach scene, certain color distributions are typically found, compared to a downtown cityscape.”
How smart is smart?
Good for them. But face it, identifying a beach is pretty basic stuff for most of us humans. Could we be getting carried away about how much thinking machines will be able to do for us?
Gary Marcus, a psychology professor at New York University, thinks so. Writing recently on The New Yorker’s website, he concludes that while much progress has been made in what’s become known as “deep learning,” machines still have a long way to go before they should be considered truly intelligent.
“Realistically, deep learning is only part of the larger challenge of building intelligent machines. Such techniques lack ways of representing causal relationships (such as between diseases and their symptoms), and are likely to face challenges in acquiring abstract ideas like “sibling” or “identical to.” They have no obvious ways of performing logical inferences, and they are also still a long way from integrating abstract knowledge, such as information about what objects are, what they are for, and how they are typically used.”
The folks at IBM would no doubt acknowledge as much. Machine learning comes in steps, not leaps.
But they believe that within five years, deep learning will have taken enough forward steps that computers will, for instance, start playing a much bigger role in medical diagnosis, that they could actually become better than doctors when it comes to spotting tumors, blood clots or diseased tissue in MRIs, X-rays or CT scans.
And that could make a big difference in our lives.
Seeing is believing
Here are more ways machine vision is having an impact on our lives:
- Putting your best arm forward: Technology developed at the University of Pittsburgh uses pattern recognition to enable paraplegics to control a robotic arm with their brains.
- Your mouth says yes, but your brain says no: Researchers at Stanford found that using pattern recognition algorithms on MRI scans of brains could help them determine if someone actually had lower back pain or if they were faking it.
- When your moles are ready for their close ups: Last year a Romanian startup named SkinVision launched an iPhone app that allows people to take a picture of moles on their skin and then have SkinVision’s recognition software identify any irregularities and point out the risk level–without offering an actual diagnosis. Next step is to make it possible for people to send images of their skin directly to their dermatologist.
- Have I got a deal for you: Now under development is a marketing technology called Facedeals. It works like this: Once a camera at a store entrance recognizes you, you’re sent customized in-store deals on your smart phone. And yes, you’d have to opt in first.
- I’d know that seal anywhere: A computerized photo-ID system that uses pattern recognition is helping British scientists track gray seals, which have unique markings on their coats.
Video bonus: While we’re on the subject of artificial intelligence, here’s a robot swarm playing Beethoven, compliments of scientists at Georgia Tech. Bet you didn’t expect to see that today.
More from Smithsonian.com