Posts tagged Lives
Don’t mourn Twitpic. The popular service will continue to host images and video on Twitter thanks to an acquisition, the company tweeted Thursday.
This is the company’s first tweet since Sept. 4, when it announced it would be shutting down on Sept. 25 following a copyright skirmish with Twitter. The micro-blogging giant decided now was the time to take action against Twitpic, although the startup has been around since 2008.
“Unfortunately we do not have the resources to fend off a large company like Twitter to maintain our mark which we believe whole heartedly is rightfully ours,” founder Noah Everett wrote. “Therefore, we have decided to shut down Twitpic.”
Now it looks like the acquisition will give the company the resources it needs to stay in business. Twitpic has not revealed any other details, or even the identity of the buyer.
Twitpic’s news comes as an apparent relief; as of publication, the announcement tweet had more than 4,000 retweets.
Lead image by Homard.net
View full post on ReadWrite
Below is what happened in search today, as reported on Search Engine Land and from other places across the web. From Search Engine Land: Google Authorship May Be Dead, But Author Rank Is Not Google ended its three-year experiment with Google Authorship yesterday, but the use of Author Rank to…
Please visit Search Engine Land for the full article.
ReadWriteBuilders is a series of interviews with developers, designers and other architects of the programmable future.
Cynthia Breazeal has been called the mother of personal robotics, and it’s easy to see why. For more than ten years, she has brought forth robotic life that is designed to express, gesture, and relate to humans.
It’s a far cry from most of modern robotics, which is about utilizing robots for production and labor. Her earliest creations, the saucer-eyed Kismet and cuddly Leonardo, aren’t much for manual labor, but they can emote and respond to emotions almost like people.
At the end of the day, Breazeal’s work is about using simulated life to help humans make connections in our own lives. In a TED talk, she recounted a project she’d conducted during her work at the MIT Media Lab in which she tried to help people eat healthy food. Some test subjects received a paper notepad, others a computer, and the last third a responsive robot named Autom. People trusted Autom the most because they related to it as a personal assistant.
Breazeal’s latest and most groundbreaking robot is Jibo, a personal robot designed to live in your home with your family. With an expressive facial screen and a youthful voice, Jibo suggests a friendly helper, evocative of big screen robot fantasies. Since Jibo’s GoFundMe launch on July 16, it has raised more than $1.5 million—only about 15 times its original $100,000 funding goal.
I talked to Breazeal about her work in personal robotics and how she thinks robots can help us to better understand humanity.
Our Robots, Ourselves
RW: What inspired your shift from working with space robotics to social robotics?
Cynthia Breazeal: When I was an undergraduate, I was at UC Santa Barbara, and at the time they had a Center for Excellence in Robotics. At the time I was fascinated by space and I wanted to be an astronaut.
To be an astronaut I knew I’d have to get a PhD. in a relevant field, so I decided that it would be robotics. It just happened to fortuitously turn out that the year I was applying to graduate schools, Rod Brooks [founder of Rethink Robotics], who was my PhD advisor, had just started a program in developing planetary microrovers.
He wrote this famous paper—that there was a movie of the same title—Fast, Cheap and Out of Control: A Robot Invasion of the Solar System. And it was all about advocating for, rather than sending out a few large teleoperated rovers like we did to the moon, if you really wanted to explore Mars you needed to send many small, much more autonomous robots because of the lag of the signal from the Earth going to Mars was hours so the robots would have to have more autonomy in order to be effective explorers.
That was the project that was available when I applied for graduate school. That’s how I really got my start in autonomous robots. When I was at UC Santa Barbara it was more about manufacturing robots. That was my first exposure to biological inspired autonomous robots, robots that are really inspired after creatures rather than machines. And for me it was like, you know, “Oh my God, Star Wars is really happening. I have to be a part of this!” (laughs)
Rod is credited with shifting NASA’s stance to smaller microrovers, too, so of course it was a tremendously successful project. When NASA landed Sojourner on Mars I remember, as a graduate student, watching this huge success for all of robotics and thinking, “OK, we have robots we’ll send into the ocean. We’ll send them into volcanos. And now we’ll send them to Mars. But they’re not in our homes. What’s up with that? Where are the robots?”
It occurred that me that the human environment and people, in so many ways, are so much more complex than navigating in environments where there’s nothing around you. And then I started thinking, “Well, what would it really take to have a robot be able to live with people? To have anyone be able to interact with them?”
Social robotics is very much the analog of when we built these extremely expensive computers and only experts knew how to use them. We know how to build extremely expensive robots that only experts know how to use. So that mind shift of what happens when they go into every home and every desk. So that was it and that was social robotics and that was, “How do people naturally want to try to interact with machines?”
Well, people naturally anthropomorphize things that sense the world on their own, make decisions on their own, and exhibit this autonomy. So given that this is how people naturally want to make sense of autonomous robots, that’s how we need to support them. They very naturally try to anthropomorphize these robots, try to understand them as not being governed by laws of physics like things, but by states of mind, beliefs, intents, desires, emotions.
That was the impetus for social robotics. The final frontier of robotics is actually the human environment and robots need to be social in order to engage with us in a natural way, to be part of our lives.
Social Doesn’t Necessarily Mean Humanoid
RW: What differentiates a “social robot” from other humanoid robots?
CB: A humanoid robot typically refers very much to the morphology of the machine. So to have legs, arms, a head, basically the same physical attributes as a human. A lot of the impetus or motivation behind humanoid robots was that you could have robots in a human environment.
We have engineered so much of this environment for our morphology. If you built a robot that could basically be of human stature and size then it could also use all the tools and navigate the spaces around it as well. That was the original rationale behind humanoid robots.
You can have humanoids that are purely functional. They can carry things and never have to interact with people. Social robots are really about robots that engage people in this interpersonal dimension. Social robots don’t have to be humanoid at all, as it turns out, as long as they interact with us in this social, emotional way.
These are two distinct research areas, but they overlap as well.
RW: How have human expectations of robot connections changed since you first broke out with Kismet 10 years ago? Back then, we didn’t have jokes about talking to Siri or the iPhone. Now, many of us expect to have a digital companion at all times. How has the demand in your audience evolved?
CB: Yeah, you know it’s really fascinating. I think one of the challenges of robots, especially personal robots, is that it is a technology people have fantasized about, written books about, and made movies about long before the technology ever really existed. People have always had really profound expectations of what robots could be, from the extremely human-like social to the superhuman to the completely mundane—I mean, all over the map.
Surely all those images are still all around us. You still have the cultural joke about the robot overlords. They’re just part of our culture.
I do think what’s fascinating since the mobile computing revolution is just, because people are used to having little computers in their pockets and with them everywhere, people now have a much greater familiarity with technology in general. Everybody has a smartphone, everybody understands downloading apps.
Because of that, I think there’s probably a greater receptivity in terms of expectation that robots clearly are coming very soon. If I’m talking to my smartphone like Siri, clearly the robots must be coming soon.
It’s funny, I’ve talked to a lot of people about that. It’s almost as if they say, “I grew up with the Jetsons dreaming of robots, and what I got instead was a smartphone. But finally the robots might be actually coming.” I think there’s a real anticipation that personal robots are really happening, they’re going to come into our homes in a much bigger way. I think that’s what’s changed because of the mobile computing revolution.
An Electronic Helping Hand
RW: Your robots Kismet and Leonardo are pre-verbal. How do robots that can understand but not speak English help people?
CB: This is just an appreciation that how we communicate is extremely rich and extremely multimodal. For a very long time, a lot of AI focused on spoken conversational interaction, and that’s logical, right? People want to be able to talk to each other and to be able to talk and have machines understand them. But very few people are actually looking at the nonverbal dimension and it’s a profound dimension of human communication.
People have done studies that show if you have a person say something but if their body communicates a different attitude, people believe the body and not what’s said. How the body expresses this nonverbal dimension is also profoundly meaningful and convincing and persuasive to people.
When you look at a lot of work in the formation of human social judgements—“Are you an ally or a foe? Can I trust you? Do you have my best interests at heart? Do I like you? Do you like me? Can we relate?”—in all of these things, the nonverbal behavior has a tremendous role in how we make these very quick assessments of one another. And that has a huge consequence on interpersonal interaction and our effectiveness at working with each other as a team. It is a critical dimension of human communication and collaboration.
When we started off with robots, it was really almost more of a stake in the ground saying, “There’s already so much work in spoken language. Let’s focus on embodiment because what do robots have? Bodies!” Bodies are a tremendous asset to robots and they are not only an asset in terms of doing physical work, they’re a tremendous asset in how we interact with them that has not been explored nearly enough.
Even how robots understand the nonverbal behavior of people, in not just gestures but even action patterns—“How can I confirm the intent of a person by watching their activity?”—we do this so naturally. Humans have this profoundly rich theory of mind. The question is how do you design robots that can also have that theory of mind.
We started with these robots that were preverbal. We weren’t trying to get right down to the semantic side, but they were doing a lot of heavy lifting in terms of these social cognitive theory of mind skills, in terms of intuiting and understanding the intent and beliefs and desires, where words never had to be exchanged by watching what they were doing. It was a research stance, a stake in the ground, but one that was not explored nearly as richly enough as it should be.
But of course as our work has continued to evolve, now we’re implementing more spoken language into it where you still have the richness of the nonverbal.
The Mechanical Hand That Rocks The Cradle
RW: Why did you choose to focus so much of your research on robots that care for the old, young, and chronically ill?
CB: When you look at the field of robotics, the dominant paradigm is that robots are a technology to do physical work. You can navigate an unstructured environment so you surely must be able to carry things, or vacuum. If you can manipulate things, clearly you can manufacture or do unloading of trucks or things like that. It’s all been about the physical labor aspect.
Within social robotics, one of the big ah-ha moments was that there’s profound utility in what robots can bring for people where it is really around high touch engagement.
If you look at these domains like learning and health care and behavior change and aging independently, these are all domains where in the human professions they understand that high touch, this feeling of being supported by a social other, is really critical to the best outcomes for people. Information alone isn’t enough. Our technologies are great at information but they’re not great at high touch.
We live in this time now where the demand is far exceeding our institutional resources and human professionals. We need technology to step in and fill that gap, but it needs to be high touch technology to be the most effective. Social robots are a really powerful expression of high touch engagement that a technology can perform.
See also: 5 Adorable Robots At CES
We’ve been exploring it in domains like learning companions for young children, like health coaches for weight management, to show that when a robot takes these principles and strategies and skills that people do, that in fact people respond to them in a similar way and do better than with an intervention that doesn’t do that.
I think there’s a huge societal need for technology in general for technology to be much more humanized. By that I mean treating people like human beings and not like other stuff like most of our technology does, and to be able to support this high touch engagement—not to replace people, but to supplement and fill the gap.
It’s really about augmenting and extending what people provide each other. It’s not about marginalizing or replacing people—we still need people, we just don’t have enough people. There’s a huge opportunity here, which is why I talk about emotions as the next wave of computing.
Peaks And Valleys
RW: Robotics have really taken off in part because robot makers of today, including yourself, were movie robot lovers as kids. But many still have that Terminator distrust of robots, the fear of being replaced, as well as the evolutionary revulsion we experience with too-human robots. Do kids interact differently with robots than adults, and if so, how?
CB: Yeah, I think it’s interesting. Regarding the difference in adults versus kids… in my work, I think the way robots were first introduced to society was about replacing human labor. There’s a knee-jerk reaction from the past about robots trying to replace people and take away jobs. But in reality that’s not quite what happens.
With any new technology, they take over the jobs that people don’t necessarily want to do anyway, and they create new jobs. They empower people to do more interesting work.
There’s a great book, Race Against the Machine .… we’re moving from racing against the machine to racing with the machines. I think we need to do a better job communicating this new, more enlightened philosophy: robots are supplementing what people do.
They’re meant to help support us and allow us to do the kind of work that humans in particular find much more interesting and much more fulfilling because humans are creative, humans do things that machines don’t. It’s really about this partnership, this teamwork.
If you set aside this first initial reaction of robots trying to replace people, now we’re just talking about visceral reactions to robots. The uncanny valley. There’s a lot of science poking at what the uncanny valley is really about, but the premise is, when things take on more anthropomorphic like toys or Disney characters or so forth, we like it. It’s appealing. But when you get too close, when you’re really close to being like a human but still falling short, then you drop into the uncanny valley. That’s like, you know, a corpse.
Remember when Pixar did Tin Toy, the precursor to Toy Story? There was this baby in this concrete diaper—that was the joke. That animated short was intended to show how far computer graphics had come at doing people, and it was still way in the valley.
Now we have Shrek and other examples where, we can tell they’re not real humans, but they’re definitely out of the Valley, they’re very appealing. That’s an example of something that went into the valley and came out of the valley.
Robots are also at this point where we can design things that are more anthropomorphic and they’re not in the valley because they’re not trying to push it too hard. And then we have robots, like modern androids I’d say, that are definitely in the valley. But eventually I’m thinking it’s going to go the way of computer graphics and come out of the valley.
Now when you talk about just raw appeal or non appeal between adults and children, I think it’s very similar. I think children can be a little more flexible and forgiving, but I think what children and adults find appealing and unappealing is also similar. I think if adults can get past the Terminator fear or overlord fear or take-over-my-job fear and are just looking at the pure appeal of something as a character, it’s actually quite comparable.
RW: Do you think kids growing up interacting with robots will evolve beyond the uncanny valley experience?
CB: I think there’s two things going on. Kids growing up are not only interacting with robots more, they’re actually building robots themselves. I think you definitely get a different sort of appreciation when you’re a creator of the technology as well as someone who interacts with it.
In terms of the uncanny valley, there are certain things that are just viscerally unappealing. I think that’s just a physiological response, beyond conscious thinking. But I do think that children growing up with robots, creating robots as well as living with them, are going to continue to evolve our thinking in terms of this kind of autonomous intelligent technology that’s not just bits on a screen, but is actually in our world and of our world and how they fit into our lives and how they fit into society and the roles they feel comfortable having these machines play.
How We’ll Learn To Love The Machine
RW: In your studies, what characteristics do robots need to have for people to feel empathy for them? Both in terms of hardware and software?
CB: Even Charles Darwin said human beings are the most social and emotional of all species. It’s true—look at our facial expressions.
In terms of empathy, just even based on our science fiction and mythology even, it’s a profound human quest to think about creating or discovering things that exist that are like us but not quite like us. It’s the fascination with the not-quite-human Other.
Whether it’s Hephaestus making maidens of gold that are walking around and talking to people, or DaVinci’s [mechanical] knight, it seems like whenever we’ve created or advanced technology, very soon after that we create machines that are that much more in our image. I think it’s a profound human quest to do this. I think we do it because it fascinates us. I think we do it to try to understand ourselves.
In terms of empathy, I think we’re naturally inclined to want to emotionally connect with all kinds of things. That’s why we have companion apps, so we can emotionally connect and relate to things that aren’t human at all. We can emotionally connect to ideas, to nature.
I think we want to emotionally connect to our artifacts—we already do—with our cars and things like that. It’s not surprising that we want to emotionally connect with intelligent machines, with robots. The challenge is not to get people there to want to connect to robots. I think the challenge is to get robots to meet us halfway.
I think we already want to, it’s just human nature. I think we delight in it, it gives richness to our lives. I think what we want is to see the Other connect back with us. We want this reciprocity. It’s an AI challenge.
We can connect with things that aren’t anthropomorphic at all. But if you give them a little bit more cues, our brains just respond to them in a very subconscious way. The field of animation is an excellent example of an entire discipline inspired by human movement that discovered these abstracted principles that you can push and not make human at all—for instance, just squiggly lines on a screen—and you can imbue an emotion to.
So you need to understand what is it about our human mind that triggers these ways of thinking and understanding, and again you can take these very abstract ideas of it. But when you put those qualities into a technology, it’s like you’re just dancing with people. That’s the way they want to think, that’s how they respond, and that’s the way they want to try to understand.
It can be done digitally, it can be done through software, it can be done through hardware, and the more those signals reinforce one another in a coherent way, the stronger that perception is. The design space is actually ginormous on how you achieve that.
The quality of movement [is important]. You know rectilinear movement is machine as we know it today; biological movement moves in arcs. There are certain kinds of acceleration/deceleration profiles that organic motion adheres to, that’s typically not of machines. The more that the technology embodies those signals, the easier and more intuitive it is for us to see it as a living thing, a someone, not a something, and for us to want to connect to it, because that’s the next obvious thing we want to do. And then it’s about how does the robot reciprocate with us.
It is a stance almost. It’s attuned reciprocity. For the machine to engage us in the stance of communication, getting back to nonverbal communication, that really makes it happen.
RW: Do you think our increasingly connected world is causing us to lose empathy for fellow humans?
CB: I know there’s a lot of discussion around this right now in the media. I think it is a very important discussion to have because I think, for children in particular growing up, you’re hearing the stories of people breaking up with their girlfriend by texting and things like that.
The bottom line is interpersonal relationships are really important for people, and they can be wonderful, but they can be really complicated. They can be really hard. And they can be hurtful. But that doesn’t mean we should be avoiding them.
What I’m hearing a lot of concern being expressed about is children growing up not learning how to deal with negative emotions, especially negative emotions coming from other people, because they’re kind of offloading it to the technology. Because it is uncomfortable. It’s uncomfortable to say, “Hey, what you did or what I did, it wasn’t cool. Let’s talk about it.” It is a hard thing to do. But it’s a critical skill for people in society to be able to do to live with each other.
There’s no reason to believe that technologies can’t be designed to help strengthen human relationships. I think we just need to put a stake in the ground and say that’s what technology really needs to do. And we need to be cognizant of when we make design choices and people use these technologies, the good that they do and the unintended side effects or consequences that they have and learn from that and try to rectify that.
I think that’s the bottom line. Technology is a powerful enabler of human beings and human experience and we definitely want to create technologies that fully support human beings’ values of what leads to a quality life. Sometimes we do things that have this objective and we don’t quite get it. Or we learn people are using it in a way that’s not the best possible thing you could want. And I think that’s when’s the dialogue is important and that’s when people need to try to create new technologies to show a cognizant shift toward something that is more humanized or more supportive of the human experience and human relationships.
I’m very pro-technology, obviously, and I’m very pro-human relationships. They’re both critical.
RW: What is your opinion on the Robot Bill of Rights, a proposal to extend legal rights to social robots?
CB: Yeah, so this is very interesting. (Laughs)
I haven’t been following it really closely, but Kate Darling at the [MIT] Media Lab is wonderful. She wrote an article two years ago and her argument was very much like, “Why should we extend rights to animals?” Which is, “Do we connect to robots in an interpersonal or effective other way—like we do to animals—enough that we don’t like ourselves or what it says about human society if we mistreat those entities?”
Even if you don’t ascribe any legal rights to animals, the fact that we connect to them and we care about them, that if we mistreat them it makes us feel really bad about what that says about ourselves and our society. So her premise was, whether robots are intelligent or whatever is almost irrelevant. It’s really about people and what it says about ourselves and our society.
I can see a rationale there at some point where if machines are relatable to us enough that we feel inhuman by mistreating them, I think that is interesting. For me just as a mother, whether it’s a pet or a piece of furniture in my house, I don’t want my kids abusing it. There’s just a certain civil behavior that I would like to see ascribed to things as well as living things. It’s a baseline civility stance.
But I do think Kate’s point is a fascinating one, and I think it will keep evolving as these technologies evolve.
RW: You’ve said Jibo is “the world’s first social robot for the home.” Can you elaborate?
Jibo is the world’s first family robot. Jibo’s really about how it interacts with and supports the interactions between human beings in the house. There’s so much stuff around the connected home and the Internet of Things that’s about how your stuff talks to one another. Social robotics is about how this technology engages you.
It’s about bringing this high touch, humanized engagement with technology into the home where, from the whole paradigm of social robots. It’s really about saying technology should be able to interact with us and support us more like a partner that’s helpful versus just a tool that we use.
That frees us up to be in the moment and to experience our life versus what we have with technology now: “Oh I want to take a picture, let me leave the moment, pick up my camera, take the picture, now I have all these pictures of my kids but I’m not in any of them, put the camera back down, and get into the action.”
By having something like Jibo I can stay in the action. I can be with my kids, I can be in the moment, and have that moment recorded. Because Jibo’s playing the role of the cameraman.
As a high order concept, it’s about bringing this helpfulness with heart to the home. This helpfulness is the idea that Jibo plays roles for you, to free you up to be in the flow of your life. Jibo also brings any kind of content to life with this more interpersonal engagement.
Jibo is a storyteller, not an e-reader so even a children’s story is brought to life through the fact that Jibo’s treating you like its audience, looking at you, reading your expressions, telling you a story rather than just displaying a story while playing an audio file of it.
So that level of engagement is very different. We want to enable developers to bring that experience into their content, apps, and services. First of all, because I think it’s going to make their content more emotionally engaging for people, and I think there’s a lot of people in the world who are looking for technology to treat them in a much more humanized way.
CB: That decision was a conscious one, to court developers. Although there’s a lot of robotics research going on in ROS, it’s not necessarily a lot of social robotics research. Jibo in many ways I see as a bridge between the iPad kind of devices of today and the sci-fi robot visions of the future.
We want to court those developers, people who design apps for tablets today, but we’re saying to them, “Hey, you’ve been making apps for flatscreens for a long time. What about something that feels alive?”
Black-and-white robot and shadow image by Flickr user JD Hancock; Braava-in-the-kitchen image courtesy of iRobot; baby and robots image by Adriana Lee for ReadWrite; all other images courtesy of Jibo.
View full post on ReadWrite
Seo Shin Ae To Appear In g.o.d's New MV Of “The Story Of Our Lives”
Seo Shin Ae will appear on g.o.d's third music video, filmed like a drama. The revealed teaser image of Seo Shin Ae raises expectations of eagerly awaiting fans. In the revealed photo, Seo Shin Ae wears a high school uniform, looking cute unlike the g …
View full post on SEO – Google News
ClickZ Live New York kicked off with a keynote from Randi Zuckerberg on the top trends in tech and new media that are influencing the minds and behaviors of modern consumers and gave insights on how to engage, convert, and delight consumers.
View full post on Search Engine Watch – Latest
A town where you live's Kouji Seo to Resume Half & Half Manga
Anime News Network
Manga creator Kouji Seo is announcing in this year's 11th issue of Kodansha 's Weekly Shōnen Magazine on Wednesday that he will resume his Half & Half series. The series will return in the June issue of Kodansha 's Bessatsu Shōnen Magazine on May 9.
View full post on SEO – Google News
Google is constantly innovating. Sometimes those changes make our lives easier, and sometimes they leave us scrambling to make adjustments to our PPC campaigns. I identified 7 areas of improvement within the AdWords interface that could make everybody’s life much easier. Hey, we can dream right? 1.See Conversions Action Name Details in the Dimension […]
The post 7 Dream AdWords Features That Would Make Our Lives Easier by @Rocco_Zebra_Adv appeared first on Search Engine Journal.
View full post on Search Engine Journal