I tried three apps that claim to make you more likable—and am now addicted to one of them
I have a confession to make: I might have cheated to get this job.
Last spring, when I was applying to Fusion, I heard about a new start-up called Crystal that promised to, among other things, help you write the perfect email. Crystal scrapes the web for public data on a person, from sites like Facebook, Google, Twitter and LinkedIn, and then feeds it into an algorithm to figure out their personality and the best way to talk to them. Think of it like a data-driven cheat sheet for what to say and how to say it.
Before emailing Alexis Madrigal, Fusion’s Editor-in-Chief, I navigated over to Crystal and entered his name into the search bar. It spat out this:
Per Crystal’s instructions, I included an exclamation point in my email to Alexis so that even my brief note confirming when and where to meet sounded “emotionally expressive.” In our interview, I took Crystal’s cue to mention my own abstract ideas about how to cover tech. I changed subjects a lot. I think I even mustered a stab at a joke.
My pre-interview online stalking seemed to pay off: the interview went great. I made it to the next round, an interview with Fusion editor Kashmir Hill.
Obviously, I looked her up on Crystal, too.
I took Crystal’s advice and tried to keep the conversation energetic and friendly, hoping Kashmir would come away with a good impression. I’m usually pretty chatty and high-energy anyway, but Crystal served as a reminder to keep it up even under pressure.
Whether at a job interview or on a Tinder date, you’re usually flying blind when you meet someone for the first time. But Crystal’s data-crunching offered me the confidence and power of at least feeling like I knew someone before we’d ever met.
I got the job. I feel like Crystal helped, which raised all kinds of questions for me. Had I used an algorithm to artificially enhance my chances of landing the job? Did using Crystal somehow change who I was, or at least who I presented myself to be?
These days, technology doesn’t just optimize how we get cabs or groceries; it can optimize us. There are technologies out there that offer a chance at a personality makeover—that tweak and twist who you are to enhance your appeal, like an Instagram filter for the soul.
Naturally, I wanted to try more of them out.
So I recently spent a month filtering my communications through Crystal and two other apps that I hoped might make me, ultimately, a more likable person. In doing so, I caught a glimpse of a future when all communication between humans might be buffered by a layer of conversation-optimizing machines.
The next app I tried was Moodies, which decodes people’s emotions based on their voice. Moodies doesn’t care what you say, but how you say it, homing in on vocal qualities like volume, pitch and tone to determine how a person is feeling in real time, as they speak.
As a reporter, I thought that knowing how a source was feeling could help me better navigate an interview—maybe, say, sussing out the perfect moment to drop the tough question. But when I tried Moodies on a source I talk to all the time, the app kept telling me she was “aggressive” and “lonely,” even though our conversation seemed perfectly pleasant. After that, I took Moodies on a first date, and there, the algorithm’s reading seemed even further from reality. Moodies told me that my date’s tone of voice was hostile and angry, but at the end of the evening, he walked me home and kissed me goodnight.
I called up Dan Emodi, marketing head for Beyond Verbal, the company that makes Moodies. He told me that in its current incarnation, the app is meant to be used on yourself as a way to hear how others might interpret how you sound, not as ESP on others.
“It’s all about being more in touch with our own emotions,” Emodi said.
So I used Moodies before calling an ex-boyfriend. The first time I rehearsed, my voice was slow and quiet, which Moodies read as lonely and unhappy. I tried a few more times, slightly speeding up my pace and injecting a little pep into my cadence. Once Moodies said I sounded friendly, I called and left a voicemail. I’m not sure whether it was Moodies’ “emotions analytics” that did the trick or if it was simply paying attention to how I sounded, but the result was an improvement over my pitiful rehearsal.
The Moodies app, which is free, is really just a party trick from Beyond Verbal; its real business is supplying its technology to call centers, where human workers use it to track customers’ moods and frustration levels. Theoretically, it can guide operators to course-correct during a call to win over an angry customer, and, more importantly, managers can use it to know which operators tend to get on people’s nerves. Emodi told me that call centers using Beyond Verbal’s technology were able to predict the results of customer satisfaction surveys about 80 percent of the time, a good indication of accuracy. Beyond Verbal imagines embedding its software into wearable tech, to monitor health and emotional well-being over time, or to coach someone who’s socially awkward on how to better fit in. Monitoring our voices, Beyond Verbal firmly believes, is a technologic shortcut to a better self.
Next, I tried Us+, a Google Hangouts extension that “analyzes speech and facial expression to improve conversation.” During video chats, Us+ tries to improve how well your communication style matches that of the person to whom you’re talking, an idea rooted in what’s called Linguistic Style Matching. When you’re talking to someone using Us+, pop-up windows appear on screen telling you how the other person is feeling or reminding you to do things like pay more attention to them. Sometimes it actively intervenes, muting your mic if you talk too much, leaving you to blather on without sound until you realize you’ve been cut off.
[vimeo 81903116 w=650 h=400]
I used Us+ for a long-distance meeting with my editor one day when we were working from different cities. It wasn’t helpful. As I pitched stories, Us+ kept muting me for talking too much. We were both distracted by what the computer thought about our emotional states, the technology hijacking the meeting and turning it into a meta conversation about the conversation.
A few weeks later, I used Us+ while vacation planning with my friend Hope. Us+ kept telling me she seemed frustrated or sad, which was surprising considering that we were planning an awesome vacation to Greece. Hope later told me she was only frustrated because the app kept muting me every time I read off a hotel description.
Us+ creator Lauren McCarthy, an artist and programmer at New York University, told me that the tension her technology creates was part of her intent.
“We sometimes think the computer is so smart that it will solve all our problems,” she said. “But part of the point is to recognize that there are things the computer can’t do as well as a human.”
McCarthy and her co-creator Kyle McDonald intended Us+ as something that functions both as a useful technology and a critique of one. Communication, she said, has become so riddled with layers of tech that sometimes it can become difficult to distinguish man from machine.
In the research labs of companies like emotion analytics firm Affectiva, scientists are perfecting facial detection tools that can interpret what humans are thinking and feeling. Oovoo, a competitor to Skype, is working to integrate Affectiva’s software into video calls to create a commercially viable answer to the arty critique of Us+. Its first product, Flinch, released in January, was a staring contest game: the app read facial expressions to determine who was first to “flinch.” Eventually, though, Oovoo wants to use the technology create a lie detector to use during job interviews.
But that future still seems a long way off. Moodies and Us+ only served to make me hyperaware of how I sound to a machine, not to another person. The voice and expression translation tools commercially available now aren’t yet sophisticated enough to really enhance how we communicate in everyday life.
Daniel McDuff, a scientist at MIT’s Media Lab, told me that in the near-term, these technologies will mostly enable better communications between humans and machines. He studies how to make computers more receptive to human emotion, so that Siri, for example, could detect the sadness in your voice when you request a restaurant recommendation and, say, point you to a good comfort-food spot. One day McDuff imagines emotionally sensitive robots could perform tasks like monitoring mental health patients around the clock, detecting emotional flare-ups to alert a doctor or therapist.
Of the technology I tried, only Crystal, with its ability to data-crunch tons of information about someone in a way the human brain can’t, made me feel like a part of the future had arrived.
Crystal has just 64 personality types, and humans rarely fit so neatly into any given box. But when I sent a dozen or so friends and family members their Crystal profiles, most of them were creeped out by how accurate they were (and creeped out by me for looking them up).
For now, Crystal is intended for the workplace, but I found myself using it on everyone. Instead of phoning a friend, I turned to Crystal to help fashion an e-mail to an ex asking him to mail back my stuff. One of Crystal’s features goes so far as to suggest specific tweaks to e-mails based on the personality of the recipient. I used it when sending a difficult text to a friend going through her own breakup, pasting the text into the Crystal website for feedback on how she might interpret my attempt at tough-love comfort.
At times I can be brash and blunt, and sometimes my short electronic missives are misinterpreted as harsh. Crystal often suggested, depending on the person, that I start with “hey” instead of “hi” or make sure my e-mail was short and to the point rather than blathering on. (It also frequently suggested I include emoticons, though I never did. It just didn’t seem like me.)
Still, I didn’t know whether Crystal actually made me better at communicating, or, like Moodies, was just an effective reminder to be more thoughtful.
“Crystal often stops me from writing things I might naturally write that the person doesn’t really want to hear,” said Crystal founder Drew D’Agostino. “That’s really all about empathy.”
Eventually, as Crystal’s data set and algorithm improve, D’Agostino imagines integrating Crystal with other things, like online dating. I told D’Agostino that I was terrified of a future where I can no longer evaluate the worthiness of a Tinder match based on his grammar and choice of words. And if we’re all using machines to communicate, at what point does the algorithm start optimizing for communication with other machines? When does it stop being two humans communicating?
Crystal, D’Agostino reassured me, isn’t replacing human work. It’s merely prompting you to consider how the person on the other end of a send button might read what you say. A future where Crystal writes the perfect witty one-liner to your Tinder match, he said, would be taking it too far.
Crystal gave me some pointers on how to approach an interview, but it didn’t go so far as to give me real-time feedback on what to say. It seems safe to say that I earned the job myself, albeit with a little algorithmic help.
A version of this piece was presented live on stage in New York on September 14, 2015 at our second Real Future Fair: The Real Future of Deception.