特里斯坦·哈里斯
When we get sucked into our smartphones or distracted, we think its just an accident and our responsibility. But its not. Its also because smartphones and apps hijack our innate psychological biases and vulnerabilities.
I learned about our minds vulnerabilities when I was a magician. Magicians start by looking for blind spots, vulnerabilities and biases of peoples minds, so they can influence what people do without them even realizing it. Once you know how to push peoples buttons1, you can play them like a piano. And this is exactly what technology does to your mind. App designers play your psychological vulnerabilities in the race to grab your attention.
If youre an app, how do you keep people hooked? Turn yourself into a slot machine.
The average person checks their phone 150 times a day. Why do we do this? Are we making 150 conscious choices? One major reason why is the number one psychological ingredient in slot machines: intermittent variable rewards2.
If you want to maximize addictiveness, all tech designers need to do is link a users action (like pulling a lever) with a variable reward. You pull a lever and immediately receive either an enticing reward (a match, a prize!) or nothing. Addictiveness is maximized when the rate of reward is most variable.
Does this effect really work on people? Yes. Slot machines make more money in the United States than baseball, movies, and theme parks combined. Relative to3 other kinds of gambling, people get “problematically involved” with slot machines three to four times faster according to New York University professor Natasha Dow Schüll, author of “Addiction by Design.”
A sense of belonging
But heres the unfortunate truth: Several billion people have a slot machine in their pocket.
When we pull our phone out of our pocket, were playing a slot machine to see what notifications we have received. When we swipe down our finger to scroll the Instagram feed, were playing a slot machine to see what photo comes next. When we “Pull to Refresh” our email, were playing a slot machine to see what email we got.
Sometimes this is intentional: Apps and websites sprinkle4 intermittent variable rewards all over their products because its good for business. Other times, for example with email or smartphones, its an accident.
Another way technology hijacks our minds is by inducing5 the 1 percent chance we could be missing something important. But Apps also exploit our need for social approval. When we see the notification “Your friend Marc tagged you in a photo” we instantly feel our social approval and sense of belonging on line. But its all in the hands of tech companies.
Facebook, Instagram or SnapChat can manipulate how often people get tagged in photos by automatically suggesting all the faces we should tag. So when my friend tags me, hes actually responding to Facebooks suggestion, not making an independent choice. But through designing choices like this, Facebook controls the multiplier for how often millions of people experience their social approval.
Everyone innately responds to social approval, but some demographics6, in particular teenagers, are more vulnerable to it than others. Thats why its so important to recognize how powerful designers are when they exploit this vulnerability.
The empire
LinkedIn is another offender. LinkedIn wants as many people creating social obligations for each other as possible, because each time they reciprocate7 (by accepting a connection, responding to a message, or endorsing8 someone back for a skill) they have to come back to linkedin.com where they can get people to spend more time.
Like Facebook, LinkedIn exploits an asymmetry in perception. When you receive an invitation from someone to connect, you imagine that person making a conscious choice to invite you, when in reality, they likely unconsciously responded to LinkedIns list of suggested contacts. In other words, LinkedIn turns your unconscious impulses into new social obligations that millions of people feel obligated to repay. All the while they profit from the time people spend doing it.
Welcome to the empire of social media.
Western Culture is built around ideals of individual choice and freedom. Millions of us fiercely defend our right to make “free” choices, while we ignore how our choices are manipulated upstream by menus we didnt choose in the first place.
When people are given a menu of choices, they rarely ask: “Whats not on the menu?” Or: “Why am I being given these options and not others?” “Do I know the menu providers goals?” “Is this menu empowering for my original need, or are these choices a distraction?”
Even when were not hungry
The more choices technology gives us in nearly every domain of our lives (information, events, places to go, friends, dating, jobs), the more we assume that our phone is always the most empowering and useful menu to pick from. But is it?
Companies maximizing “time spent” design apps to keep people consuming things, even when they arent hungry anymore. How? Easy. Take an experience that was bounded and finite, and turn it into a bottomless flow that keeps going.
Cornell professor Brian Wansink demonstrated this in his study showing you can trick people into keeping eating soup by giving them a bottomless bowl that automatically refills as they eat. With bottomless bowls, people eat 73 percent more calories than those with normal bowls.
Tech companies exploit the same principle. News feeds are purposely designed to auto-refill with reasons to keep you scrolling, and purposely eliminate any reason for you to pause, reconsider or leave.
Its also why video and social media sites like Netflix, YouTube or Facebook autoplay the next video after a countdown instead of waiting for you to make a conscious choice.
Tragedy of the commons9
Tech companies often claim that theyre just making it easier for users to see the video they want to watch, when they are actually serving their business interests. And you cant blame them, because increasing “time spent” is the currency they compete for.
Companies also know that interruption is good for business. Given the choice, WhatsApp, Snapchat or Facebook Messenger would prefer to design their messaging system to interrupt recipients immediately instead of helping users respect each others attention, because they are more likely to respond if its immediate. Its in their interest to heighten the feeling of urgency. For example, Facebook automatically tells the sender when you “saw” their message, instead of letting you avoid disclosing whether you read it. As a consequence, you feel more obligated to respond.
The problem is: Maximizing interruptions in the name of business creates a tragedy of the commons, ruining global attention spans and causing billions of unnecessary interruptions each day.
Its inevitable that billions of people will have phones in their pockets, but they can be designed to serve a different role than deliver hijacks for our mind.
The ultimate freedom is a free mind, and we need technology thats on our team to help us live, feel, think and act freely.
如果智能手機(jī)讓我們沉迷或分心,我們會(huì)以為這純屬偶然,責(zé)任全在自己。但實(shí)則不然。原因還在于,智能手機(jī)和應(yīng)用程序操控了我們天生的心理偏誤和心理弱點(diǎn)。
當(dāng)初我在做魔術(shù)師時(shí),了解到我們的思維弱點(diǎn)。魔術(shù)師首先尋找人們的思維盲區(qū)、弱點(diǎn)和偏誤,從而在他們毫無(wú)察覺(jué)的狀態(tài)下左右其行為。一旦知道如何牽著人們的鼻子走,即可像彈奏鋼琴一般操縱他們于股掌之間。科技正是以此方式掌控你的思維。應(yīng)用程序設(shè)計(jì)者利用你的心理弱點(diǎn),爭(zhēng)相攫取你的注意力。
假如你是一款應(yīng)用程序,如何讓用戶對(duì)你著魔呢?答案就是變身為一臺(tái)老虎機(jī)。
一般人每天查看手機(jī)達(dá)150次。我們緣何這樣做?我們做的是150次自覺(jué)選擇嗎?一個(gè)重要原因就是間歇性變量獎(jiǎng)勵(lì)——這是打老虎機(jī)的主導(dǎo)心理因素。
如果希望最大限度實(shí)現(xiàn)產(chǎn)品的致癮性,科技設(shè)計(jì)師需要做的一切就是將用戶行為(比如拉動(dòng)手柄)和變量獎(jiǎng)勵(lì)相關(guān)聯(lián)。拉動(dòng)手柄后,或立刻收到一份誘人獎(jiǎng)勵(lì)(相當(dāng)于參賽即獲獎(jiǎng)?。?,或一無(wú)所得。當(dāng)回報(bào)率變化最大時(shí),致癮性即達(dá)到最強(qiáng)。
該效應(yīng)對(duì)人們確實(shí)起作用嗎?沒(méi)錯(cuò)。在美國(guó),老虎機(jī)的獲利超過(guò)棒球、電影和主題公園三者之和。據(jù)紐約大學(xué)教授、《人為致癮》一書(shū)的作者娜塔莎·道·許爾所說(shuō),人們對(duì)老虎機(jī)“嚴(yán)重成癮”的速度之快是其他賭博形式的三至四倍。
網(wǎng)絡(luò)歸屬感
然而,不幸的真相在于:數(shù)十億人的衣袋里都裝有一臺(tái)老虎機(jī)。
從衣袋里掏出手機(jī)時(shí),我們本想查看接到的通知,實(shí)則卻是在打老虎機(jī)。向下滑動(dòng)手指去翻看照片墻的推送時(shí),我們本想查看下一幅照片,實(shí)則卻是在打老虎機(jī)?!跋吕⑿隆彪娮余]件時(shí),我們本想查看新收到的郵件,實(shí)則卻是在打老虎機(jī)。
有時(shí)這種狀況屬于人為:因其有利可圖,各應(yīng)用程序和網(wǎng)站將間歇性變量獎(jiǎng)勵(lì)穿插于產(chǎn)品各處。而有時(shí),比如對(duì)于電子郵件或智能手機(jī),該情況則屬偶然。
科技還有劫持我們思維的另一途徑,即制造1%我們可能會(huì)錯(cuò)過(guò)要事的幾率。但應(yīng)用程序也利用我們對(duì)社會(huì)認(rèn)同的需求。當(dāng)看到通知“好友馬克在照片中標(biāo)記了您”,我們頓覺(jué)得到社會(huì)認(rèn)同、有了線上歸屬感。然而,這完全掌握在科技公司之手。
臉書(shū)、照片墻或色拉布自動(dòng)推薦所有需要標(biāo)記的面孔,從而可操控用戶在照片中獲得標(biāo)記的頻率。因此,好友如果標(biāo)記我,其實(shí)是在聽(tīng)從臉書(shū)的建議,而不是在作自主選擇。但通過(guò)這樣的設(shè)置,臉書(shū)可讓無(wú)數(shù)人感受社會(huì)認(rèn)同的頻率成倍增加。
每個(gè)人生性都會(huì)回應(yīng)社會(huì)認(rèn)同,不過(guò)有些群體,尤其是青少年,較之他人更易受其影響。設(shè)計(jì)者利用這一弱點(diǎn)就會(huì)具備強(qiáng)大的操控力,認(rèn)識(shí)到這一情況十分重要,原因就在于此。
社交媒體帝國(guó)
領(lǐng)英同樣在操縱人們的思維。領(lǐng)英希望互欠人情的用戶多多益善,因?yàn)橛脩裘孔骰貞?yīng)(接受添加好友邀請(qǐng)、回復(fù)信息,或者認(rèn)可聯(lián)系人的技能),均需重登領(lǐng)英網(wǎng)站,從而可讓別人花費(fèi)更多時(shí)間。
和臉書(shū)一樣,領(lǐng)英也是利用認(rèn)知的不對(duì)稱。收到添加好友邀請(qǐng)時(shí),你會(huì)以為對(duì)方發(fā)邀請(qǐng)是出于自覺(jué)選擇,但實(shí)際上,對(duì)方很可能是無(wú)意中對(duì)領(lǐng)英的聯(lián)系人推薦名單作了回應(yīng)。換言之,領(lǐng)英將潛意識(shí)沖動(dòng)變作許多人覺(jué)得必須回報(bào)的新人情。領(lǐng)英始終從人們?yōu)榇撕馁M(fèi)的時(shí)間中獲利。
歡迎來(lái)到社交媒體帝國(guó)。
西方文化建立在個(gè)人選擇與個(gè)人自由的理想之上。我們中有無(wú)數(shù)人堅(jiān)決捍衛(wèi)自己的“自由”選擇權(quán),而忽視了自己的選擇如何在前期就受到最初未選菜單的操控。
收到一份選項(xiàng)菜單時(shí),人們鮮少詢問(wèn):“哪些內(nèi)容菜單上沒(méi)有?”或者“為何給我的是這些選項(xiàng),而非其他?”“我了解菜單提供者的用意嗎?”“這份菜單會(huì)滿足我的最初需求,還是會(huì)分散我的注意力呢?”
不餓仍進(jìn)食
幾乎在生活的方方面面(資訊、活動(dòng)、去處、交友、約會(huì)、求職),科技給我們提供的選擇越多,我們?cè)揭詾槭謾C(jī)總是最得力、最實(shí)用的可選菜單。但事實(shí)的確如此嗎?
最大限度追求“耗用時(shí)間”的公司設(shè)計(jì)應(yīng)用程序的目的在于,即便用戶已經(jīng)不餓,也要讓他們繼續(xù)進(jìn)食。方法可謂簡(jiǎn)單:選擇一種有限的用戶體驗(yàn),使之變?yōu)樵丛床粩嗟臒o(wú)限信息流。
康奈爾大學(xué)教授布賴恩·萬(wàn)辛克證實(shí)了這一點(diǎn)。他的研究表明,如果提供一只不見(jiàn)底飯碗,人們喝湯時(shí),飯碗會(huì)自動(dòng)續(xù)滿,從而可誘導(dǎo)他們不停地喝下去。較之使用普通飯碗的人,使用不見(jiàn)底飯碗者多攝入73%的卡路里。
科技公司運(yùn)用的是同一原理。新聞推送意在自動(dòng)持續(xù)提供讓你瀏覽下去的理由,而故意不給你停頓、重作考慮或離開(kāi)的任何借口。
奈飛、優(yōu)兔、臉書(shū)等視頻和社交網(wǎng)站在倒計(jì)時(shí)結(jié)束后,不等你作出自覺(jué)選擇,就自動(dòng)播放下一條視頻,其原因也在于此。
公地悲劇
科技公司往往聲稱,自己只是在為用戶觀看喜歡的視頻提供更多便利,但實(shí)則是在為自己的商業(yè)利益服務(wù)。他們的行為無(wú)可厚非,因?yàn)樵黾佑脩舻摹昂挠脮r(shí)間”是他們競(jìng)相采取的通用手段。
同時(shí),科技公司知道,打擾用戶有利可圖。如有可能,瓦次普、色拉布或飛書(shū)信更愿設(shè)計(jì)即時(shí)打擾接收者而非幫助用戶尊重彼此注意力的消息系統(tǒng),因?yàn)橛脩艋貞?yīng)即時(shí)消息的可能性更大。強(qiáng)化緊迫感是出于科技公司的利益考慮。比如,你“看到”信息后,臉書(shū)會(huì)自動(dòng)告知發(fā)送者,而不是允許你避免透露是否讀到信息。結(jié)果,你會(huì)感到更有回復(fù)的必要。
問(wèn)題在于:借商業(yè)之名最大限度打擾用戶導(dǎo)致公地悲劇,破壞了全球范圍人們的注意廣度,每天給人們?cè)斐蔁o(wú)數(shù)次的非必要打擾。
手機(jī)用戶將達(dá)數(shù)十億,這是必然趨勢(shì),但手機(jī)可用以發(fā)揮另一作用,而不是劫持我們的思維。
終極自由在于思維的自由。我們需要科技同我們站在一起,幫助我們自由地生活、感受、思考和行動(dòng)。
(譯者為“《英語(yǔ)世界》杯”翻譯大賽獲獎(jiǎng)?wù)撸?/p>
1 push sbs buttons引起某人的反應(yīng)。
2 intermittent variable reward間歇性變量獎(jiǎng)勵(lì),即每次所獲獎(jiǎng)勵(lì)都不可預(yù)知,從而保持人們的興趣,激勵(lì)人們不斷嘗試。 ?3 relative to相比于。
4 sprinkle將……穿插于。 ? 5 induce引起,導(dǎo)致。 ?6 demographic(商業(yè)用詞)同類客戶群體,(具有共同特征的)人群。
7 reciprocate回應(yīng),回報(bào),酬答。 ?8 endorse認(rèn)可。Endorsements(技能認(rèn)可)是領(lǐng)英推出的一項(xiàng)功能,用戶可一鍵認(rèn)可聯(lián)系人添加的職業(yè)技能。
9 tragedy of the commons公地悲劇,由美國(guó)學(xué)者加勒特·哈丁(Garrett Hardin)于1968年提出,其核心意思是:人們出于私利而搶占公共資源,終致公共資源遭毀。common公共用地。