Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

Are algorithms hacking our thoughts?
Techcrunch.com ^ | 5-20-2018 | Adriana Stan, Mihai Botarel

Posted on 05/26/2018 3:14:35 PM PDT by spintreebob

As Facebook shapes our access to information, Twitter dictates public opinion and Tinder influences our dating decisions, the algorithms we’ve developed to help us navigate choice are now actively driving every aspect of our lives.

But as we increasingly rely on them for everything from how we seek out news to how we relate to the people around us, have we automated the way we behave? Is human thinking beginning to mimic algorithmic processes? And is the Cambridge Analytica debacle a warning sign of what’s to come — and of happens when algorithms hack into our collective thoughts?

It wasn’t supposed to go this way. Overwhelmed by choice — in products, people and the sheer abundance of information coming at us at all times — we’ve programmed a better, faster, easier way to navigate the world around us. Using clear parameters and a set of simple rules, algorithms help us make sense of complex issues. They’re our digital companions, solving real-world problems we encounter at every step, and optimizing the way we make decisions. What’s the best restaurant in my neighborhood? Google knows it. How do I get to my destination? Apple Maps to the rescue. What’s the latest Trump scandal making the headlines? Facebook may or may not tell you.

Wouldn’t it be nice if code and algorithms knew us so well — our likes, our dislikes, our preferences — that they could anticipate our every need and desire? That way, we wouldn’t have to waste any time thinking about it: We could just read the one article that’s best suited to reinforce our opinions, date whoever meets our personalized criteria and revel in the thrill of familiar surprise. Imagine all the time we’d free up, so we could focus on what truly matters: carefully curating our digital personas and projecting our identities on Instagram.

It was Karl Marx who first said our thoughts are determined by our machinery, an idea that Ellen Ullman references in her 1997 book, Close to the Machine, which predicts many of the challenges we’re grappling with today. Beginning with the invention of the internet, the algorithms we’ve built to make our lives easier have ended up programming the way we behave.

Here are three algorithmic processes and the ways in which they’ve hacked their way into human thinking, hijacking our behavior. Product comparison: From online shopping to dating

Amazon’s algorithm allows us to browse and compare products, save them for later and eventually make our purchase. But what started as a tool designed to improve our e-commerce experience now extends much beyond that. We’ve internalized this algorithm and are applying it to other areas of our lives — like relationships.

Dating today is much like online shopping. Enabled by social platforms and apps, we browse endless options, compare their features and select the one that taps into our desires and perfectly fits our exact personal preferences. Or just endlessly save it for later, as we navigate the illusion of choice that permeates both the world of e-commerce and the digital dating universe.

Online, the world becomes an infinite supply of products, and now, people. “The web opens access to an unprecedented range of goods and services from which you can select the one thing that will please you the most,” Ullman explains in Life in Code. “[There is the idea] that from that choice comes happiness. A sea of empty, illusory, misery-inducing choice.”

We all like to think that our needs are completely unique — and there’s a certain sense of seduction and pleasure that we derive from the promise of finding the one thing that will perfectly match our desires.

Whether it’s shopping or dating, we’ve been programmed to constantly search, evaluate and compare. Driven by algorithms, and in a larger sense, by web design and code, we’re always browsing for more options. In Ullman’s words, the web reinforces the idea that “you are special, your needs are unique, and [the algorithm] will help you find the one thing that perfectly meets your unique need and desire.”

In short, the way we go about our lives mimics the way we engage with the internet. Algorithms are an easy way out, because they allow us to take the messiness of human life, the tangled web of relationships and potential matches, and do one of two things: Apply a clear, algorithmic framework to deal with it, or just let the actual algorithm make the choice for us. We’re forced to adapt to and work around algorithms, rather than use technology on our terms.

Which leads us to another real-life phenomenon that started with a simple digital act: rating products and experiences.

Quantifying people: Ratings & reviews

As with all other well-meaning algorithms, this one is designed with you and only you in mind. Using your feedback, companies can better serve your needs, provide targeted recommendations just for you and serve you more of what you’ve historically shown to like, so you can carry on mindlessly consuming it.

From your Uber ride to your Postmate delivery to your Handy cleaning appointment, nearly every real-life interaction is rated on a scale of 1-5 and reduced to a digital score.

As a society we’ve never been more concerned with how we’re perceived, how we perform and how we compare to others’ expectations. We’re suddenly able to quantify something as subjective as our Airbnb host’s design taste or cleanliness. And the sense of urgency with which we do it is incredible — you’re barely out of your Uber car when you neurotically tap all five stars, tipping with wild abandon in a quest to improve your passenger rating. And the rush of being reviewed in return! It just fills you with utmost joy.

Yes, you might be thinking of that dystopian Black Mirror scenario, or that oddly relatable Portlandia sketch, but we’re not too far off from a world where our digital score simultaneously replaces and drives all meaning in our lives.

We’ve automated the way we interact with people, where we’re constantly measuring and optimizing those interactions in an endless cycle of self-improvement. It started with an algorithm, but it’s now second nature.

As Jaron Lainier wrote in his introduction to Close to the Machine, “We create programs using ideas we can feed into them, but then [as] we live through the program. . .we accept the ideas embedded in it as facts of nature.”

That’s because technology makes abstract and often elusive, desirable qualities quantifiable. Through algorithms, trust translates into ratings and reviews, popularity equals likes and social status means followers. Algorithms create a sort of Baudrillardian simulation, where each rating has completely replaced the reality it refers to, and where the digital review feels more real, and certainly more meaningful, than the actual, real-life experience.

In facing the complexity and chaos of real life, algorithms help us find ways to simplify it; to take the awkwardness out of social interaction and the insecurity that comes with opinions and real-life feedback, and make it all fit neatly into a ratings box.

But as we adopt programming language, code and algorithms as part of our own thinking, are human nature and artificial intelligence merging into one? We’re used to thinking of AI as an external force, something we have little control over. What if the most immediate threat of AI is less about robots taking over the world, and more about technology becoming more embedded into our consciousness and subjectivity?

In the same way that smartphones became extensions of our senses and our bodies, as Marshall McLuhan might say, algorithms are essentially becoming extensions of our thoughts. But what do we do when they replace the very qualities that make us human?

And, as Lainier asks, “As computers mediate human language more and more over time, will language itself start to change?”

Automating language: Keywords and buzzwords

Google indexes search results based on keywords. SEO makes websites rise to the top of search results, based on specific tactics. To achieve this, we work around the algorithm, figure out what makes it tick, and sprinkle websites with keywords that make it more likely to stand out in Google’s eyes.

But much like Google’s algorithm, our mind prioritizes information based on keywords, repetition and quick cues.

It started as a strategy we built around technology, but it now seeps into everything we do — from the way we write headlines to how we generate “engagement” with our tweets to how we express ourselves in business and everyday life.

Take the buzzword mania that dominates both the media landscape and the startup scene. A quick look at some of the top startups out there will show that the best way to capture people’s attention — and investors’ money — is to add “AI,” “crypto” or “blockchain” into your company manifesto.

Companies are being valuated based on what they’re signifying to the world through keywords. The buzzier the keywords in the pitch deck, the higher the chances a distracted investor will throw some money at it. Similarly, a headline that contains buzzwords is far more likely to be clicked on, so the buzzwords start outweighing the actual content — clickbait being one symptom of that. Where do we go from here?

Technology gives us clear patterns; online shopping offers simple ways to navigate an abundance of choice. Therefore there’s no need to think — we just operate under the assumption that algorithms know best. We don’t exactly understand how they work, and that’s because code is hidden: we can’t see it, the algorithm just magically presents results and solutions. As Ullman warns in Life in Code, “When we allow complexity to be hidden and handled for us, we should at least notice what we are giving up. We risk becoming users of components. . .[as we] work with mechanisms that we do not understand in crucial ways. This not-knowing is fine while everything works as expected. But when something breaks or goes wrong or needs fundamental change, what will we do except stand helpless in the face of our own creations?”

Cue fake news, misinformation and social media targeting in the age of Trump.

So how do we encourage critical thinking, how do we spark more interest in programming, how do we bring back good-old-fashioned debate and disagreement? What can we do to foster difference of opinion, let it thrive and allow it to challenge our views?

When we operate within the bubble of distraction that technology creates around us, and when our social media feeds consist of people who think just like us, how can we expect social change? What ends up happening is we operate exactly as the algorithm intended us to. The alternative is questioning the status quo, analyzing the facts and arriving at our own conclusions. But no one has time for that. So we become cogs in the Facebook machine, more susceptible to propaganda, blissfully unaware of the algorithm at work — and of all the ways in which it has inserted itself into our thought processes.

As users of algorithms rather than programmers or architects of our own decisions, our own intelligence become artificial. It’s “program or be programmed” as Douglas Rushkoff would say. If we’ve learned anything from Cambridge Analytica and the 2016 U.S. elections, it’s that it is surprisingly easy to reverse-engineer public opinion, to influence outcomes and to create a world where data, targeting and bots lead to a false sense of consensus.

What’s even more disturbing is that the algorithms we trust so much — the ones that are deeply embedded in the fabric of our lives, driving our most personal choices — continue to hack into our thought processes, in increasingly bigger and more significant ways. And they will ultimately prevail in shaping the future of our society, unless we reclaim our role as programmers, rather than users of algorithms.


TOPICS: Business/Economy; Crime/Corruption; Culture/Society; Philosophy
KEYWORDS: choice; control; decision; freedom
Navigation: use the links below to view more comments.
first 1-2021 next last
In politics, religion, business, education, all aspects of life ... we need to be aware of the assumptions that go into the algorithms that control the information we consume. Information is power. He who controls the algorithms controls the information and controls what we will know in the future.

We accept the Bible as truth. Everything else needs to be verified. Trust but verify.

1 posted on 05/26/2018 3:14:35 PM PDT by spintreebob
[ Post Reply | Private Reply | View Replies]

To: spintreebob

Someone show me how what Cambridge Analytica did was in any way illegal.


2 posted on 05/26/2018 3:16:06 PM PDT by 2ndDivisionVet (You cannot invade the mainland US. There'd be a rifle behind every blade of grass.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: spintreebob

“If we’ve learned anything from Cambridge Analytica and the 2016 U.S. elections, it’s that it is surprisingly easy to reverse-engineer public opinion, to influence outcomes and to create a world where data, targeting and bots lead to a false sense of consensus.”

[snaps fingers] But of course! It was algorithms that cost Hillary the election!


3 posted on 05/26/2018 3:33:01 PM PDT by Ken H (Best election ever!)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Ken H

Al Gore rhythms?

Did the Manbearpig learn to dance?


4 posted on 05/26/2018 3:50:53 PM PDT by Fai Mao (There is no rule of law in the US until The PIAPS is executed.)
[ Post Reply | Private Reply | To 3 | View Replies]

To: spintreebob

Makes you wonder if the more vulnerable will fall for it.

It just makes me angry that my Internet searches are being tracked and information used to sell me stuff.


5 posted on 05/26/2018 4:06:09 PM PDT by dhs12345
[ Post Reply | Private Reply | To 1 | View Replies]

To: spintreebob

bookmark


6 posted on 05/26/2018 4:07:00 PM PDT by GOP Poet
[ Post Reply | Private Reply | To 1 | View Replies]

To: spintreebob

Awhile ago I read a good article about “information” - and how even with the internet, we still don’t have complete freedom of information, as so much of it is tailored to our past interests.


7 posted on 05/26/2018 4:12:00 PM PDT by 21twelve
[ Post Reply | Private Reply | To 1 | View Replies]

To: dhs12345
I read once where when you call into places (banks, website, cable company, where-ever) and they say the call may be recorded? The call IS recorded to determine your relationship style. And in the future when you call them, a representative that best fits your style will answer your call. It makes sense (everybody is happier) - but spooky too. They may even share that information with third parties. (Or perhaps it IS a third party that offers that service to all phone representatives?).
8 posted on 05/26/2018 4:16:44 PM PDT by 21twelve
[ Post Reply | Private Reply | To 5 | View Replies]

To: 21twelve

Which type of customer service employee does the algorithm choose for the customer who hangs the f up immediately? Asking for a friend.


9 posted on 05/26/2018 4:17:44 PM PDT by Yaelle
[ Post Reply | Private Reply | To 8 | View Replies]

To: Yaelle

LOL. Yeah - I wish they would get those robo calls to figure it out!

No - this is for when YOU call a company for some reason. Report a stolen credit card, get your cable TV fixed, etc.


10 posted on 05/26/2018 4:21:21 PM PDT by 21twelve
[ Post Reply | Private Reply | To 9 | View Replies]

To: 2ndDivisionVet; Ken H

Cambridge Analytica did exactly the same thing Obama did in 2008 and I did 10 years earlier. The difference is that mine was very manual, only partially computerized. Obama’s was very AI. But AI made big strides from 2008 to 2016.

This is not just about politics. It is about all facets of our lives. Alogrithms will assume certain pre-suppositions in choosing what medicine and medical care a person will receive. It will determine where brick and mortar stores are placed, where fast food and formal restaurants will be placed.

Assumptions made by Algorithms will determine what regulations the FDA and EPA and all regulatory agencies will make. It will determine what the CBO tells Congress.

Reagan said: Some see government as the solution. Others see government as the problem. Guess which assumption is currently built into every government agency computer system.


11 posted on 05/26/2018 4:24:43 PM PDT by spintreebob
[ Post Reply | Private Reply | To 2 | View Replies]

To: Fai Mao
Al Gore rhythms? Did the Manbearpig learn to dance? Excellent. You should be careful, though. You may have just red-flagged yourself to the internet Police. After a second offense you could be sent to the PUNetentiary!
12 posted on 05/26/2018 4:31:14 PM PDT by heterosupremacist (Resistance to tyrants is obedience to God. - (Thomas Jefferson)
[ Post Reply | Private Reply | To 4 | View Replies]

To: spintreebob

‘What’s the latest Trump scandal making the headlines?”

These asshats work this stuff into everything.


13 posted on 05/26/2018 4:45:38 PM PDT by Luke21
[ Post Reply | Private Reply | To 1 | View Replies]

To: 21twelve

Just makes me mad.


14 posted on 05/26/2018 4:57:32 PM PDT by dhs12345
[ Post Reply | Private Reply | To 8 | View Replies]

To: dhs12345

It makes me angry that search engines know what I want to search before typing it in.


15 posted on 05/26/2018 4:59:58 PM PDT by bgill (CDC site, "We don't know how people are infected with Ebola.")
[ Post Reply | Private Reply | To 5 | View Replies]

To: dhs12345
It just makes me angry that my Internet searches are being tracked and information used to sell me stuff.

Last night, I designed my dream Camaro on the Chevrolet website. Today, I went to various websites, and they were advertising Camaros. Never before have I seen Camaro ads.

Probably after today, I will see Mustang ads, too. That’s because Mr. exDem insisted I look at Mustangs, which he claims are better than Camaros. Whatever.

16 posted on 05/26/2018 5:02:11 PM PDT by exDemMom (Current visual of the hole the US continues to dig itself into: http://www.usdebtclock.org/)
[ Post Reply | Private Reply | To 5 | View Replies]

To: 21twelve

If they do that when you phone a company in order to have someone more compatible with you answer the next time you call, then why do I still keep getting foreigners answering the phone?


17 posted on 05/26/2018 5:04:43 PM PDT by exDemMom (Current visual of the hole the US continues to dig itself into: http://www.usdebtclock.org/)
[ Post Reply | Private Reply | To 8 | View Replies]

To: bgill

You’re lucky. They rarely understand what I am looking for. I search for technology, health and politics mostly. Tech searches are the worst. I search for a common term and specify 2018. Yet they give me articles from the 90s.

In politics when I want a word to be take literally, they take it as a figure of speech for something else. When I want a word to be used as a figure of speech, they take it literally.

In health, I want health policy. But they give me medical articles.

One thing worse than that they know what you are is if they make you out to be something you are not, like a racist or pedophile, or a poet with artistic interest when that is not me.


18 posted on 05/26/2018 6:22:49 PM PDT by spintreebob
[ Post Reply | Private Reply | To 15 | View Replies]

To: spintreebob

It used to be called brain washing!


19 posted on 05/26/2018 7:36:56 PM PDT by Retvet (Retvete)
[ Post Reply | Private Reply | To 1 | View Replies]

To: 21twelve
-- Awhile ago I read a good article about "information" - and how even with the internet, we still don't have complete freedom of information ... --

even with the internet, we especially don't have complete freedom of information

20 posted on 05/26/2018 7:39:51 PM PDT by Cboldt
[ Post Reply | Private Reply | To 7 | View Replies]


Navigation: use the links below to view more comments.
first 1-2021 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson