KJZZ is a service of Rio Salado College,
and Maricopa Community Colleges

Copyright © 2025 KJZZ/Rio Salado College/MCCCD
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

How algorithms do — and do not — impact our daily lives

code on a computer screen
GabrielPevide/Getty Images
/
E+
According to Merriam-Webster, the word algorithm is commonly used for the set of rules a machine, especially a computer, follows to achieve a particular goal.

Most of us come into contact with algorithms every day.

Whether we’re picking a show to watch, a song to listen to, a post to like or a new product to buy — all sorts of companies, websites and platforms use algorithms to put new things in front of us.

But lots of people worry about this, and whether we’re losing our say in some of the decisions we make. There are also privacy concerns and just philosophical ones, as well.

Anna Shechtman delves into all of this in an essay in the Yale Review — which also serves as a review of two books about algorithmic culture and thinking. Shectman is a writer and Klarman Fellow at Cornell University, and she joined The Show to talk about the essay.

Conversation highlights

Do you think there's any truth to the feeling that algorithm is one of those words that like everyone uses but fewer actually know what it means and what it is? Do you buy into that?

ANNA SHECHTMAN: Yeah, I do. There's a term we have for this. It's an SAT word — it's called a hypostatisation ... which is a fancy way of saying it's an abstraction, right? It's a thing that doesn't quite exist, but we talk about it as if it does, as if it's a concrete thing. And so part of my hope is to point to the concrete things that we really mean when we're talking about the algorithm as some sort of menacing specter haunting our world.

It is something that clearly we come into contact with, even if it's indirectly on a regular basis — like if we watch Netflix or if we're reading news stories on our phones.

SHECHTMAN: Right. Yeah. And I think what we mean there are these algorithmic recommendation systems. Which is to say that the algorithm itself is not real. It's not something we can point to as a single accessible piece of code that these companies are hiding from us. But there is a feeling that's attached to this notion of the algorithm, which is a feeling of waning agency. Lack of agency. We didn't opt into this. We've just had to adapt to these new norms of, basically, consumption. And that feeling is real. It's just that the algorithm itself is, instead, a sort of new technology that intersects with very old forms of legal, economic and political systems ... that we've actually always existed and, in fact, we do have agency over as both consumers and as voters.

The agency issue is so interesting because — as you point out — algorithms are trained to sort of take their cues from what we do. If you click on something that you like, the algorithm is going to learn, that particular algorithm is gonna learn: OK, Anna likes this movie. Maybe she will also like this movie. So like we do have agency and we are, to some extent, the ones in control.

SHECHTMAN: That's right. Although, again, as I say, it oftentimes doesn't feel that way. And so that's the strange paradox of this particular form of what I actually like to think of as just advertising. It's a way to kind of remind us that we do actually know what this is like.

The difference is that where we tend to think of advertising as selling us things that are for someone like us or someone that we could be if only we buy X, Y or Z product. And algorithmic recommendation systems, because of their capacity to, basically, be trained on data that we've provided to them, are instead selling us things that are for not just someone like us, but someone that we have been.

Do you think the fact that the algorithm is sending us stuff for who we are, as opposed to maybe who we are aspirationally or in our minds, does that contribute to some of our negative feelings towards the algorithm?

SHECHTMAN: You know, it might. But, again, I think one of the things that's most menacing about the idea of the algorithm is that it's new. Is that it's something that is new and therefore dystopian in a way. ... It hasn't existed and therefore we don't know how it's going to exist. But there are certain ways that it is just a new form of advertising. We have a certain way of thinking about advertising. You know, I still get analog mail, get catalogs for things that I didn't sign up for. And that's, you know, frustrating in and of itself. And I'm not sure that we need to necessarily think about the advertising that we get online — even though it is more targeted to our recent consumer behavior — as any more or less frustrating.

I think the larger questions we need to be asking are questions that, in fact, we've been asking in our democracy for more than 100 years. Which is: Should advertising be so attached to the information we receive when we're reading the newspaper or in this case, you know, reading X or Twitter — or whatever we want to call it. Should there be a profit motive underlying, you know, the information that we're receiving? Is that the best and healthiest thing for our democracy?

KJZZ's The Show transcripts are created on deadline. This text is edited for length and clarity, and may not be in its final form. The authoritative record of KJZZ's programming is the audio record.

Mark Brodie is a co-host of The Show, KJZZ’s locally produced news magazine. Since starting at KJZZ in 2002, Brodie has been a host, reporter and producer, including several years covering the Arizona Legislature, based at the Capitol.
Related Content
  • Artificial intelligence has become a bigger part of our day-to-day lives, but a new paper argues it’s not immune from human bias and errors. Bill Welser is a co-author of that report. He leads the Engineering and Applied Sciences Department at the RAND Corporation.
  • Algorithms, or mathematical formulas, have been a part of most of our lives since grade school. But the emergence of "big data" in recent years, and more impactfully during the pandemic, has put algorithms in the spotlight of how we act and make decisions.