KJZZ is a service of Rio Salado College, and Maricopa Community Colleges
Privacy Policy | FCC Public File | Contest Rules
Copyright © 2024 KJZZ/Rio Salado College/MCCCD
Play Live Radio
Next Up:
0:00 0:00
Available On Air Stations

When did surveillance become a business model — and what would it take to rein it in?

From romantic partners who sneak peeks at their companions’ texts to the companies that harvest and sell data we don’t even know we’re giving away, internet privacy in the U.S. is ever more the exception than the rule. This three-part series asks: Does it have to be? When did surveillance become a business model — and what would it take to rein it in?

Published: Feb. 20, 2023

Getty Images

Part I: A violation of trust

Listen to Part I


S pam emails and social media scams provide regular reminders of how online behavior can be tracked and used against us with varying degrees of harm.

But some data-privacy invasions are far more up-close and personal — as Ann found out the hard way.

“I don't know that I could say that it changed the way that I view people, but it changed my behaviors certainly,” Ann recalled.

Ideally, everyone would have trusted friends or family to help them through vulnerable processes like surgery and recovery. Ann, still fairly new in town, didn’t. So she placed her trust in a coworker she occasionally dated.

“At the beginning of the year, I had had a major surgery, and he helped a lot with that and was very interested in showing how much I could count on him and how much he wanted to be a part of my life,” she said.

Ann sensed her friend viewed their relationship as more serious than she did and says she told him so. Still, when she needed a follow-up surgery, he insisted on helping out.

“I was grateful for it, but I also was kind of aware that he was viewing this as a relationship step, too, which made me uncomfortable,” said Ann.

Tired, worried about her mother’s health — which also meant her sister was unavailable — she agreed.

Some data-privacy invasions are far more up-close and personal — as Ann found out.

“So, we went to the hospital, and I just gave him my purse to hold — as you do when you're in surgery — and then I was in the hospital for several days,” she said.

But her friend never stopped by.

When she returned home, she learned why.

“He had had my keys to be able to get my mail and things like that. And there was a note on my bed saying, ‘I have to let you go. I can't be with you if you're involved with someone else,’” said Ann.

The man she had trusted had gone through her phone.

“I didn't even have a lock on my phone, because I guess I'm a very trusting person,” she said. “But also, I'm not anyone that lends my phone to anyone or leaves it around. It never occurred to me that that would happen.”

Perhaps it’s just as well it did.

Crossing the line

“Big red flag for person being a good relationship partner,” said Jane Bambauer, a professor of law at University of Arizona. “It's probably also illegal, at least under civil law, in most states.”

American jurisprudence has long recognized that privacy, as a right, defies simple solutions, bumping up as it does against other rights and interests, from law enforcement to the free press.

But Bambauer, who studies problems posed by well-intended technology policies, says, even if we don’t always know where the line lies, we can usually agree when someone crosses it — though standards might shift over time.

“The human experience has always included some line that is indefinite but that most people can draw consensus around, that people shouldn't cross — even when they're just in normal social, physical space,” she said.

University of Arizona law professor Jane Bambauer studies problems posed by well-intended technology policies.

In this case, Bambauer refers to the pre-internet common law tort known as intrusion upon seclusion. Recognized in Arizona, it lets people sue those who invade areas reasonably expected to be private.

“It’s not really about disclosing information or sharing secrets,” said Bambauer. “It's about somebody taking extraordinary measures to discover and observe — for the first time, for them — some private details about another person.”

Bambauer thinks most people in situations like Ann’s leave the courts out of it.

“We don't see a lot of cases like that, because the more important thing is that the relationship breaks down,” she said.

Factoring in the human

Some might blame Ann or call her naïve. Yet the pandemic shows how stress, illness, fatigue and isolation can batter down anyone’s defenses.

“I think I just assume the best in people — that they're not going to do things that to me that are just basic human kindness,” said Ann.

Human nature often factors into intrusion and influence. From catfishing to quarter slots, technology provides a mere prop or tool; psychology dupes the mark.

“If the tech were different, it would be the same story with different tech; if there was no tech, it would be the same story with no tech,” said Bruce Schneier, who teaches cybersecurity policy at the Harvard Kennedy School. “The story is the friend.”

Technology and security author Bruce Schneier teaches cybersecurity policy at the Harvard Kennedy School and serves on the board of the Electric Frontier Foundation. David Betts

The story is also social norms.

According to a 2020 Pew Center survey, although most Americans condemn thumbing through a partner’s phone, one-third endorse the behavior.

Shifts in community standards can have ramifications for privacy intrusion laws.

“You have to prove that there was an intrusion that most people would consider outrageous,” said Bambauer. “So, it crosses the line for the typical person in the community.”

Doxing with the devil

Bambauer says some as-yet-unlegislated conduct may fall under prevailing laws. Take for example doxing — maliciously publishing someone’s personal data, often to expose them to persecution.

“Even if we didn't have a specific statute for the electronic communications environment, the general-purpose harassment laws should apply fairly comfortably in these contexts,” she said.

Although some ridicule the internet’s outsized share of over-sharers, no one reveals everything. Even the most uninhibited YouTubers curate their content.

“We share some things that a generation or two ago would have been scandalous,” said Bambauer. “But there might be other things that people, especially young people, are more reluctant to share than ever before, like political ideology.”

Much like privacy, security is possible, but not infallible or static, and it entails costs — to convenience, to openness and to faith in others. Somehow, people must balance privacy against the quality of life offered by openness.

In that respect, at least, the problem is as human as it is technological.

“I can give you a bunch of tech things to do, but that's not going to solve the problem,” said Schneier, who is also a board member of the Electric Frontier Foundation and has written more than a dozen popular press books on technology and security.

“The problem is, we have no choice but to trust somebody.”

But trusting someone often means trusting their judgment, too.

Ann’s friend, who had years earlier discovered his wife cheated on him by going through her phone, viewed his intrusion as warranted.

“That was how he was able to get evidence to be able to have custody of their kid,” said Ann. “And so, in his head, this is a completely reasonable, even honorable, thing to do — to do the right thing and to make sure that justice is done the way that it should be. And that blows my mind.”

Getty Images

Part II: Protecting personal privacy

Listen to Part II

Someone once said, “The road to hell is paved with good intentions.” But maybe it’s more fitting to say, “I trust everyone. I just don’t trust the devil inside them.”

In other words, maybe some precautions are in order.

“Some people, they might know the risks, but it's just that these devices are so convenient, and you just really want to have all this private information on your phone because it makes your life easier,” said Indiana University computer scientist Apu Kapadia.

When it comes to privacy and security, humans often are the weakest links — and the easiest marks. They pick obvious passwords, follow clickbait and fall for phishing attacks.

In the physical world, there are even cyber-muggers.

“People take your phone, and they coerce your password out of you,” said Kapadia, whose research emphasizes usable security and human centered computing. “Now they actually have access to your financial information and can just drain your bank accounts.”

At times, it’s an official or a police officer who applies the pressure — with potentially dire results for reporters or political activists.

Recent years set new records for journalists imprisoned or killed.

“If you're someone who needs privacy to stay alive, there are things you can do,” said Schneier. “It'll make the internet annoying for you, but it'll keep you alive.”

Indiana University computer scientist Apu Kapadia studies usable security and human-centered computing. Tracey Theriault

Search and destroy

The lawfulness of searching and seizing varies, even in the U.S.; for example, authorities enjoy wide discretion near U.S. borders. The legal umbrella of national security lets them search cellphones without a warrant and, according to The Washington Post, copy data and save it for years in a searchable database.

“It hasn't reached the Supreme Court, but there's been a lot of 9th Circuit litigation in our part of the country that does deal with searches at the border,” said Bambauer.

Courts are still hammering out many legal specifics concerning the so-called “border search exception” to the Fourth Amendment, which forbids unreasonable searches and seizures.

In the meantime, federal regulations say U.S. Customs and Border Protection can legally operate within 100 miles of any U.S. external boundary. That covers about two-thirds of the U.S. population.

With that in mind, Kapadia endorses disabling your phone’s biometrics and checking what’s on it, what’s in the cloud — and what’s secure — especially when traveling.

“Enable features for finding your phone or remote wiping of your phone; enable the option where your phone automatically wipes itself after some number of failed password attempts,” he said.

“Cellphone backups have become much easier, and it's easier to restore your phone, so a phone getting wiped is not really the end-of-the-world it used to be.”

Dual-factor logins add reasonable protection for a reasonable effort. Unfortunately, everyone seems to have their own.

To develop more physically and electronically portable solutions, a major tech alliance has developed an open-source standard called Universal 2nd Factor, or U2F. A bit like a thumb drive that provides secondary authentication, U2F is secure from keylogging, phishing and man-in-the-middle attacks.

It’s been slow to catch on.

“It’s very secure, but maybe not so usable or effective for everyone because you have to carry a device everywhere,” said ASU computer scientist Rakibul Hasan. “But it's versatile.”

Siri bar the door

But requiring special knowledge, timing, locations or objects to unlock a cellphone is like barring the front door while armies of apps abscond out the back with your valuables.

Some call such footpad data harvesting highway robbery; others call it free enterprise.

“The question is, do you want to live your life to benefit a bunch of tech billionaire monopolists? And, if you do, great, free market, baby,” said Schneier.

Schneier says the U.S. does limit the free markets for the public good.

“You are not allowed to sell yourself into slavery; you are not allowed to sell an organ; you are not allowed to sell your children; you’re not allowed to hire 5-year-olds and put them up chimneys to clean them,” he said.

But, so far — although legal actions against companies like Facebook offer some pushback — research suggests 70% of apps share data with third parties.

“Will we look back in 100 years at Facebook in the same way we look back today at chimneys companies?” said Schneier.

The answer probably depends on the size and resources of the chimney lobby.

ASU computer scientist Rakibul Hasan studies the intersection of security, psychology and machine learning. Rakibul Hasan

“There's no doubt about it: We are reluctant to recognize rights that get in the way of the sort of behavioral advertising and data-broker kind of industry that helped the internet develop and undergirds a lot of technology industry today,” said Bambauer.

Users who want to stem this data deluge should familiarize themselves with their devices’ apps and find out what their permissions allow them to access.

“You can look at the privacy settings for each app and disable access,” said Kapadia. “So, for example, a flashlight app does not need access to your contacts or your location.”

Sensors like cameras and microphones pose the most obvious targets for unwanted surveillance, and malware can spy on people with smartphones, tablets and laptops by compromising apps that have sensor access.

But location information is far more widely tracked and shared, and can reveal almost as much, if not more, about a user.

“Some applications, even if you turn off location sharing, still try to get your location data,” said Hasan, who studies the intersection of security, psychology and machine learning. ”If they cannot get it from the device, they try to communicate with Bluetooth or NFC or other channels — through other surrounding devices — and then get the location data from those devices.”

NFC, or near-field communication, is a rules system that lets devices separated by less than one-and-a half inches communicate wirelessly.

Users can also reduce their exposure by using the web version of a service instead of the app. Websites infringe comparatively less than apps do, since they don’t silently run and gather data after closing.

“The web says, ‘Once you close the tab, you're done,’” said Hasan. “Which is why you will notice that, in many cases, they will try to kind of promote their application instead.”

But users should still scrub iffy browser histories — and learn what private or incognito browsers actually do.

“This is not really buying you anonymity; it's telling the browser like, ‘Hey, now, don't maintain a history of what we're about to do, or what we're about to access,’” said Kapadia.

Schneier put it more bluntly.

“You want to watch porn and not tell your spouse; incognito mode is great for that,” he said. “Google is still going to learn you’ve watched porn.”

Watch the watchmen

Google users can copy the data the tech giant collects on them from their profile page. It can be eye-opening — as can boning up on the virtues and vices of Virtual Private Networks (VPNs).

“They all have limits: They do certain things; they don't do other things, right?” said Schneier. “A VPN doesn't work as soon as you log into Facebook, because you've logged into Facebook.”

Kapadia says research shows many VPNs leak data, if not for malicious uses, then for analytics or advertising.

“We don't really know what they're doing with our connections,” he said. “Are they snooping in? Are they keeping track of what we're doing? It's very hard to tell.”

Some browsers tout themselves for not rummaging through your trash — or cache — like a creepy stalker: Mozilla’s stated mission entails protecting privacy, and DuckDuckGo says it doesn’t track users.

Services like Tor provide anonymous browsing. It’s not foolproof, but it’s legal under First Amendment protections.

“So, if the government tried to create a law that required everyone to register before participating on the internet, that would have to undergo constitutional scrutiny and would probably fail,” said Bambauer.

Ultimately, individuals and organizations must balance their security needs against the time, money and inconvenience required to maintain them.

Organizations like the Electronic Frontier Foundation dispense good advice, but they can offer no panaceas as long as individuals and companies rely on credit cards, cellphones, social media and emails.

“You have no choice but to use email, have a cellphone; you probably have no choice but to be on social media,” said Schneier. “The problem is that it's perfectly legal for these companies to spy on you, and to use that information against you. If we want to do something about it, we have to change the law.”

Sky Schaudt/KJZZ

Getty Images

Part III: The big business of surveillance

Listen to Part 3

Every day, consumers supply companies and governments with dossiers-worth of self-compiled intelligence data — locations, behavior patterns, social circles — in exchange for convenience.

“The joke was, ‘If you're not paying for the product, you are the product’; but the real sad part is, even if you are paying, you are the product,” said Schneier.

Back in the Columbia Record Club days, list brokers sold direct marketers basic contact info with a smattering of demographics. Today, companies acquire data that reveals relationship status, pregnancy, medications and likely voting preferences.

“This spying is legal, and it's now extraordinarily big business,” said Schneier.

In 2018, British political consulting firm Cambridge Analytica parlayed a personality app with 270,000 Facebook users into psych profiles of 87 million people, which it sold to political campaigns from Donald Trump to Brexit.

Similar efforts continue today.

“The services on the other end are doing their best to build profiles of people and track you across sites, and try to build a profile of you — understanding who you are, and what your preferences are, and what you like to shop for,” said Kapadia.

Bambauer says Cambridge Analytica fed a general anxiety among lawmakers that the tech industry holds too much power, and adds that both Democrats and Republicans desire the political capital they might garner from defanging its more vampiric tendencies.

She says lawmakers have introduced a federal privacy law that has “some momentum.” But such bills are tricky to get right.

“It’s actually hard to design these laws in a way that doesn't undermine something that even some subset of consumers want,” she said.

Yet most experts interviewed agreed that protecting the public from such insidious and adaptable harms will require legislation or regulation of some kind.

“Even if someone cared about their privacy, it’s pretty impossible to protect on their own if there is no incentives, laws and policy changes,” said Hasan.

"Even if someone cared about their privacy, it’s pretty impossible to protect on their own if there is no incentives, laws and policy changes." — Rakibul Hasan, computer scientist

Taming the frontier

In 1986, the Electronic Communications Privacy Act (ECPA) extended wiretap laws to cover computer data transmissions. Bambauer says the legal interpretation of ECPA has since expanded, and courts now view it as comprising email and certain forms of direct messaging.

Of course, the government has security interests in some data, and FISA courts, warrants and special enforcement zones offer official channels for obtaining it.

But there's a slippery sort of subcontracting at work, too.

“The Googles, the Facebooks, the Amazons do this kind of surveillance as a business,” said Schneier.

In this ever-more-wired world, we are our data. Guarding it is like shielding our identity, reputation, hireability, and personal and financial security all at once.

Recent changes in the political and legal landscape surrounding abortion provide just one example of how consequential a few redirected bits and bytes can be.

Amid fears of legal repercussions for even the appearance of abortion-related activities, many doctors won’t prescribe the autoimmune treatment methotrexate — despite its other medical uses — because of its abortion applications. Meanwhile, one ProPublica investigation found nine online pharmacies that sell abortion pills also share sensitive data with third parties like Google.

“The data is not controlled by us; it's controlled by large corporations that don't have our interests at heart, and there's not a lot we can do about it,” said Schneier.

It sounds like a job for regulators. But the internet still retains a frontier mentality. What would it take to tame it?

Schneier thinks the answer is obvious.


“Rein in the business models,” he said. “Like, why is surveillance a legal business model?”

Laying down the law

As companies develop new and more effective apps, targeted ads and social media tools to gather data and build shadow profiles on users and their networks, it remains unclear whether authorities can pile regulatory sandbags fast enough to hold back the flood.

“Digital technology — the internet, social platforms — all of these are growing pretty rapidly, and law is not able to catch up,” said Hasan.

Whether all parties involved want to catch up is another matter.

Today, companies scrape billions of images from social media, surveillance doorbells and police drones to compile facial recognition and license plate location data. Police use those databases for various purposes, including tracking and running criminal background checks on people experiencing homelessness.

Schneier says it makes sense that law enforcement personnel with probable cause can obtain warrants to search data, just as they would to search an apartment or a safe deposit box.

“It's the bulk surveillance that we want to avoid,” he said. “The ‘I want to spy on everybody.’ The kind of stuff that the U.S. does against whole populations; China does; other countries do.”

In other words, the kind of data major players like Google and Facebook collect as a business plan and hand over when Uncle Sam comes knocking. Bambauer says the bar for accessing data not specifically protected by Title II of the ECPA, aka the Stored Communications Act, is quite low.

“They do not need probable cause; they don't need a warrant,” she said. “They need relevance to an investigation, so it's more like a subpoena.”

For example, obtaining directory information would not require a Stored Communications Act order.

“In many cases, a warrant is needed; in many cases, a warrant is not needed,” said Schneier. “And yes, there it is: It's too easy.”

Too easy to spy on everyone. And, like a student loan, such data harvesting is nearly impossible to escape.

Fighting a wildfire

Even if you’ve locked down your data, someone else who stores your data in their contacts list probably hasn’t.

“When someone else installs an application that collects all this data stored in their device, that app is getting my phone number, my email address,” said Hasan.

Congress and the Federal Trade Commission are just beginning to show interest in taming big tech.

By comparison, the EU’s General Data Protection Regulation(GDPR) makes data private by default and places the legal burden on data-gatherers.

That law’s roots reach across the pond to the U.S. Health, Education and Welfare (HEW) report of the 1970s.

“As the government started using more and more computerized records, they saw that there was going to be more need for privacy, more emphasis placed on it,” said Bambauer.

The HEW report advocated principles like stronger control by the data subject, including the right to know how data was being used and the option to refuse consent for its transfer. Many of its ideas were echoed by the Health Insurance Portability and Accountability Act of 1996, known as HIPAA.

“The European privacy law has sort of adopted most of these principles that were ironically developed in the U.S. but never fully implemented here,” said Bambauer.

Those doctrines include strong rights — most notably, the right to be forgotten, which entitles a person to have their private data removed from internet searches and other directories.

Some GDPR rules spill over into the U.S., mainly because so many internet companies that serve American markets do business in Europe, too. For example, many websites now request more nuanced approvals for using cookies.

But, barring real enforcement or punishment, such measures don’t amount to much.

“Research shows that many of sites collect your data, no matter if you consent or not,” said Hasan. “The other thing is that you will get very quickly annoyed by the number of these notifications you get, so that's not very useful.”

In any case, although laws vary from state to state, U.S. policies will likely remain far more conservative than the GDPR — or, for that matter, the HEW report.

“It’s extremely unlikely to ever be implemented in the United States because of our strong First Amendment — not just law, but norms,” she said.


Chasing the mirage

But principles can be weaponized as readily as they are monetized.

Artificial intelligence and machine learning don’t just crack passwords; some reduce humans to algorithms that predict and deliver the content most likely to empty wallets and retain eyeballs — even fueling fear, frustration and hate.

“There’s the whole issue of the algorithm, and the AI trying to figure out ways to keep us there and what we should be seeing and then, gosh, things get pretty dark,” said Kapadia.

Chatbots and deepfakes are no longer the sole province of large, wealthy corporations; nor are they a figment of some futuristic fantasy. The technology to make fake pornographic videos and bogus nude selfies of celebrities — or of anyone with a social media profile or online photos — is now within reach of individuals.

“Any data someone can collect from a consumer can be used for pretty much any purpose now or in the future,” said Hasan. “Which is why many people have reservations about using biometric authentication systems — fingerprint or facial recognition and so on.”

Today, someone a bit more motivated and less psychologically stable than Ann’s friend could do far more damage than penning a breakup note — or even emptying her accounts or wrecking her credit rating.

In short, practical realities no longer limit the damage we can do to each other.

Hasan fears more subtle effects as well. He wonders how the increasing prevalence of monitoring and self-editing will affect our behaviors and capacity for honest self-expression.

“Over time, there will be, I think, less critical thinking ability and more conformity,” he said. “And overall, I think there will be less freedom of thought.”

The internet has democratized data’s destructiveness.

Only time will tell if authorities can summon the tools, motivation or courage to contain the damage.

“I think we're already pretty late,” said Hasan. “But still, I want more people and more researchers to be concerned and to look into this thing before it gets out of hand.”

Nicholas Gerbis joined KJZZ’s Arizona Science Desk in 2016. A longtime science, health and technology journalist and editor, his extensive background in related nonprofit and science communications inform his reporting on Earth and space sciences, neuroscience and behavioral health, and bioscience/biotechnology.Apart from travel and three years in Delaware spent earning his master’s degree in physical geography (climatology), Gerbis has spent most of his life in Arizona. He also holds a master’s degree in journalism and mass communication from Arizona State University’s Cronkite School and a bachelor’s degree in geography (climatology/meteorology), also from ASU.Gerbis briefly “retired in reverse” and moved from Arizona to Wisconsin, where he taught science history and science-fiction film courses at University of Wisconsin-Eau Claire. He is glad to be back in the Valley and enjoys contributing to KJZZ’s Untold Arizona series.During the COVID-19 pandemic, Gerbis focused almost solely on coronavirus-related stories and analysis. In addition to reporting on the course of the disease and related research, he delved into deeper questions, such as the impact of shutdowns on science and medicine, the roots of vaccine reluctance and the policies that exacerbated the virus’s impact, particularly on vulnerable populations.