Washington Post Technology columnist Geoffrey Fowler says smartphones and apps are harvesting our personal data – and that of our children – on a scale that would shock most users. By the time a child turns 13, he writesonline ad agencies have collected an average of 72 million data points on this individual.
“Companies whose name you would never know, that really have no relation to the app you’re trying to use, might, first of all, track your child’s interests and then try to predict what they might want to buy, or resell their information to others,” Fowler says.
Fowler started out as a tech reviewer, reviewing new gadgets and assessing whether they were worth their price tag. But, he says, as the technology has become more mainstream, he’s begun to see his role as a reviewer differently: “It’s no longer a question of ‘Is it too hard to use?’ The question has now become, “Is this wrong? ‘Does this take away our rights and our choices?’ “
Fowler’s ongoing series for the Job, “We the Users”, attempts to answer these questions and raise awareness of the extent of the problems. He says the The Children’s Online Privacy Protection Act of 1998 states that a company must know that a child is using the application or website for certain privacy protections to take effect. But many companies get around the law, simply by claiming they don’t. know who their users are. Fowler advocates closing this loophole and creating new laws that allow companies to collect only the data they need, and nothing more.
“I think that’s what most consumers already assume: that if you ask a website to show you a map, they collect your location right then and there to give you directions,” Fowler says. “The problem is, that’s not happening. These companies take advantage of this to collect your data all the time and do whatever they want with it.”
Fowler says that while each individual instance of data collection may seem insignificant at the moment, the big picture is anything but: “Once it’s collected, it’s out of your control, it could be used for anything. kinds of ways that we can’t even imagine. Again.”
On how apps are spying on kids on an alarming scale
I work with researchers from a company called Pixalate, and they’ve looked at this very broadly. So they tried to categorize all the existing apps that might be of interest to kids. And then they tracked what happened to the personal data that those apps were collecting: things like ways to identify the phone, the location of the phone. They found that more than two-thirds of [the apps] on iPhones were sending this information to the advertising industry. It was an even higher number – 79% – on Android phones. What shocked me about this is that we have a law in America that is supposed to protect children’s privacy – and yet it happens.
Why age ratings on apps don’t protect children from data collection
Apps from the Google and Android app stores must all have an age rating, which corresponds to the violence of the content or its “mature” nature. So you will see these ratings in the store. The problem here is that these ratings have nothing to do with whether or not these apps collect data on children. The law we have in the United States, called COPPA, the children online [Privacy] Protection Act, makes it pretty clear that if someone is under 13, companies aren’t supposed to collect data about them without the explicit permission of their parents. But the problem is that this giant industry of app developers, along with Apple and Google who run these app stores and make billions of dollars from them, have found some very big loopholes in this law. So they do it anyway.
On the loophole in the Children’s Online Privacy Protection Act 1998
A large number of [the app developers] then just pretend, “We don’t know who’s using our app. It could be adults.” Or they’ll say, “We’re not really marketing this coloring app or this math homework app to kids. We’re marketing it to adults.” And Apple and Google, who run these app stores and are sort of the de facto police for them, let them off the hook.
Children use all kinds of things. The app stores that they have available to them on their phones are exactly the same as the app stores that adults have. So they want to play the same games that we want to play. Often it’s things like Angry Birds and Candy Crush. … They want to do a lot of the same things we do. And these kinds of apps all pretend they’re consumer apps, meaning they’re designed for adults rather than acknowledging that in fact kids are going to be interested in this stuff as well, so you have to treat them differently.
On the responsibility of Apple and Google to stop this collection of data from children
Apple and Google are the de facto app cops in our world. They decide what happens in these stores and they actually go to Washington all the time right now arguing that only they should be allowed to run these stores, even though they’re kind of like monopolies because only they can protect our privacy and security and protect our children. So if they say that, they really should force the apps to tell the truth about kids potentially using the apps, and if so, treat them differently.
Ideas to solve this problem
Another idea that has come up, of all places, from Instagram, which is one of the apps we’ve talked about a lot as causing trouble for kids, is to make the phone itself know if a child use it or not. So, the idea is that when a parent installs an iPhone or an Android phone for their child, they enter the age of the child. And so if that age is less than 13, that would send a signal to apps that would say, “Hey, there’s a kid here. Don’t collect data unless you get parental permission.” I think it would be really useful in many ways.
What happens when you tap “ask app not to track” in app settings
When you hit the “request the app not to track” button, … you’re making a request to not track you, but you’re not exactly turning off the system that would be used to potentially track you. So what you are doing is preventing the app from using a particular type of identifier that exists on your phone. This is called the IDFA. And it was actually made by Apple, integrated into the iPhone a long time ago. And it’s just a code that lets apps know who you are across different apps. This is very useful for advertisers, for example, who may want to show you the same advertisement for fancy underwear that you see in one app and then show it to you on another website or in another app.
So when you hit the “ask app not to track” button, it says you can’t enter that form of identification anymore, but it really does nothing to stop all the other types of data that can still be used to identify you, which applications may want to capture. … It’s better than nothing. So, yeah, enjoy everything you can out there because it’s, you know, it’s a battle.
On Fowler reading all the terms and conditions for each app on his phone.
It was a million words. And just to give you some context, it’s about twice the length of War and peace. There’s no way a normally functioning person will have enough time to read this even once, let alone continue reading it because these companies are tweaking the language and updating them all the time. It’s just crazy. But unfortunately, this is the basis of how our privacy is supposed to be protected in the United States. Our problem right now is that we’re just overwhelmed with data collection and this model that’s built into US law and economics that we, the users, kind of consent to. Each of these uses of data is completely broken. In fact, it’s really mean to us as consumers. It’s not really fair. It puts accountability on us, puts it on our shoulders that if something happens with our data that we didn’t like or if something bad happens to our data, it’s our fault for consenting all along . And I just think it’s really, really broken.
On how data collection could be weaponized against women in a post-Roe v. wade
Your phone knows a lot about you, and so it would know if you were looking for information on where to get an abortion. He might know if you were in a clinic. It can know your fertility cycle history as many people use cycle tracker apps. All of this data could be used against you if you find yourself in a state where asking for an abortion becomes illegal. There is already precedent for this, that research histories and other information have been used in an attempt to show that women were guilty of not caring for their fetuses or of causing the death of babies. ‘a baby. And what I think people forget is that any time a company collects information about you, the government can access that information either by issuing a court order or, increasingly, simply by buying them. We’re talking about a giant industrial economy of selling people’s data, so more and more we see the government going and doing this to gather evidence and try to prosecute crimes.
Amy Salit and Thea Chaloner produced and edited the audio for this interview. Bridget Bentz, Molly Seavy-Nesper and Natalie Escobar adapted it for the web.