Cart ()

Facebook Eyes Final Data Frontier, Unrolls App for Kids

Facebook Eyes Final Data Frontier, Unrolls App for Kids

Facebook has unveiled its latest platform, Facebook for Tots. Its official name is actually Messenger Kids, and those who are familiar with Facebook’s revenue stream, which relies upon its users’ behavior as monetized data to be sold to advertisers, will understand that kids represent the final frontier of Facebook’s paid-for advertising data. Kids are not just the final untapped source of behavioral data, they are the most lucrative, and the social network has now presented a means through which to mine that data under the guise of ‘protecting’ the kids.

There goes Mark Zuckerberg again, saving the human race from its own malfeasance. What would we ever do without him to save us from the minefield of unmonitored internet use? First they requested our nudes, and now they are asking parents for permission to data-mine their children. If they have the nudes and the kids, what is even left for Facebook to mine?

Facebook DNA, coming September 2018…

But seriously, let’s consider the specifics and ramifications of this new Facebook platform that is catered to those below the current Facebook age restriction, which sits at 13 years. Keep in mind that the new low-bar for Facebook use would now be six years.

‘The company will collect the content of children’s messages, photos they send, what features they use on the app, and information about the device they use. Facebook says it will use this information to improve the app and will share the information “within the family of companies that are part of Facebook,” and outside companies that provide customer support, analysis, and technical infrastructure.’ (Wired)

Those kids who have already been corrupted by the ‘adult’ version of Facebook are not going to go quietly to a platform that is branded as Messenger Kids. Like the first time I listened to the explicit version of an Eminem album, you weren’t going to convince me to listen to the clean version if there was any other means by which to hear those dirty words so crucial to the rhyme scheme and lyrical effect.

So, it follows that Facebook’s Messenger Kids is not going to serve as an alternative on which parents can monitor their pre-teen and teenage kids’ behavior. This demo has crossed the social media Rubicon, and even if their parents insist, they aren’t coming back. To those well acquainted with Facebook’s primary aim – getting as much information about the lives and thought processes of as many people as possible, including children – this Messenger Kids serves to send one message: it’s perfectly acceptable for children of any age to use Facebook.

It’s a blatant attempt at de-stigmatization of social media use, at the same time giving the parents the false comfort and low-hanging rationalization that Mark Zuckerberg has made it safe for kids, so why not? After all, how could we deprive our children of access to the platform that has bared so many benefits in our own lives, connecting us with high school friends and past coworkers with whom we’d long ago lost touch. Maybe six-year-old Timmy will be able to get back in correspondence with that buddy he had way back in the womb days, right?

I’ll put moral grandstanding about the seemingly unstoppable rise in childhood obesity and the role technology and social media has played in fueling that rise aside. The decline of kids spontaneously put together the neighborhood football game or the extinction of the most basic, joy-inducing game there ever was – Tag – is a depressing one (tag has actually been banned by many school districts, if you hadn’t heard).  but the debate over Facebook’s Messenger Kids is far deeper than one’s body mass index.

For one, the introduction of a social platform populated by children is an advertiser’s dream. Advertisements already sustain Facebook’s revenue model, and providing a platform comprised solely of the most impressionable, easy-to-market to demographic on the face of the planet – children – is almost as unjustifiable as creating a Facebook for kids in the first place. For now, Messenger Kids is said not to contain advertisement, but in the relatively unregulated medium of social media, how can this be trusted, if not subject to change? Even Plato recognized that we need to show children only images which reflect humans’ most just representations, and it’s the ‘special nature’ of children – impressionable, vulnerable to falsehood and the agendas of advertisers – that has prompted the FTC has created laws over the years to limit marketing to children. This, it is easy to argue, is Facebook’s primary long-term benefit in creating such a network, as it is a veritable goldmine for advertisers.

It’s a sad state of affairs that it is necessary to state this: the same logic applies for pedophiles, who will surely infiltrate the network with haste, preying on those whom it has never been so convenient to prey upon.

Beyond those who would infiltrate the medium from inside or out to take advantage of the children who use Messenger Kids, it is the medium of Facebook itself, as well as its offshoots, which re-shape the personalities and social makeup of its users, even as adults. Zuckerberg and his co-founders knew this when they created Facebook, and have tailored the algorithms inherent to the platform to make people essentially addicted to its never-ending stream of pictures, statuses, videos, and ‘news’.

As co-founder Sean Parker explained this year, “it literally changes your relationship with society, with each other ... It probably interferes with productivity in weird ways. God only knows what it's doing to our children's brains."

God may not know what it’s doing to our children’s brains, but Mark Zuckerberg and his team of technicians certainly do. And now, they will have a legal basis for tapping further into those brains, molding them into the consumerist, tech-addicted, mindless drones that increasingly populate not only elementary schools, but colleges, workforces, and social assistance rosters.

But, parents can approve their contacts and monitor their conversations, so what is there to worry about?