Let's Fix Social Media

Although I use social media (you might have clicked on this post from Instagram, Twitter, or Facebook) I believe with all my being that it’s a horrible, destructive force that has caused far more harm than good. So why do I use it? Sad to say, it’s become so ingrained into day-to-day life that I really have no choice. I would delete it all if I didn’t think it would hurt my career. It’s probably here to stay. That doesn’t mean it needs to continue to be a blight on our society. Here are a couple of quick fixes that we could pass laws to enforce and start changing social media for the better.

No Minors

This is a big one. Facebook recently announced (and then delayed) a new Instagram platform aimed specifically at kids ages 13 and under. On the heels of a report that kids, especially girls, feel that social media have made them basically hate themselves, this is a disturbing development. Kids, who are not yet finished with their physical and mental development, should not be anywhere near social media. It creates a pseudo-reality where all of the consequences of real-life (and beyond) are present but none of the repercussions. You can bully, use racist, hateful language (and have it reinforced and encouraged by others), body shame, and do a number of other harmful activities with little accountability.

How often do we hear a story about a group of teens at a high school who were participating in racist groups (like this one where students held mock slave auctions for their Black classmates)? Seems like it’s pretty often. And those are just the ones we find out about and get reported in the news. I’m sure that’s just the tip of the racist iceberg. Then there’s also the sex trafficking and abuse that takes place. In a case that is unfolding in Minnesota, a Republican donor and operative was charged with multiple counts of trafficking and other crimes.

One of the people caught up in the case is a 19-year-old college student named Gisela Medina. She worked as a recruiter for Lazzaro, finding girls on Snapchat and funneling them to him to be pimped out. In exchange, he gave her gifts like champagne and paid her tuition. He also hired her to tutor his girlfriend in high school algebra. He’s 30, btw. It’s super gross. And while banning minors from social media will not stop trafficking in totality, it can have a major, major impact. In this case, for example, none of the minors that were trafficked would have been able to be recruited if not for Snapchat. Sex trafficking and child abuse on social media is a HUGE problem.

And yet, our lawmakers would have us believe there is nothing to be done. Hogwash, I say. If these companies want to sell their products in our country and make money off our citizens, they need to make sure they’re doing everything they can to keep us safe. Part of the disconnect is that we have so many elderly lawmakers who do not understand or use social media at all. Kind of important to know about the thing you’re charged with regulating. I wouldn’t put a penguin in charge of a BBQ.

We regulate loads of other industries to protect kids from harm. They can’t buy booze or cigarettes. They can’t vote or drive until they pass a test. They can’t rent a car or hotel room. They can’t even buy graphic video games, explicit CDs (does anyone still buy CDs?), or go see an R-rated movie. We limit how much, when, and where they can work. I could go on but I think you get the point. Why is social media any different?

And listen, this is to protect the kids from THEMSELVES as much as anything. How many times do we see someone post something asinine on social media when they were a teenager only to have it come back and bite them in the ass later in life? If we all collectively agree that kids are pretty much corruptible morons who shouldn’t be trusted to do much of anything, why do we allow them to show their asses to the world? We shouldn’t.

That’s step one. No minors. Maybe we can allow some kind of super-regulated social media platforms specifically AND ONLY for kids but I have yet to see one that wouldn’t be harmful to some degree. It’s like saying there is a safe cigarette.

But Matt, how are these social media companies supposed to check their user’s ages?

Great question and it brings me to my next fix:

Verify and Confirm All Users

Yeah, that’s right. From now on, if you’re going to use social media, you need to prove you are who you say you are. This can be done in a number of ways. The porn industry has figured it out to some degree. You can’t open a savings account or get a credit card or car loan without proving who you are. The banking industry has been doing this for decades. Hell, I have to show some form of ID to get a library card. You can require credit cards, bank info, government IDs, social security numbers, or something as simple as a cellphone statement. Remember, there won’t be any minors participating anymore, so an adult in America should be able to find something. We put a man on the moon! We can figure out how to ID social media users. And if not, then I guess that’s too bad for ya. You don’t have a right to post pictures of last night’s dinner.

This will do a number of things. For starters, it will virtually eliminate all the bots and spam accounts that have spread so much disinformation. No more Russian troll farms stirring up shit in America. It will also make it a lot easier to hold users accountable for their actions. If we aren’t going to require that social media companies fact-check their users’ posts then we need a more effective way to pursue legal action against the users who make defamatory or libelous content. If a sex trafficker or person cruising for minors to abuse has to enter and verify that he is who he says he is, I think he’ll be much less inclined to use it for that purpose. Plus, if there aren’t any minors on there anyway, there won’t be anything to trawl for.

To be clear, I’m not saying this information should be public. You can still make parody accounts and clever usernames. But the company needs to know who you really are and be able to share that information with law enforcement or other agents of the justice system. Want to sue an account claiming to be you? A li’l subpoena and presto — Facebook hands over the info of the user to your lawyer. There are some details that need flushing out but I have faith we can work it out.

But Matt, I don’t feel comfortable about giving my personal info to Facebook, etc.

Well, I got news for you, Sally. They already have FAR more info than you probably know. Remember, with social media, the platform is not their product. That’s the bait. YOU are their product. Believe me, they already know where you bank, what kind of car you drive, where you work, who you’re friends with, where your kids go to school, what your house looks like, where you live, what you eat, etc. It’s nearly endless. And they do that for good reason. It’s their entire business model. The more they know about you and the more accurate the info is, the more valuable you become for them to sell to other companies. We can easily write a law that requires them to either encrypt your info or delete it altogether once you have been verified. Simple.

But Matt, isn’t this putting an unfair burden upon the companies?

Nice of you to have sympathy for companies that are worth BILLIONS of dollars and do as little regulating as possible, even if they know it’ll help protect children and democracy itself. I do not have such feelings about them. To me, they’re more akin to tobacco companies than anything else. Their product is addictive and dangerous to people’s health. It’s time to regulate it.

Liquor and tobacco stores would make a lot more money if we didn’t require them to verify the age of their customers. If anyone could just waltz in and buy beer or vodka, how many 11-year-old alkies would there be? I’m willing to bet a lot more than now. We don’t even let tobacco companies sell or advertise flavored products in a lot of places because of how it impacts kids. Social media should be no different.

Now, will the companies fight this? HELL YEAH, they will. They’ll cry about how much it’ll cost to create and implement a system like this. They’ll say it won’t even do anything since crimes gonna crime. But the truth of the matter is this: they don’t want to verify users because they know that fake accounts help keep their user numbers rising and some of their most popular content comes from spammers. Once they delete all the fake accounts, you’ll see their stock value dip a bit. Same as when we kick all the kids out of the pool. That’s fine with me. I’m sure Phillip Morris took a hit when they stopped sales to minors and began aggressively preventing the company from advertising. And yet, they survived with a current market value of around $150 billion. So yeah, Facebook will be fine.

We cannot allow unfettered capitalism to continue to destroy another generation of children. There are things more important than money. And let’s not forget, Facebook started as a way for nerdy creeps to check out and rate women, so it’s not like it’s a virtuous pursuit in the first place.

Update Section 230

I’ve written about this before. In brief, Section 230 is part of the Communications Decency Act (CDA) of 1996 and provides liability protections to internet companies. Social media falls under its protection. Basically, it indemnifies them from facing lawsuits. So, for example, if a reporter for USA Today writes something false and defamatory, you can sue the paper AND the reporter. With Section 230, you cannot sue Twitter for something one of their users say. As I previously wrote, there are some positives and negatives to this.

But it hasn’t been updated since 1996. Feels like a lot has changed since “Macarena” was sweeping the nation, no? We need to get into the text of the bill and require companies who want the privilege of being exempt from lawsuits to EARN IT.

So what are some of the changes we can make? Luckily, the Justice Department actually spent a good chunk of time last year studying this exact thing. Their analysis boiled down to two main changes: transparency, and doing more to stop illegal activities on their platforms. Now, this report was compiled by the Trump Justice Department, which was about as fair and balanced as Fox News. So a lot of their “transparency” changes have to do with inhibiting protected speech. But we aren’t going to allow that to happen. In fact, we need to go in the opposite direction.

Does a person have a right to say hateful things? Yep, unfortunately. Does a person have a right to spread that hate online, using a private company’s interface? NOPE. The first update we’re going to give Section 230 is that social media companies will now be required to create and publish their company plans and policies when it comes to policing hate speech. There will be minimum standards they have to meet. We can give them a pretty broad definition of what constitutes “hate” but they’ll be accountable for upholding their own policies. If they don’t see — ya later exceptions!

We’re also going to require them to combat disinformation and fake news. Again, they’ll need to create and publicize their policy and plan and then be accountable for upholding it. Allowing trolls and foreign agents to spread disinformation online with the sole purpose of dividing America and making it sicker cannot be allowed to continue. If they fail at this, then again, they lose the protection of Section 230.

The last change that has to do with transparency is about how the company stores, protects, and sells its users’ data. Since we’re going to be requiring them to prohibit minors and verify all users, the integrity of the system is vital. Users will need the ability to delete all their information from a social media company’s database with a single click. No more hiding “delete my account” eight layers deep into the settings menu.

Users also have a right to know what companies their information is being sold to. There will need to be a section in your account where you can access a list of all the parties that have accessed your user info and what aspects of that info they have purchased. Facebook definitely knows who writes them the checks. It’s just that you (again, their product) don’t know who owns your data. I think YOU should own your data or at least know who has it.

Now, as far as stopping illicit activities on their platforms, by banning minors and requiring everyone to be verified, we’ve already done a lot of the work. But we can do more. One way to stop crappy behavior is to stop having temporary bans on people. No more “15 strikes and you’re out” policies. I think you get two. Mess up once, OK. But do it again and you’re done. And since we’re verifying everyone, they won’t be able to make a new account and continue their bad behavior. They are done-zo. This might hurt Facebook’s bottom line but again, I don’t care. I’ll save my tears for the folks who get stalked, harassed, bullied, and killed or end up killing themselves. Facebook is worth nearly half a TRILLION dollars. They can spare a few more bucks on hiring people to make sure their platforms are tidy and as hate-free as possible.

The Blowback

Whooooo boy, there’s gonna be some. Tech companies are not going to like these new regulations. They will bribe, sorry lobby, politicians to never enact these changes. They will have campaigns online and on their own platforms about how bad these changes will be. They might even have something akin to those messages that TV networks scroll on your screen when DirecTV doesn’t want to pay for ESPN. Consider this, though. Unlike with cable providers, you are not Facebook’s customer. They are not really interested in keeping you happy. Just engaged and using their service. The more active you are on their service, the more data they can mine about you and sell it off. Don’t listen to them when they try to scaremonger about how these changes will ruin your account. What are they gonna do? Stop making money? LOL.

There will also be arguments about what constitutes “hate speech”. And some of that will be fair. I feel I have a pretty good definition of it but that might differ between millions of people. If I genuinely believe that Zionist Jews are running all the banks and ruining the world, don’t I have a right to talk about that? Well, not really. As we’ve covered, hate speech is only protected from government intrusion, meaning they can’t arrest you for saying it. But when it comes to businesses, even ones regulated by the government, it’s a bit more of a gray area. There will be obvious examples, probably the majority of them will be obvious. Some will be a bit more tricky. None of this matters. Doing SOMETHING is better than nothing. Always. One idea would be to create a board made up of diverse viewpoints that create the standards. That might be tricky but again, I think we can figure it out. It won’t be perfect but it HAS to be better than what we have now.

One additional change I would suggest is that social media companies need to do a better job of protecting and caring for the employees who regulate and monitor the content on their platforms. Being a content moderator is really hard work. They aren’t given much time to judge what can/can’t be published on a platform. They’re also usually required to sign lengthy NDAs (non-disclosure agreements). And when faced with the worst of our society, like murders, child porn, animal torture, and all manner of awfulness, it takes a toll. Mental health is hugely impacted by this job. And if Facebook and Twitter want to wash their hands of being responsible themselves for content, then they need to care for the employees who do.

This means providing them with counseling during AND after their employment to deal with it. Think of it as a worker’s comp program for social media. There also needs to be stricter standards for what is considered disturbing content. Lastly, we need to stop companies from requiring NDAs for these employees. There is nothing sacred about seeing the things the employees mentioned in the videos above. Mental health screening should be regularly used and paid for by the company.

In Conclusion

Social media can be an incredible tool. So can a hammer. Right now, we’re using social media to bludgeon the brains of children and adults alike. It doesn’t have to be this way. The only reason Facebook and Twitter exist is because we wrote laws that allow them to. We have control here. We can make them accountable for whatever we want. And if they can’t be in business without hurting our society, then they shouldn’t be in business at all.

Matt Barnsley