Unaliving, Self-Deletion, and the Infantilization of Dialog on the Internet
One of the most important things we do as the individual members of a civilization is engage in dialog. This dialog covers a wide range of topics from family events to geopolitics. The latter is, of course, a largely recent phenomenon and (likely) historically rare outside of certain social classes, but important nonetheless.
Dialog must be frank, open, and honest to be useful. Censorship, even self-censorship, has numerous negative impacts including a lack of feedback, illusions of popularity (especially of ideas), shallow understanding of events, and an inability to peaceably connect and interact with the world. It is no understatement to say that any decline in the quality of dialog, to include the rise of censorship and self-censorship, would form a trend with potentially disastrous social and political consequences.
I want to begin here with a couple of definitions:
- Infantilization is the reduction of words and actions to extreme childishness, especially among adults. In the context of this article, it occurs primarily through the use of euphemism and using some means of barring participants in a dialog from using certain words and phrases, even when they have a useful purpose. Examples may include…
- …a media outlet criticizing a politician for using the, “f-bomb,” rather than simply quoting what was said, in context, including the colorful language.
- …insisting that grown adults use childish language like, “self-deletion,” or “self-unaliving,” when discussing serious social and personal issues like suicide.
- Crybully is a term coined in an article featured on ZeroHedge some years ago. It is used to describe somebody who emotionally manipulates others through contrived and often nebulous statements of victimization in order to intimidate, silence, or otherwise harm other people. They especially prefer to use emotionally, politically, or socially charged words, phrases, or accusations in an effort to place their victims on the defensive and discredit them for audiences. Examples include, but are hardly limited to: racist, sexist, misogynist, [insert here]phobe, fascist, nazi, microaggression, and more. In a strange twist of irony, the word bully is also a favorite of crybullies, especially after they reach the finding out stage of their undesirable behavior.
All of that being said, I have noticed a disturbing trend on certain platforms that content creators are choosing to self-censor. Content creators who fail to do so are required to remove or alter content to avoid removal, are demonetized, have their reach limited, and so on.
This is growing tiresome. It hides the fact that the world is not always a pleasant place, nobody actually cares how you feel, the process is shrouded by opaque algorithms and nebulous community guidelines, and I fail to see any real benefit from any of this. It’s time to stop.
The World Can Be Brutal
Guess what: the real world can be a brutal place, sometimes. If you choose to live your life rather than burying your head in the sand or living like a hermit, then you’ll very quickly notice that…well, bad things happen. At some point in your life, you will see, hear, or otherwise witness horrifying events.
You can’t live in a bubble forever. (Neither can your kids, but that’s beside the point.) One day, you’re going to step outside your door and see something terrible. If you’re lucky, it’ll just be a mild car accident where someone gets loaded into an ambulance. If you’re a little less lucky, you’ll find yourself processing driving past a motorcycle accident that the local FD doesn’t have the trucks to cover, and…hey, that’s an ankle sticking out of that shoe. Sucks to be that guy.
Oh, and this is the lightweight material that most people are likely to encounter at some time in their life: accidents and resulting injuries. This isn’t even starting on the horrendous things that people do to each other.
Nobody Cares How You Feel…
Real life happens sometimes, for better or worse. Suck it up, buttercup. Nobody else needs to hide behind stupid little euphemisms. Nobody else should ever have to watch what they say because someone else with a severe case of arrested development might get, “triggered.”
Do I care how that makes someone feel? No. I really don’t. Not one bit.

Do you not like this? Are you “offended?” Too bad. You see that little X on the tab you have open? The one with a circle and arrow in the picture above? Click it. Just click it. That’s all you have to do: click it and shut up about it.
There’s something that needs to be understood about life. This is important. Neither I nor anyone else has the energy to care. We cannot waste what little time and energy we have on caring about what might offend some rando somewhere. Here’s the thing…
- Somebody, somewhere will be offended by what you say, do, or believe, no matter how normal or innocuous it may seem to you.
- Nobody, and I mean nobody, has the energy to constantly worry about how everyone else might feel about what they say or do.
Get used to it. People will discuss rough topics. In some cases, they need to discuss these things. They really neither know nor care how a third party might feel about it. Nor should they. Nothing gives the “offended” third party the right to interfere in that discussion.
Opaque Algorithms, Nebulous Guidelines…
I think that the worst part of this whole affair is a combination of opaque, closed-source algorithms. It’s further exacerbated by nebulous community guidelines.
I’m going to start with the algorithms. One of my pet peeves is that the algorithms for social media and YouTube are closed-source. There have been (supposed) leaks, of course, but I cannot, say, go to Google’s GIT repository and view the algorithm myself. Yes, it’s generally accepted that certain factors (e.g. likes) will promote or demote certain content for various users. However, there is nothing on the content creator or social media user side which tells them how the algorithm actually works.
The biggest problem I have by far, however, is the makeup of the “for you pages.” How do social media sites decide what content is promoted on users’ pages? It certainly doesn’t seem to have anything to do with who they’re following or subscribed to. Frankly, the FYPs I’m seeing on social media turns platforms into a lottery system for whether or not users will see content they want or content creators will continue connecting with consumers. That’s a problem. It makes the platforms unstable and unpredictable—and miserable from a user perspective.
Either way, onto the problem I have with community guidelines. Ironically, it is less about the guidelines themselves and more about their enforcement. A guideline against harmful content may seem perfectly reasonable on the surface. However, when that guideline is used to selectively silence voices to the right of Karl Marx, we have a problem. And, this has been apparently rampant until very recently.
Cui Bono?
It’s a serious question. Who is actually benefiting from any of this? Who is benefiting from a lack of maturity in conversation? Because it certainly doesn’t seem like anyone is.
Outside of a small group of crybullies, does anyone actually benefit from a total lack of maturity in dialog? Does anyone actually benefit from the arrested development that stems from inundation with silly euphemisms?
Frankly, it seems like nobody actually benefits from any of this in any way. All it has done is hand the ability to speak over to a small group of people.
Indeed, this small group of people are the only beneficiaries of the system. They are the political ideologues, the tHouGhT lEAdErS (pronounced a-cuh-dem-icks), multi-national corporations, NGOs, activists, and other persons (and organizations) of questionable moral or social value.
Bust the Stupid Bubble, Already
Thing is, I’m talking about grown adults here. The issue here is that these grown adults are having to tolerate language-based censorship in content aimed at adults on various websites under the auspices of those websites being, “family-oriented.” One site in particular is known for, often arbitrarily, labeling various content as “mature content,” limiting the reach of said content, demonetizing content, and so on—cough cough YouTube cough cough.
Stop. Seriously.
We all know that these websites use algorithms to promote content labeled as “family-friendly,” or even, “for kids,” despite them being obviously not suitable for children. And, frankly, this family-friendly nonsense is a load of garbage. It always has been. Nothing on the internet is safe for children. It never has been, and anyone with two brain cells in synapse has known this for a very long time.
The often (allegedly) arbitrary nature of rules enforcement aside, it’s not just time to ask if these rules are useless. It’s time to ask if these rules are actively harmful.
I think this language is arresting people’s development. They’re not maturing in the way that they use or process language. And, it’s insane to expect that people will. People need the complete, unfiltered truth if they’re going to grow up. Otherwise, what they have to process later on will be made worse by having never had to process it conceptually.
And, I don’t just blame the censors and crybullies. I also blame the people who willingly accede to their demands, who willingly censor their own speech, usually for the sole reason that they can stay monetized. They’re just as bad as the crybullies. They give the crybullies something that they desperately crave: consensus…something the crybully can point to as an example of how it’s, “nOt ThAt HaRd,” as they brow-beat their next victim.
Burst your bubble. Stop limiting the way you speak and use language. You don’t have to become a foul-mouthed bore who confuses profanity with having a vocabulary. No, speak the unfiltered truth and never ever use their insipid, infantile language.

Stop It…Seriously, It’s Time to Stop
I don’t know about anyone else, but I’m not about to play nice on any of this. No, I’m not going to self-censor.
- They are not unalive…they are DEAD.
- They didn’t self-delete…they COMMITTED SUICIDE.
- They were not unalived…they were MURDERED or otherwise KILLED.
- Nobody saw a pew-pew. They saw a GUN.
Not only do I refuse to self-censor, but I’m not adding any of their precious little tRiGGeR wArNiNgS, either.
I write high fantasy. People will get stabbed, cut open, dismembered, decapitated, disemboweled, and so on and so forth. Occasionally, someone might even get hung, drawn, and quartered—now there’s some messy imagery.
I don’t care if someone gets triggered.
Nobody has a right to use their “trauma,” whether real or imagined, as a means to silence others, control what they say, or put limits on how they say it. Doing so instantly turns one from “victim” to blatant crybully. That’s right, anyone who ever tries to do that kind of thing is a bully, not a victim. Like all bullies, they should be exposed and shamed for their nasty behavior.
So, pardon me for my lack of sympathy over you hearing someone say, “dead body,” or, “corpse,” instead stupid little euphemisms like, “uNaLiVe PeRsOn.” I don’t care.
It’s time to stop.
In fact, the time to stop was years ago.
I get why people did it. I understand. Nobody wanted to feel like they were mean. Nobody wants to be the jerk in the room. People had their goodwill turned against them by malicious actors pretending to have some special status.
It’s time to stop. It’s time to stop self-censoring. It’s time to stop apologizing. It’s time to stop giving ground.
If someone doesn’t like it, then that’s too bad.
Don’t give a crybully the time of day. Don’t debate people like that. Don’t engage with them; don’t give them your time; don’t feed the crybully in your feed; and, for Pete’s sake, never, EVER apologize to one of the little twerps. The only engagement they should ever receive back is mockery.
Feel free to ask questions or make comments below…