Welcome to tilde.camp!
The spark that ignited this project, which has culminated in tilde.camp, was the infamous "Facebook Experiment".
The Facebook Experiment was a scientific study in which unsuspecting Facebook users were used as lab rats in a social experiment that specifically sought to discover the extent to which people's emotions can be hi-jacked by Facebook. This was done by filtering certain status updates out of certain people's newsfeeds, to see how it affected their own status updates.
Two groups were selected, in which one group would have their friend's status updates with positive emotional words removed from their newsfeed, while the other group would have their friend's status updates with negative emotional words removed from their newsfeed. In this test of "emotional contagion", the experiment found that anyone whose newsfeed had positive status updates hidden from them (and who were therefore seeing a disproportionate number of emotionally negative status updates) would themselves be more likely to post emotionally negative status updates. The inverse was also true.
The experiment was a "success". Facebook can be used to manipulate emotions en masse, and they know it. (Which means, so does their A.I.)
One of the things that makes this type of manipulation so powerful, is that there was no misinformation required. There were no fake status updates planted in people's newsfeeds. So there was no trace of misconduct, because all you see is your friend's genuine status updates. It's just that certain status updates never showed up, which is incredibly hard to notice when you have so much information flowing through your newsfeed-and-mind at all times. In a world where there is such a constant flow of data, and an overwhelming and steady stream of information, it actually becomes easier to manipulate people because the data need only be filtered and gently nudged this way or that, and you can control entire populations to feel more a certain way or be nudged toward a certain worldview.
To learn more: Facebook's History of Secret Experiments
Combatting these types of manipulations has proven much more difficult than we thought, for a number of reasons. For one thing, these algorithms are already in place, filtering our worldview in whatever ways are profitable for the companies implementing the algorithms. And if it's not profitable for people to learn in-depth about how they are being manipulated, then chances are the algorithms will bury that information as well, or remove it from your newsfeed altogether.
Another obstacle in sounding the alarm on these manipulations, is the tendency for humans to consider themselves "individuals". As "individuals" we like to think of ourselves as being somehow impervious to outside manipulations, and imagine that every choice is our own "free choice" derived from some magical concept of "free will" that is somehow unable to be manipulated. This is one of the rules of marketing, that has allowed marketing to take over every aspect of our lives. Rule #1: Marketing works. Rule #2: No one believes that marketing could possibly work on them, although there is insurmountable evidence that it works on everyone else.
To the emotionally fragile human brain, with it's heavy investment in this illusory concept of the unchanging self, when met with information that proves just how easily we are manipulated, and how prevalent these manipulations are, we all generally consider ourselves to be the "exception" to the rule.
And yet, if we humans could fearlessly admit the extent to which we are being manipulated, we might be able to do something about it quite easily. For instance, by creating media platforms that organize things in chronological order again.
However, there has been a war going against this illusory concept of the unchanging self, that is impervious to manipulation, for many years now. It's called Anonymous.