Tumgik
#IN ANY CASE. i have multiple medias to consume lest i let The Thoughts take over my brain again for even a second ٩(๑❛ᴗ❛๑)۶
anneapocalypse · 7 years
Note
[tone of genuine curiosity, as clarified in an elcor-esque fashion because the internet renders all emotion an uncertain factor] You're welcome to skip this ask if you ain't up for it, but re: the perpetual debate over Problematic Subjects In Media, I've seen you in the past write many a critique on how fandom writes/treats women / BDSM / etc. Does this not fall under the idea that the writer has a responsibility in how they handle / frame certain issues in their writing?
Hi Silt! I’m up for it, but buckle in, because this is gonna get long. :)
Okay so the thing is, this is a broad topic and these days I try to resist treating it as a zero-sum game with “No Critique Allowed” on one side and “Relentlessly Harass People Who Make Bad Content According to Our Arbitrary But Obviously Correct Standard” on the other. 
Let me state clearly for the record: both of those options are terrible. Fortunately, it’s not all or nothing, and those aren’t the only horses in the race.
The way that female characters, characters of color, disabled characters, and other representations of marginalized groups are treated in media remains very much of interest to me. That hasn’t changed. My approach has changed somewhat over the years (as I’d hope it would, if I’m continuing to grow as a person), largely due to understanding that some rhetorical styles are more effective than others when you actually want to reach people or change something.
If I gave the impression that I want to absolve creators of all responsibility, that was never my intent. In fact, I mentioned critique and growth as part of the process in one of my recent posts. I do critique the media I regularly consume, and in fact the more heavily I am immersed in something, the more in-depth my criticism, because we’re best able to examine the things we know best.
What I do feel is that creators need room to grow, and fandom can be a great test bed for exploration, where creators work with elements of established media to explore different ideas and techniques. I’m not saying fandom is only a test bed, or like, a trial run for original work, because I don’t think that; I think fanworks are worthwhile in their own right, written for enjoyment and personal indulgence. But the fact is that many of us do or will create original work, and for many of us, creating fanworks helps us build a skillset we’ll use for original work too. 
That said, the cultural impact of fandom is more limited than that of popular media. I’m not saying it has no impact–and indeed, in a time when we have multiple known works of popular published fiction that are retooled from fanfics, when TV writers are on twitter regularly interacting with their fanbases, it’s probably safe to say fandom has more impact on popular media than it ever has before, but neverthelesss, its impact is still limited. The average piece of fanfiction does not reach an audience on the scale of a piece of popular media, that’s just a fact.
Does that mean we shouldn’t bother looking at patterns in fandom and fanworks? Hell no! Fandom is a microcosm–the patterns we see in fandom do absolutely reflect wider social patterns and in fact for very immersed fans it can make those patterns more apparent. And I think it’s good for us to discuss them, address them, become more aware of how we play into them–especially if we’re creating or planning to create original work.
Because these kind of discussions, when they are actually discussions, do work. I talk about the season 10 climate in the RvB fandom a lot, but even back then, I saw people change their minds about Carolina, not because they were accused of internalized misogyny or told to feel guilty for not liking her (shockingly, shaming people for their taste doesn’t have a high success rate in changing their minds), but because someone presented them with a compelling case for a more nuanced reading of her character. My experiences in past years led to me almost checking my watch to see fans turn on RvB’s newest female character this season, and you know what? It hasn’t happened. Things do change, and I don’t think fandom turnover is the sole reason. I would love to see some shifts in other patterns as well. For example, I would love to see trauma in female characters given as much weight as it is given in male characters. I would love to see more artists willing to draw Tucker with brown eyes. Those will be discussions, and we’ll continue to have them.
What I’ve seen happening in recent years, though, is a turn toward a certain ideal of purity in fanworks. It’s not an ideal of working toward more complex and thoughtful portrayals of characters; rather, it’s an all or nothing attitude that says some characters and ships and topics are Good and worthy to be explored in fanworks, while other characters, ships, and topics are Bad and anyone who touches them or likes them is Bad, and also fair game for targeted harassment.
I keep drawing comparisons between fanworks and original work for a reason–the attitudes that I find most unsupportable in fandom are the same ones I find untenable when it comes to original work, and when you apply them to the latter, their limitations are far more obvious. 
One example: the idea that it’s wrong to find any reasons to sympathize with an antagonist, or to look for an interesting and complex backstory, one that might make sense of (not even to say justify) their actions. That’s all well and good when you’re engaging purely from a fan perspective I guess, but what happens when you want to write a novel? If it’s morally wrong to find complexity and interest in villains, are you morally obligated to make your antagonist as bland and cartoonish as possible, to be sure no one could possibly relate to them? Is that good writing? Is that what we want?
Or take the idea that it’s morally wrong to ship unhealthy ships–and this attitude in fandom goes that shipping certain ships is wrong regardless of how or why, to the point that people will proudly identify themselves as “anti-[ship],” thus building a kind of identity around not shipping a Bad Ship (and giving rise to the umbrella term “antis” to refer to this attitude). Carry this into original work and… you’re not allowed to write unhealthy relationships? You’re not allowed to write any conflict into a relationship between two “good” characters lest it be perceived as “abusive” or “toxic?” 
Then there’s the idea that it’s morally wrong to write fic with dark subject matter, which is what my most recent posts were about. I’m never going to argue these things can’t be done badly but I’m absolutely going to push back against the idea that they can’t be done at all. And I could write paragraphs more about how incredibly reductive I find the whole idea that certain topics are just off-limits for fiction, that art isn’t allowed to be catharsis (especially in a tiny niche setting like fandom, for corn’s sake) but this post is long enough, so I think I’ll put a lid on it here. ;) But frankly, if someone’s going to write dark fiction insensitively, in bad taste, or just plain poorly, there are worse places for it to exist than on AO3 tagged with content warnings, where nobody’s paid a hot cent for it and the way out is just clicking the back button.
24 notes · View notes
ntrending · 6 years
Text
Our stupid brains love spreading lies all over the internet
New Post has been published on https://nexcraft.co/our-stupid-brains-love-spreading-lies-all-over-the-internet/
Our stupid brains love spreading lies all over the internet
Tumblr media Tumblr media
Social media allows us to catch news in real time, but it also makes it incredibly easy for false information to spread. A new study published this week in Science suggests that hoaxes, whether they be unintentional nuggets of misinformation or malicious pieces of propaganda, actually disseminate more quickly than pieces of true news.
But that’s not at all surprising, from a psychological standpoint. By their very design, these falsehoods prey upon some of humanity’s greatest cognitive weaknesses.
Because most people get information from social media without thinking about which traditional news outlets the stories come from, the researchers—Soroush Vosoughi, Deb Roy, and Sinan Aral of MIT’s Media Lab—didn’t just look at “fake news” published in full-article form. They instead tracked “rumor cascades,” or the spread of tweets involving any piece of information (texts, photos, links to articles) that purported to be true news. That leads us to a brief caveat: this study isn’t necessarily evaluating our ability to tell whether or not a news article is biased to the point of absurdity, or downright untrue. A tweet with a lie in it isn’t exactly the same thing as a news article full of misinformation. But while “fake news” is quite the hot topic, internet falsehoods in general are probably pretty dangerous, too.
Back to the methodology: Each individual fact counted as a rumor, and each time it was independently presented as news counted as a cascade. Cascades grew with every retweet. A rumor might have several “size one” cascades if multiple individuals tweeted the information without getting any retweets, or a rumor might have a single, large cascade if 1,000 people retweeted a single instance of the claim.
By mining Twitter data from 2006 to 2017, the MIT team found around 126,000 cascades spread by some 3 million people more than 4.5 million times. They then fact checked them with reliable sources (snopes.com, politifact.com, factcheck.org, truthorfiction.com, hoax-slayer.com, and urbanlegends.about.com) and calculated how big each rumor got—and how quickly.
According to their analysis, false information was 70 percent more likely to get an initial retweet. On the whole, they report, “falsehood diffused significantly farther, faster, deeper and more broadly than the truth in all categories of information.” But the effects were strongest for false political news, with the total number of false rumors peaking at the end of both 2013 and 2015 and again at the end of 2016, corresponding to the last U.S. presidential election.
Lest you try to blame bots, well, the researchers used software to try to weed them out. At least based on the fake accounts they were able to catch, it seems that robo-tweeters had equal success in sharing true and false pieces of info; it was human users who saw runaway success when spreading fake news. Major influencers weren’t to blame either, apparently, because the MIT researchers saw more falsehoods spreading faster from Twitter users who followed and were followed by fewer people. After looking at the kinds of language used in sharing the most pervasive falsehoods, the authors concluded that the novelty and emotional charge of untrue rumors fueled their spread.
But this study doesn’t mean we’re all trapped in a hopeless cyclone of misinformation.
“I think that panicking about fake news on Twitter is the wrong reaction,” Duncan Watts, a specialist in social networks at Microsoft Research who wasn’t involved in the new study, told Nature. Other research, he argues, suggests that most people overwhelmingly consume real news in lieu of the fake stuff. Robinson Meyer at The Atlantic also rightfully points out that the study’s methods might inadvertently ignore non-contested news—stories so undeniably true they don’t get fact checked on websites like Snopes. But the findings are still disturbing.
“The key takeaway is really that content that arouses strong emotions spreads further, faster, more deeply, and more broadly on Twitter,” Rebekah Tromble, a professor of political science at Leiden University in the Netherlands, told The Atlantic. “This particular finding is consistent with research in a number of different areas, including psychology and communication studies. It’s also relatively intuitive.”
Indeed, the notion that we might be attracted to false information is far from surprising. The simple fact of the matter is that our simple brains only want to believe facts that fit our existing belief systems.
Our minds boast many a cognitive bias—ways we process information other than thinking about it rationally, probably as an evolutionary quirk meant to help us make snap decisions before getting, like, eaten or something—and they’re a real pain in the butt. We remember the past as being better than it was, we believe things more the more often they’re repeated, and we let new events color our recollection of old memories, among other things. But a lot of our love for lies comes from good old confirmation bias.
Most people wouldn’t share something they suspected to be untrue, so it’s probably safe to posit that a fake rumor is more likely to be successful if it makes someone say, “yeah, sure, makes sense to me!” Unfortunately, we have a hard-wired tendency to try to fit every new piece of information we learn into what we already know to be true. This can lead us to dismiss evidence of climate change as being obviously untrue, or encourage us to cherry-pick pieces of data that support the (debunked) notion that global warming is happening naturally, without human input. Our brains are desperate for evidence that what we already believe is true, which makes it oh-so-tempting to take for granted something that fits our favorite narrative. Admit it, it’s happened to you: even if you’re sure you’re not one of those people who spreads so-called fake news, it’s pretty dang likely you’ve accidentally shared something false at least once because something in your brain sang out “yessssss, I knew it.”
We also love finding patterns where there aren’t any, which allows all sorts of conspiracy theories to persist on Twitter and elsewhere. In short, our brains are often quite dumb.
The MIT researchers think that sheer surprise might also be a factor. Something made up is more likely to seem like it’s coming out of left field than something true (given how real-world events tend to unfold in a linear fashion, and all that) and that might make us more excited to spread the word. Of course, many would argue that our current news cycle is frequently surprising, so perhaps phonies won’t have an edge in this regard for much longer.
In any case, the only way forward—at least while the engineers behind our favorite social media platforms continue to fail to stop the spread of falsehoods—is to put that lazy brain of yours to work. We may be simple creatures with biases a-plenty, but pausing to think through the rationality of something (or, crazy thought here, checking other sources to see if it’s true) before hitting the retweet button can help elevate our powers of reasoning. Another recent study found that folks get worse at weeding out hoaxes when their newsfeed is too flooded, suggesting a little more time to think might make all the difference.
And eventually, maybe scientists and social media companies will figure out what else we can do about it. MIT’s team hopes to explore options in future studies. “Let’s not take it as our destiny,” study author Deb Roy told Reuters, “that we have entered into the post-truth world from which we will not emerge.”
Written By Rachel Feltman
0 notes