Tumgik
theleftexplained · 6 years
Text
How an Ant Taught Me the Importance of Voting
Tumblr media
“Elections belong to the people. It's their decision. If they decide to turn their back on the fire and burn their behinds, then they will just have to sit on their blisters.” – Abraham Lincoln (supposedly)
You've heard over and over again the words “your vote counts!” It's a cliché that quickly reaches the point of sounding sarcastic, especially because – if you live in America – it's provably false in many ways. However, just because your individual vote doesn't count doesn't mean you shouldn't cast it anyway. This may sound counter-intuitive (and it is), but it makes sense when you step away from the micro picture of one individual vote, and look at the macro picture of the state or the nation.
First, let's be perfectly clear; if you live in America, your vote – at least for president – doesn't count in most states. This is thanks to the winner-take-all aspect of the electoral college. To win all the votes for a state (excluding Nebraska and Maine, who apportion their electoral college votes proportionately), a candidate needs exactly 50% plus one vote. This means that unless your state was won by literally one vote, you could have stayed home and the outcome would have been exactly the same. This is one reason why, if you support the candidate who most of the people in your state don't support, it can feel like an exercise in futility even showing up on election day. This does not, however, mean that your vote doesn't matter, or that you shouldn't cast it anyway.
Let's talk about ants for a moment. An ant is a stupid, mindless creature that makes no conscious decisions in its life, living only to serve the queen and colony. An ant *colony*, however, is a complex, multi-layered system that adapts to and interacts with the world in ways that seem almost impossible when you look at the individual parts. Let's focus for a moment on how ants communicate and select jobs. Ants (other than the queen) have four major jobs that must be filled; workers, soldiers, gatherers, and caretakers. These jobs have to exist in the proper proportions for the colony to survive and thrive. Ants communicate what role they fill by excreting identifiable chemicals, smelling each other as they meet, and (most importantly, for our purposes) keeping track of who they've met recently. If something happens to one section of the colony – say there's a large(ish) battle with a predator, or a hostile encounter with another colony, causing many soldier ants to die – the proportions of ant jobs fall out of balance. Ants aren't psychic, and despite what pop culture may have told you, ants don't have some kind of everlasting mental connection to the queen so she can talk to them all at once. So how do they put their roles back into balance?
Well, remember when I mentioned that they keep track of who they've met recently? The ants also know what proportions their jobs should be in, and when the ants they encounter are dramatically out of proportion to what they expect, they switch jobs to make up for the difference. Thus, without the colony as a whole having to be sentient, and without having to have a town hall meeting to assign new jobs, the colony can adapt and return to normal levels fairly quickly. To use another cliché, the whole is greater than the sum of its parts. Or, to use a more scientific description, this is a phenomenon called Emergence – when larger effects come from the cooperative effort of smaller things. (For more on this phenomenon, and for my source for the ant example, check out the video on youtube that Kurzgesagt – In a Nutshell made on the topic.)
So how does that relate to voting? Well, in the ant example, the individual actions of any one ant are irrelevant. One ant changing its role won't put the proportions back into balance. But if none of them change, the colony will be in massive trouble. The same goes with voting. Your vote, as an individual part of the system, doesn't count. But you should do it anyway. One person refusing to vote doesn't throw the system out of balance, but if everyone were to make that choice, voter turnout would be zero. You are a precious, unique, special little snowflake, and your individuality is important and beautiful and makes the world go 'round. But you're also a cog in a machine. Acknowledge that, and play your part in the ant colony that is society. Accept that, and make the seemingly useless gesture of voting. Perform an exercise in futility, and watch as millions of people doing the same transforms the world around us.
Just because the sum is greater than the whole of its parts doesn't mean the parts don't matter.
P.S. If you haven’t registered to vote (or don’t know if you have), there’s still time! Click here to check, and click here to register online.
0 notes
theleftexplained · 6 years
Text
Every 9 Days
In February of this year, a lone gunman shot up Marjory Stoneman Douglas High School in Parkland, FL, killing 17 and injuring 14 others. 3 days later, one of the students made the wish that they be the last school shooting. 13 days later, a gunman killed 2 people (his mother and father) at Central Michigan university in Mount Pleasant, MI. 5 days later, Huffman High School in Birmingham, AL lost 1 student with 2 others injured at the hands of one of their students. 2 days after that, another student at Fredrick Douglass High School in Lexington, KT accidentally wounded himself with a gun he had smuggled to school. 5 days later, again in Birmingham, AL a disgruntled employee at the University of Alabama shot 2 of his coworkers (killing one) before killing himself. 6 days later, a high school student at Great Mills High School in Great Mills, MD killed his ex, wounded another student, and killed himself. We had a brief respite before exactly one month later, on the day of a national walkout about school shootings, a student was shot and injured at Forest High School in Ocala, FL. Nearly a month later, a student at Highland High School in Palmdale, CA fired a semiautomatic rifle and injured a fellow student. 6 days later, a student at Dixon High School in Dixon, IL shot a faculty member. The next day, 10 were killed and a further 10 injured at Santa Fe High School in Santa Fe, TX. That same day, at Mount Zion High School’s graduation in Clayton County, GA, a gunman shot two people, killing one and injuring the other.
The Parkland shooting happened on February 14th, 2018. The Santa Fe and Clayton County shootings happened on May 18th, 2018. 
94 days. 
11 shootings. 
35 dead. 
33 injured.
And keep in mind, this is only what has happened on school campuses in that time. We can do better. We MUST do better. And shame on us all if we don't.
0 notes
theleftexplained · 6 years
Text
Every Common Anti Birth Control Argument Refuted
“The truth is that male religious leaders have had – and still have – an option to interpret holy teachings either to exalt or subjugate women. They have, for their own selfish ends, overwhelmingly chosen the latter.” – Jimmy Carter
With the Trump white house having repealed an Obamacare rule preventing employers from opting out of birth control coverage (under the guise of “religious freedom”), I think the time is ripe to have a discussion about what the truth is behind the motives of these “religious freedom advocates” fighting against progress for women's rights. Setting aside the fact that many of the arguments against both abortion and birth control are religious ones (and we're a secular nation, no matter how much anyone tries to convince you that we're a Christian one), on the surface it simply makes no sense to be against birth control if you're against abortion, because birth control (surprise!) helps lower abortion rates. Besides that, the “moral” arguments against birth control are rather weak. I'll break them down one by one.
Contraception is inherently wrong.
“Contraception is unnatural.” I have yet to cover this topic in depth (I have an entire post planned), but I will at a later date. “Unnatural” is a null concept, and therefore doesn’t mean we shouldn’t do it. By the same logic, all of modern medicine is “unnatural”, as is literally almost every aspect of modern life. Unless you’re Amish, if this is your argument, you’re a hypocrite.
“Contraception is anti-life.” The phrasing of “anti-life” is a little harsh and smacks of Ad Hominem. If we translate this into more neutral terms, it basically says that contraception is against having babies. In a world that could easily be threatened with over-population very soon, that's not necessarily a bad thing. Besides which, having children when you can't afford them leads to a lower quality of life for both the parents and the child.
“Contraception is a form of abortion.” No. No it's not. Abortion removes an already fertilized egg (or fetus, depending on when it occurs). Birth control prevents fertilization from occurring. There's a difference.
“Contraception separates sex from reproduction.” Ok. And? Sex is fun. If it wasn't, people wouldn't do it so much. Removing access to birth control isn't going to stop people from having sex for fun. It's just going to make recreational sex more likely to lead to unwanted pregnancy (and thus, abortion).
Contraception brings bad consequences.
“Contraception carries health risks.” So does pregnancy. So does alcohol, but you don't see people trying to ban it completely. So do most fun things, really. Risk is a part of life, and there are far more risky things that we don't even question whether or not to allow.
“The 'contraceptive culture' is dangerous.” Dangerous to whom? Contraception lowers the risk of disease and pregnancy, both of which can be dangerous. If what you mean is that easy access to contraception leads to people being more aggressive about pursuing sex, there's no evidence to suggest that. Humans want to bang. Get over it.
“Contraception prevents potential human beings being conceived.” Yep. That's the point. Of course, what I'm sure you actually mean is the classic “what if your potential kid cures cancer and you prevented them from being born” argument. To which I respond “what if the mother of an accidentally conceived child could have cured cancer and having the baby forces her to drop out of school?” That door swings both ways. Personally, I'm in favor of the already-living having more rights than those not yet even conceived.
“Contraception prevents people who might benefit humanity from being born.” See above.
“Contraception can be used as a eugenic tool.” So can selective breeding, which doesn't require contraception – just abstinence. Guns can be used as hammers, but that's not the primary use for them. Just because something can be used in a certain way doesn't mean we should ban it. Otherwise, cough syrup would be as restricted legally as meth.
“Contraception is often misused in mass population control programs in a racist way.” We don't have a mass population control program, and nobody rational is suggesting it in America. There are plenty of other racial issues to focus on – ones that actually exist in America. And yes, I do mean PLENTY. I really don't think this one is at the top of their list.
“Mass population control programs can be a form of cultural imperialism or a misuse of power.” See above.
“Contraception may lead to depopulation.” In a world with more than 7 billion people (a number rising rapidly), depopulation isn't really an issue we have to worry about. The real worry is what happens when we have a population we're incapable of supporting with the limited resources on our world (if we haven't made it to another resource-rich planet by then – and it seems unlikely).
Contraception leads to immoral behavior.
“Contraception makes it easier for people to have sex outside of marriage.” Ok. And? The only “moral” argument against this comes from religion, which should not be allowed to factor into the laws of our secular nation (see the first amendment).
“Contraception leads to widespread sexual immorality.” You have no evidence of that. People are going to do what they want. Besides which, you haven't defined sexual immorality – what's immoral in sex can be defined in exactly one condition: something occurring between parties that are not consenting adults. Anything other than that isn't inherently immoral – it's a personal moral code, and you have no right to hold anyone else to your standards.
“Contraception allows people (even married people) to have sex purely for enjoyment.” I'm sorry my response took so long – I was laughing my ass off. Contraception doesn't “allow” sex for enjoyment. People are going to do that regardless. It just helps protect against unwanted consequences.
The arguments against abortion pretty much boil down to one fundamental disagreement: at what point does life begin? At what point is removing the thing inside you murder instead of something more along the moral lines of clipping your toenails? The truth is, there's no good answer to this, because despite our desperate attempts to divide things into finite categories, that's not how most things in the world work. There IS no finite line. I personally lean in favor of a woman's right to choose whether or not to have an abortion, and draw the line for abortions (that aren't medically necessary) at 26 weeks, the point at which the fetus can feel pain. There are good people on both sides of this question, and the answer is one that really isn't clear. However, there is one caveat; those against even medically necessary abortions (where the life of either the fetus or the mother is at risk) have no non-religious leg to stand on. And once again, religion cannot be allowed to enter into our legal system.
However, there's more to it than just the surface arguments that they use. For religious fundamentalists, at heart, the issue isn't really whether birth control is risky, or whether abortion is murder. It's about the fact that in the book written by their sky lord, he said “no touching your naughty bits.” And as Jimmy Carter once said, it's about subjugating women.
I have a patreon where you can give me money to do what I do anyway! Check it out here.
3 notes · View notes
theleftexplained · 7 years
Text
What is Labor Day? And Why do we Celebrate it?
“We must elevate the craft, protect its interests, advance wages, reduce the hours of labor, spread correct economic doctrines and cultivate a spirit of fraternity among the working people regardless of creed, color, nationality or politics. These principles are the foundation principles of our organization.” - Peter McGuire, the first Secretary of the AFL
Many people recognize Labor Day as the “unofficial end of summer” and not much more. It’s the last day where you can wear white pants and/or shoes, it’s one last day for a barbecue, a day off from work and school for Americans to throw a big end-of-summer bash. But have you ever wondered what Labor Day is, or why we celebrate it in the first place?
As Labor Day approached this year, I found myself not only struggling to remember the name of the holiday (I often used to switch it up with Memorial Day and Veterans Day, not having paid much specific attention to any of the three), but wondering what it was supposedly created to celebrate. Labor, obviously, but what does that mean? I’ve never been big on holidays, so I decided to ask around to see if this was mirrored in my friends and family, and of the ~50 people I asked, only 2 could come up with a more specific answer than I did. Apparently, the problem is fairly wide-spread.
It’s a real shame, too, because in this era of massive wealth inequality and stagnant wages, it’s massively important that we know and understand this holiday. Labor Day was created for two main reasons; first, to honor the hard-working men and women who built this country and economy with their sweat and blood, and second, to honor the unions and workers who fought (and sometimes died) to secure rights for their children and grandchildren that they didn’t have for themselves – things like child labor protections, a minimum wage, an 8-hour work day, a mandatory 30 minute lunch break, a 5-day work week, and so much more.
Labor Day has a fairly simple history; first celebrated in 1882 by the Central Labor Union in New York, it drew national attention as a celebration of the workers upon whose backs the country was built. Other cities soon followed suit, with Oregon being the first state to make Labor Day an official state holiday in 1887. By the time the federal government made it a national holiday in 1894, some 30 states already observed it.
Those who live outside the US might notice that, while around 80 countries worldwide celebrate Labor Day on May 1st, (or May Day), the US celebrates this holiday on the first Monday in September each year. The 19th century saw somewhere in the neighborhood of 37,000 strikes in the US, and between 1870 and 1914, up to 800 American workers were killed during strikes . When Congress voted unanimously during the Pullman Strike of 1894 to make Labor Day a national holiday, following the deaths of 30 workers, President Grover Cleveland supported the current placement for the holiday, fearing that placing it on May 1st would both encourage Haymarket Affair type protests (during which a peaceful strike turned into a riot after an unknown person threw a lit stick of dynamite at police), and strengthen socialist movements in the US.
I would argue that in this day and age, it is imperative that we remember and recognize those who led us to where we are today, fighting fiercely not just for worker protections, but for the very right to organize at all – people like Cesar Chavez, Mother Jones, Peter McGuire, and A. Phillip Randolph – only one of whom I had ever even heard of before researching this article.
Cesar Chavez, a migrant farm worker and US military veteran, helped found the National Farm Workers Association (now the United Farm Workers of America) and stood together with both home-grown and migrant farm workers to alleviate conditions that he once described as “just like being nailed to a cross.” As one of the founders of the NFWA, he helped organize several strikes and boycotts against California winemakers, struggling specifically against the Giumarra family from 1967-1969, eventually forcing not just Giumarra Vineyards to the table, but the other 28 major grape growers in California as well. By the time the three-day-long negotiation had ended, the growers had agreed to recognize the union, raise the grape pickers' pay, create a hiring hall, set up a joint labor-management committee to regulate pesticide use and contribute to the farm workers' health and welfare plan. Over the next decade, however, Chavez’s organization struggled to enforce the terms of the contracts, and eventually, he successfully convinced the California State Legislature and Governor Jerry Brown (yes, THAT Jerry Brown) to enact the California Agricultural Labor Relations Act in 1975.
Mary Harris "Mother" Jones, once labeled “the most dangerous woman in America” by a U.S. District Attorney, was a champion of mining unions and child labor laws, rising to prominence in the first two decades of the 20th century. She was a fiery orator, spurring men, women, and children alike to action. Perhaps one of her most notable techniques was staging parades with children holding signs that read “We Want to Go to School and Not to the Mines.” Over the course of her activist career, she was banished from more towns and held incommunicado in more jails than any other union leader of the time. She was especially concerned about child workers, and during the 1903 Philadelphia silk strike, she led a parade of 100 children all the way from Philadelphia to President Theodore Roosevelt’s Long Island home.
Peter J. McGuire was one of the most influential union organizers of the mid-1800s. When he was 11, he was forced into working to support his family when his father went off to fight for the Union army. He attended classes at his local union, becoming quite active and, by the age of 21, was elected to the Cooper Union Committee of Public Safety, where he became their best-known public spokesperson and chief negotiator. Though he soon lost faith in the effectiveness of unions’ reformist measures, moving into political circles to form the Democratic Socialist party (later the Socialist Labor party), he again returned to union leadership at the age of 25, when he helped organize St. Louis’s carpenters union. Within two years, he had led the union so effectively and achieved so many gains that he attracted national attention, at which point he called for a national meeting of carpenters unions and was elected the head of the newly formed United Brotherhood of Carpenters. Later that year, he sent out a call for all labor unions to unite, which spurred the creation of the Federation of Organized Trades and Labor Unions, the precursor to the AFL-CIO, the greatest sponsor of unions around today. He also played a critical role in starting Labor Day itself, being one of those credited with suggesting the idea in 1882, and even organizing the original parade through New York City, as well as founding May Day, the international Labor Day.
A. Phillip Randolph was not only a hero of labor, but also one of the fathers of the 1950s and 60s civil rights movement. He graduated from the Cookman Institute as valedictorian in 1907, having excelled in drama, public speaking, and literature. He was also an exemplary singer. He spent several years trying to make it as an actor in New York City, taking odd jobs and becoming involved in the Socialist Party, having been inspired by W.E.B. Du Bois’ The Souls of Black Folk that the fight for social equality was of the utmost importance. In 1917, he and Chandler Owen were asked to take over a newspaper called the Hotel Messenger, which they renamed the Messenger. It quickly became known as "one of the most brilliantly edited magazines in the history of American Negro journalism." In 1925, he was asked to head the Brotherhood of Sleeping Car Porters, which he did for over a decade. In 1940, when President FDR was refusing to sign an executive order banning racial discrimination in the defense industry, Randolph issued a call for 10,000 “loyal Negro American citizens” to march on Washington DC, which, thanks to Randolph’s reputation, quickly grew to a call for 100,000 men instead. Pressure was so great that FDR signed the executive order guaranteeing racial protection in the defense industry, one of the first major strides forward in the fight for racial labor justice. Six years later, he issued a call to President Truman that the military itself be integrated. Pressure was so great that within months, Truman called for the integration of the military “as quickly as possible.” Randolph was so well-known and respected in the civil rights movement that he was named chair of the 1963 March on Washington, where Martin Luther King Jr. gave his famous “I have a dream” speech. He was elected vice president of the AFL-CIO in 1955, and helped found the Negro American Labor Council and served as its president from 1960-1966, during which time he earned the Presidential Medal of Freedom from President Lyndon Johnson.
I would like to be clear – before researching this article, I only knew the name of one of these amazing people, and I had no real idea why he was famous. I label this an utter failure of the US education system. These people are American heroes, and should be celebrated as such. This Labor Day, I invite you to join me in honoring them, all their allies through history, and all those who carry on their legacy today, in groups like the AFL-CIO and Fight for Fifteen. In this age of horrible wealth inequality, new strides in labor protections and rights are critical, and those who pursue those ends should be held up as the modern-day heroes they are.
Sources:
https://aflcio.org/about/history/labor-history-people
https://www.dol.gov/general/laborday/history
http://www.newsweek.com/labor-day-2017-labor-day-weekend-labor-day-sale-659114
I have a patreon where you can give me money to do what I do anyway! Check it out here
0 notes
theleftexplained · 7 years
Text
North Korea Is Not A Threat – Unless We Make It One
With US intelligence confirming that North Korea has miniaturized nuclear warheads that are capable of reaching Alaska and possibly other parts of the US, there has been an uproar in the mainstream media, with news personalities, commentators, and politicians all panicking that an attack is imminent. Lindsey Graham was in the news a week ago saying that if we fail to bring war to North Korea, war will be fought on US soil. From the political climate, you would think nuclear winter was inevitable, and soon! And it is – if we don't start being practical about the situation.
The fact of the matter is North Korea has zero interest in an offensive war with America. You wouldn't know it from his rhetoric, but Kim Jong Un is fully aware that, though he might be able to hit America with a nuke, it would be the last thing he ever did. North Korea may have nuclear weapons capable of hitting California or Alaska, but America has enough nuclear weapons to leave North Korea nothing but a smoking, radioactive crater. So to attack the United States in an offensive manner would be akin to committing suicide. And cruel, fascist dictator though Kim Jong Un may be, he's not THAT stupid. He doesn't want to attack the US. He wants insurance.
“Insurance,” I hear you ask? Yes, insurance. In President George W. Bush's 2002 State of the Union address, the now infamous “Axis of Evil” was created, with North Korea as one of the members, clearly labeling them as an enemy state of the US. How has America treated enemy states in the last couple of decades, especially when they have (or are rumored to have) WMDs? If you recall, that was one of the original reasons we went into Iraq and toppled Saddam Hussein, despite the weapons not actually existing. The world saw that when a relatively weak enemy of ours was thought to have nukes, we got rid of them. And as Qaddafi showed in 2011, even disarming won't stop America from taking out enemy states who have had nuclear weapons. The world saw, and the lesson was evident: there’s no negotiating with these warmongers.
So far, two things are clear; North Korea has nuclear missiles, and there's little chance they'll give them up. So why are they not a threat? Well, once again, they would have to be ridiculously stupid and self-destructive to preemptively launch against the US or one of its allies. Given that the leaders of North Korea have a strong instinct for self-preservation, this seems highly unlikely. There's just one problem.
The United States can’t stop poking North Korea with a stick. Time and time again, leaders and newspeople of the nation with the most military power that the world has ever seen call out North Korea's pursuit of a bomb, openly musing about invading. And that's dangerous. If we actually were to invade, not only would South Korea almost certainly be attacked, but we might even spark open war with China, or at least a trade war. Given all this, and the fact that the US has been engaged in war continuously since October 2001 (nearly 16 years), it's no surprise that the US is considered the greatest threat to world peace.
While it is absolutely true that North Korea is a humanitarian crisis and basically a nation-wide hostage situation, it's also true that going to war or taking out the people in charge won't fix that. The people of North Korea are raised on a steady diet of propaganda, primarily claiming that the Kim family are essentially demigods. Nothing we can do militarily will win the hearts and minds of the people, which is the ultimate solution to the problem of North Korea. We need to stop discussing going to war with North Korea and stop worrying about a threat that isn't real unless we make it so.
War will lead only to death on all sides.
I have a patreon where you can give me money to do what I do anyway! Check it out here 
0 notes
theleftexplained · 7 years
Text
The Political Difference Between Fighting for a Team and Fighting for Ideas
“If a political party does not have its foundation in the determination to advance a cause that is right and that is moral, then it is not a political party; it is merely a conspiracy to seize power.” - President Dwight D Eisenhower
Following the failed presidential campaign of Hillary Clinton, the Democratic party has been in disarray, with the two major wings of the party being the corporate Democrats (the Neo-Liberals) and the progressive base. The progressive base, led by groups like Our Revolution, Justice Democrats, and National Nurses United, among others, has been arguing that the failure of the Clinton campaign was one of ideology and message, not one of circumstance. The corporate Democrats seem dead set against introspection, choosing instead to point outward and place blame on everyone from James Comey to Vladimir Putin, to even Bernie Sanders.
In recent weeks, the progressive base has been calling for a litmus test for anyone running for office as a Democrat, saying that, among other things, candidates must be for getting money out of politics, a living wage, single-payer healthcare, ending the wars, free college, abortion rights, LGBTQ protections, and racial and gender equality. In short, the progressive base has laid out a very basic ideological platform that they want the party to represent. However, the corporate Democrats have answered this proposal with scorn. The answer, they claim, is to run further to the right and compromise more with the Republican party in the hopes of earning votes from moderate Republican voters (a strategy that has so far cost the Democrats over 1,000 seats in the last 10 years).
The corporate Democrats answer that the way to beat a Republican in a red state is to be a Republican in all but name. If the people have elected a right-wing politician, clearly they have a right-wing ideology, and the secret to success in those communities must be to run right-wing candidates with a D by their name, right? Well, we've tried that for 10 years now, and the Democrats have lost the Presidency and majorities in the Senate, the House of Representatives, governorships, and even most state legislatures. The strategy of running further and further to the right has wiped out Democrats at every level of elected government. However, when you look at the polling data, it's not hard to see why.
Bernie Sanders's proposals, the ones that the progressive base have set as their proposed litmus test, polls very well, even among self-described conservative Republican voters. In a recent town hall in West Virginia he inspired cheers and applause from a room full of Trump voters in one of the most Republican states in the country. The public knows what will improve their lives, and can tell you clearly what they wish lawmakers would do.
When asked about the possibility of a litmus test recently, Rep. Ben Ray Luján, the chairman of the Democratic party’s campaign arm in the House, was absolutely clear that there would be “[no] litmus test for Democratic candidates,” continuing “As we look at candidates across the country, you need to make sure you have candidates that fit the district, that can win in these districts across America.” But what would that really get the Democrats, even if it was a winning strategy, despite all evidence to the contrary?
If all you care about is having people with the title of “Democrat” in office around the country, having no litmus test might seem like a good idea (it's not). The question is what that would actually accomplish. With “Democrats” like Joe Manchin and the rest of the “Blue Dog Democrats” in office, would progressive goals actually be achieved? If you will recall, when President Obama first took office, he had a supermajority in the House and Senate, even passing the Affordable Care Act (Obamacare) without a single Republican vote cast in his favor. So if they had to pass healthcare without any support from across the isle anyway, why did they go with Richard Nixon’s plan from the early 1970s? Obama originally pushed for single-payer, then a public option, but both plans were rejected, not just by Republicans, but by so-called Democrats like Joe Manchin. If you refuse to consider the plan that has been the stated objective of the party for over 40 years, the only thing the Democrats gain from having you in office is the illusion of winning the “game” of politics. If you stand for nothing, what is the point of winning, other than power and money?
If governing is supposed to be for the benefit of the governed, then it seems only reasonable that your stated ideology should be the one you think will best help the people. If you allow candidates who disagree with you on every point to say they're on your team, you toss ideology out the window and cannot defend the idea that you are governing with the people's best interests in mind. Further, if you believe that your ideas cannot win across the country, then you are either implying that people are too stupid to know what's good for them, or admitting that your ideas are weak. If you truly believe your plan will help the people, then all that remains is to sell people on your ideology and convince them you're right. 
And if you don't think you're right, what the hell are you doing in office?
I have a patreon where you can give me money to do what I do anyway! Check it out here
0 notes
theleftexplained · 7 years
Text
Comparing Apples and Oranges; An Introduction to Logical Fallacies
If you've ever argued on the internet (and let's face it – who hasn't?) then you've probably run across someone who definitely has an opinion, but can't seem to back it up with facts. I've often found that the convoluted logic they use sticks out like a sore thumb in my mind – I can tell their reasoning isn't quite sound, but I can't put my finger on why. If you've ever encountered that suspicion that someone's logic isn't all that logical, chances are, they're using a logical fallacy.
Logic has two basic categories, fallacies and syllogisms. Syllogisms are when the logic lines up (A=B, B=C, therefore A=C), and fallacies are when a logical argument has an error that makes the argument, if not the conclusion, incorrect. That's an important distinction to make, by the way: just because someone's argument includes a fallacy doesn't mean the conclusion they reach is wrong by definition (called the Fallacy Fallacy, which I'll discuss later).
The internet is rife with logical fallacies and the people who use them. There's no getting rid of them completely, but you can be on guard to make sure you yourself aren't among them. As a service to logical thought everywhere, I figured I'd make this handy guide to the common kinds of logical fallacy. Special thanks to yourlogicalfallacyis.com for the complete list of names (they aren't sponsoring me – I just happen to like their mission). If you want to get a Logical Fallacy Poster, click here.
Ad Hominem: “I like apples more than oranges.” “Yeah, most midgets do!” Ad Hominem is when, instead of providing evidence, the person simply insults whoever they're arguing with. It's usually a sign that they have nothing left to argue with.
Ambiguity: “I like looking at fruits.” “Here's a picture of an orange!” “That's... not what I meant.” The Ambiguity Fallacy occurs when some word or phrase used in an argument can be taken to have multiple meanings. The most common example of this is a sign reading “Fine for parking here.”
Anecdotal: “You'll totally like this apple! I know a guy who tried an apple once after saying he wouldn't like it, and he liked it, so you'll like it too!” Anecdotal evidence is not actually evidence, no matter how often those two words are put together. In order for something to be absolutely true, it must be true every time, not just “this one time.”
Appeal to Authority: “I heard Shaq likes orange juice, so oranges must be good!” Just because someone you admire has an opinion on something doesn't make it a fact. Our heroes can be wrong, too.
Appeal to Emotion: “Why won't you eat the apple? A poor immigrant worker worked hard to pick it for you!” Logic and emotion are not the same thing. Take the Trolley Problem, for example. While emotion leaves us running ourselves in circles about the right thing to do, logic (based on the assumption that people dying is bad) dictates that the lever should be thrown so that the five people are spared, and the one is killed. Whether logic should be the only factor in moral choices is a completely different question.
Appeal to Nature: “Oranges grow naturally, so they must be good for you.” Natural does not necessarily mean good. Cyanide is perfectly natural, but in even moderate doses, it's deadly. Targeted antibiotics are manufactured, but they are effective medicine.
Bandwagon: “All my friends prefer apples to oranges, so I guess I should eat apples too.” Just because “everyone” says something is true does not mean it is. During the black plague, it was conventional wisdom that smoking purified the air so that you wouldn't get sick. Whether you like them or not, smoking cigarettes is by no stretch of the imagination an air purifier.
Begging the Question: “Oranges are the best kind of fruit. They're my favorite, and naturally, I would only choose the best one as my favorite.” Also known as a circular argument, this particular fallacy will leave you dizzy as you try to figure out which came first, the chicken or the egg. A good way to pick this one out is if the argument references itself as evidence.
Black or White: “You either like apples or oranges – you can't enjoy both!” It's a sad truth, but a truth nonetheless that, while our brains have developed to see things in black and white, most things are shades of gray. Most things that don't fall under the umbrella of scientific or mathematical fact lie on a spectrum. You can, in fact, enjoy apples and oranges! And even one slightly more than the other without disliking either!
Burden of Proof: “More people eat apples than oranges.” “Actually, that's not a known statistic.” “Well, you can't prove me wrong, so I must be right!” If you make a claim, the burden of providing evidence to support that claim is on you. In a fair justice system, the defendant is “innocent until proven guilty” because the claim being made is that the defendant committed a crime, a claim made by the prosecution. Therefore, it is on the prosecution to provide the evidence, and if there is reasonable doubt, the default judgment is innocence.
Composition/Division: “Apple seeds are disgusting. Therefore, apples themselves are disgusting.” Just because a statement is true of a segment of a group, or a piece of a whole, does not mean that the statement is necessarily true of the entire thing or group (and visa versa). A common use of this fallacy is “guilt by association”, when an individual does something bad, and critics of a group that individual belongs to colors the group with the actions of that single person.
False Cause: “I started eating an orange just before sunset, and by the time I was done, it was dark. Therefore, eating the orange made the sun set.” Correlation does not equal causation. It's pretty easy, when combined with the Texas Sharpshooter fallacy, to show a link between any two trends you choose. Showing a causal link is more difficult, because you have to show exactly how one affects the other. The most famous example of the False Cause fallacy is the “vaccines cause autism” argument. A now-defrocked doctor did a study that showed a correlation between a rise in autism diagnoses and early childhood vaccinations. What he failed to mention is that signs of autism rarely surface noticeably before the age of 3-4 years, when children are going through a set of vaccinations. His published study decided correlation was causation, resulting in thousands, if not millions of preventable illnesses and deaths as parents used the false study as a reason not to vaccinate. Logical fallacies can have serious consequences when the decisions matter.
Genetic: “Ugh! You like oranges? Don't you know they're from Florida? Everything from Florida is terrible!” Other names for this fallacy include Fallacy of Origins, Fallacy of Virtue, and racism. I'm kidding, of course – racism isn't another name for the Genetic fallacy – it's simply the most devastating example. Any time you make judgments about an individual based on traits like race, gender, sexual orientation, or even just height or eye color, you're committing a logical fallacy – and being pretty damn offensive about it most of the time. However, this fallacy can be applied to things as well as people.
Loaded Question: “Don't people who eat apples look stupid?”  Most commonly seen in misleading polling, a loaded question is really more of a statement than an inquiry. When a conclusion is implied in the question, the answer is often different than if the question was phrased neutrally.
Middle Ground: “Apples are invisible!” “No, they're definitely solid.” “Guys, don't argue. Maybe they're transparent?” Just because the truth is not always Black and White (see above) doesn't mean the truth is 50/50. Sometimes, one person is right, and the other is wrong. Making a compromise might help settle a disagreement in opinion, but a fact is rarely a compromise between two extremes. If one person says the sky is red, and one person says the sky is yellow, you're not helping if you suggest it might be orange. Try looking outside.
No True Scotsman: “I'm American, and I'm not a fan of apples.” “No true American doesn't love apples!” Also known as the Appeal to Purity fallacy, this particular misleading argument is often applied to back up absolutes. This one shows up a lot on the internet as some form of “a true fan would know [trivia]” or “if you don't know [trivia], you're not a real fan.” In a much more sensitive arena, this fallacy is also used to separate extremists from a group they claim to belong to. For instance, claiming that ISIS is not truly Muslim, or that the KKK and the Westboro Baptist Church are not real Christians. This argument is often used to protect against another thing on this list, the Composition/Division, or “guilt by association” fallacy.
Personal Incredulity: “So you're telling me that if you put an apple in the ground, it turns into a tree? It looks nothing like a tree.” Just because an idea seems impossible doesn't mean it is. The world is round, despite the fact that when you sail on the ocean, you can see that the horizon looks like an edge.
Slippery Slope: “So you like oranges? Do you know what would happen if everyone started eating oranges? Florida would run out of oranges and then everyone would get scurvy. Is that really a road you want to go down?” A favorite of politicians and activists alike, the Slippery Slope fallacy (often confused with the Strawman fallacy) takes a single step to its illogical conclusion. To create a Slippery Slope, simply assume that when your opposition subscribes to a single facet of an idea or argument, they subscribe to the most extreme aspects of that idea or argument. An excellent example of this was when, during the argument over gay marriage, critics claimed that allowing marriage between two men or two women would remove all restrictions on who (or what) could get legally married, with Representative Steve King famously saying that a lawyer told him that the court's decision meant that he could marry his lawnmower if he so chose. And more power to him, if that's what he wants.
Special Pleading: “I can peel an apple cleanly with my fingernails in one piece.” “Show me.” “I can't do it if anyone is watching. You'll just have to trust me.” Special Pleading asks for an exemption based on circumstance. A good example of this is when a law is bent to allow for a “special case” to pass through without punishment, whether by the judge, jury, or even the prosecution. A common phrase you'll hear for this one is “he's a good kid who made a mistake.”
Strawman: “Oh, you like apples? So you're saying apples are the best thing ever? Everyone should just eat apples 24/7, is that what you're saying? What about people who don't like apples? You're a monster.” To create a strawman, take a balanced and well-reasoned opinion, make it an absurd absolute, and mock it fiercely. It helps if you take it out of context as well. A favorite more of people discussing politics on the internet than the politicians themselves,
The Fallacy Fallacy: “I heard Shaq likes orange juice, and I figured it must be good.” “That's not a good enough reason to assume it's good.” “But I tried an orange, and it actually was good!” As I mentioned earlier, just because someone is incorrect in their logic doesn't mean their conclusion is inaccurate as well. Even a stopped clock is right twice a day. Unless it's a digital clock, in which case it's probably just blank. Hey! Stop picking apart my metaphors!
The Gambler's Fallacy: “I go to the store every day and have a random employee bring me an apple. I've been getting a lot of granny smith apples recently, so I must be due for a Fuji pretty soon.” A favorite among gamblers, as the name suggests, this misinterpretation of probability attempts to link independent chance events to each other as dependent. Flip a coin a thousand times and you'll likely end up with about 500 heads and 500 tails, but the first flip has the same 50/50 chance of heads or tails as the thousandth. Just because you're at 500 heads and 499 tails doesn't mean you're more likely to get tails on your next flip.
The Texas Sharpshooter: “The ten oldest people on earth prefer oranges to apples, so eating oranges makes you live longer than eating apples.” Cherry-pick your data, and you can “prove” pretty much anything you want. When examining data sets, it's important to look at all the data as a whole, not just the pieces that prove what you think is right.
Tu Quoque: “Apples are yummy.” “Oh yeah? If apples are so freaking awesome, why aren't you eating one as we speak?” Also referred to as the appeal to hypocrisy, Tu Quoque is best represented by the phrase “You talk the talk, but you don't walk the walk.” Just because someone isn't taking their own advice doesn't mean their advice is any less valid. Ask any smoker who wishes they had never started.
This is far from a comprehensive list, but these are some of the more common logical fallacies we encounter in discussion. Steer clear of the logical fallacy, and the world is at your command! Well, actually, it's likely nobody will care either way. But you can at least have the satisfaction of making a logically consistent argument!
I have a patreon where you can give me money to do what I do anyway! Check it out here 
0 notes
theleftexplained · 7 years
Text
Your Body is a Mouse and Keyboard; How to Turn a Bad Mood into a Good One
“I regard the brain as a computer which will stop working when its components fail.” – Stephen Hawking
A quick disclaimer before I begin: This refers to bad moods, not chronic depression. Chronic depression is a serious mental health issue and should be treated as such. If you suffer from depression, I highly recommend seeking professional help, as you would with any physical health condition.
Anyone who has had computer troubles knows that sometimes, things go wrong in the software and need troubleshooting. Maybe your screen froze, or something is taking longer than it should, or maybe some program isn't opening when it should. What do we do when something goes wrong in our computers? We either troubleshoot them ourselves, or bring them to professionals to do it for us. But whoever is fixing the computer has to have a way to interact with it. Whether this is the standard desktop that most of us are used to, the home screen on our phones, or even textual interfaces for more intricate programming, how we interact with the computer is called the User Interface, or UI. Generally, with a desktop computer, our interaction with the UI is via input devices we call a mouse and keyboard. Without some way to input commands into the computer, it would just sit there, changing nothing in how it runs.
So when something goes wrong with the computer that sits between our ears, it seems logical that we would need some kind of input device. Good news! You aren't limited to two when it comes to your brain. You have five. That's right; the senses are the input devices for the computer that is our brain.
I often run across these articles that advise people to “just think happy thoughts” to make everything better. I'm sorry, I have a question. Aren't thoughts coming from inside the brain? Do we expect a computer to troubleshoot itself and fix the problems with no input from outside? Of course not! So when something goes wrong with our thoughts (a bad mood), how can we expect to fix it with thoughts? Put that way, it sounds as ridiculous as it is.
Sometimes, admittedly, it does work. But every person who has been through a period of negativity has had some version or another of the “just cheer up” speech, and we all end up thinking some version of the same thing: “Congratulations! You fixed me! I just needed to cheer up manually!” laden, of course, with a heavy dose of sarcasm. While trying to think positive can work sometimes, it definitely doesn't work all the time, and getting the advice to just cheer up can actually make it more difficult.
What's the answer? Why, the trusty old input devices, of course! Whenever I'm in a funk and I need to get out of it, I have a 5-step checklist that I go through, each step of which stimulates one of my senses.
In no particular order, I:
Find a song I like to listen to. Whether it's a get-up-and-go type song (Fight Song, Roar, Happy, Shake It Off, etc) or a song that really empathizes with my bad mood (The Sound of Silence, Say Something, Fix You, Mad World, etc), having something to listen to can break the cycle of miserable thoughts.
Find something funny or happy to look at. Surfing memes on the internet, watching a TV show I like, or even just finding cool pieces of art. I find having something visual to focus on can refocus my mind onto happy things.
Eat something yummy. We all know the stereotypical weeping girl sitting and eating her feelings away with ice cream. And yes, it's not exactly the healthiest response when it comes to your diet, but A, it actually helps your mood, and B, it doesn't have to be ice cream. Try to have something around that's not too unhealthy, and treat yourself. Besides breaking the thought spiral a bit, your brain will actually release neurotransmitters associated with joy. So go ahead! Have a cookie! Or a lollipop! Or even just a minty or fruity piece of gum!
Smell something good. Whether it's a scented candle, a spice from your spice rack, some flowers outside, or even if you just like the smell of your shampoo or soap, find some scent your brain will enjoy, and take a couple of calming breaths.
Touch yourself. Hey! Get your mind out of the gutter! Rub your arms, take a shower, or even ask someone for a hug or cuddle with a pet. Find some feeling you find comforting, like a familiar sweater or fuzzy socks, and embrace your sense of touch. I'll be writing an entire post about this eventually, but the sense of touch is one of the most underrated and least stimulated senses in western society. Our sense of touch is the first to fully develop, and as infants, it's the most important one for comfort. When a baby cries, one of the first instincts her mother has is to pick her up and hold her close. What makes you think that kind of comfort won't work on an adult?
That's my generic 5-step checklist, but most importantly, if you can, remove whatever is causing the bad mood from your general vicinity. Whether that means logging off of Facebook, walking outside, or getting out of a bad social gathering, one of the most effective ways to stop a bad mood is to get rid of the thing that caused it. I know that's not always possible, but if you can, do it.
Again, if you have chronic depression, this may help some, but I definitely recommend seeking professional help. Psychologists and doctors have far better tools to help you as an individual than I do, and sometimes even just having someone unconnected to you hear you out can help a bit.
May you have happy thoughts!
I have a patreon where you can give me money to do what I do anyway! Check it out here https://www.patreon.com/SomeRandomBitsandPieces
0 notes
theleftexplained · 7 years
Text
A Dramatic Shift in Power
“Until they became conscious they will never rebel, and until after they have rebelled they cannot become conscious.” ― George Orwell, 1984 
I've been looking at the overall trend of history's patterns, and there seems to be a common direction that all societies have moved in. Since the dawn of civilization, society has been organized in a pyramidal structure. In small tribes, this wasn't a problem; in fact, this was a benefit to society as a whole. And, when proper feedback systems are in place, it continues to be a benefit to society.
In society's early stages, even the most mundane tasks required human brains, and there was no way to specialize, because every task needed to be done by every person. As we gained organization, we gained the ability to specialize more and more. The basics of this kind of job diversity goes back to simple hunters and gatherers, who needed different skill sets to do their jobs. As we grew, the specializations grew more specific, and we gained the ability to harness other brains; dogs for hunting and guarding, horses for travel, etc. The "brains" became more and more specific as we started actually making complex tools like looms, steam engines, and more, culminating today in the computer, but by no means finished.
But as our ability to make brains for ever more specific jobs increased, we necessarily needed more of them to do more jobs. A pyramid is a logical structure to organize the ever-increasing brain pool and harness them all for a single purpose - supposedly the betterment of the human condition both in general and specifically. The problem is that “supposedly” modifier; the major difference between a pyramidal society and an actual pyramid is that the ever-smaller number of (organizationally) top brains have a disproportionate amount of power to influence and drive society. Ideally, they drive it for the best interest of the greatest number. In practice, they often use that ability to focus and control society for their own ends. Traditionally, the way they've almost always done this is by having access to more information than those they are directing. Superior knowledge can create the illusion of a superior intelligence. The old saying that knowledge is power is scarily accurate.
However, two things are slowly shifting, both of which are unprecedented in all known history, and both of which have been in the works for a century or two. One is good, one is bad. I'll begin with the bad.
The worst thing that those at the top of their respective pyramids have historically been able to do to those below them is kill individuals, either directly or by sending them into battle. They could cause suffering, but for the most part, only among those they directly controlled. Serving their own interests could do nothing worse economically than impoverish their own subjects. Harming the environment was always against their own interests, because their own wealth and power was based in the output of their farmers' crops. In an agrarian society, to harm the environment was to harm society, and it was seen as such (hence salting your enemy's lands being seen as the ultimate victory from which he could not return). The unprecedented thing happening with this is that, along with the simple ability to harm the environment growing vastly since the industrial revolution began, western culture is gradually becoming less instinctively connected to the land, allowing a disconnect to grow between the land and survival. It also happens that, because fossil fuels are profitable, for the first time in history, it is in the short-term interests of those at the top to harm the environment. With no mitigating forces, this spells the end – certainly of society as we know it, but also possibly of humanity itself.
Now for the good:
As I said before, the way that power was almost always held was by restricting access to information. The disparity of knowledge between the average peon and one of the ruling elite was stark, and provided a good justification for such a strongly slanted power dynamic; it takes a certain amount of knowledge and experience to rule effectively. No peasant was educated more than was necessary to function in their role in society, but the ruling class received the best education available at the time. This was, obviously, in the interest of the ruling class; ignorance is bliss, and the less the peasants knew, the less they could complain about. Books were difficult to make, being hand-written, and therefore necessarily expensive, so reading was a skill that the poor did not learn, not only because the rich didn't want them to, but also because it wasn't practical. But then, as we started to make brains more and more complex and specific, two things happened.
First, the printing press shattered the concept that books were something that only the rich could afford to own, and second, the jobs that had once been done by the peasants were now being done by brains made for the purpose. The common man began to make the shift from being the brain in charge of doing the actual labor to being in charge of operating the brains that did the labor. This, to the discomfort of the ruling class, required educating peasants to the skill level they needed to work the machines. Of course, in the early days, this wasn't much different, requiring only specific instruction on the individual task. However, as machinery became more and more complex, the skill necessary for operating them increased. Just as a chicken cannot return to the egg, progress always moves inexorably forward.
The common man is not actually dumber than the ruling class (no comment on non-relative intelligence) – the disconnect has always been in education. We reached a tipping point where to stop educating the common man would have caused riots, especially because people had become reliant upon the goods created by these ever more complex machines. The unprecedented thing happening here is that people are beginning to be able to coherently resist the influence of the powerful.
Revolutions in the past never really changed the lives of the general populace substantially, because whoever ended up on top would inevitably return to pulling the pyramid however they wished for their own gain. It might make a temporary difference, but a couple of generations of power later (or just right away), they would forget or abandon the promises that put them in power. The novel 1984 demonstrated this perfectly. But Orwell could not predict the internet. Leftist populism, like the strain found in Sweden, Denmark, and the rest of the Scandinavian region, and yes, like what Bernie Sanders proposed, is gaining momentum. People are becoming educated enough to truly understand what will lead to the greatest good of the greatest number. And because we are many and they are few, and because our political system has a mechanic for revolution without violence, the powerful cannot stop us from putting people that will actually serve us – rather than themselves – at the heads of the pyramids.
But here's the scary part. We can do it eventually. Only two questions remain. First, do we have the will to do it? Can we actually follow through? I think yes, but I cannot know for certain. And second (and this is the part that concerns me most), can we do it in time to save ourselves and our habitat? Can we put the right people in charge across the board in time to reverse the damage that will have been done by the time shift happens? I hope so, but again, I cannot know for certain.
What do you think?
I have a patreon where you can give me money to do what I do anyway! Check it out here.
0 notes
theleftexplained · 7 years
Text
Hopes High, Expectations Low
“Pessimist by policy, optimist by temperament — it is possible to be both. How? By never taking an unnecessary chance and by minimizing risks you can't avoid. This permits you to play out the game happily, untroubled by the certainty of the outcome.” – Robert Heinlein, Time Enough for Love
Optimism is a wonderful thing. It helps you to stay positive when negativity crashes over your head. “Maybe it will get better,” the optimist tells herself. “Maybe this time will break the cycle.” And hope, which is the optimist's key guiding principle, is important. It keeps us going when things seem pointless, or when we feel like we can't go on. It shines like a light in the darkness, keeping the demons at bay. But hope has a flaw – it can leave you unprepared.
Ask a pessimist why they seem so negative, and they'll tell you – they aren't negative, they're realistic. In some cases, of course, this is true. But a good chunk of the time (and I'm sure a certain Gloomy Gus pops into your head here), it's not realism, it's just dumping on the happiness of people around them. “That will never work,” the pessimist gripes. “Things will never change.” Pessimism, when it's just griping and negativity, rarely helps anything, if ever. But realism, which often looks similar, is extraordinarily useful. By taking a good, hard look at the things that can go wrong and cause something to fail, you can prepare for contingencies. Realism helps you to be ready for the challenges you'll face in the hard times. But realism has a flaw – it can be a real drag if you do it all the time, and you can even think yourself into failure. One piece of wisdom I learned from a master marksman is that if you worry you'll miss a shot, you often will.
The good news is that hope and realism each cover for the other's flaw. It can be difficult to use them together, but when done correctly, the greatest compromise between maximum happiness and minimum risk can be achieved. A good example of how I use this policy is my car. I rarely (if ever) expect anything to go wrong with my car when I need it. But I've hit snags there, because my blind optimism didn't prepare me for the things that inevitably do go wrong in rare instances. So, as a compromise, I keep jumper cables, a hand jack, and a spare tire in my trunk, then forget about them. I also keep a backup of every fluid I might run out of if something starts to leak (oil, coolant, wiper fluid, etc). That way, I can live with the same baseline hope that my car will run perfectly, while having the necessary at hand when (as I expect) things occasionally go wrong. Hopes high, expectations low.
This stretches to everything in life. When I go in for an interview, I sometimes spend hours in the days before going over the questions I might be asked and making sure I know my answers, ensuring I have all the paperwork I might need, thinking up questions to ask the interviewer, researching the company, and doing anything else I might possibly need to do to prepare. I make sure I have time to get a good night's sleep the night before, then I go in with my head held high, certain that I've done everything I possibly can to prepare, focusing on my hope that it will earn me the job. Again, hopes high, expectations low.
In general, the way I use this compromise is that I prepare for what I can, then forget about it and enjoy the experience of living. Life is no fun if you're always worrying about what might go wrong, and it's even less fun when you haven't thought through how you'll respond when, inevitably, something goes kablooie.
And it will. But don't worry! Just prepare.
0 notes
theleftexplained · 7 years
Text
Starstruck
“...the assumption being that the common man was the equal of anyone and better than most...” - Robert Heinlein, Stranger in a Strange Land
I'm a rather egalitarian fellow. When I look someone in the eyes, I do so as an equal. I don't consider anyone my better, nor do I consider myself better than anyone else. And yet, there's the interesting case of the celebrity. Society seems to worship them, they achieve an elevated status, and we certainly keep a close eye on them. Many people, when meeting their idols, are so overcome by this conceptualized separation, this imaginary podium we place our idols on, that they can barely manage to speak. I see my friends obsess over their celebrity crushes, imagining scenarios where they might meet, inventing a fantasy life where they're actually dating, and even getting alarmingly jealous when their crush starts dating someone in the real world. I find this odd for two reasons. First, they don't even know you. And second, how do you fall in love with someone you place above you?
I've long harbored a small crush for Taylor Swift. I find her attractive, I think her voice is wonderful, and she genuinely comes across as a decent person. However, I realized a few years ago that I have absolutely no specific interest in meeting her. I don't mean to say that I would turn down an opportunity to meet her and say hello, but I certainly wouldn't go out of my way to create such an opportunity, any more than I would with a random person I see on the street. Even when I used to picture meeting her, the conversation went a bit like this:
“Hi!”
“Hi!”
“..............”
I would have absolutely nothing to say. The only thing I really knew about her was what she's famous for – her music. My silence wouldn't be due to panic, but simply cluelessness. But then it occurred to me that I start out with about as much when I meet some stranger and strike up a conversation. So why would I go any further out of my way to meet her than I would any random person?
Funnily enough, I actually did get the chance to meet someone “famous” (does 1.7 million subscribers count?) a few weeks ago. He runs two youtube channels, a Let's Play (and vlog) channel, and a mod review channel. After watching videos of him and his girlfriend for several months, I came to feel like I would actually like him if I had a chance to talk to him. Then I found out he lives in the next city over. So I hatched a plan to meet him. It worked, and I found myself driving up to see him, full of curiosity about how it would feel to meet a celebrity, and whether I would freeze up when I met him, and whether he would even be like who he presents himself as in his vlogs. What I found was somewhat odd.
He was just a guy. He seemed every bit as nervous to meet me as I felt to meet him. And as I shook hands with him in greeting, my nerves melted away. I found that I could talk with him as easily as I chatted with my friends at home. We grabbed a bite to eat, talked for a while about various things (mostly video games and youtube), and parted amicably. Through the whole encounter, I kept expecting something to feel odd or different about him, but he didn't sprout wings or start speaking in tongues. No supernatural glow or aura came from him. He really was just another guy.
I don't know what I expected, but it clearly wasn't that. On the drive home, I turned the encounter over in my head, and I realized that in the moment I shook hands with him, a key shift happened. I separated the man from the videos. We are not the things we produce. I have a blog. I am not a blog. He has a couple of youtube channels. He is not his channels. Taylor Swift has a music career. She is not her music career. And the same goes for every celebrity ever. The president plays golf. Actors go clubbing. Writers watch movies.
We're all just people. Not one of us is better or worse for how well we are known, or for the quality of the things we produce. I find it makes much more sense to base my judgment of people on who they are, rather than what they are known for. Nobody is ever simply what they have become known for. Gandhi was a football (soccer) fan. Martin Luther King Jr. was a trekkie. Stalin was a meteorologist. We are all people underneath the exterior that the world sees. Just because a famous person's exterior seems overwhelming (good or bad) doesn't mean there isn't a human being beneath it.
And so I realized that the easiest way to avoid getting starstruck is to simply remember that famous people are still people, with all the insecurities, nerves, and quirks as the rest of us, and meet them on even footing, human to human.
Celebrities are people, too.
1 note · View note
theleftexplained · 7 years
Text
Why Firsts Don't Matter (And Why They Do)
Let me be clear – firsts matter. Really, they do. But the things that people worry about when agonizing over firsts, be they first dates, first kisses, first jobs, and pretty much any first step into any new concept or world (like blogging) are all wrong.
I wasn't sure exactly what to make my first post here about. I'd been thinking of writing a blog for a month or two, and ideas came and went, never staying long enough to really be appealing. I considered making this a political blog, or a blog about my life, or about news, or science, or really anything I cared about. But I could never really see myself making a focused blog long-term, so I drifted away from the idea. When it occurred to me that I could make a blog about everything, the idea really started to take hold. The only problem? I had no idea what to make my first post about.
I watched news story after news story fly by, I got upset and opinionated about topic after topic, I learned new things and wanted to share, but nothing really seemed right for my first post. So I waited. And waited. And I continued to wait, until one day I realized I wasn't waiting for the right topic.
I was scared.
Firsts are terrifying. And exhilarating. And, of course, everything in between. We stand upon the edge of that cliff which is the unknown, and we prepare to take a step. Will there be land beneath our foot? Or will it be naught but air, catapulting us head over heels into nothingness? We fear firsts because we don't know what's coming. We have no clue what to expect. We could do amazingly well, or we could suck in front of the whole world. Or it might be that nobody cares, and we go unnoticed.
So we focus on details. We plan a first date down to every detailed thing we're going to say. I nearly brought note cards to my first ever date because I was so nervous I wouldn't have anything to say. We imagine our first kiss for years, whether it be standing in the pouring rain on the sidewalk, or on the beach at sunset, or in the middle of a grassy field surrounded by a chorus of nature. We over-prepare for a first day at work, when all we're really expected to do is listen and learn, in most cases. And the one thing I've found that all of the details of all of the firsts have in common, is that none of it matters.
None of it matters! I was more terrified on my first date than I could possibly have imagined! I looked like a moron. The same was true of my first job, my first kiss, my first day at a new school in the third grade, and practically every first I've ever had. But I came through just fine. I might have been embarrassed, I might have beaten myself up over it, but in the end, none of it mattered.
The only thing that matters about a first is doing it in the first place.
A first is a tremendous step. When you cut wood with a handsaw, it's making the first couple of cuts that takes the most effort. The prototype takes eons longer than the next hundred units. The first drop of rain takes its lazy time coming down the window, but the drops that follow in its path don't hesitate. When you have a first, you open up an entirely new world to yourself, and doing it again is almost always much easier. You're more prepared, you know what you're getting into, and you have a basic idea of how to deal with the things that messed you up the first time.
I've recently been binging a productivity podcast called Cortex (you should go listen to it). It's hosted by youtuber CGP Grey and Relay.fm cofounder Myke Hurley, both of whom are self-employed, and both of whom have the same advice for anyone who has an idea and wants to turn it into a business: just start. It doesn't matter if the product is good from the beginning. It doesn't matter if you fall on your face a couple of times. If you don't start, you'll never succeed. It's about taking that first step, no matter how frightening it might be. I happen to agree with them. In fact, if it wasn't for their advice, this blog likely wouldn't be happening.
I've said for a long time that I try not to care about firsts. What I meant was I try not to care about the details surrounding firsts. First steps are tremendously important. Don't do it tomorrow. Do it today.
What first step will you take today?
0 notes