Saturday, November 18, 2017

Deaf Inventor of Leaf Blower Unveils new Noisy Devices

Dateline: PITTSBURGH—Manny Hornblower, the deaf inventor of the leaf blower, has unveiled a series of new inventions, including the gas-powered page-turner, the motorized tea bag dipper, and the jet-fuelled food-chewing aid.

The gas-powered leaf blower is popular with landscaping companies and when in operation can be heard miles away even when indoors.

“We use it to blow leaves around, to clear lazy people’s lawns,” said professional landscaper Dillon Jerkwad. “Kind of makes you feel like a ghostbuster, since it’s this big old device you wear on your back and you hold this thick hose that comes out the side.

“Noisy AF, that’s for sure. That’s why some of us in landscaping also wear heavy-duty, noise-cancelling earphones. In the business, we call that the Full Asshole. You know, like the Full Monty, except instead of showing your junk on a stage you’re making a god almighty nuisance of yourself in suburbia. I mean, you might as well be going up to strangers and shouting at the top of your lungs in their ears for no reason, all while wearing big earmuffs so you can’t hear a thing and your ears are fully protected.

“I can’t tell you how many times I’ve had folks come up to me and give me dirty looks while I’m waking up the dead, blowing leaves off of some guy’s lawn with that gas-powered ghostbuster contraption. One old guy turned his dog loose on me to shut me the hell up or maybe just to knock my earphones off my thick head, to give me a taste of my own medicine, I guess. I aimed the leaf-blower hose at that dog and blew him back to his doghouse.”

Anthropologist Leah Mackelmire explained the appeal of this strange tool.

“The leaf-blower very nearly replaced the rake in suburban areas, in which a pristine lawn is a sign that even though the suburbanite couldn’t last an hour in the wild, and even though all wild places are fast disappearing from the face of the earth, he’s still king of his castle and can take comfort in the greenness of his lawn as a substitute for any connection between him and nature.

“The trick is you can’t leave even a single leaf or twig on your lawn for long in autumn, since that would spoil the effect of the green grass and desecrate the clean-cut lawn as an altar to phony masculinity. So the rake won’t work unless you want to be out there for hours and hours or you haven’t gotten fat and lazy from your desk job.”

Mr. Hornblower demonstrated his new devices at an indoor press conference. Sitting on a Lazy Boy recliner on a stage with a novel in his lap and a glass of brandy by his side, he clipped the page-turning device to the book and pressed a red button on the side of the device. The small but surprisingly-cacophonous page-turner sprang into action, emitting a deafening roar as it turned a single page of the novel, and then it fell still and silent. He pressed the button again, and the device once more filled the room with howling banshee screeches as it deftly turned the next page.

Mr. Hornblower couldn’t hear the reporters’ questions that were shouted at him while his machine was turning the pages, not just because he’s congenitally deaf but because the device itself was deafening. Instead, he laid the book aside and gave the reporters a self-satisfied grin.

The next invention was a variation of the page-turner, except that instead of turning pages it dunked a bag of tea in the hot water in his mug. Again, the reporters were astonished by the ear-splitting noise generated by the small device.

While it was dunking the tea bag, a reporter shouted at one of Mr. Hornblower’s aids, who were waiting at the side of the stage, “Don’t you expect most people won’t want to make quite so many explosive, buzzing, crashing noises while they’re preparing to have a cup of tea—or read a book, for that matter?”

The aid only shook his head reproachfully, as though it were politically incorrect to speak about noise in front of Mr. Hornblower.

The inventor attached the final machine to his head and proceeded to eat a sandwich, relying on the jet-fuelled motor to open and close his mouth for him. Like the rest of Mr. Hornblower’s inventions, the chewing aid could be heard from outside the building and from far down the street.


But Mr. Hornblower smiled and proudly gestured towards his three machines, indicating that they’re available for sale, and anticipating perhaps that the consumer’s laziness will compel him or her to overlook the products’ foibles. 

Friday, November 17, 2017

Lucky Thirteenth PDF Installment of RWUG

Here's the thirteenth collection of this blog's articles as a PDF file. This one was a little overdue. 

The other installments can be found here

Cheers!  

Thursday, November 16, 2017

Is Theism or Atheism a Delusion?

Is the belief that there’s a personal creator of the universe a delusion? Is atheism a delusion? And just what is a delusion anyway? Should our overriding goal be to understand and accept reality? One amusing way into these questions is to consider the confusions in a YouTube video from a Christian apologetics website, innocuously called “Inspiring Philosophy.” Presumably, that website is meant to pretend to be neutral in its quest for philosophical truth, and the website just so happens to confirm something as anachronistic as Christianity. This rhetorical technique might be borrowed from American conservative politicians who call their Machiavellian schemes Office of Special Plans or Patriot Act, to fool gullible individuals who don’t look under the surface of things.

To that extent, these politicians and devious Christian evangelicals are comparable also to the folks at Goldman Sachs who likely agree with their CEO Lloyd Blankfein when he said, shortly after the American banking collapse of 2008, that his bank was “doing God’s work.” His stated reason why he believes that is just that he’s a banker and banks help companies grow by helping them raise capital, which creates wealth and jobs and leads to a virtuous cycle. That, of course, is balderdash, since the wall came down between commercial and investment banks in the US with the repeal of Glass-Steagall in 1999, which allowed banks like Goldman Sachs to engage in massive fraud, along with much of the rest of America’s financial industry. The insider reading of Blankfein’s comment, then, must be that he’s doing God’s work by being the superior fraudster, which enables his bank to defraud the inferior fraudsters. His is a social Darwinian view of merit: the weak perish in a struggle for survival, which is God’s will, assuming God is the most terrifying beast in the animal kingdom. That is conceivably a neo-Jewish theology, based on synthesizing the tribal bloody-mindedness of most of the Old Testament with modern science-centered naturalism.

In any case, the point of that digression is that we must beware when entering the swamp of evangelical Christian discourse. The author of those videos, whom I’ll call Inspiring Philosopher or IP, proclaims that there’s a “mountain of data” and “overwhelming evidence” demonstrating the truth of theism and of Christianity in particular, and IP refers the interested viewer to cases he’s made such as his video on Plantinga’s ontological argument. IP’s defense of that argument is misleading, mind you, since he says the only controversial premise in the argument is the statement that God’s existence is possible; the rest, he says, “follows modal logic and is uncontroversial” (6:48). Apparently, IP is unaware of the problems of using the much-too-strong reduction rules of the S5 system, which led Plantinga himself to disavow the claim that his argument proves anything. Elsewhere, I explain those problems and some other flaws of the presumptuous ontological argument. If IP thinks this modal argument is part of a mountain of evidence for theism, we should expect the mountain is in fact a molehill.

IP’s discussion of delusions in the other video is full of confusion but it does invite us to reflect on the issue of delusions in this context. IP argues that science shows theistic belief is “natural” and that atheism, on the contrary, is unnatural, since atheism requires “hard cognitive work” to sustain. The human brain is wired to believe in God from a young age, he says, and atheism “overrides our intuitions.” Moreover, atheists are angry at God, according to a study cited by IP which shows no such thing. But this suggests to IP that disbelief in God is only a coping mechanism and itself a delusion. Before I turn to the reason why IP would cobble together such confusion and nonsense, let’s go ahead and demolish his claims.

Wednesday, November 15, 2017

China officially Adopts Infantilized America in 2047

Dateline: PLAYGROUND 307, Year 2051—The eight hundred remaining adults in the United States are increasingly asking how the Age of Reason become the Age of Babies.

In the US, which has led the world in this decline, the early signs ranged from the surge of comic book movies, to the proliferation of butt pictures on social media, to the banning of old people in pop culture, to the pampering of kids by overprotective twenty-first century parents, to the retreat to safe spaces on liberal campuses, to the lame fantasies emanating from Christian fundamentalist churches, to the anti-intellectualism of blue collar America, to the opioid epidemic, to the election of the first Clown Presidents, George W. Bush and Donald Trump.

The 2006 film Idiocracy was only partly correct, according to the elite corps of American adults, since that film focused on the decline of American intelligence and this is only one aspect of infantilization. In addition to being naturally dumb, babies are self-absorbed, irritable, fickle, and cute as a button. The same is true of infantilized adults.

The remnants of American intellectuals blame what they call the wave of mass infantilization partly on Americans’ overreliance on machines, especially on television, the internet and communications devices, which started in the 1980s in the case of television. But other countries also had access to advanced technologies and didn’t collectively revert to a childish mindset. So the rise of machines was only a necessary condition.

The key to American mental regression was its formative culture of individualism, which high-technology as well as corporate media saturation exacerbated. The early-modern Europeans who conquered North America were hardy and self-reliant. In the 1960s, capitalists exploited that ethos of defiant independence, by spreading the myth in their advertising campaigns, that because Americans are each so great, they deserve the best and so they should consume a smorgasbord of manufactured products. Pioneers thus turned into consumers who depended on transnational corporations for their materialistic standard of living.

As more and more Americans were infantilized, their elected government became dysfunctional, the country’s infrastructure crumbled, its quality of education deteriorated, its skilled workforce disappeared, and so big businesses moved all their manufacturing facilities offshore. There was also an astonishing brain drain as millions of educated liberals fled to Canada or Europe. The leftover Americans no longer had steady jobs, so they couldn’t repay their credit card companies. Thus Americans suddenly found themselves unable to consume as spoiled, overgrown babies.

2047 was the year that startled the world, when Americans threw a collective temper tantrum and whined and cried for four days straight. The earsplitting cries could reportedly be heard from the northern- and southern-most tips of Canada and Mexico, respectively.

That was also the year when via the United Nations, China officially stepped in and adopted America as its national child. But because few adults want to be around children for long periods, China had to care for America from afar. So the Chinese trained eight hundred American overgrown babies to be an elite class of nannies and baby-sitters.

These overseers would ensure that the Playgrounds set up by the Chinese were functioning properly. Each Playground is an enormous kindergarten, the size of a major city but geared towards mollycoddling a large class of the remaining two hundred million American men and women who think and behave at a fourth-grade level.

At Playground 307, which used to be known as Jacksonville, Florida, Americans are fed pain-killers at water fountains so they don’t have to endure even a split second of discomfort. Women’s butt pictures and comic book movies are posted or streamed on billboards at every street corner. Instead of the older fairytales which adult Americans used to read their children at bedtime, robots recite fundamentalist nonsense or conspiracy theories to daydreaming American pseudo-adults.  

China is currently debating the feasibility of raising Americans to a mentally-adult status, and whether twenty-first century Americans are even capable of growing up.

Monday, November 13, 2017

Defining God into Existence: The Presumptuous Ontological Argument

The ontological argument for God’s existence has tantalized theologians and philosophers for centuries because the argument seems at first glance to prove that God exists even though all the argument does is analyze the concept of God. Take, for example, the philosopher Alvin Plantinga’s modal version of the argument, which begins by stating that God’s existence is at least possible. The argument next points out that “God” is defined as a maximally great being, meaning not just that God is all-powerful, all-knowing, and all-good, but that “God” is defined as a necessary being rather than just another contingent thing that comes and goes. Anything dependent on something else wouldn’t be God, by definition. S5 modal logic, the system which specializes in simplifying strings of modal operators, includes a (very dubious) rule of inference that says if it’s possible something is necessary, the thing is simply necessary, meaning that if there’s a possible world in which the thing exists in all possible worlds, the thing must simply exist in all possible worlds. The only way it could exist necessarily in some possible world is if it really does exist in all possible worlds. And if that’s how the thing exists, it exists in the actual world, which means God exists as a matter of fact just because God’s existence as a necessary being is possible.

So wasn’t that easy? God’s existence can be proven with just a few sentences. That’s as we would expect it to be if God wanted us to know easily that he exists. Unfortunately, God’s existence is intuitive to creatures like us who thrive on reading each other’s intentions and projecting mental properties onto everything in nature, as was commonplace in our animistic past and in our individual childhoods. By contrast, reason has conflicted more and more with how we naively intuit the world. We felt we were central to the universe, but empirical investigation proved that intuition is wrong. We naively trust in our clan’s religion, but then discover there are many cultures and so we acquire the perspective of postmodern irony, which compels us to doubt our myths even as we struggle to remain civilized rather than give in to multicultural vertigo. Thus, the ontological argument would be a strange bird indeed if the argument were rationally compelling.

As the Stanford Encyclopedia of Philosophy article on the argument points out (in section 7), the soundness of a deductive proof of anything is a trivial matter, if the issue is the more general question of whether we should accept the conclusion. Take this argument, for example:

(1) Either God exists or else 2 + 2 = 5.

(2) 2 + 2 do not equal 5.

(3) Therefore God exists.

The second premise is true, the first premise isn’t obviously false, and the argument is valid since the disjunction in (1) can be interpreted as exclusive (even though there is no clear reason for doing so). But if (1) is comparable to a statement such as “Either it’s daytime here and now or it’s nighttime here and now,” (1) excludes the scenario in which both sides or disjuncts are true, so that once one side is eliminated, the other must be true if the disjunctive statement as a whole is true. So because 2 + 2 do not equal 5, the other half of that disjunction must be true, if (1) as a whole is true, and so God exists.

Again, wasn’t that easy? According to deductive logic, God exists! Yet the reason this argument wouldn’t convince anyone even though it might technically be sound is that it’s wildly incomplete. Again, there’s no reason to think (1) is true and it needn’t be up to logic or the analysis of concepts to determine the relation between “2 + 2 = 5” and “God exists.” A random falsehood is merely being attached to another dubious notion (since the concept of God is arguably incoherent), and then the dubious notion is proved by presuming that the pair counts as a logically decidable statement so that the falsity of the arithmetical part entails the truth of the other part. Likewise, I could “prove” the following:

(1) Either I have a trillion dollars or there is a dinosaur in my shirt pocket. 

(2) There is no dinosaur in my shirt pocket.

(3) Therefore I have a trillion dollars.

But of course I don’t have a trillion dollars. Or we could make a more obvious fiction real by the following wave of a magic wand:

(1) Either Darth Vader exists in reality or Mickey Mouse has square ears.

(2) Mickey Mouse does not have square ears (his ears are round).

(3) Therefore Darth Vader exists in reality.

We know on the contrary that Darth Vader is a fictional character, so this argument must be flawed even though technically, at first glance at least, the argument might seem sound. 

Sunday, November 12, 2017

Man Mocked for Alleging his Female Boss Sexually Assaulted him

Dateline: NY CITY—Software engineer Timmy Whatawuss alleged on social media that he was sexually assaulted by his female employer, Olga Pololga, and has been met with laughter from all quarters.

Meanwhile, hundreds of thousands of women have accused powerful men in America of sexually harassing or assaulting them. In fact, there is no powerful American male deemed never to have taken advantage of his authority and exploited at least one of his female employees’ dependence, in some perverted manner.

Every executive or celebrity in Hollywood, stand-up comedy, government, Wall Street, and all other sectors in which wealth or power is concentrated in male hands has been accused by one or more female underlings of offering them financial compensation or promising them advancement at work for sexual favours.

These allegations have led to widespread condemnation of so-called Boy’s Club culture. Powerful men have been ostracized, fired, or prosecuted. In many cases, a woman’s mere coming forward with an allegation of a man’s unwanted sexual advances sufficed to damage the man’s reputation, so that the woman felt no further inquiry was needed and no charges needed to be pressed.

This revolution in American sexual relations encouraged Mr. Whatawuss to test the waters with his astonishing claim that a woman could sexually harass a man.

“First, I took my complaints to my boss’s superior, Steve Strittmatter. I told him Olga Pololga sexually assaulted me by stroking my penis in the hallway. He looked at me from across his desk and burst out laughing.

“So I announced on Facebook and Twitter that my female boss molested me by rubbing her boobs in my face and giving me a lap dance even though I told her ‘No.’ The responses were unanimous: laughter all around. I mean that literally I got hundreds of messages saying nothing but ‘Hahahahaha!!’ The messages varied the number of ‘ha’s or exclamation points, but they were certainly united in their mockery of me.

“I took the matter to the police and filed a report, but the whole time the police couldn’t stop laughing. The officer who typed up the file was crying on the keyboard he was laughing so hard.”

But Mr. Whatawuss had proof of his ordeal, since he had recorded some of his employer’s threats and unwanted advances, and he posted them on YouTube.

“Again,” said Mr. Whatawuss, “just laughter. That was the only response. All the comments under the video were variations of ‘Hahahaha!’ or “Hardy har har!” or ‘Heh heh heh.’ Several response videos were posted by other YouTube users, and all of them showed the men or women who posted them laughing hysterically. One female Youtuber fell off her chair, she was laughing so hard. The video shows her struggling to get back up and then promptly falling off her chair again in uproarious laughter.”

For her part, Mrs. Pololga has admitted to molesting Timmy Whatawuss. “Of course I did it,” she told reporters. “Why shouldn’t I admit it? No one can do anything about a woman’s sexual assault of a man, because you can’t do anything else while you’re laughing your ass off hearing or thinking about it.”

Her prediction proved accurate, because while attempting to televise Mrs. Pololga’s confession, reporters erupted with laughter, knocked over their equipment, and broadcast only dead air. 

Feminist ethicists have rarely debated whether it’s possible for a woman to sexually assault a man. They contend that while this sort of assault is logically and perhaps even physically possible, such an offense would be highly unlikely in practice because men tend to have more power than women, and women are more interested in emotional connections than in random, meaningless sexual encounters. 

Thursday, November 9, 2017

The Strangeness of Normality: Alan Moore on Art and Magic

If there’s a single principle that philosophers should live by in their pursuit of ultimate truth, it’s that the natural world uncovered by science maximizes irony to humiliate us, which suggests that the ultimate truth hides in plain sight. If the truth of how we should live and should fundamentally conceive of the world were hard to find, there would be no shame in ignorance. Only if the philosophical truth were obvious, at least in retrospect, requiring just a shift in mindset to appreciate what had been there all along would we be embarrassed to have missed the truth or to have passed it by in our preference for distractions. Humiliation is the surest sign of someone’s cognitive awakening, which means there are no arrogant philosophers. Pride and confidence thrive on ignorance, because nature is indifferent to us so that the most probable outcome is a gulf between reality and the truth as we’re capable of intuiting it. Our intuitions lead us astray because we evolved not to understand reality, but to survive in a harrowing struggle for resources. The profound truth has always been there right in front of us, but we’re fixated on playing an altogether different game and so we blind ourselves.

The first irony, then, is that the more ingenious we are in our search for philosophical truth, the further we distance ourselves from it, because the philosophical upshot of living in a natural universe is likely obvious in hindsight. The second irony is that once the truth is seen, the answer becomes almost useless, because to understand the meaning of life is to think of ourselves as the inhuman cosmos would and thus to disassociate from our intuitions and preferences, to turn our nose up at all the carrots obtainable by our evolved thought patterns. To understand the truth is to stand under and thus apart from human nature, in which case we become self-alienated; the wake of that alienation is persistent humility.

Alan Moore on Art and Magic

What is this profound truth? Part of the truth, as I attempt to comprehend it, is captured by Alan Moore’s intriguing view that art and magic are identical. In an online interview, Moore the graphic novelist says (with my emphasis):
consciousness, preceded by language, preceded by representation (and thus art) were all phenomena arising at around the same momentous juncture of human development and that all of these would be perceived as magic, an umbrella term encompassing the radical new concepts born of our discovery of our new, inner world. This allows us to offer a definition of magic as a ‘purposeful engagement with the phenomena and possibilities of consciousness’. We then go on to argue that originally, all of human thought and culture was subsumed within the magic worldview, with the advent of urban society and the rise of specialised professionals gradually stripping magic of its social functions.
So Moore is saying that magic was taken for granted at the dawn of human self-awareness, sometime in the Upper Paleolithic, around 80,000-50,000 BP when subject was only first being distinguished from object. At that time, everything was perceived as magical, just as it is for children in any era, because we were only first trying out our cognitive ability to put everything in conceptual boxes, before this ability had become routine and we’d grown disenchanted with ourselves due to the dominator’s boredom. Consciousness was sublime but so was everything else because all we perceived were things from our point of view which we could alter at will, if only by shifting physical positions or altering our mood or intentionally modifying the outer world itself. That is, we perceived the world through the filter of our most naïve intuitions and feelings. That was the mythopoeic era, when we hadn’t yet become alienated from nature because we hadn’t learned enough to specialize in this or that fragment of human knowledge, when the world was encountered as an undivided whole in which we felt we belonged. That was the world of subjectobject, as analogous to spacetime. Magic then was the exploration of consciousness which included an exploration of the outer world, because we’d projected the inner onto the outer.

In another interview, Moore says,
Actually, art and magic are pretty much synonymous. I would imagine that this all goes back to the phenomenon of representation, when, in our primordial past, some genius or other actually flirted upon the winning formula of “This means that.” Whether “this” was a voice or “that” was a mark upon a dry wall or “that” was a guttural sound, it was that moment of representation. That actually transformed us from what we were into what we would be. It gave us the possibility, all of a sudden, of language. And when you have language, you can describe pictorially or verbally the strange and mystifying world that you see around you, and it’s probably not long before you also realize that, hey, you can just make stuff up. The central art of enchantment is weaving a web of words around somebody. And we would’ve noticed very early on that the words we are listening to alter our consciousness, and using the way they can transform it, take it to places we’ve never dreamed of, places that don’t exist.

Sunday, November 5, 2017

Footnote added to the Hollywood Sign, reminding Starlets to “Expect to be Molested”

Dateline: HOLLYWOOD, 2019—Movie producer Danny Fishman lamented that sometime during the 1960s, a footnote to the Hollywood Sign was removed, since that footnote used to warn prospective starlets that they should expect to be sexually molested as payment for receiving what Mr. Fishman called “their easy job of acting.”

“Young women used to know how the system worked,” said Mr. Fishman. “Hollywood is run by rich old Jews with small penises. But this town is also filled with beautiful young women who are desperate to become professional actors. Obviously, then, those old men are going to sexually exploit those young women. That’s just a law of nature.”

But according to Mr. Fishman, young actors used to understand that no woman deserves to be rich and famous just because she’s physically attractive.

“In the 1940s,” said Mr. Fishman, “you’d have your share of smoking hot starlets. But they knew you don’t get something for nothing. Beauties are born with their sexual appeal; it’s in their genes. So they wouldn’t demand to be hired to work on a big movie set just because of their facial symmetry, flawless skin, and hourglass figure.

“And talent? Don’t make me laugh! Anyone can act. It’s not a hard job at all, especially considering the huge payoffs of fame and fortune. That’s why there’s a constant flood of young people into Hollywood, because anyone can do what famous actors do. The actor’s mystique is bullshit. You memorize some lines, you get dressed up to look the part, you read your lines on camera, and if you don’t say them quite right the first time, you get to try again and you get to keep trying until the director’s happy, and then you go home to your mansion in Malibu. Case closed. Easy peasy.”

The problem for actors, according to Danny Fishman, is that because their job is easy to do and the potential rewards are astronomical, there are far more people seeking acting roles than there are legitimate roles to offer. There are even far more beautiful models in Hollywood than there are parts to appeal to the average viewer “who can’t stand the sight of old people” and who “expects to see only eye candy on the big screen,” said Mr. Fishman.

For those reasons, the executives who run the Hollywood movie studios expect to be paid something in return for gifting young actresses with roles and for setting them up for life. “The wannabe starlets start off with nothing except their looks. That’s all they have to offer. The production company invests in the young actor by marketing the movie and helping to create her mystique, but the starlet certainly doesn’t pay back the company with her great acting performance, because again, anyone can act.

“All she has to stand out from the crowd is her willingness to let herself be fondled by an old Jew who has a small penis.”

In the old system, young actors knew they’d have to perform those sexual favours, to justify their being hired to do a job that anyone could do, a job that nevertheless realizes their wildest dreams of fame and fortune.

“Some fool, however, removed that footnote from the Hollywood Sign,” said Mr. Fishman, “and that’s why it’s such a shock now that every single actress has been sexually molested by a movie producer or director or celebrity actor. It’s such a shock only because most people have forgotten how the system works.”

In early 2018, Mr. Fishman launched a campaign to return the addendum to the Hollywood Sign. “The footnote used to be right near the bottom of the big ‘D’ in The Hollywood Sign. You’d have to squint a little, but you could still make it out. Young actors were being warned that they shouldn’t expect the laws of nature to be overturned in Hollywood just because it’s allegedly the land where dreams are made true.

“The addendum used to read—and I quote—‘Young actors should expect to be molested.’ I mean, I’m old now, so I was there when the footnote sign was still posted, prior to the ‘60s.

“I’m old, but I’m also a rich Jewish movie producer with a miniscule penis, so naturally I’ve done my share of fondling young wannabe actors. How else are you supposed to weed out the thousands of beauties who don’t want the part badly enough? How else are the young actors supposed to stand out from the crowd of people who think they can get by just on their looks without their having to lick an old wrinkled ball sack or two? Do you want to rob the starlets of their only opportunity for fame and fortune for doing next to nothing at all—just for reading some lines and getting fondled now and again by an old fat Jew with a tiny penis?

“We’re doing the young actors the bigger favour, believe me! All she does is look at my nearly nonexistent schwantz once in a while. Meanwhile, I pick her out of the crowd and swiftly make her rich and famous. We movie producers are mostly Jewish, so trust us: we know how ethics works. Again, you don’t get something for nothing.”

Friday, November 3, 2017

The Sham of Philosophical Theism

[The following is drawn from my email exchange with Darwin Skeptic. I took thematically-related sections of my messages and assembled them into an article. So you can read my stand-alone case against philosophical theism without having to read the longer email exchange. Enjoy!]
*****
Philosophy, the relatively independent, objective exercise of reason in response to profound questions can address theism or any other subject, but it doesn’t support theism well, by making theistic beliefs more rational than atheistic ones. Atheism is more rational than theism, as far as philosophers are typically concerned. Not everything that philosophers address is illuminated by their ruminations, because unlike science, philosophy is partly artistic and literary, which means it includes speculations and rhetorical rationalizations of cultural prejudices. At its best, though, philosophers provide arguments or illustrations that revolutionize culture or that at least separate the enlightened intellectuals from the hoi polloi. Analytic philosophers currently focus on science and rigorous analysis, minimizing speculation and rhetoric and thus the artistic side of philosophy, at the cost of making their tedious, hyper-detailed writings culturally irrelevant since they’ve had to overlook the bigger issues.  

Atheism and the Danger of Freedom of Thought

In any case, even before offering an atheistic argument or looking at any theistic proof that a religious philosopher might provide, we shouldn’t expect philosophy to establish that a personal creator of the universe exists. After all, Western philosophy grew out of a rejection of popular religion. From Thales to Aristotle, the ancient Greek philosophers ridiculed the popular notions of gods. The Presocratics overthrew the Olympian pantheon, crediting various material or impersonal powers with being the foundations of all other things. They thus made nonsense of the self-serving metaphors we naively proffer to humanize that which would plainly have to be unlike anything in nature, to be the precondition of all knowable categories and particulars, including persons. Plato’s Parable of the Cave famously substitutes goodness for God. Aristotle’s divine being, the primary cause or unmoved mover, retained the personal quality of being able to think, but only because the essence of this being is to reflect on itself. Aristotle’s theology is thus deistic rather than theistic: his God doesn’t create nature but only inspires it as its final cause or purpose, as opposed to being nature’s efficient, mechanical cause. Aristotle’s deity can’t think about or perceive anything other than itself, because doing so would render it imperfect and thus it would cease to exist as the eternal, perfect being which all lower beings look up to.

Such is an example of a philosopher’s god. Of course, Aristotle was only meditating on the celestial motions of what we now know are planets, not perfect persons in any way. But the point is that philosophical reflection on the question of theism in the West has historically acted as a corrective to the intuitive, emotional, faith-based conceptions of divinity. Vulgar religion isn’t argumentative; instead, it’s tribal, the gods being mental projections that celebrate the character of the believers’ culture. To paraphrase the ancient Greek philosopher Xenophanes, if horses or lions could believe in gods, their gods would look like horses or lions. Likewise, aggressive cultures worship angry, jealous gods that threaten to annihilate their worshippers if they don’t conquer some earthly region or other. And as Nietzsche pointed out, victimized people such as the early Christians worship a forgiving deity that prefers weakness to strength, poverty to wealth, modesty to pride. Christianity thus begins with slave morality, because popular, exoteric theism in general is a pre-reflective cultural expression quite inseparable from its religious practices. Indeed, it’s philosophy that distinguishes theistic beliefs from the religion so that the religious ideas can be scrutinized without any social commitment to the religion.

In the West, then, philosophy has historically challenged conventional wisdom in so far as the latter was propped up by prejudices and mass confusions. For that reason Socrates was executed, and so he became the secular Christ figure, the martyr for the elite exercise of reason on behalf of truths which the mob is unwilling to accept, including the truths of naturalism and atheism. Ancient Greco-Roman philosophy was reborn in Europe during the Renaissance, after what historians call the Middle Ages. Why the division rather than historical continuity? Because what passed for philosophy during the Middle Ages was dogmatic and in the hands of the Roman Catholic Church which was opposed to philosophy as such; indeed, the Church effectively demonized philosophy and science as witchcraft and the like, because free-thinking tended to depart from Church teachings. The essence of philosophy was thus a crime punishable by death. For example, the Aristotelian proposition that God can’t think about anything other than himself was banned by the Church in 1270, and between 1210 and 1277 the Church banned many other philosophical statements. But these bans proved ineffective, and despite Aquinas’s grand synthesis of Christianity and Aristotelianism, which was meant to tame the latter for the glory of the former, the act of reading the ancient texts gave some Scholastics an inkling of genuine philosophy. They turned into skeptics who denied that reason could support faith, since they believed reason couldn’t even prove that the external world exists. These skeptics countered the rationalists who adhered to the Church’s bans by confining themselves to pondering how God could act contrary to the condemned parts of Aristotle’s philosophy. All of the Scholastics’ arguments, though, were necessarily limited and often whispered rather than written, since they had to be acceptable to the Church for both the books and the authors to avoid being burned. For example, Nicholas of Autrecourt had to recant his skepticism and burn all his writings in 1347.

When more of the ancient texts became available towards the end of the Medieval Period, they sparked a full-blown humanistic revolution in Europe that set off the Age of Reason, which included the rise of modern science and the Enlightenment, the latter being the enforcing of secular philosophy to establish a culture of liberal humanism. Enlightenment philosophers were authentically philosophical because they were free to pursue ideas wherever they led. By contrast, the classic theistic proofs by Aquinas and the Scholastic philosophers were stale and strained, because their approach to reason in general was artificially narrow-minded. (Aquinas even admitted as much on his death bed, calling his life’s work so much “straw.”) Unlike Socrates, the Church’s intellectuals didn’t love knowledge more than their skins; those that did were the pagans and heretics from Hypatia to Bruno who were tortured and murdered by the orthodox Christians. In between the Ages of Faith and Reason there was a grey area populated by such figures as Descartes, Leibniz, Spinoza, Locke, and even Newton and Kant. Being men of their time when the church was still politically powerful, these philosophers and scientists were partly dogmatists, but they were also partly free thinkers. Most early modern intellectuals thus argued for both esoteric theism (that is, for a version of theism or deism which doesn’t sit well with popular religion, the closer you examine it, seeing through the philosopher’s obfuscations and noble lies) and liberal secular humanism, because philosophy was still once again finding its footing. That development culminated in the rabid naturalism of Nietzsche who predicted the downfall of philosophy itself in our period of hypermodern malaise. 

Thursday, November 2, 2017

Does Philosophy support Theism?

[The following is an email exchange I had with a philosophical theist known as Darwin Skeptic. He defended the proposition that philosophy supports theism better than it does atheism, while I argued against that proposition. We each wrote an opening statement and then several replies. We gave ourselves around a three-page limit for each message. For this presentation of the exchange, I cleaned up Darwin Skeptic’s typos and grammar a little, for the sake of readability. You can read his original messages on his Facebook page. I’d like to thank Darwin Skeptic for participating in the discussion/debate.]

*****

Ben Cain’s Opening Statement:

Philosophy, the relatively independent, objective exercise of reason in response to profound questions can address theism or any other subject, but it doesn’t support theism well, by making theistic beliefs more rational than atheistic ones. Atheism is more rational than theism, as far as philosophers are typically concerned. Not everything that philosophers address is illuminated by their ruminations, because unlike science, philosophy is partly artistic and literary, which means it includes speculations and rhetorical rationalizations of cultural prejudices. At its best, though, philosophers provide arguments or illustrations that revolutionize culture or that at least separate the enlightened intellectuals from the hoi polloi. Analytic philosophers currently focus on science and rigorous analysis, minimizing speculation and rhetoric and thus the artistic side of philosophy, at the cost of making their tedious, hyper-detailed writings culturally irrelevant since they’ve had to overlook the bigger issues.  

In any case, even before offering an atheistic argument or looking at any theistic proof that a religious philosopher might provide, we shouldn’t expect philosophy to establish that a personal creator of the universe exists. After all, Western philosophy grew out of a rejection of popular religion. From Thales to Aristotle, the ancient Greek philosophers ridiculed the popular notions of gods. The Presocratics overthrew the Olympian pantheon, crediting various material or impersonal powers with being the foundations of all other things. They thus made nonsense of the self-serving metaphors we naively proffer to humanize that which would plainly have to be unlike anything in nature, to be the precondition of all knowable categories and particulars, including persons. Plato’s Parable of the Cave famously substitutes goodness for God. Aristotle’s divine being, the primary cause or unmoved mover, retained the personal quality of being able to think, but only because the essence of this being is to reflect on itself. Aristotle’s theology is thus deistic rather than theistic: his God doesn’t create nature but only inspires it as its final cause or purpose, as opposed to being nature’s efficient, mechanical cause. Aristotle’s deity can’t think about or perceive anything other than itself, because doing so would render it imperfect and thus it would cease to exist as the eternal, perfect being which all lower beings look up to.

Such is an example of a philosopher’s god. Of course, Aristotle was only meditating on the celestial motions of what we now know are planets, not perfect persons in any way. But the point is that philosophical reflection on the question of theism in the West has historically acted as a corrective to the intuitive, emotional, faith-based conceptions of divinity. Vulgar religion isn’t argumentative; instead, it’s tribal, the gods being mental projections that celebrate the character of the believers’ culture. To paraphrase the ancient Greek philosopher Xenophanes, if horses or lions could believe in gods, their gods would look like horses or lions. Likewise, aggressive cultures worship angry, jealous gods that threaten to annihilate their worshippers if they don’t conquer some earthly region or other. And as Nietzsche pointed out, victimized people such as the early Christians worship a forgiving deity that prefers weakness to strength, poverty to wealth, modesty to pride. Christianity thus begins with slave morality, because popular, exoteric theism in general is a pre-reflective cultural expression quite inseparable from its religious practices. Indeed, it’s philosophy that distinguishes theistic beliefs from the religion so that the religious ideas can be scrutinized without any social commitment to the religion.