Tuesday, December 28, 2010
Sweet Thing, You've Got Me Singing...
I could never understand the popularity of "Ooo La La La." I remember my schoolmates singing it with such abandon and I just couldn't relate. Even when the Fugees did a remake of it with their hit "Fug-Gee-La," I was so against the original version, that their decision to do this prevented me from taking them seriously (at least not until "Killing Me Softly" came around). Yes, I was that diametrically opposed to the song. It just seemed, I don't know, beneath Teena to do something like that. With her body of work--including hard-hitting hits like "Square Biz," "Portuguese Love" (my personal favorite), and "I Need Your Lovin'"--"Ooo La La La" just seemed cheesy and so much less than the soulful sound I had come to expect of her.
And it takes a LOT for me to expect high-quality soulful music from a white artist. But Teena was no ordinary white artist. She was one of the few that could truly capture the essence, not only of soul music, but of the black culture inextricably linked with it. At the same time, she didn't exploit R&B culture or treat it like a commoditized item to be studied, mastered, and regurgitated like other artists of her persuasion. This is most demonstrated by the fact that in many instances, you could hear the "white girl" shining through loud and clear in much of her singing. But intertwined with those "vanilla child" sensibilities were the "chocolate" influences,
manifesting an effortless marbling that truly reflected her life.
I'd always figured she discovered black culture (including the music) as an adult, fell in love with it, became inspired by it, and created music within the framework of the black aesthetic as a natural progression. It wasn't until I read a recent news piece about her that I learned she was actually raised around black people. It's no wonder her personality, demeanor, and locution came off so authentically. She didn't model her sound and style off of any specific artist like many of the contrived "blue-eyed soul" artists do, and then have some marketing machine force-feed her down our throats. Her sound was uniquely her own and added to the rich fabric of R&B music rather than regurgitating a pale imitation of it. She had a genuine love for soul music and just let that love express itself naturally in her body of work. That authenticity--along with her skillful writing, singing, and musicianship--made fans of the genre gravitate towards her music without all the hype and PR peddling.
Every black fan of R&B music probably has their own list of white artists they'd consider truly and authentically soul singers/musicians. Everyone's overall list might vary, but I bet Teena would top most people's list as the epitome of "blue-eyed soul"--the standard-setter.
To give a sense of how "soulful" Teena was, picture the kind of white woman that would bring some bomb-ass potato salad to the neighborhood block party that would be all the rage--not that "German" stuff with the new potatoes and vinegar, but the good stuff with the mayo, mustard, hard-boiled eggs, and paprika! One minute you'd see her parting some child's hair with a jar of grease, barrettes, and ribbons nearby. The next, you'd see her leading the electric slide line, singing the lyrics to some obscure Funkadelic song, loud, strong, and pitch-perfect. Exhibiting the kind of behavior that would confuse all the newer black folks, having them falsely rationalize, "Oh, she must be Creole then. She Creole, right?" THAT was Teena!
There was just something genuine about her that made her easy to respect and treat as "one of our own." She wasn't one of those artists that hopped on the bandwagon of studying and imitating R&B music for the mere profit of it. And that type of artists is so dreadfully easy to spot. R&B music--at least up until the 2000's--was more than just music; it was an expression of the attitudes, personalities, and mannerisms of the artists producing the music. And a lot of that was a reflection of the black culture in which these artists were raised. Their formation as R&B singers/musicians incorporated the lifetime of experiences they'd had as members of the African-American community and it came through effortlessly and naturally in their music and mannerisms. This rich interconnection of cultural heritage, personal identity, and musical formation gets cheapened by the industry-manufactured "blue-eye soul" artists whose aping often reduces us and our mannerisms into caricatures.
Words cannot express the level of insult and irritation I've felt seeing these outsiders--some not even American, let alone African-American--get the hype, praise, recognition, and oftentimes preferential treatment over those who were raised and continue to live in the culture that inspired the music in the first place. Seeing these artists use "urban" music as a stepping stone to their intended careers of as pop artists, rock stars, or even actors--e.g. Pink, Mark Walberg, etc.--once the "Ooh-look!-A-white-person-who-can-sing/rap-like-a black-person!" fad dies down only adds insult to injury. So the fact that Teena always stayed true to R&B music makes her even more endearing to fans of the genre like myself.
Teena's biggest hit was "Lovergirl," which was probably one of her least R&B-sounding songs. She could have ridden the wave of success by altering her sound and style to retain the popular audience she'd garnered with that hit, but instead she went back to her R&B roots with follow-up albums that produced hits like "If I Were a Bell," "Still in Love," and, yours truly, "Ooo La La La." I actually could never get into these later songs; for me, the magic was in her earlier 80's tunes like "Behind the Groove," "Aladdin's Lamp," "Turnin' Me On," and "Now That I Have You." But even though her later songs were less appealing to me personally, I still respected her longevity and her ability to make hits that still resonated with her audience as a whole--and for decades after she debuted. Her body of work is nothing short of impressive, and for that main reason I am sorry to see her leave so soon and unexpectedly.
Teena was a genuinely talented artist who knew how to create a song--a soulful song--that really touched her fans. She is truly one of the few white artists I'd actually say made good R&B music--not good R&B music for a white person, but good R&B music PERIOD! She has my utmost respect, and so with her passing, I have now made it a point to listen to every song that is currently being played on the radio in her memory. As fate would have it, "Ooo La La La" is the one in the heaviest rotation nowadays. So instead of changing the station to avoid that corny "ooo la la la" riff, I'm leaving it on to play for the song's entire duration. And wouldn't you know it? The song has actually grown on me and has become an earworm of sorts, the kind that can only be satisfied by playing the song--over and over again. In other words, I can't get enough that song now--after all this time!
So Teena, Sweet Thing, you've finally won me over with this one. Without a doubt, you've got me singing "Ooo La La La" and other songs along with countless others who appreciate your musicianship and your dedication to the R&B genre. Rest in peace, Soul Sista.
Thursday, December 23, 2010
Assez, Assez, ASSEZ!
Music producers of today just don't seem to value the art of good instrumentation. They literally combine insipid riffs played from several synthesized instruments (bass #6, piano #22, etc.), throw in a basic drumbeat, and voila--a song has been produced! With such a lazy attitude towards the instrumentation, I won't even bother holding my breath for groundbreaking lyrics that inspire self-reflection and change. That pipe dream died somewhere between India Arie's "Video" and this nonsense.
Back in the late 90's Lauryn Hill told us "MC's ain't ready to take it to the Serengeti." Well, today's MC's seem even less ready than before. But to be honest, at the time that Lauryn opined this sentiment, I wasn't really ready to take it there either--at least, not in the sense of truly listening to her songs' Afrocentric themes and learning from them.
I wasn't ready to hear Lauryn speak about Menelik and the other African czars who kept it "civilized" in "Forgive Them Father." I literally couldn't hear her call to the "bredren" and "sistren", telling us to "come again" and elevate ourselves above the self-destructive, single-minded pursuit of "that thing." Likewise, I wasn't ready to take the journey with Speech to revisit the ghosts of our painful past to understand God's plan for us. I was among those who didn't know what Caron Wheeler meant when she told us to "follow good feeling through and let the superficial pass [us] by, living in the light." And I wasn't ready to chant with Les Nubians at the end of their song "Desolee," crying "Enough! Enough! Enough!" to all the people in developed nations who turn a blind eye to Africans' suffering.
Assez de faire semblant pour lui qui sait dit intelligent, ses geants sans sentiments. When Les Nubians first came onto the scene with "Makeda," I was ready for their French answer to the American-born R&B music genre. I was so ready that I spent a year learning French, just so that I could sing songs like "Sourire," "Princesses Nubiennes," "Bebela," and "Desolee" and learn the meaning of lyrics like the one heading this paragraph (which translates roughly to: "Enough pretense on the part of those who call themselves intelligent--giants without feelings").
I was heartened by Les Nubians' attempt to reach out to Americans in the African diaspora and share their stories from an Afro-French perspective. I appreciated their efforts to marry intricate beats and refined instrumentation with inspirational messages of Pan-Africanism, spirituality, and connection with their African roots. But while I was ready to appreciate all of this, I wasn't ready to really hear it. After translating the lyrics to "Desolee" along with "Demain," and comparing them to Les Nubians' only English-language track, "Sugar Cane," on their debut album, Princesses Nubiennes, I could recognize the connections among the different songs' tales of oppression, exploitation, and suffering in Sub-Saharan Africa, but I didn't take it seriously. I thought the subject matter was esoteric and something only people living in Africa (where Les Nubians were partly raised) or Pan-Africanists like Fugee and Soul II Soul alums would harp on. Basically, I was ignorant. I didn't realize that imperialism is alive and well today, even as we transition into 2011.
Embarrassingly, it took a mainstream publication like the New York Times to finally jolt me out of my (willfully?) ignorant stupor. In one of their recent articles, a reporter detailed how Malian farmers are being displaced by foreign investors whose only intentions are to till the land and produce food for their own countries. Raping and pillaging the Motherland is nothing new. I know this. I'm aware of the atrocities surrounding blood diamonds and other "conflict minerals," the rationale for fair trade, and the other casualties of developed countries' hegemonic pursuit of wealth and power. But I still couldn't help but be struck by this article, especially after reading the callous remarks of the Malian officials who said the foreign investors would actually make something useful and productive out of the arable land, despite the fact that most of the spoils would be exported out of the country. These officials had little to say about what would become of all the displaced rural inhabitants and their livelihoods.

Chaos organized by the law and its soldiers. Men and women condemned under oligarchic rule. They design their schemes and sinister plans, already lost before having the chance to effect positive change.
I wasn't ready to hear lyrics like these before, but I'm ready now. I'm ready to hear calls to arms like "Desolee," with its melodious lyrics harmonized over captivating instrumentation that commands your attention long enough for the words to sink in and resonate (sooner or later). I'm ready to groove AND learn what these crooning scholars have to say about the state of the world as they see it. But where are the Nubians, Lauryn's, and Speech's of today? Where are the songs that speak on today's stories of struggle and oppression? The refrain in "Desolee" ends with the following sobering truth: "Sorry to break it to you, but the powers that be know how substandard our living conditions are. They won't do anything about it, but they know that we're struggling and we'll always be struggling." Michael Jackson put it more simply by saying, "all I wanna say is that they don't really care about us." That was in 1995. Does anyone have anything to say in 2010?
In Lauryn Hill's song "Superstar," she challenged late-90's recording artists, saying "music is supposed to inspire, so how come we ain't gettin' no higher?" I wonder the same thing. Why aren't any of the mainstream acts trying to infuse any kind of wisdom, spirituality, or cultural pride into their music? It can't be that difficult. I'm not asking them to elevate themselves to the level of Lauryn who "sparred with stars and constellations, then came down for a little conversation." I just want something half-way inspiring. They can even be lazy about it and just slap a modern-day spin on the refrain from Sounds of Blackness' "Optimistic" (the one that goes "You can win as long as you keep your head to the sky." And, no, that rebellion-promoting "Whip My Hair" doesn't count either!)
As with my former, less enlightened state of mind back in the day, today's youth may not be ready to hear the intended messages in the lyrics of "conscious music," but you never know when and how those pearls of wisdom might blossom in the young listeners' minds and what kinds of calls to action they might one day inspire.
I'm so sick and tired of the vapid noise permeating the airwaves today--noise that doesn't elevate us, but debases us and makes us ignorant and soulless. As all of these thoughts run through my mind, the ending refrain of "Desolee" takes on a new meaning:
Assez, assez, assez de suffrance.
Assez, assez, assez de l'ignorance.
Assez, assez, assez de misere.
Assez de pouvoir.
Translated it means: Enough, enough, enough suffering. Enough, enough, enough, ignorance. Enough, enough, enough misery. Enough power.
Yes, to the people in power causing all the suffering and misery in the world and paying these "superstar" entertainers to make empty, meaningless music that distracts us and keep us ignorant of your organized chaos:
Enough, enough, ENOUGH!
Tuesday, December 21, 2010
Video Games: Programming Executives to Be Followers
It turns out that non-gamers use the parietal cortex--an area specialized for visual-spatial processing--while gamers use the frontal cortex--an area specialized for higher cognitive functions like planning, attention, and multitasking. So basically, since the neural networks in the gamer brain's executive centers are being underutilized from a dearth of real-world strategic planning and complex decision-making, this area is being recruited and re-purposed to handle the quick, reactionary planning and decision-making involved in high-stakes mortal combat. In other words, young American brains are being rewired for virtual military training. And since video gaming apparently narrows the gender gap when it comes to performance on spatial cognition tests, little girls can be heartened to know that their future as military drone operators is just as secure as that of their male counterparts. (In my best Spice Girl accent: Geh-ole pahwah!)
This finding is nothing more than confirmation for the people pulling the strings that video games can be used reliably and successfully to program young, malleable brains and entrain our youth to follow orders in an increasingly desensitized context where they're less capable of questioning what they're doing (in the grand scheme of things) and why they're doing it in the first place. What's worse is that video games, in one study (cited in a NYT article), have been linked to increased inability to sleep, which could ultimately have a negative effect on a person's ability to develop a sense of self. All the more reason to use video games to ensure that our future drone operators are indoctrinated to conform to the status quo and follow orders!
The study cited in the NYT article suggests that gaming-induced sleep deprivation could lead to an inability to synthesize information, make connections between ideas, and set priorities. So while the NPR segment's producers are hailing video games' ability to "boost brain power," I'm wondering how much this alleged increase in cognition will serve the American citizenry when its time to make the real-world decisions that matter in the broader society? And since the area designed for executive cognitive processing is being re-purposed to focus on the narrow-minded pursuit of virtual combat, what other brain area(s) will be recruited to handle complex decision-making? Will it be the now underutilized parietal cortex? Some other area? No area at all (and thus no complex decisions being made by the citizenry? I'm sure the Powers that Be would LOVE that last scenario.)
When NPR begins the segment by telling parents to fret not, "video gaming may have real-world benefits for your child's developing brain," you have to wonder who will be at the receiving end of these "real-world benefits"? I already have my sneaking suspicions, but do enough of parents have even an inkling of a clue? Hell, do today's parents ever have a clue? Don't bet your civilian life on it. Our next crop of compliant precision fighters is well on its way.
Sunday, December 19, 2010
iSpy

I really wish people would exercise more caution when it came to these devices. Convenience ALWAYS has a cost. If it's cheap in the short run, it's guaranteed to be exorbitant and painful in the long run.
Caveat lector.
Sunday, December 12, 2010
Creative Destruction
Another reason I've avoided writing on this blog is because of its somewhat forced premise. When I first created this blog, I named it "Tacos and Fries" to reflect an interest in taking information from seemingly unrelated sources and synthesizing it all into a completely new understanding that would offer a unique perspective on a variety of topics. I won't go into the details of why this is a difficult standard to live up to, but suffice it to say that it is very time-consuming and quite limiting. As much as I still prefer writing posts within this framework and will continue to do so, I no longer want to be stifled by the pressure of having to adhere to such a demanding formula all of the time.
So to divest myself from this self-imposed pigeon-hole, I'm changing the name of this blog to My Creative Destruction. This phrase from the economics discipline roughly refers to destroying older systems of wealth in order to create newer systems of wealth. Even though the notion reflects all of the ruthless nihilism that makes me despise capitalism--not to mention reminds me A LOT of the "New World Order"--it is still a good analogy for what I need to do on a personal level in order to evolve. I'm very cognizant of the periodic need to break open the shackles of comfort and predictability in order to progress. Now is the time to break free, and I hope to do so with this blog, not just at the present moment, but at any point in the future when it's necessary.
With this name change (and slightly different look) will likely come a different tone and overall purpose for how and why I will write future blog posts. I've decided to retain the previous entries because it will give the reader an honest picture of where I've been (mentally, psychologically, etc.) and how this blog has evolved. I'll probably ask the same questions and make the same observations as before, but with less of the screaming and fist-shaking at the "PTB" that was characteristic of those earlier posts. I don't know where this new direction will lead me, but I suspect it will be an interesting journey nonetheless. Bon apetite.
Sunday, September 19, 2010
The Sweet Deception Continues....
Just wanted to post this video for my own future reference the next time I hear of the Corn Refiners Association's attempt to LITERALLY sugar coat the harmful effects of high fructose corn syrup (HFCS). First they dole out their Sweet Surprise propaganda to downplay HFCS's deleterious effects (which the President of the Association calls "urban legend") by likening the poison to sugar. Now they want to further stress the synonymity by changing the name to "corn sugar."
Well, as Robert Lustig explains in the above YouTube video, "Indeed this is true. High fructose corn syrup and sucrose are exactly the same. They're both equally bad, okay? They're both dangerous. They're both poison." (Surprise!) For anyone interested in seeing how this is so, he provides a very thorough demonstration of his case.
Bottom line: HFCS is bad for you. And hoodwinking the public into believing otherwise by comparing it to sugar (an equally poisonous substance) is disingenuous and disgraceful. What's even more upsetting is the Association's laziness in finding an alternative name. I mean, come on, "corn sugar?" Are people really expected to place higher faith in a product that has the term "sugar" in it? We all know sugar is unhealthy too. Are we supposed to be stupid enough to fall for whatever they're trying to accomplish with this PR move? Apparently, yes.
Sigh....
Please, American consumers. For once, let's prove these corporate assholes wrong! If you start seeing "corn sugar" on the food labels, shun that mess just as quickly as you would "high fructose corn syrup." Your body will thank you for it.
Thursday, August 26, 2010
I Knew that Ebonics B.S. Would Come Back to Bite Us on the Butt

The above article cites an "interesting" assertion that Ebonics is a "combination of English vocabulary with African language." Really? Please explain how. I've heard this explanation of the patois spoken in the English-speaking Caribbean Islands. I never heard this of the English spoken by African-Americans. I could see if this were the early days of slavery where the captives came fresh off the West African coast (or were raised by those who were) and their native tongues heavily influenced their acquisition and pronunciation of the English language. Today's "Black English" sounds nothing like that of the days of slavery. Heck, it doesn't even sound the way it did in the 1940's, or even the 1970's! Have these people seen "Gone with the Wind?" "Superfly?" Do the Black speech patterns heard in those movies sound anything like what one would hear in a Lil' Wayne or Soulja Boy song? I don't think so. (And thank God! Those songs sound like crap!) The African-American dialect (if it can even be monolithically categorized as such) is an ever-evolving way of speaking. The further we move in time from the days of the Middle Passage, that "combination of English and African" argument grows more and more tenuous. So stop contriving some linguistic connection to Africa just to validate an American-bred variety of non-standard English!
But this isn't even what this article is talking about. What they're referring to here is slang. SLANG! They're spending Federal tax dollars on the interpretation of slang! Haven't these people heard of Urban Dictionary? It's free! Why are they wasting Federal funds on this? And why are they treating us like we're foreign speakers? We're Americans! We speak English! Stop treating us like we're "other" and "less than"!
I swear deep in someone's heart is the belief that we shall overcome someday. Can't we make an honest man or woman out of that person? Can we at least give it a try?
Saturday, July 31, 2010
And They Keep Sayin' That We're Free....
I guess I should no longer be surprised at how willing, hell, EAGER people are to just embrace any ole technology that'll make their lives "easier" without realizing the open book they're creating for the Powers That Be. Is this why health care is so expensive and dysfunctional--to drive people to use cheap and readily available devices like this one out of sheer desperation? And a "Band-Aid" of all things! A literal incarnation of what Americans tend to slap on festering problems like our ailing health care system. Can't knock the PTB's sense of irony I guess. (Jackasses....)
Sure these so-called Band-Aids might be convenient and economical, but at what cost? They might even prolong our lives, but what life is worth prolonging when it's being monitored or, worse, controlled? Do these people really think their health is all that will be monitored through these devices? Sadly, yes, so all I can do now is hope that the popularity of these surveillance devices doesn't get so out of hand that the traditional visit to the doctor's office becomes obsolete. Some of us would like to have the choice of avoiding the PTB's monitoring, thank you very much.
So in short, f$#k the PTB!
Or as my girl, Jill Scott, said (or rather sang) more eloquently:
Damn, can I get that democracy
And equality and privacy?
You busy watchin' me, watchin' me
That you're blind, Baby. You neglect to see
The drugs comin' into my community,
Weapons comin' into my community,
Dirty cops in my community.
And you keep sayin' that I'm free.
And you keep sayin' that I'm free.
And you keep sayin' that I'm free, busy watchin' me, AH!
Monday, June 28, 2010
They REALLY Want Us Dumb and Broke, Don't They?

The same article goes on to say that the biggest returns come from professional degrees. Well how the hell are people supposed to get into those professional schools if
My favorite is when she ends the article saying that most people would fare better investing in stocks rather than getting a college degree. Yes America, don't try to learn anything. Just throw all your money into the stock market where the "Powers That Be" can manipulate the fundamentals to shift the financial balance of power even further in their favor.
Don't believe the hype people. Be strategic about your education; don't abandon it altogether. And if college just isn't your thing, get some kind of training or certification in a field that suits your personality and talents. Please don't let the PTB win this game of saturating the media with "reports" that mislead people and economically eviscerate them even further for the benefit of fattening the elite's over-flowing pockets.
Friday, June 18, 2010
Hmm... I'll Go with "Step Too Far"

Thursday, June 17, 2010
Let Me Guess, They Won Right?

In spite of my usual apathy towards basketball, I did actually route for the Lakers this time. Ever since that day (some time last week) when my PI told me they were in the finals (or playoffs? whatever, who cares) and they lost later on that night, my interest in their trials and tribulations was piqued--partly because I wanted them to win for her sake. But even more so because of a realization that dawned on me: the Lakers HAVE to win. Otherwise, we have no other source of hope or pride for this city.
L.A. has been the butt of many jokes for quite some time. From New York to San Francisco, people relish in looking down on L.A. residents and characterizing us as shallow and fake. Sure we have Hollywood, which is fun and glamorous, but that's the main reason why we're seen as shallow and fake. We may also have good weather, but along with that we have smog, bad traffic, and earthquakes. The only source of pride we could ever hope to find would be in a Lakers championship. Then and only then--for at least a day--people would have to recognize the greatness that is L.A. No one could tell us NOTHING!
Maybe I'm just in a weird mood where my pride needs to feed off of other people's accomplishments. I normally know better, but aye, no one's perfect and everyone has their weaknesses. For whatever reason, I really wanted them to win. (Never mind that I haven't watched a single game.) So when I heard that they did win, at first I felt, well, relieved actually. (After all, they did have a grueling finals/playoffs.) But then my relief soon gave way to fear because the manner in which I learned of their victory wasn't from the news or through word of mouth, but from TRAFFIC!
Between braking for frat brats jaywalking across Gayley chanting "Lakers! Lakers!"; dodging aggressive Crenshaw drivers who sped, honked, and weaved in and out of traffic; and pulling over for sirening police cars and ambulances while Laker flags waved in the background--I got all the news I needed. And I was petrified! I had to dip out of Crenshaw traffic several blocks before my usual turn because I was so scared. There were even fireworks at one a point in the evening.
I guess all this means everyone is happy? I suppose, but I'll never understand why celebrations have to escalate into mayhem and unrest. But because the championship put me in a good (enough) mood, as long as no serious harm is done I'll give everyone a pass this time. I'll even refrain from side-eyeing this ignorant fool right here (although he's really testing my patience with this (inadvertent?) throwback to Jim Crow.)

-------------------------------------------------------------------------------------------------------------------
Post script: Okay, I take back my pass. Riots are never justified. They weren't justified for the infamous acquittal of police brutality offenders in '92 and they certainly aren't justified now. Damn it we won! What are we lashing out for?
Welp, I guess people can resume their fun-poking at L.A. With nonsense like this (which seems to happen repeatedly in this city), we're practically begging for it.
Sunday, June 13, 2010
Will Utopian Mirages Trap Shallow, Illiterate Americans into the Singularity's "Net"?

If you ask me, there are two problems with Americans today. One is that we don't read anymore. And this could pose a serious problem in the Digital Age, according to psychologist Maryanne Wolf, who fears if we don't first learn to read, "we may not form a brain...accustomed to automatic inference and critical thinking before the immersion in the digital world." Consequently, Wolf believes we need to deliberately form a reading brain that can transfer all its inferential skills to the digital environment where it can synthesize knowledge within an online context. Wolf goes on to quote Joseph Epstein who said "We are what we read," and argues that what our children read online is different from what was a part of our formation as readers in society. "Our brains have been trained to be critical minds that bring to bear everything that we know and everything that we've read.... Part of our formation was to probe and think deeply through deep reading. Our children may become great decoders of information, but they may not be set to think deeply and go below the surface, in part, because of the way that the presentation exists online."
All this talk of delving below the surface brings me to Americans' second problem: we're becoming tragically shallow. And that's a problem, if this study is any indication, because our collective well-being may suffer for it. What gives me the impression that we're becoming shallow? Well, playing with the numbers from the 1988 and 1998 Myers-Briggs Type Indicator (MBTI) tests, Americans are increasingly preferring action before reflection; desiring breadth over depth; making copious acquaintances rather than fewer substantive relationships; becoming less able to recognize patterns; leaving facts at face-value rather than integrating abstract constructs into the wider context; and making decisions based not on logic and reason but on what achieves the greatest balance among all stakeholders.
Comparing the results from the 1998 MBTI survey with those of its 1988 and 1991 predecessors, Americans are becoming less introverted (-3.0%) and more extroverted (+3.0%), less intuitive (-5.2%) and more sensing (+5.2%), less thinking (-12.7%) and more feeling (+12.7%), and less judging (-4.0%) and more perceptive (+4.0%). Specifically, we're increasingly acting before thinking; we're oriented more towards people and objects than concepts and ideas; we desire breadth over depth, even in our social interactions; we distrust intuition and abstraction and prefer concrete information; and we find meaning in the data itself rather than how the data relates to patterns and theories. We're also increasingly likely to keep decisions open and we make decisions less by following a given set of rules and more by empathizing with the situation and considering the needs of everyone involved in order to achieve the greatest harmony among those affected.
Looking at the personality types associated with famous people, one sees that with this increase in extroverted, sensing, feeling, and perceptive (ESFP) types, we're gaining more entertainers like Michael Jackson, Elvis Presley, Bill Cosby, and John Travolta. We're also seeing more charismatic politicians who energize their bases but repel their political opponents like Bill Clinton and Ronald Reagan. We're fortunately seeing more selfless humanitarians like Mother Theresa, but we're also seeing increasing superficiality with a surge in iconic beauties like Jacqueline Kennedy Onassis, Marilyn Monroe, and Brooke Shields.
With less introversion, intuition, thinking, and judgment (INTJ), we're losing influential historical leaders like George Washington, Thomas Jefferson, Abraham Lincoln, Alexander the Great, and Julius Caesar. We're seeing fewer non-violent human rights activists like Gandhi, Martin Luther King, and Nelson Mandela. We're witnessing the emergence of fewer scientists and other great thinkers like Charles Darwin, Albert Einstein, Isaac Newton, and Plato, and fewer inventors and innovators like Thomas Edison and Walt Disney.
In short, we're seeing more entertainment, charisma, physical beauty, and political polarization, and less foundational leadership, scientific discovery, and innovation. Interestingly, we're seeing fewer inspirational leaders who espouse ideas that compel us to improve the world, which may be less important with an increase in humanitarians. But perhaps all this charity work would be unnecessary if the nation weren't becoming a place where problems repeat themselves because patterns aren't recognized; solutions are arrived at in the most shallow and least analytical manner possible; and societal ills go largely unresolved as political polarization worsens, innovation decreases, and mindless entertainment placates the woes of superficial, social butterflies who act before they think. Our only saving grace is that more decisions will be made through consensus-building--that is, if they're made at all, which is less likely with the more indecisive ESFP types.
So what's causing this shift in personalities? Some believe the preference for breadth over depth and factoids over integrated knowledge is the hallmark of a postmodern society. But if so, what's driving this postmodern trend? A child psychologist by the name of Alison Gopnik believes it may be the increase in the time spent in the education system which arrests people's psychological development and keeps them in a state of expert fact-gathering and skillful memorization at the expense of integration of knowledge and decision-making.
Another perspective comes from Ray Bradbury, author of Fahrenheit 451 (published in 1951), who names television as the culprit, believing it causes Americans to lose interest in literature and to favor factoids over integration of information (e.g. knowing Napoleon's birthday without knowing his significance in history). If this line of reasoning is true, then the problem can only worsen with the increase in both type and frequency of media consumption today. We're already seeing more extroverts directing their energies outward toward people (idolizing and modeling after celebrities; social networking; etc.) and objects (iPhones, iPads, clothes, and other material items). Today's technology may further drive extroverts to be more action-oriented (participating in reality shows; posting their own YouTube videos; tweeting; and, yes, blogging); to desire breadth (Googling for factoids and multitasking); and to prefer more frequent social interactions (texting, IM chatting, and "friending" hundreds of casual acquaintances).
I'd love to see what the MBTI stats look like today. It would be interesting to see if the explosion of Web 2.0 technologies has yielded more shallow personality types. A recent Frontline documentary called "Digital Nation" may provide a glimpse into the Web's influence. One notable trend was the infantilization of adults and their detachment from reality as they increasingly engage in virtual worlds (gaming, Second Life, etc.). The other--perhaps less surprising--trend was the alarming increase in multitasking among children and adolescents. As with most debates on multitasking, this documentary featured the usual baby-boomer technological cheerleaders glibly accusing their nay-saying counterparts of being "old fogies" and technophobic Luddites.
As a Gen-X'er observing from the sidelines, I must say I share the concerns of those "Luddites." I've always had my qualms about multitasking because it encourages cursory attention to information, making it virtually impossible to go deeper and find meaning in the disparate sources of information. Sharing these concerns, author Nicholas Carr fears that the Internet with all its distractions and interruptions may turn us into "scattered and superficial thinkers." He noted the studies of developmental psychologist Patricia Greenfield who concluded that while certain computer tasks may enhance visual-spatial intelligence, this may be at the expense of losing "higher-order cognitive processes," such as "abstract vocabulary, mindfulness, reflection, inductive problem solving, critical thinking, and imagination."
Carr argues that the multitasking Internet environment deprives us of the chance to devote the requisite amount of attention necessary for associating new pieces of information with previously acquired and established knowledge so that we can master complex concepts. According to Carr, with all the Internet's distractions, "our brains are unable to forge the strong and expansive neural connections that give depth and distinctiveness to our thinking. We become mere signal-processing units, quickly shepherding disjointed bits of information into and then out of short-term memory."
Carr goes on to explain how book reading protects against this, saying that "reading a long sequence of pages helps us develop a rare kind of mental discipline" that gives us the "capacity to engage in the quieter, attentive modes of thought that underpin contemplation, reflection and introspection." Technophile Clay Shirky disagrees, countering that reading is "an unnatrual act; [for] we are no more evolved to read books than we are to use computers." Indeed, Carr himself argues that our "innate bias is to be distracted. Our [evolutionary] predisposition is to be aware of as much of what's going on around us as possible" in order to reduce the likelihood of being attacked by a predator or overlooking an important food source. Deep reading helps train us to sustain our attention towards one task which allows time for integration of ideas and big-picture thinking.
But unlike Carr who urges that we continue teaching this discipline of deep, critical reading, Shirky argues that "literate societies become literate by investing extraordinary resources, every year, training children to read. Now it's our turn to figure out what response we need to shape our use of digital tools.... We are now witnessing the rapid stress of older institutions accompanied by the slow and fitful development of cultural alternatives. Just as required education was a response to print, using the Internet well will require new cultural institutions as well, not just new technologies." (Typical sanguine libertarian drivel BTW. Just look at the source of the article: WSJ, where blind faith in markets, technology, and being a lemming are par for the course.)
So is that why people like Shirky (gotta love that name) harbor such an aversion to funding education? Because we need alternatives that educate children in a way that caters to their technological sensibilities? Well, one thing's for sure: Shirky certainly believes we shouldn't fear the future but embrace it. For, as he says, "we are living through [an] explosion of publishing capability today, where digital media link over a billion people into the same network. This linking together in turn lets us trap our cognitive surplus, the trillion hours a year of free time the educated population of the planet has to spend doing things they care about. In the 20th century, the bulk of that time was spent watching television, but our cognitive surplus is so enormous that diverting even a tiny fraction of time from consumption to participation can create enormous positive effects.
Hmm... That "cognitive surplus" idea sounds suspiciously like the Singularity which a recent NYT article defines as "a time, possibly just a couple decades from now, when a superior intelligence will dominate and life will take on an altered form" where "human beings and machines will so effortlessly and elegantly merge that poor health, the ravage of old age and even death itself will all be things of the past." So basically, it's a merging of humans and machines to create a super intelligence, or in Shirky's words, a "cognitive surplus," where people throughout the world can link their minds together through technology in a way that will improve the human condition. This is not as far-fetched as one might think if Carr's warning is correct and our brains are already being reduced to automatic "signal-processing units" that only find meaning in decoding the data itself rather than how the data relates to patterns and theories. Heck, we're already meeting the computers half way if that's the case!
Such automated thinking, however, may be our undoing if Daniel Pink's assertions are true. Pink, the author of A Whole New Mind, gave a recent lecture where he argued that rules-based routinized work that can be automated (tax accounting, processing uncontested divorce papers, etc.) are already being made obsolete by the likes of programs like TurboTax and 123DivorceMe.com and it will only get worse with time. One way to protect our viability is to tap into our right brain capabilities and develop qualities like inventiveness, empathy, and meaning in order to produce work that will be difficult to automate. Left-brain thinking is analytical, sequential, and logical. Unlike the left brain, the right brain processes everything at once, works with the context of the data (rather than the data in and of itself), and synthesizes (rather than merely analyzing) the data. He argued that machines replaced our muscles by taking over our manufacturing jobs during the last century. This century machines will replace our brains--not our empathetic, creative, big-picture-thinking right side, but our rules-based, logical left side.
If those MBTI stats are correct, then our one ray of hope lies in our increasing ability to empathize. But when it comes to addressing the "creative" and "big-picture thinking" parts of the puzzle, we have a lot of work to do. Carr would argue that deep reading could help develop many of those necessary qualities espoused by Pink, including increasing our creativity. But as I said at the outset, Americans aren't reading anymore, so I guess a lot of us will be out of work if Pink's forecasting is correct.
If children are to meet the needs of the 21st century workforce by developing creative minds that can integrate knowledge and engage in big-picture thinking, how are we to educate them? Should reading for deep, critical thinking still be stressed? I believe it should. But should we also transfer this deep-thinking approach from the print environment to the online realm? Most definitely! Maybe this is what Shirky meant when he said "using the Internet well will require new cultural institutions." If so, then what institutions, and how should they address the problem?
As of yet, I've heard no real answers to this conundrum, and this lack of real solutions is where I feel the agenda truly lies. My fear is that there won't be any attempt to address this problem because they WANT us to be shallow, noncritical thinkers. (And this, by the way, is the true reason behind the declining support for education, libraries, and literacy programs.) They want our brains to be reduced to shallow information processing centers, not only so that we won't question authority or possess the mental acumen for independence and self-determination, but also so that we'll--voluntarily or otherwise--meld our brainpower with that of computer technology. Why? Who knows? To fulfill some "Matrix" or Brave New World type of prophecy? Population surveillance? Subjugation? Universal mind control? You name it. But the glamorization of robots and cyborgs is undeniably omnipresent in pop culture from Beyonce and Christina Aguilera to autotune and the Transformers franchise. We're clearly being instructed to embrace this inevitability, and it just might work if the claims in this article and this news segment are any indication.
Interestingly, both WSJ articles (Carr's and Shirky's) contain at least one occurrence of the word "Net" rather than "Internet," which piqued my interest. This abbreviated form is less commonly used than the fuller form, so my increasingly paranoid mind couldn't help but notice and read into it as a deliberate choice of words. Nets entrap their intended victims, and I can't help but feel that the deliberate neglect of education and literacy needs coupled with the concerted universalization of technology in the name of attaining the Singularity is a trap. So I'll just wait it out on the sidelines with my fellow "cave-dwellers" and proceed with caution when it comes to blindly adopting "utopian" technologies. And all those technophiles pointing their fingers and sneering at us Luddites can go ahead and continue their dismissive derision. I'll happily let them be the guinea pigs and join in much later when the coast is indisputably clear because my fear is that nothing but bad will ultimately come of all of this. But believe me when I say I hope more than anything that I'm wrong. Unfortunately, I don't think I am....
Sunday, June 6, 2010
Trust Me, "Enuf" Is not Enough


Silly protesters. :-) Surely they jest, yes? No? Oh. Well, let me take this opportunity to school them on how misguided and absurd their efforts are. I mean don’t get me wrong, I’ve had many a vexation with the English language and its so-called "spelling system." Any language that would have “colonel” spelled and pronounced the way it does is asking for a picket sign or two. I vividly recall sympathizing with Ricky Ricardo’s confusion when he read and incorrectly pronounced various words ending in “-ough.” Along with the audience, I too laughed in acknowledgment of how ridiculous and unpredictable English spelling can be. That said, come on people! Did it ever occur to you that protesting a children's spelling contest might seem a bit drastic (not to mention futile)?
Look, I understand the concern that the current system makes reading difficult for many struggling children. But I’ve also witnessed my own spelling skills improve the more I read and write. In particular, the more literary-style pieces I read, the greater my exposure to complex words (and their spelling). And whenever I write a complex composition, my orthographic mistakes reveal how strong or weak my spelling skills are. So a lot of it is just a matter of reading and writing habits. With the exception of a few people with learning disabilities or other impediments, most children will eventually gain proficiency in reading and writing in English, as long as they practice it.
I say keep it complicated and unpredictable because you can tell a lot about an adult's reading habits by the way they write. Dumbing down English spelling conventions would only remove an obvious red flag that could reveal a lot about the way a person spends their leisure time, their level of intellectual curiosity, and their attention to detail. I say don't mess with a built-in warning system when you've got one.
Other people have come up with some great arguments against overhauling the English spelling system, the most important being: which spelling convention would we use? One that addresses the American pronunciation (which, for instance, vocalizes the terminal "r")? The British one (which omits the terminal "r")? Some other English-speaking country? The two groups protesting the spelling bee come from the U.S. and the UK. Did they even think this through before charging towards D.C. with their construction paper and magic markers?
Another (perhaps less convincing) argument is the rich history embedded within English spelling that would be lost if everything were normalized. My favorite "colonel" is a good example, reflecting two different sources of influence (one Latin, the other Norman) which created two different pronunciations and two different spellings. For whatever reason we've adopted the mix-n-match spelling and pronunciation that we know and love (to hate) today. While I wouldn't necessarily use this as justification to retain the spelling convention for this word, other differences (eg. "great" vs. "grate"), while not always predictable, may reveal a bit about the evolution of the English language and the regional history of its speakers.
Still not convinced? Fine. Let me proceed with the following 2 arguments:
Exhibit A
Let's pretend the spelling conventions have been normalized and all spelling is phonetic and predictable. Consider the following sentence:
Thayr wundering if thayr kernol ate ate kernols uv korn over thayr.
Okay wait. I'm sorry. There's just TOO much fugly going on in that little bit of sentence right there. Why don't we forget the phonetics for now and just focus on predictability and consistency? Let's consider the following:
There wondering if there kernel ate ate kernels of corn over there.
Now with effort and logic ["there" as a verb phrase vs. possessive pronoun vs. preposition; "ate" as a verb vs. number; people eat food, not the other way around (usually)] one could deduce the meaning of this sentence. But how much more quickly would they have understood it if they had simply read:
They're wondering if their colonel ate eight kernels of corn over there.
If we changed the spelling conventions just for the sake of consistency, we'd have a lot of similarly spelled homonyms to contend with, some of which occupy the same parts of speech. ("Do we have time/thyme?" "I need/knead dough for a living." "Those are some phat beats/fat beets!") It's true that context could provide the necessary clarification for some of these (admittedly corny) sentences. (After all, we seem to have little difficulty understanding them when heard in verbal dialogue.) But with inferential reading skills on the decline, even among college graduates, I wouldn't push the need to rely on context too heavily.
What we'd probably need to do is eliminate many of these words in favor of their more unique sounding synonyms (or make up new words if no synonyms exist). Do we really want to do this? If these people can't be bothered to learn (or teach) the current spelling conventions, do they honestly think they'll be willing and able to adopt not only a new spelling system but also a new vocabulary? Be careful what you wish for. It sounds to me like a lot more work than people like this would care to deal with.
Exhibit B
I read the urban (read: "black") blogs frequently and I'm beginning to see a correlation between people's value judgments and their spelling (as well as grammatical) errors. Case in point, there was a recent blog discussion about Ciara's "Ride" video, which got banned from BET for being too explicit. Many of the commenters thought BET was justified in pulling the video, while others thought the network was being hypocritical considering the garbage it usually airs. Most of the comments had passable-to-good spelling and grammar, but a few fell short. The following was the worst of them all:
ITZ RLLY MESSED UP BECAUSE ITZ NOT DAT ABD AT ALL. SHE DID HER THING AND SHE LOOKS FANTASTIC!!!! SMH HATERZ.....
Basically, this person is complaining about other people's gripes with yet another unnecessary display of brazen sexuality that would further contribute to the already heavily entrenched media-influenced hypersexual behavior of young black people today. How did this person express this point? By yelling in all caps [which obviates the need to observe capitalization rules and press the SHIFT key every once in a while (suggesting laziness and/or impatience)]; using the word "DAT" (because it's on an "urban" blog?); employing creative spelling conventions with the substitution of "Z" for "S" (just because?); and throwing in a bonus "HATERZ" (just for good measure). Believe it or not, I peruse the comments sections of these blogs for the unique perspectives, the insightful dialogue, and the witty repartee I often find therein. This pointless comment is not what I come for.
Another case in point comes from a different "urban" blog where a post mentioned that someone by the name of "Birdman" (yeah I never heard of him either) bought Lil' Wayne (a.k.a. Weezy) a $1,000,000 watch and a $200,000 diamond encrusted cake for his birthday. Many of the commenters felt this was excessive and wasteful. Take a wild guess at what the worst "writer" had to say. (Read it and weep.)
aa wat happen 2 sum people the man wanted 2 buy sum ting 4 his son wats wrong wit dat uh.birman u write 2 buy the million dollar wat 4 weezy.
*Shuddering violently here* At least the other comment didn't take as much effort to understand. With this one here, I'm frankly surprised at how many words were spelled correctly, given the person's blatant misspelling and conflation of the words "watch" and "what" and their confusing use of "write" in lieu of "right." More importantly, what this person has in common with the "Ciara stan" is something I see all too often on these blogs: the most gratuitous spelling, grammatical, and capitalization errors come from the loudest, most vehement defenders of gratuitous celebrity behavior, whether it's vulgar sexuality, shameless brandishing of wealth, or some other instance of an eroding value system being shoved down the black community's throat by the media. It's gotten to the point where absolving (if not outright praising) degenerate behavior seems to go hand in hand with bad writing. I can pretty much predict the level of insight (and scruples) that can be found in a blog's comments section based solely on the spelling errors alone. If more than 10% of the comments have those errors, I don't bother with the blog. (Looking at you, MTO!)
It's because of these experiences that I harbor a healthy dose of skepticism towards these spelling bee protesters. Contrary to what they proclaim, "Enuf" is not enough. (I mean, really! At least add another "f" to the ending. It may not be necessary phonetically, but leaving it at just one "f" would be a crime against humanity--not to mention humanity's eyes!) People with the most egregious spelling errors often defend the most debauched and unproductive behavior and use the angriest and most simple-minded arguments to express themselves. Catering to this crowd would only hasten our spiral downwards towards a society of increasingly impulsive, aggressive, and depraved simpletons who never question anything and shout down anyone else who does. Maybe I'm positing more negative implications than would actually transpire if we made these spelling changes, but do we really want to take that risk? Trust me, for more reasons than we dare fathom, "enough" is definitely enough. So leave the Bee be!
Sunday, May 16, 2010
Pay for Play not Dismay = Death of the Fourth Estate

NOW is just the latest casualty in an increasingly beleaguered journalism industry. While its story is not unique--I recently lamented with a colleague on how small the LA Times paper had become--there was still something special about NOW's approach to journalism that, for me, will leave a huge void in its absence.
The press is often referred to as the "fourth branch of government" or the "fourth estate." Its perceived role as the fourth branch of government stems from the belief that its responsibility to inform the public is crucial to a functioning democracy. The notion of it being the "fourth estate" emphasizes its independence from the three official branches of government. With its lack of critical reporting during the run up to the Iraq War, we've seen the press act very much like a neglectful fourth branch of government, parroting whatever the White House press secretary chose to answer during his controlled press conferences. The country could have benefited greatly from a "fourth estate," scrutinizing the government's actions and exposing the manipulation of facts that led to the general acceptance of its actions.
That the press should act as the fourth estate seemed to be the ethos underlying NOW's brand of reporting. NOW quickly gained a reputation for focusing their critical eye on the powers that be, exposing government corruption and politicians' neglect in their responsibility to serve We the People. NOW unflinchingly called out corporations' collusion with the government in "Trivializing Corruption" and revealed the self-serving practices of corporations that fund ballot initiatives in "Taking the Initiative." They reported on laws that deliberately marginalized underserved segments of the electorate and explained how our national subsidies often contribute to the poverty of developing nations. I believe it was NOW that first taught me how the rising cost of health care affects jobs and the economy and that military officials are allowed to foist their recruitment efforts onto schoolchildren because of a provision in No Child Left Behind. They even took a critical look at themselves (as members of the media) in questioning the press' neglect in asking the tough questions during the months leading up to the Iraq War and in discussing media reform in general.
To be sure NOW wasn't the only purveyor of whistle-blowing investigative journalism. But they did have a way of breaking down the details of a story: providing the background, historical context, and political and financial motivations behind an initiative as well as the conflicts of interest surrounding the perpetrators profiting from passing or blocking the initiative. They even seemed to exercise journalistic integrity by revealing their own potential conflicts-of-interest, disclosing if an interviewee was affiliated with PBS or one of its funding sources. Now it seems we only have the likes of Frontline to look for such in-depth reporting, and even they sometimes seem to be headed for trouble with their infrequent reporting. (For that matter, NPR's website often seems to be featuring more blog posts and less audio pieces. I hope that's not an omen. Sigh...)
As NOW's original host Bill Moyers described: "Americans are saturated with events in the headlines, but in this pounding news cycle it is hard to grasp the bigger picture and the larger forces driving daily developments. NOW will report on the reality behind and beyond news-making events." Unfortunately, with NOW's demise, Americans will have one less news source to distill the behind-the-scenes details of the events that affect our lives. With the industry-wide struggle of news organizations in general, the onus increasingly falls upon us to do the in-depth investigation that gives us the nuanced understanding of our world.
Do I have faith that we'll pick up the slack? Well, maybe if I were convinced that people had the desire to learn about their world I could muster up a modicum of optimism. Unfortunately, if a glimmer of hope ever emerges, it gets immediately crushed by statements such as the following from the comments section of an On Point program discussing a virtual game called "Worlds of Warcraft":

Sigh.... Jim, while it's true you don't get to interact with people while listening to the news, you still benefit by learning about the world you're actually living in! And instead of focusing all your energy on some contrived fairy tale land solving nonexistent problems, you can learn about the problems that exist in the real world and--oh I don't know--solve them instead!
Along with this gaming-is-more-interactive-than-TV-and-the-news argument were a lot of comments aimed at dispelling the gaming-people-are-nerds myth or that viciously defended this lifestyle by calling the naysayers "judgmental hypocrites" who likely waste our time watching T.V. or reading fiction. Other comments centered on the benefits of communicating and "working" with people all over the world. Not one of the game's defenders addressed the comments that questioned the amount of energy and electricity wasted while playing these games or those that remarked on how improved the world would be if all the time and money spent on gaming were harnessed into eliminating society's ills. Why were these questions ignored? Because--at the risk of generalizing--these are the kinds of people who largely don't care about the real world and they don't want to acknowledge or address the real suffering going on in it. The following comment sums it all up:

To avoid my usual digressions, I'll ignore the anti-"it takes a village to raise a child" sentiment of this argument and focus on the "it makes me sick to see what this world has become" statement. This bury-your-head-in-the-sand mentality is the main reason why any grown person would ever want to spend hours a day everyday immersed in gaming, virtual reality, or any other form of mindless entertainment. Unfortunately, recreational activities are on the rise in every demographic in America, so much so that there are now college degrees for game design and development. This, taken together with the decline in newspaper subscriptions and public radio contributions, suggests that more people would rather pay to bury their heads in the sand than consume news that might upset, yet inform them. In the end, this "see no evil, hear no evil" attitude is the wrecking ball that is slowly demolishing the fourth estate. And the demise of the fourth estate, in turn, can only mean dire consequences for democracy itself. For in the words of Thomas Jefferson: If a nation expects to be ignorant and free... it expects what never was and never will be.