Monday, June 28, 2010

They REALLY Want Us Dumb and Broke, Don't They?

Wow. Yet another article discouraging the pursuit of higher education. Apparently college graduates will only out-earn their high-school graduate counterparts by a mere $400,000 rather than the million-dollar estimate previously reported. I'm sorry, but since when is $400,000 considered "paltry"?

The same article goes on to say that the biggest returns come from professional degrees. Well how the hell are people supposed to get into those professional schools if minions of the “Powers That Be” people in the media, like this author, keep touting the worthlessness or intellectual inaccessibility of undergraduate education? (If I have to read one more heading asking if college is right for everyone I'm going to scream! Why don't they just come right out and say what they really mean?: "Idiots are lost causes, so don't waste your time (or ours) trying to achieve something that will never happen--learning.) Not once did she mention 2-year colleges, which would significantly reduce the price tag of a bachelor's degree. How would that factor into PayScale's ROI estimates?

My favorite is when she ends the article saying that most people would fare better investing in stocks rather than getting a college degree. Yes America, don't try to learn anything. Just throw all your money into the stock market where the "Powers That Be" can manipulate the fundamentals to shift the financial balance of power even further in their favor.

Don't believe the hype people. Be strategic about your education; don't abandon it altogether. And if college just isn't your thing, get some kind of training or certification in a field that suits your personality and talents. Please don't let the PTB win this game of saturating the media with "reports" that mislead people and economically eviscerate them even further for the benefit of fattening the elite's over-flowing pockets.

Friday, June 18, 2010

Hmm... I'll Go with "Step Too Far"

Just saw this interesting bit of news several minutes ago. I haven't read all the arguments for or against it so I don't know all the details, but my gut reaction is: this doesn't look good. Hope I'm wrong....

Thursday, June 17, 2010

Let Me Guess, They Won Right?

I guess congratulations are in order for the Lakers who won their 16th title. (At least, I think it's their 16th title. I wouldn't know; I don't follow sports. (Well, I did religiously follow the '92 and '96 US Women's Gymnastics teams, but since some condescending sexists people don't consider that a sport, I'll just concede to being ignorant of all things athletic, for simplicity's sake.))

In spite of my usual apathy towards basketball, I did actually route for the Lakers this time. Ever since that day (some time last week) when my PI told me they were in the finals (or playoffs? whatever, who cares) and they lost later on that night, my interest in their trials and tribulations was piqued--partly because I wanted them to win for her sake. But even more so because of a realization that dawned on me: the Lakers HAVE to win. Otherwise, we have no other source of hope or pride for this city.

L.A. has been the butt of many jokes for quite some time. From New York to San Francisco, people relish in looking down on L.A. residents and characterizing us as shallow and fake. Sure we have Hollywood, which is fun and glamorous, but that's the main reason why we're seen as shallow and fake. We may also have good weather, but along with that we have smog, bad traffic, and earthquakes. The only source of pride we could ever hope to find would be in a Lakers championship. Then and only then--for at least a day--people would have to recognize the greatness that is L.A. No one could tell us NOTHING!

Maybe I'm just in a weird mood where my pride needs to feed off of other people's accomplishments. I normally know better, but aye, no one's perfect and everyone has their weaknesses. For whatever reason, I really wanted them to win. (Never mind that I haven't watched a single game.) So when I heard that they did win, at first I felt, well, relieved actually. (After all, they did have a grueling finals/playoffs.) But then my relief soon gave way to fear because the manner in which I learned of their victory wasn't from the news or through word of mouth, but from TRAFFIC!

Between braking for frat brats jaywalking across Gayley chanting "Lakers! Lakers!"; dodging aggressive Crenshaw drivers who sped, honked, and weaved in and out of traffic; and pulling over for sirening police cars and ambulances while Laker flags waved in the background--I got all the news I needed. And I was petrified! I had to dip out of Crenshaw traffic several blocks before my usual turn because I was so scared. There were even fireworks at one a point in the evening.

I guess all this means everyone is happy? I suppose, but I'll never understand why celebrations have to escalate into mayhem and unrest. But because the championship put me in a good (enough) mood, as long as no serious harm is done I'll give everyone a pass this time. I'll even refrain from side-eyeing this ignorant fool right here (although he's really testing my patience with this (inadvertent?) throwback to Jim Crow.)

Everyone gets a pass because with state and local budgets shrinking, government employees getting laid off in droves, and, frankly, nothing else exciting going on here, this Lakers win is all we've got going for us. Clearly!

-------------------------------------------------------------------------------------------------------------------

Post script: Okay, I take back my pass. Riots are never justified. They weren't justified for the infamous acquittal of police brutality offenders in '92 and they certainly aren't justified now. Damn it we won! What are we lashing out for?

Welp, I guess people can resume their fun-poking at L.A. With nonsense like this (which seems to happen repeatedly in this city), we're practically begging for it.

Sunday, June 13, 2010

Will Utopian Mirages Trap Shallow, Illiterate Americans into the Singularity's "Net"?

I know no one's asking, but...

If you ask me, there are two problems with Americans today. One is that we don't read anymore. And this could pose a serious problem in the Digital Age, according to psychologist Maryanne Wolf, who fears if we don't first learn to read, "we may not form a brain...accustomed to automatic inference and critical thinking before the immersion in the digital world." Consequently, Wolf believes we need to deliberately form a reading brain that can transfer all its inferential skills to the digital environment where it can synthesize knowledge within an online context. Wolf goes on to quote Joseph Epstein who said "We are what we read," and argues that what our children read online is different from what was a part of our formation as readers in society. "Our brains have been trained to be critical minds that bring to bear everything that we know and everything that we've read.... Part of our formation was to probe and think deeply through deep reading. Our children may become great decoders of information, but they may not be set to think deeply and go below the surface, in part, because of the way that the presentation exists online."

All this talk of delving below the surface brings me to Americans' second problem: we're becoming tragically shallow. And that's a problem, if this study is any indication, because our collective well-being may suffer for it. What gives me the impression that we're becoming shallow? Well, playing with the numbers from the 1988 and 1998 Myers-Briggs Type Indicator (MBTI) tests, Americans are increasingly preferring action before reflection; desiring breadth over depth; making copious acquaintances rather than fewer substantive relationships; becoming less able to recognize patterns; leaving facts at face-value rather than integrating abstract constructs into the wider context; and making decisions based not on logic and reason but on what achieves the greatest balance among all stakeholders.

Comparing the results from the 1998 MBTI survey with those of its 1988 and 1991 predecessors, Americans are becoming less introverted (-3.0%) and more extroverted (+3.0%), less intuitive (-5.2%) and more sensing (+5.2%), less thinking (-12.7%) and more feeling (+12.7%), and less judging (-4.0%) and more perceptive (+4.0%). Specifically, we're increasingly acting before thinking; we're oriented more towards people and objects than concepts and ideas; we desire breadth over depth, even in our social interactions; we distrust intuition and abstraction and prefer concrete information; and we find meaning in the data itself rather than how the data relates to patterns and theories. We're also increasingly likely to keep decisions open and we make decisions less by following a given set of rules and more by empathizing with the situation and considering the needs of everyone involved in order to achieve the greatest harmony among those affected.

Looking at the personality types associated with famous people, one sees that with this increase in extroverted, sensing, feeling, and perceptive (ESFP) types, we're gaining more entertainers like Michael Jackson, Elvis Presley, Bill Cosby, and John Travolta. We're also seeing more charismatic politicians who energize their bases but repel their political opponents like Bill Clinton and Ronald Reagan. We're fortunately seeing more selfless humanitarians like Mother Theresa, but we're also seeing increasing superficiality with a surge in iconic beauties like Jacqueline Kennedy Onassis, Marilyn Monroe, and Brooke Shields.

With less introversion, intuition, thinking, and judgment (INTJ), we're losing influential historical leaders like George Washington, Thomas Jefferson, Abraham Lincoln, Alexander the Great, and Julius Caesar. We're seeing fewer non-violent human rights activists like Gandhi, Martin Luther King, and Nelson Mandela. We're witnessing the emergence of fewer scientists and other great thinkers like Charles Darwin, Albert Einstein, Isaac Newton, and Plato, and fewer inventors and innovators like Thomas Edison and Walt Disney.

In short, we're seeing more entertainment, charisma, physical beauty, and political polarization, and less foundational leadership, scientific discovery, and innovation. Interestingly, we're seeing fewer inspirational leaders who espouse ideas that compel us to improve the world, which may be less important with an increase in humanitarians. But perhaps all this charity work would be unnecessary if the nation weren't becoming a place where problems repeat themselves because patterns aren't recognized; solutions are arrived at in the most shallow and least analytical manner possible; and societal ills go largely unresolved as political polarization worsens, innovation decreases, and mindless entertainment placates the woes of superficial, social butterflies who act before they think. Our only saving grace is that more decisions will be made through consensus-building--that is, if they're made at all, which is less likely with the more indecisive ESFP types.

So what's causing this shift in personalities? Some believe the preference for breadth over depth and factoids over integrated knowledge is the hallmark of a postmodern society. But if so, what's driving this postmodern trend? A child psychologist by the name of Alison Gopnik believes it may be the increase in the time spent in the education system which arrests people's psychological development and keeps them in a state of expert fact-gathering and skillful memorization at the expense of integration of knowledge and decision-making.

Another perspective comes from Ray Bradbury, author of Fahrenheit 451 (published in 1951), who names television as the culprit, believing it causes Americans to lose interest in literature and to favor factoids over integration of information (e.g. knowing Napoleon's birthday without knowing his significance in history). If this line of reasoning is true, then the problem can only worsen with the increase in both type and frequency of media consumption today. We're already seeing more extroverts directing their energies outward toward people (idolizing and modeling after celebrities; social networking; etc.) and objects (iPhones, iPads, clothes, and other material items). Today's technology may further drive extroverts to be more action-oriented (participating in reality shows; posting their own YouTube videos; tweeting; and, yes, blogging); to desire breadth (Googling for factoids and multitasking); and to prefer more frequent social interactions (texting, IM chatting, and "friending" hundreds of casual acquaintances).

I'd love to see what the MBTI stats look like today. It would be interesting to see if the explosion of Web 2.0 technologies has yielded more shallow personality types. A recent Frontline documentary called "Digital Nation" may provide a glimpse into the Web's influence. One notable trend was the infantilization of adults and their detachment from reality as they increasingly engage in virtual worlds (gaming, Second Life, etc.). The other--perhaps less surprising--trend was the alarming increase in multitasking among children and adolescents. As with most debates on multitasking, this documentary featured the usual baby-boomer technological cheerleaders glibly accusing their nay-saying counterparts of being "old fogies" and technophobic Luddites.

As a Gen-X'er observing from the sidelines, I must say I share the concerns of those "Luddites." I've always had my qualms about multitasking because it encourages cursory attention to information, making it virtually impossible to go deeper and find meaning in the disparate sources of information. Sharing these concerns, author Nicholas Carr fears that the Internet with all its distractions and interruptions may turn us into "scattered and superficial thinkers." He noted the studies of developmental psychologist Patricia Greenfield who concluded that while certain computer tasks may enhance visual-spatial intelligence, this may be at the expense of losing "higher-order cognitive processes," such as "abstract vocabulary, mindfulness, reflection, inductive problem solving, critical thinking, and imagination."

Carr argues that the multitasking Internet environment deprives us of the chance to devote the requisite amount of attention necessary for associating new pieces of information with previously acquired and established knowledge so that we can master complex concepts. According to Carr, with all the Internet's distractions, "our brains are unable to forge the strong and expansive neural connections that give depth and distinctiveness to our thinking. We become mere signal-processing units, quickly shepherding disjointed bits of information into and then out of short-term memory."

Carr goes on to explain how book reading protects against this, saying that "reading a long sequence of pages helps us develop a rare kind of mental discipline" that gives us the "capacity to engage in the quieter, attentive modes of thought that underpin contemplation, reflection and introspection." Technophile Clay Shirky disagrees, countering that reading is "an unnatrual act; [for] we are no more evolved to read books than we are to use computers." Indeed, Carr himself argues that our "innate bias is to be distracted. Our [evolutionary] predisposition is to be aware of as much of what's going on around us as possible" in order to reduce the likelihood of being attacked by a predator or overlooking an important food source. Deep reading helps train us to sustain our attention towards one task which allows time for integration of ideas and big-picture thinking.

But unlike Carr who urges that we continue teaching this discipline of deep, critical reading, Shirky argues that "literate societies become literate by investing extraordinary resources, every year, training children to read. Now it's our turn to figure out what response we need to shape our use of digital tools.... We are now witnessing the rapid stress of older institutions accompanied by the slow and fitful development of cultural alternatives. Just as required education was a response to print, using the Internet well will require new cultural institutions as well, not just new technologies." (Typical sanguine libertarian drivel BTW. Just look at the source of the article: WSJ, where blind faith in markets, technology, and being a lemming are par for the course.)

So is that why people like Shirky (gotta love that name) harbor such an aversion to funding education? Because we need alternatives that educate children in a way that caters to their technological sensibilities? Well, one thing's for sure: Shirky certainly believes we shouldn't fear the future but embrace it. For, as he says, "we are living through [an] explosion of publishing capability today, where digital media link over a billion people into the same network. This linking together in turn lets us trap our cognitive surplus, the trillion hours a year of free time the educated population of the planet has to spend doing things they care about. In the 20th century, the bulk of that time was spent watching television, but our cognitive surplus is so enormous that diverting even a tiny fraction of time from consumption to participation can create enormous positive effects.

Hmm... That "cognitive surplus" idea sounds suspiciously like the Singularity which a recent NYT article defines as "a time, possibly just a couple decades from now, when a superior intelligence will dominate and life will take on an altered form" where "human beings and machines will so effortlessly and elegantly merge that poor health, the ravage of old age and even death itself will all be things of the past." So basically, it's a merging of humans and machines to create a super intelligence, or in Shirky's words, a "cognitive surplus," where people throughout the world can link their minds together through technology in a way that will improve the human condition. This is not as far-fetched as one might think if Carr's warning is correct and our brains are already being reduced to automatic "signal-processing units" that only find meaning in decoding the data itself rather than how the data relates to patterns and theories. Heck, we're already meeting the computers half way if that's the case!

Such automated thinking, however, may be our undoing if Daniel Pink's assertions are true. Pink, the author of A Whole New Mind, gave a recent lecture where he argued that rules-based routinized work that can be automated (tax accounting, processing uncontested divorce papers, etc.) are already being made obsolete by the likes of programs like TurboTax and 123DivorceMe.com and it will only get worse with time. One way to protect our viability is to tap into our right brain capabilities and develop qualities like inventiveness, empathy, and meaning in order to produce work that will be difficult to automate. Left-brain thinking is analytical, sequential, and logical. Unlike the left brain, the right brain processes everything at once, works with the context of the data (rather than the data in and of itself), and synthesizes (rather than merely analyzing) the data. He argued that machines replaced our muscles by taking over our manufacturing jobs during the last century. This century machines will replace our brains--not our empathetic, creative, big-picture-thinking right side, but our rules-based, logical left side.

If those MBTI stats are correct, then our one ray of hope lies in our increasing ability to empathize. But when it comes to addressing the "creative" and "big-picture thinking" parts of the puzzle, we have a lot of work to do. Carr would argue that deep reading could help develop many of those necessary qualities espoused by Pink, including increasing our creativity. But as I said at the outset, Americans aren't reading anymore, so I guess a lot of us will be out of work if Pink's forecasting is correct.

If children are to meet the needs of the 21st century workforce by developing creative minds that can integrate knowledge and engage in big-picture thinking, how are we to educate them? Should reading for deep, critical thinking still be stressed? I believe it should. But should we also transfer this deep-thinking approach from the print environment to the online realm? Most definitely! Maybe this is what Shirky meant when he said "using the Internet well will require new cultural institutions." If so, then what institutions, and how should they address the problem?

As of yet, I've heard no real answers to this conundrum, and this lack of real solutions is where I feel the agenda truly lies. My fear is that there won't be any attempt to address this problem because they WANT us to be shallow, noncritical thinkers. (And this, by the way, is the true reason behind the declining support for education, libraries, and literacy programs.) They want our brains to be reduced to shallow information processing centers, not only so that we won't question authority or possess the mental acumen for independence and self-determination, but also so that we'll--voluntarily or otherwise--meld our brainpower with that of computer technology. Why? Who knows? To fulfill some "Matrix" or Brave New World type of prophecy? Population surveillance? Subjugation? Universal mind control? You name it. But the glamorization of robots and cyborgs is undeniably omnipresent in pop culture from Beyonce and Christina Aguilera to autotune and the Transformers franchise. We're clearly being instructed to embrace this inevitability, and it just might work if the claims in this article and this news segment are any indication.

Interestingly, both WSJ articles (Carr's and Shirky's) contain at least one occurrence of the word "Net" rather than "Internet," which piqued my interest. This abbreviated form is less commonly used than the fuller form, so my increasingly paranoid mind couldn't help but notice and read into it as a deliberate choice of words. Nets entrap their intended victims, and I can't help but feel that the deliberate neglect of education and literacy needs coupled with the concerted universalization of technology in the name of attaining the Singularity is a trap. So I'll just wait it out on the sidelines with my fellow "cave-dwellers" and proceed with caution when it comes to blindly adopting "utopian" technologies. And all those technophiles pointing their fingers and sneering at us Luddites can go ahead and continue their dismissive derision. I'll happily let them be the guinea pigs and join in much later when the coast is indisputably clear because my fear is that nothing but bad will ultimately come of all of this. But believe me when I say I hope more than anything that I'm wrong. Unfortunately, I don't think I am....

Sunday, June 6, 2010

Trust Me, "Enuf" Is not Enough



Silly protesters. :-) Surely they jest, yes?
No? Oh. Well, let me take this opportunity to school them on how misguided and absurd their efforts are. I mean don’t get me wrong, I’ve had many a vexation with the English language and its so-called "spelling system." Any language that would have “colonel” spelled and pronounced the way it does is asking for a picket sign or two. I vividly recall sympathizing with Ricky Ricardo’s confusion when he read and incorrectly pronounced various words ending in “-ough.” Along with the audience, I too laughed in acknowledgment of how ridiculous and unpredictable English spelling can be. That said, come on people! Did it ever occur to you that protesting a children's spelling contest might seem a bit drastic (not to mention futile)?



Look, I understand the concern that the current system makes reading difficult for many struggling children. But I’ve also witnessed my own spelling skills improve the more I read and write. In particular, the more literary-style pieces I read, the greater my exposure to complex words (and their spelling). And whenever I write a complex composition, my orthographic mistakes reveal how strong or weak my spelling skills are. So a lot of it is just a matter of reading and writing habits. With the exception of a few people with learning disabilities or other impediments, most children will eventually gain proficiency in reading and writing in English, as long as they practice it.


I say keep it complicated and unpredictable because you can tell a lot about an adult's reading habits by the way they write. Dumbing down English spelling conventions would only remove an obvious red flag that could reveal a lot about the way a person spends their leisure time, their level of intellectual curiosity, and their attention to detail. I say don't mess with a built-in warning system when you've got one.


Other people have come up with some great arguments against overhauling the English spelling system, the most important being: which spelling convention would we use? One that addresses the American pronunciation (which, for instance, vocalizes the terminal "r")? The British one (which omits the terminal "r")? Some other English-speaking country? The two groups protesting the spelling bee come from the U.S. and the UK. Did they even think this through before charging towards D.C. with their construction paper and magic markers?


Another (perhaps less convincing) argument is the rich history embedded within English spelling that would be lost if everything were normalized. My favorite "colonel" is a good example, reflecting two different sources of influence (one Latin, the other Norman) which created two different pronunciations and two different spellings. For whatever reason we've adopted the mix-n-match spelling and pronunciation that we know and love (to hate) today. While I wouldn't necessarily use this as justification to retain the spelling convention for this word, other differences (eg. "great" vs. "grate"), while not always predictable, may reveal a bit about the evolution of the English language and the regional history of its speakers.


Still not convinced? Fine. Let me proceed with the following 2 arguments:


Exhibit A


Let's pretend the spelling conventions have been normalized and all spelling is phonetic and predictable. Consider the following sentence:


Thayr wundering if thayr kernol ate ate kernols uv korn over thayr.


Okay wait. I'm sorry. There's just TOO much fugly going on in that little bit of sentence right there. Why don't we forget the phonetics for now and just focus on predictability and consistency? Let's consider the following:


There wondering if there kernel ate ate kernels of corn over there.


Now with effort and logic ["there" as a verb phrase vs. possessive pronoun vs. preposition; "ate" as a verb vs. number; people eat food, not the other way around (usually)] one could deduce the meaning of this sentence. But how much more quickly would they have understood it if they had simply read:


They're wondering if their colonel ate eight kernels of corn over there.


If we changed the spelling conventions just for the sake of consistency, we'd have a lot of similarly spelled homonyms to contend with, some of which occupy the same parts of speech. ("Do we have time/thyme?" "I need/knead dough for a living." "Those are some phat beats/fat beets!") It's true that context could provide the necessary clarification for some of these (admittedly corny) sentences. (After all, we seem to have little difficulty understanding them when heard in verbal dialogue.) But with inferential reading skills on the decline, even among college graduates, I wouldn't push the need to rely on context too heavily.


What we'd probably need to do is eliminate many of these words in favor of their more unique sounding synonyms (or make up new words if no synonyms exist). Do we really want to do this? If these people can't be bothered to learn (or teach) the current spelling conventions, do they honestly think they'll be willing and able to adopt not only a new spelling system but also a new vocabulary? Be careful what you wish for. It sounds to me like a lot more work than people like this would care to deal with.


Exhibit B


I read the urban (read: "black") blogs frequently and I'm beginning to see a correlation between people's value judgments and their spelling (as well as grammatical) errors. Case in point, there was a recent blog discussion about Ciara's "Ride" video, which got banned from BET for being too explicit. Many of the commenters thought BET was justified in pulling the video, while others thought the network was being hypocritical considering the garbage it usually airs. Most of the comments had passable-to-good spelling and grammar, but a few fell short. The following was the worst of them all:


ITZ RLLY MESSED UP BECAUSE ITZ NOT DAT ABD AT ALL. SHE DID HER THING AND SHE LOOKS FANTASTIC!!!! SMH HATERZ.....


Basically, this person is complaining about other people's gripes with yet another unnecessary display of brazen sexuality that would further contribute to the already heavily entrenched media-influenced hypersexual behavior of young black people today. How did this person express this point? By yelling in all caps [which obviates the need to observe capitalization rules and press the SHIFT key every once in a while (suggesting laziness and/or impatience)]; using the word "DAT" (because it's on an "urban" blog?); employing creative spelling conventions with the substitution of "Z" for "S" (just because?); and throwing in a bonus "HATERZ" (just for good measure). Believe it or not, I peruse the comments sections of these blogs for the unique perspectives, the insightful dialogue, and the witty repartee I often find therein. This pointless comment is not what I come for.


Another case in point comes from a different "urban" blog where a post mentioned that someone by the name of "Birdman" (yeah I never heard of him either) bought Lil' Wayne (a.k.a. Weezy) a $1,000,000 watch and a $200,000 diamond encrusted cake for his birthday. Many of the commenters felt this was excessive and wasteful. Take a wild guess at what the worst "writer" had to say. (Read it and weep.)


aa wat happen 2 sum people the man wanted 2 buy sum ting 4 his son wats wrong wit dat uh.birman u write 2 buy the million dollar wat 4 weezy.


*Shuddering violently here* At least the other comment didn't take as much effort to understand. With this one here, I'm frankly surprised at how many words were spelled correctly, given the person's blatant misspelling and conflation of the words "watch" and "what" and their confusing use of "write" in lieu of "right." More importantly, what this person has in common with the "Ciara stan" is something I see all too often on these blogs: the most gratuitous spelling, grammatical, and capitalization errors come from the loudest, most vehement defenders of gratuitous celebrity behavior, whether it's vulgar sexuality, shameless brandishing of wealth, or some other instance of an eroding value system being shoved down the black community's throat by the media. It's gotten to the point where absolving (if not outright praising) degenerate behavior seems to go hand in hand with bad writing. I can pretty much predict the level of insight (and scruples) that can be found in a blog's comments section based solely on the spelling errors alone. If more than 10% of the comments have those errors, I don't bother with the blog. (Looking at you, MTO!)


It's because of these experiences that I harbor a healthy dose of skepticism towards these spelling bee protesters. Contrary to what they proclaim, "Enuf" is not enough. (I mean, really! At least add another "f" to the ending. It may not be necessary phonetically, but leaving it at just one "f" would be a crime against humanity--not to mention humanity's eyes!) People with the most egregious spelling errors often defend the most debauched and unproductive behavior and use the angriest and most simple-minded arguments to express themselves. Catering to this crowd would only hasten our spiral downwards towards a society of increasingly impulsive, aggressive, and depraved simpletons who never question anything and shout down anyone else who does. Maybe I'm positing more negative implications than would actually transpire if we made these spelling changes, but do we really want to take that risk? Trust me, for more reasons than we dare fathom, "enough" is definitely enough. So leave the Bee be!