Saturday, October 31, 2015

Uncle Sam wants you to be obese!

The other day, I went with the friend to eat at Chipotle.  Hey, the term is underway and I do not have the time anymore to cook every meal at home, and the friend is getting tired of cooking and baking for me ;)  Sanitas per escam is, thus, sometimes put to test, not because I am worried that the restaurant staff might not have washed their hands, but because restaurant foods are not really intended to be healthy.

"Just because of their marketing gimmick about GMOs, do not be fooled that this is healthy food" I warned the friend.  I tell ya, I know how to be a party-pooper! ;)  But then I am not merely opining, especially when I read pieces like this in the New Yorker:
Chipotle, with nineteen hundred locations, is the most successful of the new chains. The company has admirable goals, but, more often than not, meals from Chipotle are high in both saturated fat and calories, and often exceed the recommended daily allowance of sodium.
Calories, fat, and sodium.  No wonder that restaurant foods are way tastier.  As my grandmother used to say, restaurants want to serve you tasty foods, while mothers want to serve you healthy food.  Of course, in my case I lucked out with my mother, grandmothers, and aunts, all making healthy food that was also wonderfully tasty.  When I spoke with my parents yesterday, my father said that mother had just finished making பொரிச்ச குழம்பு (Porichcha Kuzhambu) and I started drooling here ... hmmm, where was I?

Ah, yes, even a Chipotle often serves foods high in fat and calories, and one meal can deliver more than the salt that you body ever needs in a day.
“Chipotle has a health halo”—a term Wansink and a colleague coined several years ago to describe the general aura of eating at places that advertise themselves as healthy. “They are organic and use the word sustainable a lot. That’s not a bad thing, but it doesn’t make the food healthier.”
I know what you are thinking: just who the hell does Wansink think he is?

Brian Wansink is the director of Cornell University’s Food and Brand Laboratory, and more:
Wansink has also served as the executive director of the U.S.D.A.’s Center for Nutrition Policy and Promotion. (In addition, he is a member of McDonald’s Global Advisory Council, which makes nutritional recommendations to the company’s leadership.) His book “Mindless Eating” demonstrated the gulf between what we think we are eating and what we actually eat.
So, if Chipotle's food is not all healthy, then you can imagine calories, fat, sodium, and sugar at the likes of McDonald's.  But, and this is a big butt--er, but--the food at those fast-food places are way less expensive too.  What gives?
[Taxpayers] heavily subsidize corn and soybeans, two crops that feed livestock and help create the processed food that public health officials have warned us for years to stop eating. Few federal incentives exist for farmers to grow a more varied selection of vegetables or to motivate consumers to eat them. And at least half of our calories come from food that is subsidized by the government, a figure that has held steady for years.
“A value meal at these places is a big burger, some fries, and a sugary beverage,’’ Kelly Brownell said. “Every time you buy one of those meals, Uncle Sam is standing there with his wallet open. The grain that feeds the cows is subsidized. The oil used to cook the fries is subsidized, and the high-fructose corn syrup used in sodas is subsidized, too. But if you walk in the next day and order a salad, a piece of fruit, and some tea you will be on your own. Uncle Sam will not be there to help you.”
We have created a system thanks to which foods that are high in calories, fat, sodium, and sugar, are less expensive than the healthier options.  Should we then be shocked at the ever growing obesity rate in these United States?
fatty, salty meals remain far cheaper to produce, distribute, and buy than healthier alternatives. For that to change, America’s agricultural priorities will need to fall in line with its health priorities.
Don't you hold your breath and suck in your tummy waiting for those policy changes to happen.  Definitely not with this dysfunctional Congress.  


Friday, October 30, 2015

May we please stop pushing college degrees for all?

Consider the following, which are the concluding sentences from James Surowiecki's "Financial Page" column in the New Yorker:
We should also rethink our assumption that college is always the right answer, regardless of cost. Politicians love to invoke education as the solution to our economic ills. But they’re often papering over the fact that our economy just isn’t creating enough good jobs for ordinary Americans. The notion that college will transform your job prospects is, in many cases, an illusion, and for a while for-profit schools turned it into a very lucrative one.
Now, read the following sentences from the concluding paragraph in Rebecca Schuman's column at Slate:
The university classroom benefits from a diversity of backgrounds, including first-generation students who absolutely should get any support they need with college readiness. Every American who wants to attend college should be able to (and for a lot less money than they’re currently laying out). But those who don’t want to shouldn’t feel like they have to
Here is the problem: Surowiecki and Schuman are not higher education "insiders."  Yes, Schuman, is an adjunct faculty after earning her doctorate.  After being on the doctoral path at Yale, Surowiecki became a journalist/writer.  For all purposes, both are "outsiders" looking in at higher education when they conclude that the push for college degree for all is not worth the talk, leave alone the money that we invest.

It is an open secret anymore that we are unnecessarily pushing college degrees down the throats of young, and older, adults.  All we have managed to achieve through this is an enormous level of credential inflation.  Which is what Schuman notes as well:
the solution is to stop requiring a bachelor’s degree to be an office assistant, or a paralegal, or any number of professions that up until recently could be staffed—successfully—by the holder of an associate’s degree or high-school diploma.
The first ever op-ed of mine that was published along these lines was about two decades ago.  Yep, back in California, not too long after earning my doctorate, I argued in that opinion piece that we were committing a double crime of pushing college degrees and undermining vocational education.  Nobody cared then because I was an outsider--I was not in academia at that time.  (Well, ok, there was one wonderful benefit from that--a Mike contacted me after reading the column, which later led to dinners as well.)

As an academic, of course, I have written quite a few op-eds by now criticizing the overselling of college degrees and the under-investment in vocational education.  The only time that a fellow academic responded to it, well, it was pretty much a personal attack.  Otherwise, the higher education professionals remain silent as ever.  Why?  It should not surprise you by now--this college degree issue is merely one of the many examples for why I loved the George Bernard Shaw quote as the title for this blog itself: every profession is a conspiracy against the laity.



(BTW, keep in mind that the criticism is against the push for college degrees; having a degree is not the same as having been educated, wherein lies a huge part of the problem.)


Thursday, October 29, 2015

Why does science fail in the popularity contest?

A few years ago, when visiting with the parents, my mother complained that science was one day reporting that coffee was bad and then changing it another day.  I told her that science is a work in progress and that as we know more we might have to revise what we previously knew.  She seemed disappointed and unconvinced.

What disappointed my mother is precisely what always excites me about science.  A refreshing feeling that we don't know and we are trying our best to find out.  What we know now is incredibly more than what people knew only a few generations ago.  And there is a good chance that quote a bit of what we now know will be overthrown in a couple of generations.  It is awesome.

But, we humans like a clear story, a story that does not keep changing.  We like narratives that provide us with a sense of certainty.  Religions offer that comforting feeling of certainty.  "Where did we humans come from?  God created us" is a simpler narrative with a remarkable sense of certainty.  To answer that question with "we evolved from apes, and in fact we could even be separate category of apes.  We are still searching for the missing links in the evolution. But then these are recent stories.  We have no idea where this universe came from" are not the answers that provide any certainty at all.

Writing about this incremental progress that science offers, David Barash, who is "an evolutionary biologist and Professor of Psychology and Biology at the University of Washington" notes:
The capacity for self-correction is the source of science’s immense strength, but the public is unnerved by the fact that scientific wisdom isn’t immutable. Scientific knowledge changes with great speed and frequency – as it should – yet public opinion drags with reluctance to be modified once established. And the rapid ebb and flow of scientific ‘wisdom’ has left many people feeling jerked around, confused, and increasingly resistant to science itself.
Exactly.  That's what my mother's question/comment was all about.
Perhaps we mourn the loss of certainty, of the sort that most religions offer to their followers. Perhaps it’s more a search for authority, of the sort once provided by our parents. Or a universal yearning for any reliable port – even if conceptual rather than maritime – in the storms of life’s unknowns. Whatever the underlying cause, people have difficulty accepting the unstable, shifting, impermanent reality of how the world is put together. And this difficulty, in turn, renders us uncomfortable with precisely the only stability and certainty that science offers: that paradigms come and go.
Paradigms come, and paradigms go.  That is how science works.
The loss of paradigms might be painful, but it is testimony to the vibrancy of science, and to the regular, unstoppable enhancement of human understanding as we approach an increasingly accurate grasp of how our world works.
I often mention in my classes about the increasingly accurate understanding, which is why I am confident that a couple of generations from now, humans will have that much better an understanding of this universe than we now have--unfortunately, I won't be around at that time! ;)
Denied the comforting blanket of illusory permanence and absolute truth, we have the opportunity and obligation to do something extraordinary: to see the world as it is, and to understand and appreciate that our images will keep changing, not because they are fundamentally flawed, but because we keep providing ourselves with better lenses. Our reality hasn’t become unstable; it’s just that our understanding of reality is of necessity a work in progress.
A work in progress, unlike the well-defined and unchanging holy word of god that the faithful believe in.
Science is a process, which, unlike ideology, is distinguished by intellectual flexibility, by a graceful, grateful (albeit sometimes grudging) acceptance of the need to change our minds, as our understanding of the world evolves. Most people aren’t revolutionaries, scientific or otherwise. But anyone aspiring to be well-informed needs to understand not only the most important scientific findings, but also their provisional nature, and the need to avoid hardening of the categories: to know when it is time to lose an existing paradigm and replace it with a new one. What is more, they need to see this transition as progress rather than a sign of weakness, which is more difficult than one might think. 
I suppose we have quite a bit of work ahead of us then.


Wednesday, October 28, 2015

Where have you gone, Sandy Koufax?

There is an appropriate place in life for sports.  In life, that is.  I emphasize life because that is, after all, an all-consuming quest of mine--what is this life all about, and what does my existence mean?  This search for meaning then leads me to question the primacy that society grants to irrelevant aspects like, well, sports.

Today's exhibit:
News of Father’s Death Is Withheld From Royals Starter Edinson Volquez
Re-read that.

A man dies.  The news about his death is held back from his son, only because he is the starting pitcher in a World Series game.

No, it was not the team that put a cone of silence all around.  It was the pitcher's family:
According to the Royals, Roandy Volquez, the pitcher’s wife, told General Manager Dayton Moore what had happened and asked him not to tell her husband until after he had finished pitching the biggest game of his career, his first World Series start. The team then asked the broadcasters on Fox not to announce the news, because Volquez routinely goes into the clubhouse between innings, and the broadcasts of the game are usually on.
Let's see.  The wife and the family told the team not to let the pitcher know that his father had died.  And then the broadcaster was asked to keep the mouth shut as well.  Seriously?  Hello, the father is dead!

But, we live in a world where news is hard to contain, right?
Shortly after that, news reports began to circulate that Volquez’s father had died, and even as Volquez went to the mound unaware of what had happened, many people around him knew.
Many people around him knew that his father was dead.  The son was the last to know.  Seriously?  Hello, the father is dead!

It was not as if the father was a mean guy either; three years ago, the son said this:
“It was good for me because my mom and dad always took care of me,” he said in the article. “I started playing baseball when I was 9 or 10, and they took care of me. It was easy for me.”
Of course, it was the wife, the family, who made the decision not to inform Volquez.  But, I can't help thinking that it is so much a reflection of the contemporary values in society.  Yet another example of our messed up priorities.

This story is such a contrast to one of my favorites from the rich lore of baseball.  Another pitcher, another World Series, and a different call on priorities.  Fifty years ago, Sandy Koufax refused to pitch on the first game of the World Series because it was Yom Kippur.  
Koufax, who wasn't particularly observant, had no clue that his decision would carry so much weight—then or now.
"I believe he was thinking, 'I'm going to pitch the next day. What's the big deal? We have [star pitcher] Don Drysdale starting'," Leavy said in a Q and A with Sports Illustrated in 2002. "And, in a way, that makes it even sweeter. Yom Kippur is a day of sacrifice. .... And here's Koufax, who's doing this reflexively not out of his own great belief, but really more in deference to others. So it was a much greater sacrifice on his part. For a more religious man it might have been a no-brainer. For Koufax, it was the right thing to do."
And in doing the right thing, Koufax inspired a generation of Jewish players that came after him.
That was class.  That was demonstrating what was more important in life.

Perhaps far too many people in this world simply do not care about understanding life and its priorities.  They then spend a great deal of time and money on sports.   How messed up is this approach?  Here is another exhibit:
Five former University of Louisville basketball players and recruits told Outside the Lines that they attended parties at a campus dorm from 2010 to 2014 that included strippers paid for by the team's former graduate assistant coach, Andre McGee.
One of the former players said he had sex with a dancer after McGee paid her. Each of the players and recruits attended different parties at Billy Minardi Hall, where dancers, many of whom stripped naked, were present. Three of the five players said they attended parties as recruits and also when they played for Louisville.
Said one of the recruits, who ultimately signed to play elsewhere: "I knew they weren't college girls. It was crazy. It was like I was in a strip club."
It is one messed up life that we lead!


Tuesday, October 27, 2015

Only for the articles, ma'am. For real!

Last December, when I was in the old country, a middle-aged cousin in the extended family commented that his cohort of parents find the internet to be terrible for how it has made porn so easily accessible to their high school- and college-age children.  "Most of the time these kids go to the internet cafes only for this" he worried.

Life was easier when I was a kid, and then a college student.  There was no internet and no easy access to porn.  In the world that I inhabited, there were no porn magazines either, which is why some of the stories of reading porn that classmates joked about during the reunion surprised me.  But, their stories about porn were tamer than the sex and nudity of a typical "R" rated movie these days.

Porn is seemingly everywhere.  One has to be careful to click when visiting webpages, or typing the URL.  The run of the mill porn is free and unlimited, while one has to pay to read the NY Times after the freebie offerings!

This has had one good result though--Playboy is in financial trouble:
From a peak of 7.2m copies in November 1972, its circulation has shrunk to a mere 800,000 today.
Ah, yes, the "shrinkage" is for real ;)

So, Playboy is "reinventing the brand by banning full nudity from the Playboy website."
On October 13th Playboy Enterprises said that from March next year Playboy will not publish full nudity any more, though it will continue to show “sexy, seductive pictorials of the world’s most beautiful women”. It will also continue to choose a “Playmate of the Month” and hire a “sex-positive female” as a sex columnist. And it says it will go on publishing long-form journalism, interviews and fiction.
“The quality of the content was always overshadowed by the nude pictures,” says Americus Reed at the Wharton School of the University of Pennsylvania.
What does the living American icon of feminism, Gloria Steinem, think about this?
it’s as if the NRA said we’re no longer selling handguns because now assault weapons are so available.
A good line ;)

So, about the abundance of porn that parents all over the world are worried about.
The more traditional critiques say that pornography is inherently degrading to women – or whoever happens to be the object of sexual activity – and fosters unrealistic expectations of sex. It decreases the quality of real relationships and the self-image of those involved – and increases negative sexual attitudes and actions. Porn-users compare real humans to the fantastical images, and either come out unimpressed and reluctant to have real sex, or, at worst, demanding the types of behaviours they see on screen, regardless of their desirability to their partner. One poll from the US Pew Research Center in 2007 quantified the feeling, finding that 70 per cent of Americans said pornography is harmful.
Do any of these criticisms hold water?
 A good question to ask, right?

After reviewing a bunch of research, the conclusion is:
The negative behaviours we blame on pornography, in other words, might have emerged no matter what; porn is perhaps more symptom than cause.
Porn is, to use my favorite Hitchcock phrase, a McGuffin!
The way to change that – and to change the negative effects such a misperception can have – isn’t to restrict or ban pornography. It’s to bring the discussion of sexual pleasure to the foreground, especially in sex-ed. ‘We need to supplement pornography with non-porn sexual education, so that porn becomes fantasy sex rather than a real-world template,’ Zhana Vrangalova, a psychologist at NYU who specialises in sexuality, told me. ‘We need to give people permission to enjoy sex. Until we do that, they will go to porn. Because you can’t kill curiosity.’
We cannot kill curiosity, which leads the young to the forbidden places in the old country.

Monday, October 26, 2015

Are you, too, waiting for good news from Afghanistan?

After all the traveling to and in California, it was great to be back home and sleep in my own bed.  I care not if the mattress and the pillows at the hotel were of higher quality than what I have, but the older I get the better the sleep is in my own home and on my bed.  East or west, home is best, indeed.

I went to sleep all pumped up about the classes--have I ever told you how much I love doing what I do!  As if my regular classes aren't exciting enough for me to head to campus, I had an additional incentive.  A wonderful student had emailed me.  Students emailing is not any man-bites-dog story, of course.  But, this email had something special: "My mother would like to sit in on Monday's class; is that all right?"  What an awesome idea!  And what a coincidence that I had re-posted, from three years ago, the piece yesterday in which I had written:
I would love it if taxpayers routinely observed our classrooms and judged for themselves whether their hard-earned monies are being put to good use. 
This Monday morning, I woke up, as I always do, with NPR's news and commentaries.  It was about an earthquake in Afghanistan.  Can that country ever catch a break, right?  The later reports suggest that it is no minor shake:
A massive earthquake rocked northeastern Afghanistan on Monday with devastating tremors rippling across the region, leaving more than 215 dead amid collapsed buildings, panicked stampedes and fears of landslides. Officials braced for even more casualties.
The full extent of damage and human toll was not immediately clear as rescue teams tried to assess areas hardest hit by the quake, which had a preliminary magnitude of 7.5 and was centered in a remote area of the Hindu Kush mountains.
Even for a person like me, who has been obsessed with the world's problems ever since I was a kid, I simply needed some kind of a good news about this country that has been cursed by the cosmos, it seems.  I scanned for news about Afghanistan.  There is some good news.  You ready?  You sitting down?  Nothing breakable in your hand that could fall and shatter to pieces?  Get this:
Afghanistan have clinched a historic one-day international series over Zimbabwe after a 73-run victory
In case you don't understand that sentence, that is about the game of cricket.  Afghanistan beat Zimbabwe in a cricket match series.  That is a huge piece of good news from that country.  Why?  Let me explain with what I wrote in one of my newspaper columns six years ago about Pakistan:
The unrest and violence in Afghanistan is intricately linked to Pakistan’s. So, is there any simple metric that we could employ in order to understand whether things are getting better or worse in Pakistan, such that it can then feed into the decision-making process regarding Afghanistan?  Yes, there is: All we need to do is keep track of the game of cricket in Pakistan. As simple as that!
What is good for the goose is good for Afghanistan, too.  Well, ok, the games were not played in Afghanistan but in Zimbabwe.  But, here is the best part: to use the American sport language, Afghanistan's cricket team is a minor league team and they beat the crap out of a powerful major league team.
The national team was only formed in 2001 and while they are yet to shock one of the eight leading international sides, a series win over an International Cricket Council full member represents a major landmark.
Unable to play at home in war-torn Afghanistan, they were granted ODI status in 2011. 
Isn't that a small little ray of sunshine for you to feel good about Afghanistan?

Of course, this won't be of comfort to the hundreds who died from the quake, and the thousands who have to deal with the destruction of property.  We can help--by contributing to relief efforts, and by also wishing the country and its people well.

Sunday, October 25, 2015

This day ... three years ago (3)

(Re-posting from 2012)

No noise in the classroom, please. Students are asleep!
As enrollment has increased in the college where I teach, there has been an increase in the number of students who go through some of the classes that I teach.  The opportunity to help more numbers of students understand and interpret the world certainly pleases me.

However, along with the larger numbers, I notice that there is also a much higher increase in the percentage of students who don't show up even though they are registered for the class.  Sometimes they are no shows during the entire term.

Yesterday, at class time, only 15 out of the 36 were in attendance.  A couple more came in late, but we didn't even reach the 50%-plus-one that would be required for a quorum!

It worsened after the break--we were down to twelve.  Only a third of the class.

Yes, there was absenteeism even in the old days.  Ten years ago, when I started teaching here, of course, students did skip class.  I have never recorded attendance because, as I tell them, they are adults and I do not have to babysit anybody.  But, I don't even need any hard data to understand that the trend has been of decreasing attendance.

It sunk to a new low yesterday.

One reason could very well be that I am a crappy teacher, and such teachers might not be making their classes exciting enough for students, who need to be "edutained."  But, as I joke with students, if they thought I am crappy now, well, I was incredibly crappier years ago.  (One term, a student, who mistook my humor, told me in all seriousness "if you think you are bad, Dr. Khé, you really don't know what bad teaching means."  That was one awesome compliment I received that term!)

Thus, absenteeism has been on a steady increase even as my teaching has gotten better; how about that!

My hypothesis to explain the rapid increase is simple: we now have a lot more students than before who are simply not interested in higher education.  Yet, they are here because of the societal contexts that force them to be in classrooms.

Is it worth all this trouble?

When 24 students decide to skip class, it is a huge waste of precious dollars too.  At the approximately sixty dollars it costs each student for each class meeting, that was $1,440 flushed down the toilet in a little more than an hour.  Mine is not the only class where we witness absenteeism.  Now think about all the other classes that are offered at this university alone.  And then all the classes across all the colleges and universities.

I do not mean to suggest that coming to class is always better than not coming to class.  We have to count the number of students who do come to class and promptly fall asleep.  Yes, asleep.  A few years ago, a colleague, who since left for greener pastures, described to me what he did when he saw a student fast asleep with his face down on the desk.  The colleague walked up to the classroom door, and banged it shut.  The sudden loud noise jolted the student from his sleep and the rest of the class apparently had quiet smiles on their faces.

I don't do anything like what that former colleague did.  I might ask students whether they think it will be ok with their boss if they didn't show up for work, or if they slept on the job.  That is the extent to which I go about reminding them that their habits are not healthy.  I don't have to, nor do I want to, babysit adults.

Do students ever ask themselves whether all this fun and sleep is worth the $25,000 debt that they graduate with, on an average?

Should voters ask ourselves whether we should worry at all about the student debt accumulated via such a process?

Given that there is also public money involved in universities like the one where I teach, I am sure taxpayers are not going to be thrilled to know that they are subsidizing students who choose to skip classes, or treat class time as nap time.

I would love it if taxpayers routinely observed our classrooms and judged for themselves whether their hard-earned monies are being put to good use.  I am confident that if they did, well, that will be the end of even the little bit of funding that we currently get for higher education.  I, for one, would not blame taxpayers if they chose to do that.

After that, the taxpayers should also sit in on faculty meetings.  I suppose I would be blamed if it results in them jumping off the nearest cliff.  I bet quite a few administrators will gladly push them over too!

Welcome to the university, and have a nice day!

Saturday, October 24, 2015

This day ... three years ago (2)

(Re-posting from 2012)

It is ok for Obama to kill people because he is not Bush?

This graphic ((ht) says a lot, right, about how we have escalated the killing campaigns?:


As if the old stories on the kill list (have my colleagues nominated my name to be in the list?) were not enough to make us worry about due process and the constitution, the Washington Post reports:
Over the past two years, the Obama administration has been secretly developing a new blueprint for pursuing terrorists, a next-generation targeting list called the “disposition matrix.”
The matrix contains the names of terrorism suspects arrayed against an accounting of the resources being marshaled to track them down, including sealed indictments and clandestine operations. U.S. officials said the database is designed to go beyond existing kill lists, mapping plans for the “disposition” of suspects beyond the reach of American drones.
WTF!

I thought "kill list" was enough of a new and unwanted phrase in our dictionaries.  "Disposition matrix"???

As I expected, Glenn Greenwald is furious about this, and he is way smarter and informed than I ever can be:
The Post's Miller recognizes the watershed moment this represents: "The creation of the matrix and the institutionalization of kill/capture lists reflect a shift that is as psychological as it is strategic." As he explains, extra-judicial assassination was once deemed so extremist that very extensive deliberations were required before Bill Clinton could target even Osama bin Laden for death by lobbing cruise missiles in East Africa. But:
Targeted killing is now so routine that the Obama administration has spent much of the past year codifying and streamlining the processes that sustain it.
To understand the Obama legacy, please re-read that sentence. As Murtaza Hussain put it when reacting to the Post story: "The US agonized over the targeted killing Bin Laden at Tarnak Farms in 1998; now it kills people it barely suspects of anything on a regular basis."
Oh my freaking lord!

To hell with Obama and Romney!

More Greenwald:
It is literally impossible to imagine a more violent repudiation of the basic blueprint of the republic than the development of a secretive, totally unaccountable executive branch agency that simultaneously collects information about all citizens and then applies a "disposition matrix" to determine what punishment should be meted out. This is classic political dystopia brought to reality (despite how compelled such a conclusion is by these indisputable facts, many Americans will view such a claim as an exaggeration, paranoia, or worse because of this psychological dynamic I described here which leads many good passive westerners to believe that true oppression, by definition, is something that happens only elsewhere).
Need I remind you that Obama was awarded the Nobel Peace Prize . What a wonderful living illustration of George Orwell's 1984 scenario ofwar is peace :(


Friday, October 23, 2015

This day ... three years ago (1)

(Re-posting from 2012)
Jon Stewart had a funny comment a couple of days ago (I think it was in the segment immediately before this one) when he did a satirical "USA!, USA!" When the audience joined in the chant, Stewart laughingly noted that he had become like a Mussolini.

There is a lot more to that quip: democracy doesn't work well, perhaps doesn't work at all, if all we do is merely chant a few slogans, repeat what the great leaders tell us, and not think for ourselves.  Martha Nussbaum wrote about this in the context of higher education and citizenship:
The first is the capacity of Socratic self-criticism and critical thought about one's own traditions.  As Socrates argued, democracy needs citizens who can think for themselves rather than deferring to authority, and who can reason together about their choices rather than simply trading claims and counterclaims.
Nussbaum then adds:
We will have a chance at an adequate dialogue across political boundaries if young citizens know how to engage in dialogue and deliberation in the first place.
One can easily see then that societies that do not want democracy will perhaps even make sure that their approaches to education will not encourage students to think.  Aha, you say, as you think about the old Soviet Union or even contemporary China.  But, here in the US, commentators routinely cite the "success" of students in China or any number of other countries where education is for completely different goals. For goals that have very little to do with democracy and what Nussbaum writes as "human development" because:
The student's freedom of mind is dangerous, if what is wanted is a group of technically trained docile technicians to carry out the plans of elites who are aiming at foreign investment and technological development.
What a tragic irony then that we want to change our education systems in order to compete with the likes of China!  We want to institute national testing.  We want to focus on how education will immediately translate to jobs and economic growth. And, therefore, the "wasteful" expenditures on the arts, and the humanities, and the social sciences, ought to be diverted to more "useful" ones!

Doesn't work to improve and build on the dream of democracy, does it? 

Overlay this situation with slick marketing by politicians.  Well, I will leave it Calvin to remind us that ignorance is the ultimate expression of patriotism :)

Thursday, October 22, 2015

This day ... two years ago (2)

(Re-posting from 2013)
In what seems like centuries ago, though it has been less than three decades, I came to USC for graduate studies.  Those days, work rules for us "aliens" were a lot less restrictive than they are now.  Thus, in order to supplement my meager graduate assistantship, I applied for a student worker position with the university's computing services.

My "home" at USC--the VKC and WPH buildings

The first day on the job, my supervisor--I think his name was Mike, who a year later had a horrible motorcycle accident that affected his motor and mental skills--took me around the facilities.  He led me to the inner sanctum and said something like, "here is our DARPA center."  And then explained that DARPA stood for Defense Advanced Research Projects Agency, and how USC was one of the very few universities and agencies connected to its network.

That connection, network, was, of course, the internet.

Those were the primitive days of the internet.  It was well before the days of the graphic user interface of Windows, when knowing the DOS commands was enough to impress a few.  A couple of Unix commands made quite a few swoon with admiration.  Within this primitive internet were user groups, through which was how I came to know about Zia's death in Pakistan--almost a few minutes after the plane went down.  I loved reading the endless number of jokes on groups. I easily adopted the internet.  I downloaded files from FTP sites.  I graduated.  It was still the pre-www world.

And then came the web and Mosaic and Netscape and AOL and Internet Explorer.  The world changed in a hurry.  I was now at the mercy of AOL and the slow dialup modem.  Then the faster dialup. And then DSL broadband.  It is one heck of a different world now.

Even the best minds had a difficult time figuring out what all those meant.  As this essay in the New York Review of Books notes, even the mighty Paul Krugman was dead wrong when he wrote:
“The growth of the Internet will slow drastically [as it] becomes apparent [that] most people have nothing to say to each other,” the economist Paul Krugman wrote in 1998. “By 2005 or so, it will become clear that the Internet’s impact on the economy has been no greater than the fax machine’s…. Ten years from now the phrase information economy will sound silly.”
It is not Krugman's fault that he got it dead wrong.  It is a measure of how rapidly things have changed.  The future has never been this difficult to predict when even next year could be dramatically different.

The internet has created quite a few monsters along the way.  The ease with which data can be collected about you and me and the seven billion others is that Faustian Bargain that we didn't quite imagine thirty years ago. Even twenty years ago.
[Not] obvious was how the Web would evolve, though its open architecture virtually assured that it would. The original Web, the Web of static homepages, documents laden with “hot links,” and electronic storefronts, segued into Web 2.0, which, by providing the means for people without technical knowledge to easily share information, recast the Internet as a global social forum with sites like Facebook, Twitter, FourSquare, and Instagram.
Once that happened, people began to make aspects of their private lives public, letting others know, for example, when they were shopping at H+M and dining at Olive Garden, letting others know what they thought of the selection at that particular branch ofH+M and the waitstaff at that Olive Garden, then modeling their new jeans for all to see and sharing pictures of their antipasti and lobster ravioli—to say nothing of sharing pictures of their girlfriends, babies, and drunken classmates, or chronicling life as a high-paid escort, or worrying about skin lesions or seeking a cure for insomnia or rating professors, and on and on.
I certainly did not imagine this when I got to the internet 26 years ago.  The life we now live would have been science fiction to me then. 
you are not only what you eat, you are what you are thinking about eating, and where you’ve eaten, and what you think about what you ate, and who you ate it with, and what you did after dinner and before dinner and if you’ll go back to that restaurant or use that recipe again and if you are dieting and considering buying a Wi-Fi bathroom scale or getting bariatric surgery—and you are all these things not only to yourself but to any number of other people, including neighbors, colleagues, friends, marketers, and National Security Agency contractors, to name just a few.
 When phrased thus, yes, it certainly would have been nothing but science fiction back then.  Back then as in a mere 26 years ago.  How crazy is that!  How scary is that!
How all this sharing adds up, in dollars, is incalculable because the social Web is very much alive, and we keep supplying more and more personal information and each bit compounds the others.
 Not only are supplying the data, others are also providing the data.  It is important to keep in mind that:
Data—especially personal data of the kind shared on Facebook and the kind sold by the state of Florida, harvested from its Department of Motor Vehicles records, and the kind generated by online retailers and credit card companies—is sometimes referred to as “the new oil,” not because its value derives from extraction, which it does, but because it promises to be both lucrative and economically transformative.
In a report issued in 2011, the World Economic Forum called for personal data to be considered “a new asset class,” declaring that it is “a new type of raw material that’s on par with capital and labour.” 
So, how much is this new raw material worth?  You see that reflected, for instance, in the market valuation of Facebook at more than 80 billion dollars--from the more than 800 million users there.  As the quantity and quality of this raw material increases, the value of Facebook will also increase--ironically, we the people make the company worth that much by providing the data voluntarily!  Facebook is merely one example.  Google, Amazon, the NSA, ...
while we were having fun, we happily and willingly helped to create the greatest surveillance system ever imagined, a web whose strings give governments and businesses countless threads to pull, which makes us…puppets. The free flow of information over the Internet (except in places where that flow is blocked), which serves us well, may serve others better. Whether this distinction turns out to matter may be the one piece of information the Internet cannot deliver.
Not really what I imagined the world would be towards the end of 2013 back in 1987 when I was taken around the computing facilities at USC.  A mere 26 years ago that was!

Wednesday, October 21, 2015

This day ... two years ago (1)

(Re-posting from 2013)
The liberals have always been suspicious about me, and they have good reasons to--they know well that I will depart from many of their views more often than not.  That I am a card-carrying member of the ACLU doesn't convince them that I am a liberal.  A committed liberal is no different from a committed conservative, and they are often no different from a religious fundamentalist either--because I don't worship their gods, I have to be kept at a distance.

Works well for me.  Except that the decisions they make affect my life and the lives of millions of others too.  So, I blog!

First, food and Monsanto.  To the left, Monsanto is like Voldemort.  Perhaps even worse.  A few weeks ago, I blogged about the graffiti on the bike path:


Of course, that is not the only post where I have wondered about such dogmatic opposition to GM food and Monsanto (like herehere, andhere.)  Now, I have one more in this post, thanks to this piece on argumentum ad monsantum in the Scientific American blog:
It’s fashionable to think that the conservative parties in America are the science deniers. You certainly wouldn’t have trouble supporting that claim. But liberals are not exempt. Though the denial of evolution, climate change, and stem cell research tends to find a home on the right of the aisle, the denial of vaccine, nuclear power, and genetic modification safety have found a home on the left (though the extent to which each side denies the science is debatable). It makes one wonder: Why do liberals like Maher—psychologically considered open to new ideas—deny the science of GM food while accepting the science in other fields?
You can imagine what happens when you point out to the left how dogmatically ideological they are on some issues, and you point out to the right how dogmatically ideological they are on some issues.  Soon, there is nobody to talk with.   So, I blog! ;)

So, what is the deal with the "Monsanto is evil" religion?
We tend to accept information that confirms our prior beliefs and ignore or discredit information that does not. This confirmation bias settles over our eyes like distorting spectacles for everything we look at. Could this be at the root of the argumentum ad monsantum? It isn’t inconsistent with the trend Maher has shown repeatedly on his show. A liberal opposition to corporate power, to capitalistic considerations of human welfare, could be incorrectly coloring the GM discussion. Perhaps GMOs are the latest casualty in a cognitive battle between confirmation bias and reality.
We continue with the confirmation bias with food deserts. 

No, a food desert is not about lack of food in Darfur or one of those places. 
Food deserts can be described as geographic areas where residents’ access to affordable, healthy food options (especially fresh fruits and vegetables) is restricted or nonexistent due to the absence of grocery stores within convenient travelling distance. 
An informed person somewhere in India or Tanzania or anywhere on the planet will have a tough time imagining Americans not having access to food.  Not only because it is the land of plenty, but also because there are plenty of social institutions--public and private--to address food insecurity issues.  Keep in mind that food deserts are not the same as food insecurity--there will be an overlap between the two, yes, but the food desert concept is above and beyond the real and serious food insecurity issues

A simple logic tells me that choices increase with affluence and that the poor have fewer choices.  As in life so is the case with food.  Will it, therefore, surprise us to find that the less affluent have limited access to healthy food options?  Should we worry that Bubba doesn't have access to arugula, and that he eats way too much at the neighboring McDonald's instead, and bypasses the salads there?

Imagine, if you will, how easy politics will be if the liberals and the conservatives alike ditched their dogmatic and ideological passions and, instead, merely looked at how to solve problems.  Oh, wait, I see Ted Cruz coming to attack me with a hardbound edition of the Dr. Seuss collections for merely suggesting this ;)

Monday, October 19, 2015

Say it loudly? Be bold?

The long presidential campaign season is now well underway.  It has already been noisy, and it will only get worse.

It is a shout-fest out there, it seems.  What's the deal, right?  The loud ones are not necessary the correct ones.  We have various expressions from the past, like "empty vessels make noise."

The explanation comes from centuries before modern communication technologies were even invented!
उच्चैरुच्चरितव्यं यत्किंचिदजानतापि पुरुषेण ।
मूर्खा बहु मन्यन्ते विदुषामपि संशयो भवति ॥
- सुभाषितसुधानिधि
One must talk loudly even when talking about whatever little you know.
Fools will think you are right and the wise may also be put in doubt.
- Subhashitasudhanidhi

I like how the couplet notes that "the wise may also be put in doubt."  While I am far from being one of the wise, I can certainly relate to being put in doubt. A few years ago, The Atlantic had a regular column about words and grammar.  The editor/columnist, Barbara Walraff  featured an interesting question from a reader who was at a loss for words that could best describe the situation that the reader was in.  All the words in the dictionary didn't work and a word had to be invented.

I was terribly excited when they ran my question (thanks to "googling" myself, I found out that mine has even made it to the book--it is there in the collection that Barbara Walraff has put together):
“I wonder if there is a word for what happens when teachers, like me, grade papers at the end of terms: the incorrect information in students’ papers makes me begin to question my own knowledge. For instance, after grading quite a few papers I begin to ask myself if it is effect or affect; does Switzerland really border a sea? Is there a word to describe this acute sense of ‘unlearning’?”
—Sriram Khe, Eugene, Ore.
It really doesn't take much to get me all excited about life!  As if my question featuring there wasn't enough excitement for me, they also mailed me four books autographed by the respective authors!  Hey, I might not have accomplished much in my career, but there has been enough and more excitement to make it more than a pedestrian existence ;)

The structure was that readers sent in their creative solutions to the grammar/word question, and the editor then published the best:
Temporary inanity is what college English teacher Laura Zlogar, of River Falls, Wis., calls the malady. Deborah Carter, of Walkersville, Md., wrote, “I’m a teacher too, and I’ve always thought of this phenomenon as wisdumb.”
Various people suggested factigue, examnesia, and misleducation— also amissgivings (Anutosh Moitra, of Sammamish, Wash.), bogmindling (Eunice Van Loon, of Biloxi, Miss.), contaminotion (Jim Lemon, of Gladesville, of New South Wales, Australia), errattled (Lisa Bergtraum, of New York City), nonsensery overload (C. Bernard Bar-foot, of Alexandria, Va.), numbleminded (Doug and Kay Overbey, of Maryville, Tenn.), and righter’s block (Carol DeMoranville, of Steward, Ill.).
Tom Dorman, of Sedro-Woolley, Wash., had yet another idea, and he knows whereof he speaks. He wrote: “As a high school teacher, I can sympathize. My ninth-graders have recently convinced me that the Norman Conquest took place in 1951, that Samson and Goliath had a torrid affair (don’t tell the school board), and that car pedium means ‘seize your movement.’ Correct tests like this late into the night to meet your grade deadline and you, too, will feel doubt-witted by your students.”
Be confident. Be bold. Say it like you believe you know it well. Let the others sink in their doubt.

Not!

What we need is "the courage to admit one's own ignorance."  Easier said than done, eh! :(

Sunday, October 18, 2015

Are you a man or a mouse? Could be difficult to tell anymore?

During the weeks leading up the Nobel Prizes, there was plenty of speculation that the researchers behind "Crispr" would win the award, like in this report:
Scientists behind the discovery of a technology called CRISPR-Cas9 that allows researchers to edit virtually any gene they target are among the top contenders for Nobel prizes next month, according to an annual analysis by Thomson Reuters.
Of course, that did not happen:
Two of those responsible for CRISPR existence, Jennifer Doudna (UC Berkeley) and Emmanuel (sic) Charpentier (now based in Berlin at the Max Planck Institute) have surprisingly not won the Nobel Prize for Chemistry, despite Reuter’s prediction. Instead the prize has gone to Thomas Lindahl, Paul Modrich and Aziz Sancar for their DNA repair mechanism with connections at the Francis Crick Institute.
So, it might be next year then.

CRISPR freaked me out right from the first time I heard about it.  It is darn creepy to target a gene and edit it.  As I noted in this post,
Most of the public don't have a clue. And worse, don't spend even a minute wondering, contemplating, studying, about what it means to be human.
The only cool thing is this: the co-discoverers of the particular technology that has apparently made gene editing easy are women scientists.

Dr. Emmanuelle Charpentier and Dr. Jennifer Doudna after their $3 million Breakthrough Prize
Source

I should use the favorite GOP politician line of "I am not a scientist" and then go on to commenting ;)

In this NY Times science piece on a recent National Academy of Sciences meeting, the author writes:
Among the scientists describing recent advances was one of Crispr’s pioneers, George Church of Harvard Medical School. In the midst of his presentation, packed with the fine details of biochemistry and genetics, Dr. Church dropped a bombshell.
In a typical experiment, scientists use Crispr to alter a single gene. But in recent work with pig cells, Dr. Church and his colleagues used Crispr to alter 62 genes at once. The researchers hope that this achievement may someday make it possible to use pig organs for transplantation into humans.
But the experiment also raises a deeper question: Could scientists someday alter complicated human traits by manipulating many genes at once?
Are you beginning to freak out now?  Well, hey, welcome to the club!
We are certainly free to imagine a world in which parents have hundreds of genes in their designer babies edited to alter everything from the color of their eyes to their scores on intelligence tests. This experiment on pigs is a far cry from that scenario.
But that doesn’t mean that scientists won’t learn how to alter many genes in one fell swoop.
You see why I have been harping on and on that the single greatest challenge for individuals and societies will be a tough question but one that does not need too many words: what does it mean to be human?  You make up your own mind on the following sentences from Martine Roblatt, who is "the Sirius satellite radio founder turned pharmaceutical magnate turned transhumanist":
“Weird does not mean unethical. There’s a 45-degree line on a graph—as long as the utility exceeds the yuckiness, social acceptance wins,” she said. “Taking organs from dead people and putting them in living people once seemed weird, it’s not weird anymore. It would be stupid to abjure nature’s greatest invention since chemistry.”
Oh, btw, Roblatt's big plan? To grow human organs in pigs at a "factory" like in this rendering:



It's a brave new world!

Saturday, October 17, 2015

This land is my land, this land is NOT your land :(

Almost a decade ago, during a vacation in Alaska, we paid for the touristy plane ride from Fairbanks in order to land a kilometer north of the Arctic Circle. It was educational and exhilarating to understand how far--literally and otherwise--I had come all the way from an original home that is near the equator.

The touristy package was a tie-in with a local venture, it turned out.  This tourism was an important economic activity for the "native village."  During the guided bus tour in the village, we listened to his descriptions of the people's history and traditions.  In telling the story, he said, "we lived by ourselves for a long time.  One day, somebody told us that our lands belonged to Russians and we had never seen Russians in our lives.  And then one day they told us that America had bought the lands from Russia and we didn't know what that meant."

He said all that and more in a tone that was very matter-of-fact.  It was like listening to a voice in the documentary narrating the events.  The content of what he said was not new, of course.  But, somehow, to listen to him say that and in that particular setting was very moving.  Very troubling.  I wanted to apologize to him and his people right there.  It might have been because while it is one thing to read about that in history, it is a completely different emotion when one is right there in that part of the history.

In the contemporary life, we rarely ever think about the old stories anymore.  Thanksgiving is slightly more than a month away.  Yet again, I will wonder how a Native American thinks about all the old stories on that Thanksgiving Day.  I will feel awful that my fellow Americans and I have largely forgotten the original peoples.
[The] conquest of the continent is both essential to understanding the rise of the United States and deplorable. Acre by acre, the dispossession of native peoples made the United States a transcontinental power.


A power from the tip of Florida to the settlement above the Alaskan Arctic--by robbing and killing and wiping out the natives.
If slavery was a moral failing, said Lincoln in his second inaugural address, then the war was ‘the woe due to those by whom the offense came’. The rupture between North and South forced white Americans to confront the nation’s deep investment in slavery and to emancipate and incorporate four million individuals. They did not do so willingly, and the reconstruction of the nation is in many ways still unfolding. By contrast, there has been no similar reckoning with the conquest of the continent, no serious reflection on its centrality to the rise of the United States, and no sustained engagement with the people who lost their homelands.
For  one, the Native Americans are such a small percentage of the population--a tiny minority (about one percent of the population) in the very lands that once belonged to them.  There are almost as many Indian-Americans in the US as there are Native "Indian" Americans!  And, after having been exiled to reservations, they became a case of out of sight and out of mind.

Even the great Abe Lincoln was not flawless:
In July 1864, for example, President Abraham Lincoln created a reservation within present-day Washington state for the Chehalis people, reducing their once extensive homeland of 5,000,000 acres (by the measure of the Bureau of American Ethnology) to ‘about six sections, with which they are satisfied’ (according to a letter from the Office of Indian Affairs; the measure of ‘satisfaction’ must be judged by the alternative, which was removal and joint occupation of another reservation). As a section is 640 acres, ‘about six’ would have come to about 4,000 acres.
 We  humans all  over the world have committed so many atrocities that there aren't enough minutes in our lives to apologize.  As one who was born into  the "uppermost" caste of the atrociously tragic caste-ridden India, I know I have plenty to apologize for.  "What would American history look like if native peoples had been kept in sight and in mind?" is a question that perhaps we can ask ourselves as we sit down for Thanksgiving.