Saturday, September 29, 2012

From healthcare to education -- or is it all the same?

I'm back.

After a 4-year sabbatical in healthcare services (or more specifically health insurance), the passion of applying emerging technologies to societal needs has brought me back to education and workforce development, a field that I had had the pleasure to be engaged in many times before.

However, the excursion to healthcare was no waste of time, as challenging as it might have been. It unearthed many interesting parallels and correlations. My new colleagues must be sick of me constantly describing those parallels in order to create the permission space to apply previous experiences to current challenges. So, here some of those observations -- in no particular order -- that lead me to believe that it is the educational system that needs to be fixed first before we can hope to successfully tackle other societal challenges like healthcare, poverty, or multiculturalism.

Both, healthcare and education are very complex systems of systems. It is these ecosystems that make organic change hard, since the involved stakeholders have found mutually beneficial business models that allow them to co-exist. Changing those business models requires a significant amount of innovation -- and time, which might represent too much of a risk for existing stakeholders to expose themselves to.

Both ecosystems address a fundamental and recognized human needs. However, historically health and education were privileges of the wealthy and the complexity of our ecosystems has prevented them from catching up with the modern realization of the right for healthcare and education. Consequently, while everybody is talking about those rights, nobody is willing to pay or be accountable for it.

Both ecosystems are collecting significant amounts of data, which theoretically could be used to mine valuable insights towards improving the efficacy of the systems, empowering the individuals they are serving and developing new business models to change the fabric of those ecosystems from the ground up. However, the IT infrastructure of both industries is archaic, focused on transaction management, billing and compliance. Major efforts are underway to integrate data from various historical data sources and create more transparency around newly created data, so-called BigData initiatives. However, the end-user individuals that could actually benefit from such data and create contributions to generating insights by injecting their unique experiences, have never been educated or trained in analyzing, correlating, mashing and communicating data in an exploratory and audience-centered fashion. So there is an additional need for significant data literacy that first has to find its way into the educational system -- K-12 and post-secondary -- before any of those data and analytics democratization efforts can be successfully implemented.

From a human perspective, there is much work that needs to be undertaken in the area of behavioral economics in order to get to the roots of why individuals behave in a certain way -- and thus how we can motivate them (and then remind them) to take proper action. We all know that it would be better to eat the broccoli and carrots, but when the tray of cookies is standing in the office, why can't we resist? We all know that we should plan ahead and not procrastinate, make our decisions based on information not hearsay, and choose a career that matches our skills and interest as well as economic requirements rather than what seems trendy and safe. So why do we get it all wrong most of the time?

Once individuals' motivation and triggers are identified, there is the opportunity to create much more effective and successful product and service offerings, and much more compelling and seductive user experiences through user-centered design (UCD). However, both industries -- while heavily employing the term UX -- are driven by the misconception that their subject-matter experts "know what the customer wants," consequently reducing UCD to focus group consultations and justifying design decisions with limited anecdotal evidence. More often than not, even that 'customer' is merely another stakeholder in the system -- often a payor -- and not the ultimate consumer of the product and service. It is, therefore, no suprise that despite decades of published research in user-centered design, usability and human factors, the majority of products and services involving basic human needs are still designed towards optimizing transaction management rather than user experience.

Finally, both industries start realizing that taking a myopic view onto their inherent challenges is insufficient in order to arrive at sustainable and scalable solutions, and that a more holistic view of people's well-being should be used as guidance to develop solutions and assess their success. In healthcare as well as education, the financial economics dictate much of the potential success, whether it is based on the quality of available specialists, schools, or colleges to attend. At the same time, we are painfully aware of how school and job stress can have significant emotional and physical health implications, and vice versa, how poor health inhibits school and career performance. We further realize, how liberating in any of those systems it is to share own experiences with people in the same situation, and how rewarding it is to apply those experiences to help or advise others in those situations, whether it is through mentoring, tutoring, stewardship or care. So, there might even be interesting social business models out there where one aspect of one's well-being subsidizes addressing the needs of others. However, in the absence of such business models, both ecosystems -- healthcare and education -- are struggling in considering a more  holistic view on people's life satisfaction.

What type of social business models come to mind to address any of the dilemmas outlined above?

Sunday, August 30, 2009

Do New Media Detain Literacy or create New Literacy?

I have been very critical with New Media in the past in the way they affect tween and teenage literacy and communication capabilities. The limited way in which tools like Twitter, Facebook, and SMS support inter-human communication and the way in which young audiences respond to those limitations by adopting an abbreviated, cryptic and highly ambiguous code as language seems frightening. Especially, when this code then also makes it into the traditional written language: A few months back I received a pile of thank you letters from my son's class for a Show&Tell I had given. To my surprise, the letters of these then 7-graders was full of orthographic and grammatical errors. Moreover, many of the orthographic mistakes the students made were clearly related to the way they communicate if aforementioned New Media, 'bcause', 'wat', and 'yu' just being a few of them.

However, maybe I am thinking to narrow-mindedly? Clive Thompson on the New Literacy references Andrea Lunsford, a Stanford University professors, who is studying this phenomenon and is actually postulating that the New Media makes children write more, more voluntarily and for an existing audience instead of the teacher alone. All of these findings are clearly positive trends that might in the long term improve children's cognitive, communication and problem-solving abilities.

If this is the case then does the language really matter as long as everybody agrees on it?

Or is there a limit to the depth of thinking and articulation that the New Media language imposes on us, which is only suited for news casting but not critical analysis and opinionated discussions?

I guess, only time will show. In the meantime, I am going the safe way, exposing my children to New Media tools while encouraging and supporting them in enjoying old media such as books and dinner table conversations.

Wednesday, July 15, 2009

Cross Reality: When Sensors Meet Virtual Reality

Reality is time bound, costly, sometimes dangerous. Virtual Worlds could theoretically be a safer more effective 'playground' to experiment with costly and potentially dangerous objects and situations, meet and collaborate without travel, or just socialize in an anonymous way. However, Virtual Worlds are are still lacking the the real use due to a lack of real-world connectivity, a lack of ability to tie real-world events (and thus data) to the Virtual experiences. Cross-reality is holding the promise to overcome those shortcomings. Using a variety of sensors, media feeds and and input modalities, the Virtual Worlds is becoming a true platform for telepresence applications, an alternate interface to synchronous collaborative applications in cyberspace: Cross Reality: When Sensors Meet Virtual Reality.