Mobile devices are personal. We own them and still bring them to work with us. We personalize the home screens and wallpapers. We install apps and widgets. We add contacts and take photos of people we know.
These devices know our most intimate details. You bank with it. You check on your medical data. You text family members updates on your latest news and whereabouts daily without a second thought. You perform video calls and maybe even send Snapchats for someone’s private consumption.
Our learning on these devices has mostly been devoid of this level of personalization. In the pursuit of just-in-time, we have mostly been neglecting the just-for-me aspects of mobile learning.
Learning on these highly personal devices should be personalized, as well.
Many sites and apps get this.
With Yelp, you log in and get a list of restaurants around you and have easy access to your friends and reviews. The Weather Channel app knows your favorite locations for instant retrieval of weather conditions at your parents’ home or your upcoming vacation destination.
Apple’s Passbook app is indispensable to me, and it may be the current ultimate example at inferring intent. It knows where I am going next, what coffee shop I frequent and when my credit card bill is due. It knows what I want to do on the day I do it, and if my flight is delayed.
This schism between our real-world activity and learning isn’t that surprising. Our previously created learning content rarely took any bit of personalization into account, either. One size fits all was the standard de rigeur.
Our eLearning has had some small attempts at turning this around this for a while, modest as they might be. Sure, most courses can restart us where we left off, know when we completed it, and probably even remember what we answered on practice exercises or the final assessments. But what does the course really know about us? What do we need? What are we trying to accomplish today?
Our mobile devices know this. From providing us our emails, calendar events, contacts, getting our location and syncing with our time zone, the mobile screen you have with you knows more about us than our grade school friends ever did.
Mobile learning can leverage this in-depth knowledge of us for our benefit. Even the simplest HTML5 Web experiences have rich, powerful databases available to them to help store data, preferences and history. WebSQL (now deprecated), Indexed DB, and various tutorials for offline programming and databases for iOS and Android are out there offering examples and tips on how to tailor a Web experience for the user to help personalize and store information about the user’s needs.
Just like many Web content management systems and social sites capture our history and reflect it back to us, we need to start tailoring our mobile learning, performance support, and job aid experiences to take into account the user’s true needs. On a large scale, this requires data storage and bandwidth. There are two things we currently have in abundance.
Data storage aside, there is a need to use some powerful algorithms if we are truly going to provide information that is just for me. A growing list of articles, SDKs, engines and APIs are emerging to help you do just this.
In 2011, Eric Schmidt took the stage and talked about the next logical step for search:
“Think of it as a serendipity engine,” Schmidt said. “Think of it as a new way of thinking about traditional text search where you don’t even have to type.”
What we are really talking about is inferring intent via context, past expressed purpose, taste, and our social network behavior. This is a powerful recipe to create a perfect learning experience for anyone no matter where they are. I won’t dive into all of the bits listed there. You should read the details yourself. Business Insider also covered this topic here.
It doesn’t stop there. Natural language processing and digital agents will definitely revolutionize the way we receive information that is important to us. Why do I need a USA Today on my hotel room’s doorstep when I have Flipboard and Siri? Most all of the information in an average paper is not relevant to me, but in a user-agent-assisted or personalized account that knows my interest and history, nearly everything is something I want to read. Certainly, there is a danger that looking at the world only through a lens created by yourself could induce a sort of tunnel vision, but with some filters and additions, I can get a blend that remains relevant with little extra filler. Tell Siri to find articles you might want to read? Maybe not yet, but soon.
There is a lot of research going on in this area, of course. Ruder Finn published an exhaustive survey that outlines a wide array of use cases segmented by demographics. Their Mobile Intent Index asked respondents how frequently they use their mobile phones to go online for 295 reasons. The primary reason to go online was instant gratification. While the study’s excerpt states that, “mobile phones are not a learning tool,” 64 percent of all users say they go online with their mobile device to educate or research.
Intent has been a human-computer interaction (HCI) area of research for some time, with a 2004 paper that covered a “method of programming robots to automate motor tasks by inferring the intent of users based on demonstrations of a task,” and a very interesting piece from 2012 on “Language intent models for inferring user browsing behavior.” Recall Schmidt’s quote on searching without having to type? Google I/O 2013 unveiled just that.
Not too be outdone (at least not too much), Yahoo! Labs also has a great research area on this topic. Tracking trends, providing context for search and much more are just around the corner, if not here in one shape or form.
It’s clear to see there are fantastic things right around the corner, and also some ways for you to get started on inferring your learners’ intent already available to try out. Just don’t forget it.
Latest posts by Chad Udell (see all)
- Examining the Realities360 App – A Deep Dive Whitepaper - September 22, 2017
- Missed Realities360? Check out the app that Float delivered here. - September 11, 2017
- Realities360 Augmented Reality App Now Available - July 21, 2017