Employing Mobile Device Sensors for Enhanced Learning Experiences

The Seventh Sense: Using Haptics, Light Sensors, Accelerometers, Barometers, and More to Create Innovative Learning Solutions

Newsletter, Pedagogy and Learning Comments (1)

We’ve been exploring some of the things that make mobile devices unique and particularly well suited to provide just-in-time information to the mobile learner. Everything from location services to security, the camera to stored preferences and push notifications to gestural input have been featured on the Float blog.

We aren’t done yet, though.

In fact, we’re just getting to some of the most interesting pieces that constitute the devices in our pocket. Let’s explore them here.

The sensors you carry around with you on a daily basis in your smartphone and tablet are collecting data. This data can be used to learn an awful lot about your immediate surroundings and enhance your awareness to help you make better-informed decisions.

This should come as no surprise to anyone with the ability to access lots of data, but of course, data by itself is not the important thing. Data, when interpreted as information, can provide insight. Insight, when applied, can create a platform for action.

So, in essence, by creating a conduit to our surroundings via a device that can provide the data, the use of applications or services that interpret this data to give the user insight can lead to better and more successful actions. Less mistakes, better outcomes, more productivity and more safety can be provided by virtue of successful use of the data gathered by the devices we carry with us everywhere.

I sense something, a presence I’ve not felt since…

It could seem like we’ve gone a bit off the ranch here, but as you may know, we’ve already mentioned several times that the devices are essentially the current-day equivalent of what has been represented in the Star Trek universe as the Tricorder. From Wikipedia: a multifunction hand-held device used for sensor scanning, data analysis, and recording data.

The sensors we’ve discussed – cameras and GPS, among others – are pretty much universal. That is, all the devices we would consider smartphones or tablets have them on board, though many other sensors out there might be a little more fringe. It’s clear we are just at the beginning of the sensor age on these ubiquitous devices. A few non-typical sensors that we’ve spotted in the wild:

  • light sensor
  • barometer
  • pressure sensor
  • thermometer
  • altimeter
  • humidity sensor
  • magnetometer

The Android world seems to be leading the way in this area of adding “extra” sensors to your devices. For example, if you want a thermometer-enabled device in the Android world, simply pick up a Samsung Galaxy S4, but if you want the same feature on your iPhone, you’re going to need to buy a Thermodo. Certainly, doable, but considering the built-in sensors in the Android devices likely only cost a few dollars, you can see where adding sensors to your non-enabled devices can become unnecessarily costly.

We’re Just Getting Warmed Up

This is really just the tip of iceberg here. Where these sensors are now a luxury item or a hit and miss proposition, we’re going to see an influx of them across the buying spectrum as the technology becomes cheaper and smaller. With these additional sensors, we’ll see the growth of a software segment call “appcessories.” Of course, if you have sensors, you need the apps to provide information and insight to the new data you’re gathering, right?

With the next-next generation of devices, the third-party adapters and add-ons for them, we could see things like air quality sensors, motion detection, EKG/pulse and blood pressure monitors, PH/alkalinity in the not too distant future. Touch-free gestural inputs and the connected nature of wearable computing all fit right in here with this line of thinking. Your entire body and the environment around you becomes the just-in-time information gatherer and delivery mechanism.

Though it may be a little frivolous, want to see something available today that is certainly a sign of things to come? Check out these brainwave sensing cat ears you can buy right now to let everyone know just what you are feeling. Goofy? Definitely! When similar technology could be used to provide brain scans or MRI data in realtime via consumer technology avenues, we will have reached a breakthrough in health, science and human augmentation.

What sorts of things can we do with these sensors?

Of course, there are the immediate applications the sensors directly add to your user experience, such as being able to measure things like light levels, temperature and more. The immediate uses obviously give your audience more inputs about the world around them and can allow them to make more accurate decisions on a one-on-one basis.

But what if your “appcessories” that use these apps are gathering the data in an aggregate form?

The creation of big data on your workers’ surroundings could change a lot about how we do business today.

Are your workers constantly in high-humidity situations? What are the temps like for your workers at the jobsite? Are your corridors adequately lit to perform the maintenance you want them to? Are your workers using the devices when they are driving even though they shouldn’t be?

You can see where this quickly moves beyond the typical view of what learning is. Maybe the role of the learner gets flipped here. The user is the creator, and we may be the learner.

What’s Next, and How Do I Get Started?

What is going to drive the addition of these sensors to the devices? Well, to understand this, I recommend reading this example of how the thermometer made it in the Samsung Galaxy S4. It’s going to require openness on the platform providers to include some of these required hooks in their OSs, and sensor manufacturers may also need to write their own driver code to get the hardware manufacturers to put the sensors in their phones. It’s going to start slowly at first, but competition will heat up (ahem).

Another interesting way to approach this is by creating the sensor device you need, and then applying for things like Apple’s MFi program, in which you can create AirPlay or Lightning connector enabled add-on devices for iOS. There are lots of requirements for such a program, so be sure to check if you qualify first.

It’s clear there are many areas of growth here to pay attention to. The best thing you can do is to keep your ears and eyes open right now. Take a look at your target platforms and devices if there is a chance you want to access something in the user’s immediate surrounding and help them do their job better.

You may be pleasantly surprised to find out that you can assist them in ways that seemed like science fiction only months ago.

Use Wayfiler on Your Summer Vacation!

School starts back up again in a few weeks, but there’s still time left to soak in some of America’s greatest sites.

Download Wayfiler for free, and when you attend nearly any U.S national park, you’ll see a park guide has been tagged for you for easy access.

Some of the more than 40 parks tagged include…

  • Carlsbad Caverns National Park
  • Grand Canyon National Park
  • Great Smoky Mountains National Park
  • Hawaii Volcanoes National Park
  • Rocky Mountain National Park
  • Yellowstone National Park



Training Magazine: mLearning Case Studies Webinar on August 20

Training Magazine logo

Float’s director of client relations, Scott McCormick, presents a free Training Magazine webinar on mobile learning case studies at 12 p.m. CT on Tuesday, Aug. 20, 2013.

Scott will look at…

  • The original motivation for implementing the mobile learning
  • The conclusions of the mLearning strategy
  • The goals and objectives of the application
  • The instructional and graphical user interface design choices
  • The development approach
  • How security was handled
  • The method of final delivery


Chapter 1 of Learning Everywhere Now Available in MP3 and PDF

Once the decision to “go mobile” has been made in an organization, it may seem as though the hardest decision has been made.

However, you must consider the kinds of content you’ll be putting on the many varieties of mobile devices.

In Learning Everywhere, Chad Udell, a seasoned expert on mobile learning, demystifies the many choices involved in developing mobile learning content, and provides real-world experience on how to get down to the business of creating mobile learning.

We are offering you the first chapter of Learning Everywhere for free in both PDF and MP3 format. Perhaps this chapter will help you think of another way mobile learning can help your organization.

In this excerpt, Chad discusses the following benefits of mobile learning:

  • Increased productivity
  • Increased sales
  • Improved communication
  • Reduced overall cost of learning
  • Measurable ROI
  • And more!



Follow Float
The following two tabs change content below.
As managing director of Float, Chad Udell designs, develops and manages interactive Web and mobile projects. Chad has worked with industry-leading Fortune 500 companies and government agencies to concept, design and develop award-winning experiences. Chad is recognized as an expert in mobile design and development, and he speaks regularly at national and international events and conferences on related topics. In 2012, Chad released his first book, “Learning Everywhere: How Mobile Content Strategies Are Transforming Training.” In 2014, he co-edited the book, “Mastering Mobile Learning: Tips and Techniques for Success” with Dr. Gary Woodill, Ed.D.

» Newsletter, Pedagogy and Learning » Employing Mobile Device Sensors for...
On August 12, 2013
By
, ,

One Response to Employing Mobile Device Sensors for Enhanced Learning Experiences

Leave a Reply

Your email address will not be published. Required fields are marked *

« »