We found this link via the Wired article, and just couldn’t wait to share it with you.
Leafsnap may be one of the best, most pure forms of contextual mobile learning we’ve seen. This is right up there with WordLens in our book.
You can learn more at the article and via the iTunes store, but the best way to learn about it is to simply download it and try it out. Worked pretty well for us on the 6 leaves we tried it on around the office. It correctly identified about half of them. This may be due to the fact that we’re located in the midwest and the app is fairly east coast centric at this point. This will likely changed as the database grows and the user base of crowdsourced botanists continues to fill it up. Can’t wait to take it out hiking in the park later!
The one thing that could perchance be improved would be it’s recognition on a non-white background. This would allow you to identify plants that you might not want to touch. As it is right now, it requires a well-lit environment on a nearly pristine white background. We also experienced a few crashes on one of devices we were running it on.
Nonetheless, an impressive first release and a great statement on the power of using computer vision and crowdsourcing together for some great mobile learning.
Latest posts by Chad Udell (see all)
- Understanding the Benefits of Using Computer Vision and Machine Learning for Performance Support - April 9, 2020
- Float bids farewell to a friend, Gary Woodill - October 11, 2019
- Examining the Realities360 App – A Deep Dive Whitepaper - September 22, 2017