“EYECAN+ is the result of a voluntary project initiated by our engineers, and reflects their passion and commitment to engage more people in our community,”
Vice President of Community Relations at Samsung Electronics SiJeong Cho, commenting on the launch of the EYECAN+
On November 25th, 2014 Samsung Electronics Co. Ltd rolled out what can be considered their most innovative product yet; EYECAN+, Retina tracking replacement for the mouse that’s completely controlled by the user’s eye movements.
It’s an upgrade of its predecessor, the EYECAN that made its debut back in March 2012 but with significant improvements in Calibration and the User Interface (UI) thanks to the work of Hyung-Jin Shin, a graduate student in computer science at Yonsei University in Seoul.
A quadruplegic, he worked with Samsung on the original EYECAN and over the course of seventeen (17) months was the guinea-pig for the testing the burgeoning array of functions and commands that have been distilled into a simple box that makes interacting with computers by the disabled practical and enjoyable.
The full team that assisted Hyung-Jin Shin are listed on their very detailed EYECAN Project Website and would allow some of my readers to bone up on their Korean:
- 조성구 Alex Sunggoo Cho]PM, HW, User Test
- 정진용 Jinyong Chung] Connector
- 유경화 Leena Kyunghwa Yu ] Communication
- 이준석 Jun Seok Lee] UX (planning & design)
- 이상원 Sang-won Leigh]Software, Design
This along with a dizzying list of YouTube videos charting each stage of progress that they made during the development stage of the EYESCAN+ makes their EYECAN Project Website worth a visit and quality time spent watching them.
The newly updated EYECAN+ is the latest in the trend of peripherals hitting the market that’s based on Retina Tracking Technology, allowing users to interact with their computer without having to physically move a mouse or any other peripheral with their hands.
A similar soon-to-be-released product made by Swedish company Tobii Technology for Pizza Hut, tracks your Retina gaze and calculates in less than 3 seconds which ingredients you wish to make up your pizza. That other commercial product suggests that the Samsung’s EYECAN may have practical applications other than assisting the disabled.
Samsung EYECAN+ – Upgrade to EYECAN with a dash of Samsung Galaxy Note 4
Interestingly, there is no requirement that the user wear any special glasses to use the EYECAN+ system. You can basically see the EYECAN+ as a kind of Microsoft Kinect-esque device for the eye, as it requires the same type of calibration.
Sitting below the monitor housed in a portable box, EYECAN+ merely requires that the user be 60cm and 70cm from the monitor.
No special posture or pose required; this device once calibrated, recognizes each user based on the unique pattern of their Iris or Retina. You can also adjust the sensitivity of the EYECAN+ Retina Scanner so that it calibrates better and works faster during actual usage.
How the user interface works upon calibration seem to borrow from the floating Menu from the Samsung Galaxy Note 4 as described in Arrival of the Samsung Galaxy Note 4 – a Selfie-Themed Affair, with a rectangular menu board being the less exciting option.
Both contain a matrix of eighteen (18) commands, which are selected by eye movements, with a single blink used to select the relevant command. Among the eighteen (18) commands includes some “mouse”-obvious ones:
- Drag and drop
- Select all
- Zoom in
Using set builder logic, you can create custom lists of additional commands, sorta like macros based on combinations of the basic 18 such as “close program” (“Alt+F4”) and “print” (“Ctrl+P”).
Samsung goes Open Source – Individuals and Companies to build applications for EYECAN+
Strangely, Samsung harbours no plans of commercializing the EYECAN+. Instead, Samsung plans to manufacture them for the purpose of donation to charity organizations to assist with making the lives of other similarly disable comfortable.
To quote the Developers of the EYECAN+ on their EYECAN Project Website, quote: “Though EyeCan may seem like a simple device, we are hopeful it can help improve the quality of life for those suffering from Lou Gehrig’s disease (ALS) and Locked-in syndrome (LIS). We really enjoyed making the EyeCan, and since this is an open-source platform, we hope that more and more people will jump in to improve the device. EyeCan is currently not for sale. The EyeCan project team is providing only the technology. Our hope is that this technology can spread to reach people in need”.
But they might have commercialization plans in mind, just via an Open Source business model. Plans are afoot to make the EYECAN+ Open Source, with individuals and companies getting SDK (Software Development Kits) to design and build products that they wish to commercialize.
This trend towards Retina Tracking peripherals to control computers may one day be the way the rest of us able-bodied humans travel without passports, but also shop and even get diagnosed for illnesses without visiting the doctor!
A true herald for a world controlled by your voice or motion as I’d predicted in my article Siri and Kinect: Heralds of a coming world free of Remote Controls. I’ll definitely be keeping my eye on this latest goodwill gesture from Samsung!
Like the post above? Check out these related posts: