Apple has patented a system of interconnected sensors

Total
0
Shares

When it comes to patents, Apple happens to be one of the companies that are extremely active in filing and amassing them, which include a potential weightlifting tracker in the upcoming rumored iWatch, or a smart cover with integrated LEDs. It seems that Cupertino is not content to stop just there, as the USPTO has recently revealed three more patent applications that were filed by Apple. All three come with the December 14, 2012 date, and they touch on features that would seem more at home on the rumored iWatch than anything else. It is said that the iWatch wearable timepiece is currently being trialed, and if all goes well, it ought to be slated for an October launch later this year. One of the patent filings do point to sensors that are located on an iPhone as well as on a wrist-worn device, where these sensors have the ability to figure out whether the wearer’s pulse is starting to race, even when the wearer remains stationary.

A batch of Apple patent filings published on Thursday describe a system of interconnected sensors — some being wearable devices — that work with an iPhone hub to monitor activity levels, dynamically set or cancel alarms and manage push notification settings, among other automated tasks. The US Patent and Trademark Office published three Apple patent applications covering a method in which an iPhone, along with one or more remote wearable sensors, gathers and processes raw data to track a user’s activity level, as well as control certain scheduling functions like alarms. One filing titled “Method and apparatus for personal characterization data collection using sensors” offers a comprehensive overview of Apple’s system, which calls for a smartphone or similar portable device to respond automatically to data sent by various sensors. When an action or motion is sensed, a signal is relayed to the hub device for processing. An iPhone 5 is offered as an example device, with the usual assortment of sensors including a gyroscope, accelerometer, proximity sensor, ambient light sensor and location sensors, among others. In Apple’s proposed method, the iPhone receives incoming data from these on-board components as well as at least one remote sensor worn by the user. After processing the raw motion data, the system can deduce what a user is doing — running, walking, sleeping, etc. — and execute a number of automated tasks. With the first application, the task is to generate a “personal scorecard” that sheds light on a user’s lifestyle.

Leave a Reply

Your email address will not be published. Required fields are marked *

Sign Up for Techi's Special Newsletter

Newsletters are not just for grabbing attention. I promise to deliver the best disruptive technologies in your inbox once or twice a month.

You May Also Like