Apple is researching ways to identify signs of childhood autism through its iPhones. This means that your iPhone can scan autism in your child or your neighbor’s kid. A report published by the Wall Street Journal informs that the company is investigating how to use an iPhone’s camera to detect early signs of autism that cripples many American children. What we read is a desperate means to gain acceptance from users for body surveillance.
Apple has partnered with Duke University for third brain research using advanced medical technology powered by Artificial Intelligence and Machine Learning based on people and documents. The primary research team at Duke University will champion the study. Apple’s tech team will be involved in the process of looking at the data training part.
So far, according to reports, the objective of the iPhone camera should be to detect how young children focus by monitoring their back-and-forth body movements and disinterest. The technology is said to track a child’s face and monitor their various facial expressions to look for signs of the condition before the symptoms appear.
Before this partnership with Duke University, Apple partnered with Biogen, an American biotechnology company, for a similar effort. This was intended to help the company brainstorm ideas on using the iPhone and Apple Watch to screen for signs of depression and cognitive decline in their users.
Although it is too early to be sure of the success of an iPhone detecting early childhood autism, it is clear that there is a constant push from Apple’s end to make its products viable in the health technology sector. It is palpable in Apple’s efforts to monitor depression or blood pressure and record alcohol or sugar level.
The point of concern for the privacy-minded is how Apple uses the data. None of the technologies Apple establishes will be successful without Apple tapping into essential data spanning from a child to an adult. Monitoring a child’s behavior or measuring anything about an adult is sensitive medical data, likely to violate Federal rules globally. Not only it is a potential usurpation of health data but also behavior profiling.
Moreover, Apple offers no transparency into its stream of data flow. We trust Apple blindly as it stands before us as a formidable brand. It might not push sensitive data to other apps it shares space with on the iPhone, or it might guard its data against privacy theft from other businesses, but in no way it ascertains that Apple as a brand won’t use your data.
Earlier, Apple was to launch child-photo abuse scanning, and it came with force announcing its responsibility toward child safety. Despite its noble intent, Apple knew that such a feature on its iPhones would break its market because the technology was a feature of surveillance that would scan all personal images of the user. Now, as Apple plans to programme the iPhone to scan autism in children, we predict a similar withdrawal after thumping the headlines for some time.
It is worth the wait to witness how Apple announces the bold benefits of its products and then withdraws to move its tech team on to a new tech launch. Trying is good, but enterprises like Apple should know that any kind of monitoring, whether for child abuse, child autism or adults, is not likely to succeed because of the surveillance infamy.