Wednesday, June 4, 2025

Artificial Intelligence programs use measurements of people's faces and other emotion indicators

I just read an interesting paragraph in the book Artificial Intelligence for Dummies:

Humans output an enormous number of clues while performing tasks. Unlike the average human, an AI actually pays attention to every one of these clues and can record them in a consistent manner to create action data [italics in original text]. The action data varies by the task being performed; it can include info like interactions with a device, sequences for making selections, body position, facial expression, manner of expression (such as attitude), and so on.

Our wearable tech and the sensors in all the other tech we use and live around can collect a great deal of information--the kind detailed above, as well as heart rates, blood oxygenation, facial feature changes, neural activity, eye tracking, etc. 

Nearly all of us have been forced into clicking "accept" on vague terms and conditions that software companies can claim constitute "permissions" for them to utilize such personal biological information. But have we really given any kind of meaningful consent when companies have intentionally made it appear that they aren't collecting such data or using third parties to collect it? I don't think we have.

This is where there needs to be consumer and government pushback via revocations of consent and imposition of penalties and fines for unconscionably sneaky behavior.

[Update 06/06/2025: As I continue to read in the same book, I am learning very interesting things about newer AI-using technology. For instance, it can be used to do something called "sensor fusion," which is where data from various sensors is combined "to obtain a unified measurement that's better than any single measurement" (p.273-274, Artificial Intelligence for Dummies). 

Common sensors in cell phones and our wearable tech include microphones (some of which can capture infrasound or ultrasound), cameras (utilizing a range of light wavelengths), and motion sensors. 

Cell phones can also produce magnetic fields in ranges that can now be harnessed to perform magnetic resonance imaging (MRI). See "Hybrid ultra-low-field MRI and magnetoencephalography system based on a commercial whole-head neuromagnetometer." Magn Reson Med, 69: 1795-1804. https://doi.org/10.1002/mrm.24413. Vesanen, P.T., Nieminen, J.O., Zevenhoven, K.C.J., Dabek, J., Parkkonen, L.T., Zhdanov, A.V., Luomahaara, J., Hassel, J., Penttilä, J., Simola, J., Ahonen, A.I., Mäkelä, J.P. and Ilmoniemi, R.J. (2013). 

The Vesanen paper above, already over a decade old, talks about successfully using only 22 mT (milliTesla, a measurement of magnetic field strength) to do an MRI. The surfaces of Airpods, Iphones and Apple smartwatches have magnetic fields that exceed 22 mT. See "Magnetic field interactions between current consumer electronics and cardiac implantable electronic devices." Xu, K., Sengupta, J., Casey, S. et al. J Interv Card Electrophysiol 65, 133–139 (2022). https://doi.org/10.1007/s10840-022-01241-w.

Imagine the information that can be collected and processed about our body's actions and current mental and physical states if sensor fusion is done using a combination of microphone (including ultrasound) input, camera (including visible light and infrared) images, location and motion tracking data, and MRI information!

Two other interesting items mentioned in this book are telepresence and teleoperation:

Telepresence is being in one place but it seeming like you're in another. The book's example of telepresence is virtual sightseeing. But wearable tech collects much more than just images of pretty scenery. I don't like to think about the level of voyeurism possible with the data obtained by the intrusive use of sensors (which we are usually forced into blindly accepting) and fusion of the data from all the sensors. But someone needs to think about this issue because there are bound to be wolves among the IT sheep.

Teleoperation is an even more pressing issue. Teleoperation is when a person is able to interact with an environment distant from them through another device or object. The book gives the example of acting through a "robot-like device," and I think at least one of the authors is hinting that humans can be distant objects through which teleoperation is done. There is a ten-year-old TED video demonstrating the use of wired technology to allow one person to use their brain activity to command another person's arm to move. The video is "How to control someone else's arm with your brain." by Greg Gage, online at https://www.youtube.com/watch?v=rSQNi5sAwuc. If something can be done via wires, it usually can be done wirelessly, too.

As I read recent headlines about Joseph Biden's apparent diminished mental state at the end of his time as U.S. President, I have to wonder whether some people were taking advantage of him as a "robot-like device" for teleoperation purposes. We could be dealing with something much more serious than inappropriate use of an autopen machine. I think teleoperation of humans is already going on in the US and likely in other countries. I have personally observed at least two people involuntarily react physically in robotic-looking ways--typically suddenly bending forward partway--when they heard me talk about some of the unusual "conspiracy theory" or "hidden technology" items that I have posted on my blog. It seems as though there is a "background computer process" that is triggered or called up from a remote location in order to cause them distress when they hear about specific items that are being kept "in the shadows."

Why hide such cool technology? Teleoperation technology would help stroke victims and others to move again. However, wouldn't you be furious to find out that your body and mind are being subjected to secretive nudges and commands? I think most people would be very angry and would respond appropriately with boycotts and lawsuits....big lawsuits which I think they would win easily due to the refusal of tech companies to be open about what is possible with powerful new algorithms and sensors.]