Google Photos, the photo-sharing service launched by Google in May 2015, has been updated to identify individuals even when they are facing away from the camera in captured images.
When questioned about this development, Google responded that they continually enhance their capabilities to assist users in organizing and locating photos of themselves and their loved ones. The company recently improved their algorithms so that Google Photos can more effectively group people based on visual cues such as clothing, even across photos taken within a similar time frame, as reported by Android Authority.
The feature was initially observed by Rita El Khoury, a writer at Android Authority. El Khoury mentioned in an article that she noticed multiple pictures of her husband, captured from behind, appearing under his profile in Google Photos, with appropriate tags associated with the images.
However, El Khoury pointed out that the tags may sometimes appear slightly different compared to facial recognition. Even if a face is accurately recognized, users may see a message indicating that the “face is available to add.” This suggests that Google acknowledges the possibility of the feature not being completely accurate.
If a tag is incorrect, individuals have the option to modify the identified person or remove the tag entirely.
Does the feature work for all photos? According to El Khoury, it worked for approximately 80-85% of her husband’s photos taken from behind.
Also Read: Better.com Executes Fresh Round of Job Cuts, Terminates Real Estate Team and Shuts Down Unit