How much does this plant be like? iOS 16 has an invisible visual search engine, which few people know, and that goes beyond gardening

What our iPhones can do with the information from the cameras is very impressive. Life-language or visual searches create different types of interesting coding elements. The news that came with the iOS 15, but with the iOS 16 version is changing dramatically.

What our iPhones can do with the camera information is truly impressive. Such as Live Text, search engine or even telemetry, for who can recognize plants, animals and monuments. The news comes to iOS 15, but in iOS 16 is changing remarkably.

A view of the sky: a picture and all the information have come.

Several applications allow us to recognize objects with sufficient precision. But what Apple achieved in iOS 15 with Visual Search is truly impressive. We are talking about the fact that it is a free service, integrated into the system, which respects our privacy, but also the quality of our results is absolutely unbelievable.

In iOS 15 this feature allows us to recognize plants, cats, dogs, known photos, paintings, books and monuments. As far as iOS 16 goes, its capabilities expand to recognize the species of birds we photograph, insects and statues, and to recognize them. The way to learn these real world elements are simple:

  • We open up the camera on our iPhone.
  • We photograph the objects that interest us.
  • Let’s touch the photo we just took at the left.
  • We touch the i at the bottom whenever a little star appeared on it.
  • We see the focal point of the image we want to recognize.
  • Under Siri Information This information will automatically appear in Wikipedia entries. The following will add apparent web images, which can be used to create a simple comparison of the results, or to locate information about items found on the internet.

    The Visual Lookup function will help us to learn more about the plants around us, as well as about birds, insects, cats, dogs, etc., by picture.