Meta’s smart glasses can now describe what you see in more detail


When all users in the United States and Canada are launched in the coming weeks, Meta AI can now be allocated to provide more detailed descriptions of what is in front of users when they ask the smart assistant about their environment. in Shortly common video along with the advertisementMeta AI inspects more details about the features of the water park, including describing herbal areas as “well -trimmed”.

The feature can be activated by running “detailed responses” in the accessibility section of the device settings in the Meta Ai application. Although it is currently limited to users in the United States and Canada, Mita says that detailed responses “will expand to additional markets in the future”, but they have not provided details about when or the countries you will get after that.

Firstly Announced last September As part of a partnership with Be My Eyes and It was released last November in a limited offer This included the United States, Canada, the United Kingdom, Ireland and Australia, and META confirmed today that summoning the volunteers’ feature “will be launched in all 18 countries where Meta AI will be supported later this month.”

Blind and low users of Ray-Ban Meta smart glasses can use the feature to connect to a network of more than 8 million distorted volunteers and get assistance in daily tasks such as monitoring a recipe or determining the location of an element on the shelf. By saying, “Meta, be my eyes”, the volunteer will be able to see the user’s circumference by feeding the glasses camera and can provide descriptions or other help through open speakers.

Leave a Reply

Your email address will not be published. Required fields are marked *