![]() It’s also using other AI technologies to improve its ability to remove unwanted explicit or suggestive content from Search results when people aren’t specifically seeking it out. ![]() Today’s announcement comes a week after Google announced it’s rolling out improvements to its AI model to make Google Search a safer experience and one that’s better at handling sensitive queries, including those around topics like suicide, sexual assault, substance abuse and domestic violence. Multitask Unified Model, or MUM, can simultaneously understand information across a wide range of formats, including text, images and videos, and draw insights and connections between topics, concepts and ideas. The company is also exploring ways in which multisearch could be enhanced by MUM, its latest AI model in Search. Duplichecker’s image-to-text converter is available to you for free. Upon conversion, you choose to either copy the text document to your clipboard or download it in TXT format. Google says the new functionality is made possible by its latest advancements in artificial intelligence. Then choose your preferred language, click on the convert image button, and get your images converted into text within seconds. With multisearch on Lens, you can go beyond the search box and ask questions about what you see.” “That’s why today, we’re introducing an entirely new way to search: using text and images at the same time. “At Google, we’re always dreaming up new ways to help you uncover the information you’re looking for - no matter how tricky it might be to express what you need,” Google said in a blog post about the announcement. By combining the image and the words into one query, Google may have a better shot at delivering relevant search results. The new functionality could be especially useful for the type of queries that Google currently has trouble with - where there’s a visual component to what you’re looking for that is hard to describe using words alone. You could take a picture of the plant and add the text “care instructions” in your search to learn more about it. Or, say you got a new plant and aren’t sure how to properly take care of it. You can take a photo of your dining set and add the text “coffee table” in your search query to find a matching table. In another example, you’re looking for new furniture, but want to make sure it complements your current furniture. ![]() You could pull up a photo of the dress and then add the text “green” in your search query to find it in your desired color. Say you found a dress that you like but aren’t a fan of the color it’s available in. In practice, this is how the new feature could work. ![]() With this initial beta launch, you can also do things beyond shopping, but it won’t be perfect for every search. Google told TechCrunch that the new feature currently has the best results for shopping searches, with more use cases to come in the future. With the new multisearch feature, you can ask a question about an object in front of you or refine your search results by color, brand or visual attributes. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |