CNN Business  — 

Google wants to make it easier to search for things that are hard to describe with just a few words or an image.

On Thursday, Google (GOOG) rolled out a new search option that lets you combine text and images in a single query. With this feature, you could search for a shirt similar to one in a photo but type that you want it with “polka dots,” or take a picture of your couch and type “chair” to find ones that look similar.

The feature, which the company calls “multisearch” and previewed in September, is now available for US users within the Google Lens part of Google’s mobile app. Liz Reid, vice president of Google Search, told CNN Business the feature will be considered experimental at first. It is expected to be used for shopping-related searches initially, though it’s not limited to such queries.

“This will just be the start,” Reid said.

Multisearch marks Google’s latest effort to make search more flexible and less bound to words on a screen. Google has long offered an image search engine. There’s also Google Lens, a feature that debuted in 2017 and can identify objects in a picture or immediately translate text as it’s viewed through a phone’s camera lens. Another endeavor in 2020 gave users the ability to hum to search for specific songs.

To find multisearch in Google’s mobile app, you have to tap a camera icon on the right side of the search bar, which pulls up Google Lens. You can take or upload a picture, and then tap a little bar containing a plus sign and the phrase, “add to your search.” This lets you type words to better explain what you want.

The feature works, essentially, by using artificial intelligence in a few ways. Computer vision deduces what’s in the image while natural language processing determines the meaning of the words you type. Those results are then pulled together to train an overall system, Reid said.

When Google introduced this sort of search in September, the company explained that it would use a powerful machine-learning tool called MUM (which stands for “multitask unified model”) that it had unveiled last May. Reid said in an interview last week that this won’t be the case at first, but that it may use MUM in the coming months, which she said she thinks will help improve search quality.

Asked if Google will eventually make it possible for people to use search in even more varied ways — such as by letting you combine music and words in a query in order to find new kinds of music you might — Reid said the company isn’t working on that specifically, but it is interested in combining various inputs in the future.