My spouse and I placed a wager a few weeks back. I said that ChatGPT would never be able to replicate my writing style in a smartwatch reviews. I had asked the bot months earlier to do this, and the results of that were hilarious. My spouse bet they could ask ChatGPT to do the exact same thing and get a better answer. They said that my problem was that I didn’t ask the right questions to get the answers I wanted.
This memory flashed in my head while Iiveblogging Google I/O. The keynote this year was basically a two hour thesis on AI and how it will impact Search and all of the ways that it can boldly or responsibly improve our lives. It was a lot of neat stuff. Google’s admission that it is hard to ask AI questions was a shock.
Google demonstrated a new feature called Sidekick, which can offer prompts based on what document you are working on. It’s basically telling you how to ask the AI what it can do.
This was also demonstrated in Google’s keynote later, when it showed off its new AI-based search results called Search Generative Experience(SGE). SGE generates a snapshot or mini-report based on any query you enter into the search box. The snapshot includes follow-up questions at the bottom.
Both demos were disturbing for someone whose job it is to ask people questions. Google’s queries and prompts on stage are nothing like what I type in my search bar. My search queries sound like toddlers. They’re usually followed by the word “Reddit”, so I can get answers from non-SEO content mills. Search terms like “Bald Denis BlackBerry movie actor” or “Site:theverge.com Peloton-McCarthy ship metaphors”. I rarely ask Google questions like “What can I do for the weekend in Paris?”
When I stare at any kind generative AI I have no idea what to do. The blank window will taunt me no matter how many demos I watch. It’s as if I was in the second grade, and my grumpy instructor just asked me a question that I didn’t know. When I ask for something, I often get laughable results — things that I would have to spend more time on if I did it myself.
My spouse, on the other hand has taken to AI as a fish takes to water. After we made our wager, I watched the two of them use ChatGPT for an hour. I was most surprised by how different the prompts and questions were. My questions were open-ended and broad. My partner left very little room for interpretation. They said, “You must hand-hold it.” You have to give it everything you want. Their queries and commands are long and hyper-specific. They often include links or data sets. Even they must rephrase their queries and prompts repeatedly to find the exact information they need. SGE snapshots will also tell you what questions to ask next.
This is only ChatGPT. Google goes one step further with its pitch. Duet AI will pull context data from emails and documents to intuit what your needs are (which I find hilarious, since I often don’t know what my needs are). SGE will answer all your questions, even those without a “right answer”, and anticipate the next question you may ask. To make this AI more intuitive, programmers must ensure that the AI is able to know what questions it should ask the user so they can then ask the AI the correct questions. It means programmers need to be able to anticipate the questions that users will ask. I get a headache just thinking about it.
You could say that life is all about asking the right questions. The most unsettling thing for me about the AI age is that I don’t believe anyone knows what we actually desire from AI. Google claims it’s what it demonstrated on the stage at I/O. OpenAI believes it is chatbots. Microsoft believes it’s an incredibly horny chatbot. When I speak to people about AI, they always ask the same question. How will AI impact and my lives?
Nobody, not even bots, have a good solution to this problem yet. It’s unlikely that we will get a satisfactory answer until all of us rewire our brains so they can speak more fluently with AI.