The creators of Oboe know what everyone’s thinking about AI. “Is AI going to make us all stupid?” the company asks in a recent ad. “Are we going to forget how to think for ourselves?”
Oboe’s founders think the answer to both of those questions is no, and their startup is meant to prove it.
The AI education platform, which launched this month, uses AI to craft “courses” about any topic that strikes a user’s fancy. It comes from the creators of Anchor, a DIY mini-podcast creation platform that was acquired by Spotify for $150 million. Nir Zicherman, Oboe’s CEO and cofounder, ended up running Spotify’s audiobooks vertical, while Mike Mignano, also an Oboe cofounder, ran Spotify’s podcast team.
Zicherman wants to “democratize access to a great learning experience” on the cheap with AI, he tells The Verge. That’s a lofty claim, especially considering how many products are branded as AI-powered learning tools already — and how rampant hallucinations continue to be across AI products.
Currently, people turn to ChatGPT, Google, YouTube, Wikipedia, and other internet platforms to learn, Zicherman argues. The process requires users to “piece it together” and go on “a very linear journey that is one-size-fits-all,” he says. Oboe, meanwhile, streamlines the information to a “single destination that you need to go to to actually learn effectively.”
Oboe’s website has a familiar chatbot feel with a textbox inviting users to type out what they want to learn. Unlike ChatGPT, however, Oboe will not converse with users in a back-and-forth. Instead, Oboe’s response to a prompt will be its signature AI-generated “course” about the topic using traditional educational formats, such as a longer written text that appears similar to an introductory chapter in a textbook or a short bulleted list of “key takeaways.”
Image: Oboe
One could generate “courses” on the origin of artificial intelligence or forms of intelligence in nature, both of which featured in Oboe’s ad.
Or, you could be like me and choose the unglamorous, opaque topic of concrete manufacturing and its environmental impacts, about which I know next to nothing.
This generated a “deep dive” essay for me that was labeled as a 10-minute read. “Imagine the handful of fresh concrete in your palm,” the section began. The text was broken up by headers like “environmental footprint” and was interspersed with tables that listed the types of something called “SCM.” What is SCM? I hit command + F on the page for the definition of “SCM” but it was not there. Photos from Wikimedia Commons appeared every few paragraphs. After the “deep dive,” I was able to click through the information packaged in different formats, like an FAQ and a podcast.
While I walked away with what felt like a cursory understanding of concrete manufacturing, Oboe didn’t solve for me the biggest problem in AI right now: I had no idea if any of the information was accurate. Confirming details is left up to the learner. Oboe does not include links to original source materials when stating, for example, that “aggregates are granular materials such as sand, gravel, or crushed stone that make up the majority of concrete’s volume (typically 60–75%).” This turned out to be correct according to the American Cement Association, but I had to locate its likely source myself to verify the statistic.
Oboe is working on adding this down the road, but it isn’t available today. “Citations and other means of accessing additional online resources are therefore things we are actively working on and hope to add to the platform in the coming months,” Zicherman wrote in an emailed comment to The Verge.
Other platforms do give you citations, when you ask for them. When I turned to Google’s Gemini to learn about concrete and included in my prompt a request for citations, I was given a link to a 2018 paper in the journal Nature Sustainability that calculated that global concrete production accounts for nearly 10 percent of industrial water use worldwide. Useful! Perhaps citations generally give me a false sense of security that the information is accurate and not being pulled from some AI slop website. But at least the option is there if I care to go looking for it.
Oboe does not train its own foundational AI models, Zicherman says, and instead picks models from other companies — he wouldn’t name them — that are optimized for certain tasks, such as speech-to-text synthesizers or large language models. Oboe addresses the issue of possible hallucinated content by tasking some of the LLMs with the job of correcting outputs of other LLMs, Zicherman says. In other words, LLMs are checking other LLMs.
“For instance, you could have a single model outputting what it believes is a certain set of facts that should be incorporated into a course, and then another model from a different provider that’s been trained on a different dataset in a different way reviewing that, in a way that identifies the inaccuracies and help us reduce the likelihood of hallucination and inaccuracy,” Zicherman says.
Should that fail, there is not yet a feature that allows users to flag inaccurate content within their outputs, so, for now, users need to report inaccuracies directly to the company. Zicherman says that they have a process for “reviewing and incorporating” corrections into the “pipeline.”
If you don’t want to read, Oboe also generates podcasts. A pair of chipper, AI-generated “hosts” broke down concrete’s environmental footprint in a conversational manner similar to NPR’s Planet Money for me.
If an AI-generated podcast sounds similar to Google’s NotebookLM, it is. The difference is that Oboe users do not need to, but still can, upload material to generate a podcast. But this freedom comes at a cost — I trust NotebookLM’s podcast “hosts” relatively more than Oboe because I know the information comes from my documents – at least, it should be. Oboe’s podcasts, meanwhile, are made from unnamed models, without citations, so I could not bring myself to trust what I heard.
Oboe will continue to improve as more users prompt for more “courses” and the learning styles and struggles of users feed back into the models, Zicherman says. He likens Oboe to a human tutor, who learns how the student learns over time. “Oboe is not just a product that gives you a great personalized experience, it’s also a product that gets better the more you use it, the way that a human tutor would as you spent more time with them and they were able to understand more and more how you learn effectively.”
For now, I still prefer hearing directly from human experts — even if I need to work harder to identify them on the vast ocean of the internet, increasingly flooded with AI slop.
0 Comments