As a project leader at Boston Consulting Group, Rene Bystron quickly incorporated generative AI tools like ChatGPT, Claude, and Midjourney into his daily workflow. But when he wanted to better learn how to take advantage of their capabilities, he discovered existing online courses were not yet prepared to instruct users about AI effectively.
“I realized that there’s this huge gap,” said Bystron. “All the tools we’re getting right now are either not fun or semi-effective because they’re applying the same model they’ve applied to any other learning.”
Bystron decided to build a tool himself.
Seattle startup AI LaMo, launched earlier this year, is developing a mobile app that features prompt writing games, quizzes, and other peer-to-peer challenges.
The idea is to resemble learning apps such as Duolingo through a mobile-friendly interface and gamification features. Bystron said this will help it be more engaging than other courses on platforms such as LinkedIn Learning, Coursera, and Code Academy.
AI LaMo also features skill-specific courses, such as “ChatGPT for consultants,” along with other modules for lawyers and business professionals. Entrepreneurs, for instance, can use the app to learn how to use AI for business tasks, such as social media content creation and financial modeling.
The app makes money by charging a $29 monthly subscription for users that want unlimited daily content. It also allows users to complete challenges to earn “chips” that unlocks additional access.
Bystron said the app has more than 20,000 active users. He said the user base has grown mostly through word-of-mouth, adding that the company has not raised any outside funding.
Bystron previously founded and led Meltingpot Forum, a large multidisciplinary speaker event in the Czech Republic.
AI tools have seen a swift uptake, though there is plenty of room for more adoption.
ChatGPT reportedly reached 100 million users in January, just two months after its launch. But a May survey from Pew Research Center found that while 58% of American adults were familiar of the chatbot’s existence, only 14% reported using it for entertainment, education, or work-related purposes.
AI tools have also faced scrutiny for their role in disseminating misinformation. Chatbots can hallucinate, or produce false information disguised as fact, while AI image generators can craft convincing images that fuel disinformation.
Bystron said education is an important step in mitigating these risks. He pointed to headlines in May about a lawyer who was caught citing fake court cases that were generated by ChatGPT.
“When you think about it at an individual human level, he made a mistake leveraging this tool, and it has completely tarnished his reputation,” Bystron said. “But I’m convinced that if he took our course, he would have been in a different spot.”