Skip to content

Chatgpt invented a product function from thin air, so this company made it

    On Monday, Sheet Music Platform Soundlice says that it has developed a new function after he discovered that Chatgpt users wrongly told us that the service could import ASCII tabilature-a text-based guitar notation format that the company had never supported. The incident reportedly marks what the first case can be of a business structure functionality in direct response to the confabulation of an AI model.

    Soundslice usually digitizes sheet music from photos of PDFs and synchronizes the notation with audio or video recordings, so that musicians can see the music scrolling while they hear him play. The platform also contains tools for delaying playback and practicing difficult passages.

    Adrian Holovaty, co-founder of Soundslice, wrote in a recent blog post that the recent development process of functions began as a complete mystery. A few months ago, Holovaty started to notice unusual activities in the company's error logs. Instead of typical sheet musicuploads, users submitted screenshots of chatgpt conversations with ASCII tablature -simple text representations of guitar music that look like curves with songs that indicate a fret positions.

    “Our scanning system was not meant to support this style of notation,” wrote Holovaty in the blog post. “Why were we bombed with so many Ascii Tab Chatgpt screenshots? I was stunned for weeks – until I rumbled with Chatgpt.”

    When Holovaty Chatgpt tested, he discovered the source of the confusion: the AI ​​model instructed users to make Soundslice accounts and use the platform to import ASCII Tabbladen for playing audio -a function that did not exist. “We have never supported ASCII Tabblad; Chatgpt was downright lying against people,” wrote Holovaty. “And make us look bad in the process, setting false expectations about our service.”

    A screenshot of the new ASCII Tab Importer documentation from Soundslice.

    A screenshot of the new ASCII -TAB import documentation from Soundslice, hallucinated by Chatgpt and later really made.


    Credit: https://www.sounlice.com/help/en/creating/importing/331/ascii-tab/

    When AI models such as Chatgpt generate false information with apparent trust, AI researchers call it a “hallucination” or “confabulation”. The problem of AI models that confabulate false information has plagued AI models since the public release of Chatgpt in November 2022, when people wrongly started using the chatbot as a replacement for a search engine.