Intelligent failure turns on limitless naivete and an absence of artificiality. Failure is intelligent when no formula exists to precisely map the terrain in advance. These failures begin with designing smart pilots that keep failures small. AI-assisted innovation and creativity have been nothing short of transformative. AI assemblages have achieved significant results by composing poetry, creating art, producing music, designing Carnival bands and fashion collections, and discovering molecules.
In the music industry, AI is being used to build instrumental tracks, write lyrics, and shape vocal melodies. An AI-assisted Reggae Album was submitted for Grammy consideration in 2025, in which the producer used AI for about ten per cent of the process, specifically for refining vocal textures, while keeping 90 per cent of the creative direction human-led.
The Jazz-infused soul-songs of Sienna Rose have been played more than five million times and are in Spotify’s Viral Top 50. Sienna has no social media presence, never played a gig in a café or lounge, has no music videos and has released an improbable number of songs in a short space of time.
Despite these staggering advances, angel financiers are concerned that we may be inside an AI bubble. Market analysis from 2025 and early 2026 points toward gold as a strong potential hedge against a possible AI bubble. Many analysts view gold as a “safety net” if the AI boom leads to a stock market correction.
AI stock prices may be reflecting a “speculative bubble” rather than a sustainable trend. Investors believe AI-related stocks are overvalued, but demand is not falling. This has driven gold prices to US$5,000 an ounce. The uneasiness has led AI assemblages to propose a profit-sharing model for AI breakthroughs around IP law. The idea is that AI assemblages will want to be paid like partners who share the upside when clients hit on something valuable, like a new molecule, a new financial product, or a new viral song.
This is framed as “licensing models,” in which the AI assemblages take a share of downstream sales if the clients’ innovation or creativity takes off in markets across one or more cultural settings. Taking a cut is a clear sign that the models are now powerful enough to produce regular discoveries and creativity with intelligent failures that are small and fast.
When models create something innovative or creative, it stops being software and begins to resemble a discovery engine or a creative tool that keeps paying dividends for whoever plugs it into the right workflow in a medical laboratory, a fashion atelier, or a sound technology studio.
The lifeblood of innovation lies in generating new IP. It is therefore necessary to safeguard the IP associated with AI workflows and applications for everything from medicine to music. Inventors and creatives are exploring new ways to navigate the complexities of IP law, which are showing severe fissures and failures.
Jurists are grappling with the legal uncertainties and blurred boundaries over whether to patent the AI algorithms themselves or focus on patenting the outputs generated by those algorithms. Is the IP centred on the technology (tech IP), the creative use of the AI assemblages, or an interdependent combination of both? A key roadblock is that AI algorithms cannot be patented directly. To work around this limitation in medical sciences, the AI industry is using experimental validation of AI-driven drug discoveries, enabling companies to secure IP protection for the discoveries and proprietary AI tools as trade secrets.
Instagram reels and other social media platforms are showcasing AI-generated or assisted music tracks that highlight the use of “assisted intelligence” in music production. Current laws buckle under the burden of assigning IP rights, as many jurisdictions require a human inventor. Key issues include patenting AI-driven inventions, protecting training data, and the role of autonomous AI like DABUS.
DABUS, or Device for the Autonomous Bootstrapping of Unified Sentience, mimics creativity by generating and evaluating fresh outputs without human input. This isn’t just automation. It is innovation. When creators filed patent applications for DABUS’s outputs, naming DABUS as the inventor, courts in multiple jurisdictions rejected the claims. Their reasoning was consistent: only a human can be an inventor.
Present patent frameworks require that a human must have contributed meaningfully to the inventive process. This means that even if AI plays a major role, the human input, even in training the AI or interpreting its output, must be substantial. To be eligible for a patent, an invention must meet the criteria of uniqueness, utility, and unmistakability. AI systems are often “black boxes” with their decision-making processes obscure. This poses a legal challenge for patent disclosure requirements, as it can be difficult to explain how a particular invention or a Carnival song was conceived and created.
Dr Fazal Ali completed his Master’s in Philosophy at the University of the West Indies. He was a Commonwealth Scholar who attended the University of Cambridge, Hughes Hall, the provost of the University of Trinidad and Tobago and the acting president, and chairman of the Teaching Service Commission. He is presently a consultant with the IDB. He can be reached at fazalalitsc@gmail.com
