Strange labels: Has tech lost its way with words?
As AI's influence has exploded over the past year, I worry it may be running headlong into a growing and pervasive naming crisis. I'm not talking about the brand names AI companies have chosen for their flagship products—though that’s certainly a topic worth exploring in a future post. Rather, my concern is about the terminology being used for various modes of interaction within the AI products themselves, because we're at a crucial point in AI's development, establishing a new vocabulary to describe these tools.
Learning from history: The GUI revolution
This isn’t a new challenge in tech. When personal computers with graphical user interfaces (GUIs) first emerged, designers had to create paradigms that would be intuitive to the general population. They introduced concepts like files, folders, desktops, and windows—metaphors anchored in the physical world that resembled their digital counterparts. These intuitive concepts proved so effective that they’ve stood the test of time, teaching new generations of users through familiarity.
Modern metaphors: The canvas concept
Some current AI tools have managed to find equally intuitive metaphors. Take the term "canvas," which I explored in a recent post. A canvas works well as a metaphor because it builds on widely understood concepts: it's a space for creativity and expression, whether you're using ChatGPT to work on a story or Ideogram to experiment with images. The term may be used broadly, but it successfully conveys its purpose to users. It’s relatively intuitive, successfully conveying its purpose to users without unnecessary explanation.
When metaphors miss: Artifacts and projects
However, other terms in the AI space are far less intuitive. Consider Claude's use of "artifacts"—a term that puzzles even experienced users. In Claude’s interface, an artifact can be anything from an interactive chart to a functioning web application, including the underlying code, displayed in a specialized output window within the application. While the term might make sense to professional UX designers, who use the term "artifact" to describe intermediate deliverables in the design process, it’s likely unclear to most users why this term would be used here. A more accessible term like "prototyper" might better communicate its actual purpose.
The term "project" is another example of overloaded terminology. In Recraft, a project is akin to a canvas in Ideogram. It's essentially an open workspace for image generation. ElevenLabs employs the project label for a workflow tool that allows users to create long-form audio files, like audiobooks, with chapters and multiple voices. And in Claude, a project serves as a space for file uploads and custom instructions that can be saved and reused in future interactions. These projects even can be shared with colleagues if you're on the Claude Team plan. If this sounds familiar, it's because this is very similar to OpenAI’s "Custom GPTs" which fulfill a similar purpose but introduce yet another term for users to decipher. These discrepancies create a confusing landscape, especially for users who interact with multiple platforms.
The GPT conundrum
Speaking of confusing terminology, "GPT" is a prime example of a technical term going mainstream while losing much of its original meaning. I'm guessing very few people know that GPT stands for Generative Pre-trained Transformer. It's a name that has no meaning to the average user. Despite this, it became a household term, with "ChatGPT" being synonymous with conversational AI. At this point the term has become so generic that OpenAI's attempt to trademark "ChatGPT" was declined by the US Patent Office, which viewed it as merely descriptive rather than distinctive. This case just highlights the broader issue: technical jargon, when adopted by mainstream audiences, often loses its original meaning.
The ultimate misnomer
Perhaps the most problematic term of all is "AI" itself. What began as a specific scientific concept has evolved into a catch-all phrase for virtually any advanced computational technology that employs Machine Learning algorithms or Neural Networks. It's become so generic that we now casually pluralize it ("AIs"), using it to describe everything from simple automation tools to sophisticated language models. While there's a fascinating history behind the term "Artificial Intelligence," its current usage has diverged significantly from its academic origins. Interestingly, there's another intelligence-related abbreviation in the common lexicon with a similar issue: IQ. Most people know what the "i" stands for, but far less know what a quotient is. Perhaps this indicates that intelligence itself is a problematic label.
Looking forward
As AI technology continues to evolve and reach broader audiences, the industry needs to strike a better balance between technical accuracy and user understanding. We might take inspiration from those early GUI pioneers who managed to create intuitive metaphors that have stood the test of time. Perhaps instead of borrowing jargon from technical fields or using terms so broad or obscure they lose their meaning, AI companies should focus on developing terminology that resonates with an increasingly diverse user base.
The challenge goes beyond mere semantics. As these tools become more integrated into our daily lives, clear and intuitive terminology will become increasingly important. As these tools spread globally, these terms need to maintain their meaning across languages and cultures.
To be fair, I don't think this is an easy job, especially as this technology is evolving so rapidly. But the industry would benefit from a more thoughtful approach to naming conventions—one that prioritizes user understanding over technical or marketing considerations. After all, if we're building tools that are meant to make technology more accessible, shouldn't we start by making them more understandable?
Images generated with Midjourney. Copy editing assistance by Claude 3.5 Sonnet and ChatGPT-4o with canvas.