In a previous discussion, I explained why many SaaS products have not significantly evolved in terms of features and UI/UX over the past fifteen years. However, this is starting to change with the integration of Generative AI (GenAI) into SaaS products. While I do not believe that GenAI is going to require a complete overhaul of SaaS products, this evolution introduces changes, particularly in UI/UX design, for which we probably need new approaches and solutions.
How AI-Powered Features differ from Classical Features
Traditional SaaS product interactions have stayed relatively unchanged the past 20 years: users perform actions by clicking on buttons and input data by typing into forms (whether text or numbers). The emergence of GenAI introduces a novel set of interaction paradigms into SaaS products:
Prompt based interactions.
Non-prompt-based interactions.
Prompt-Based Interactions: Increasing complexity in SaaS products
At their core, prompt-based interactions (whether text or voice) are not entirely new. Most users are familiar with entering text prompts in search engines or using voice commands with digital assistants like Siri or Alexa. However, GenAI's incorporation into SaaS products creates new levels of complexity and challenges.
Users are now required to articulate their needs with precision, navigating a potential trial and error process to refine their prompts for optimal outcomes. Some people think that this process is like conversing with someone. But conversing with an AI in a professional context is actually demanding. And it often requires a nuanced understanding of what AI is capable of, in order to achieve the desired results. The challenge lies not just in asking the right questions, but in framing them in a way that aligns with the AI's processing capabilities, transforming prompt crafting from a simple task into an “art form” (more art than science).
Adding prompting in SaaS products can definitely create more complex interactions than clicking on buttons and filling form fields.
Non-Prompt-Based Interactions: From worker to quality control manager
For a deeper understanding of non-prompt-based interactions, you can check my previous article detailing examples of GenAI features operating under this model. Essentially, these interactions involve AI analyzing specific data types in the background, autonomously to generate outputs without direct user input. The user can use these AI generated outputs “out of the box” or modify them to its needs. It’s basically the “co-pilot” approach.
But I have the impression that this "copilot" approach is quite different from the traditional tool-based usage of SaaS products where the user was in complete control, selecting and using SaaS tools based on specific needs. “I need to plant a nail? I will use a hammer.”
The transition to GenAI introduces a layer of unpredictability and variability because LLMs are non deterministic and can hallucinate. Users must now assume the role of quality control managers, evaluating the AI-generated outputs for accuracy and relevance, a task that adds a new layer to the user experience.
Now the user is not the one planting nails anymore. He’s the one checking whether the nails are well planted or not (quality control).
The Implications for Product Management and Development
Changes for Product Designers and Managers
The changes I listed above necessitate adapting the current UI/UX best practices. I believe that product managers and designers have exciting new opportunities to develop UX flows and adapt UI components to better suit these AI powered interactions.
For instance, in the case of prompt-based interactions, I saw in several SaaS products super interesting UI components that assist users in crafting and refining their prompts (and not only raw text input forms). Such components will be crucial for facilitating smoother interactions with GenAI, enabling users to more effectively communicate their needs to the AI-powered feature.
Another area of innovation are UI components that help users evaluate the accuracy and reliability of AI-generated outputs. This has led to creative solutions, such as incorporating accuracy score estimations or adopting a "Tinder-like" mechanism where users can swipe through AI suggestions, approving or rejecting them based on their perceived accuracy. These examples show the necessity for UX/UI designs that not only enhance usability but also instill confidence in the AI's outputs.
At a higher level, GenAI powered features might bring a psychological shift in the user’s perception of the tool he is using. Historically, SaaS users have expected deterministic results from their interactions with software: input X and receive a consistent output Y. The unpredictable nature of GenAI challenges this expectation, potentially eroding users' blind trust in a software product. Addressing this shift, a.k.a ensuring users retain confidence in AI-enhanced products, becomes a crucial responsibility for product teams.
Changes for Developers
For developers, the creation of GenAI features also introduces a host of new challenges and considerations. This article explains very well how building GenAI powered features brings new challenges in terms of prompt engineering, orchestration, testing, best practices, safety, compliance etc…
This is also what the Incident.io team shares in their awesome article about their experience developing AI-powered features. Their journey highlights some of the hurdles I mentioned, especially on the trust and accuracy side.
Conclusion
While I do not think that GenAI requires a complete rethinking of product management, its infusion into SaaS products will probably impact it at some level.
It seems like an opportunity for product designers, managers, and developers to reimagine UX/UI designs and to develop new UX frameworks for AI interaction.
Looking at the shared experience of other developers and product managers it is also clear that it will create the need for new skills and even new roles in the product department.
As well as acquiring new PM and dev skills.