Oracle's move to Agents isn't just about AI – it's about solving user problems.
Oracle's new AI agents are a great example of the level of specificity needed in defining agent roles. They shows a deep understanding of user needs. Are they using the Jobs to Be Done framework? If they did, its evidence that shows how user centric thinking could impact design decisions in enterprise AI.
The advancement in enterprise AI is exciting, focusing on augmentation over automation. Creating tools that enhance human capabilities, based on how they actually work, opens up new possibilities for us as designers. Go take a look at their over 50 role-specific assistants.
Its interesting to see what they chose to prioritize.
Oracle AI Agents Help Organizations Achieve New Levels of Productivity
"What’s the real reason AI hasn’t yet delivered on its hype?"
The article's key angle pretty interesting: Current AI tools are falling short in truly revolutionizing productivity. It's a wake-up call for the tech industry. Generally speaking, for AI centered products, there is impressive growth numbers, but are we really moving the needle on innovation?
The critique of AI's current approach to complex tasks is simply reducing them to button-clicks - is this enough? We need to fundamentally rethink workflows, not just automate existing ones.
Redefine. Reimagine. Reconstruct.
There is a shift towards AI-native product design thinking and it is something new to learn. This article is talking about how we need to move beyond mere AI enhancements and completely reimagine workflows for the best possible experience for AI integration. It's not just about adding AI features; it's about fundamentally restructuring how we interact with technology.
He gives six workflow examples based an episode of The future of Prosumer: The Rise of “AI Native” Workflows - its a good watch but I liked Nielson’s summary too.
Filling in the blank page: Help the user get started.
Iteration: Easy onramps for variation.
Refinement and upscaling: Hooks for human refinement.
Multimodal: This one is exciting, but designing for not just one input but many is a challenging thing to design for.
Remixing: Similar to iteration, but more about leaving room in the P/R curve for things to happen without explicit user instruction.
Guided exploration: Yes.
Native AI Workflows - Jakob Nielsen on UX
AI? In public service? In this economy?!
The integration of AI into unemployment appeal processes is concerning and a bit yucky. Nevada's move to implement a Google-powered AI system for analyzing unemployment appeals is, on the outside, a big step towards efficiency -which is appealing, BUT I have critical questions about the balance between speed and accuracy in decision-making.
RAG improvements in models is promising for enhancing accuracy, but I can't help but wonder about the potential for bias and errors in such a sensitive area of public service.
Classic case of technology outpacing our ability to fully understand its implications.
Google's AI Will Help Decide Whether Unemployed Workers Get Benefits
The "Hydra Project Effect"
This is a great concept that resonates with my experience at work, at home - anywhere I have projects. It's not just about getting things done; it's about battling our own tendencies to get lost in all our iterations and improvements. Its about the psychological challenges we face when trying to complete anything.
I appreciate the strategies outlined for overcoming this cycle. The idea of defining clear completion criteria and embracing MVP as a first step is something I could be better at.
And finding balance between perfectionism and pragmatism is crucial for … getting things done.