During a session on chatbots at the Oracle UX event I made a comment that I think was taken incorrectly and as we were pushed for time I didn't have the opportunity to explain what I really meant.
The examples that was given was one of an individual using an intelligent personal assistant such as Alexa to ask how much vacation do I have and the response being taken from HCM Cloud
I said that I had just had a salesman give me this example and I had replied I didn't want a non work related interaction having access to my information. The reaction I got from the UX team made me feel like a complete Neanderthal but I'm not; or am I?
Actually I've been here before, when the Fusion Expenses mobile app first appeared, the voice recognition was quite poor and in fact we joked that unless you spoke just like George Bush it didn't understand you anyway. I did however demo the app in many conferences around the world and I suppose if I'm honest, when doing this part of the presentation I said it was a bit of a gimmick, I didn't see myself ever using it. Roll on 4 years and guess what is really funny? I am actually creating this blog using voice recognition on my iPhone and don't even think about it today.
Anyway I was in India for OTN Yathra and I had a room full of people watching attentively to my presentation and I said I didn't think I'd ever use voice recognition in the app and one of them said they had an exact use case. He worked for GE and they have engineers who have to service electricity windmills, they need to go up to the tower and they can use a barcode and GPS to know exactly which one they are at. However when they climb to the top and they do their inspection they then have to complete a service inspection report. Being able to enter transactional data into their enterprise application using voice recognition would revolutionise their work pattern. Their whole team agreed this was wonderful.
He was right, this was a real use case that an enterprise customer needed and there was I laughing at what I saw was a gimmick because I couldn't actually see me in the use case that had been presented - sitting in the back of a taxi and saying 'taxi ten dollars'.
So I guess then I should've learnt that the problem with adoption is the disruption concept doesn't seem personal until the use cases talked about are real to us, and then we can see the proper adoption capabilities.
It isn't that long ago that we were laughing at Mark Rittman for trying to make his Smarthouse enabled and he had the well documented problems with making a cup of tea. People then said, 'why don't you just switch it on yourself?' Now these intelligent personal assistants are common place. In this UX session I was one of only a couple who did not have something like an amazon echo in their home today.
Perhaps if I spent more than three days a fortnight (2 weeks) in my house I might consider it.
Anyway back to the session. The use case being given was that of you using a chatbot to interrogate HCM, it seemed foreign to me, I didn't feel comfortable with the concept and that is the barrier to adoption. If an organisation doesn't see the use cases being relevant to them they won't use it. Well not today. But as but I said adoption becomes mainstream, all of a sudden it doesn't seem so foreign.
Anyway that's what I was trying to say but we had so much to cover the discussion was cut short and then all everybody remembered was that I was the philistine in the corner. I tweeted that and when Jeremy Ashley created his story of UX at #OOW17, the only tweet from me was this:
And then on Twitter I had a response from Grant Ronald telling me about the fact that this technology is great that people are adopting it everywhere, so even in San Francisco I was getting the hard life.
I'm going to have to think more cleverly and say what I need to contribute most distinctly.
What is important is that Oracle invest a lot of R & D, not just in UX but in these new technologies. Jake and his team in the AppsLab have a mandate to understand emerging technologies, use case and PoCs within the enterprise.
Oh and guess what we are now working on back at the office?
Return to main #OOW17 Summary
Actually I've been here before, when the Fusion Expenses mobile app first appeared, the voice recognition was quite poor and in fact we joked that unless you spoke just like George Bush it didn't understand you anyway. I did however demo the app in many conferences around the world and I suppose if I'm honest, when doing this part of the presentation I said it was a bit of a gimmick, I didn't see myself ever using it. Roll on 4 years and guess what is really funny? I am actually creating this blog using voice recognition on my iPhone and don't even think about it today.
Anyway I was in India for OTN Yathra and I had a room full of people watching attentively to my presentation and I said I didn't think I'd ever use voice recognition in the app and one of them said they had an exact use case. He worked for GE and they have engineers who have to service electricity windmills, they need to go up to the tower and they can use a barcode and GPS to know exactly which one they are at. However when they climb to the top and they do their inspection they then have to complete a service inspection report. Being able to enter transactional data into their enterprise application using voice recognition would revolutionise their work pattern. Their whole team agreed this was wonderful.
He was right, this was a real use case that an enterprise customer needed and there was I laughing at what I saw was a gimmick because I couldn't actually see me in the use case that had been presented - sitting in the back of a taxi and saying 'taxi ten dollars'.
So I guess then I should've learnt that the problem with adoption is the disruption concept doesn't seem personal until the use cases talked about are real to us, and then we can see the proper adoption capabilities.
It isn't that long ago that we were laughing at Mark Rittman for trying to make his Smarthouse enabled and he had the well documented problems with making a cup of tea. People then said, 'why don't you just switch it on yourself?' Now these intelligent personal assistants are common place. In this UX session I was one of only a couple who did not have something like an amazon echo in their home today.
Perhaps if I spent more than three days a fortnight (2 weeks) in my house I might consider it.
Anyway back to the session. The use case being given was that of you using a chatbot to interrogate HCM, it seemed foreign to me, I didn't feel comfortable with the concept and that is the barrier to adoption. If an organisation doesn't see the use cases being relevant to them they won't use it. Well not today. But as but I said adoption becomes mainstream, all of a sudden it doesn't seem so foreign.
Anyway that's what I was trying to say but we had so much to cover the discussion was cut short and then all everybody remembered was that I was the philistine in the corner. I tweeted that and when Jeremy Ashley created his story of UX at #OOW17, the only tweet from me was this:
And then on Twitter I had a response from Grant Ronald telling me about the fact that this technology is great that people are adopting it everywhere, so even in San Francisco I was getting the hard life.
I'm going to have to think more cleverly and say what I need to contribute most distinctly.
What is important is that Oracle invest a lot of R & D, not just in UX but in these new technologies. Jake and his team in the AppsLab have a mandate to understand emerging technologies, use case and PoCs within the enterprise.
Oh and guess what we are now working on back at the office?
Return to main #OOW17 Summary
No comments:
Post a Comment