Here at FinancialForce, we pride ourselves on embracing a world where technology continues to make our working and home life a bit easier and more flexible. We look for the newest, cutting edge technologies and ask ourselves, how can our products interact? Could this year’s new innovation positively impact how our customers interact with our apps someday? That’s what we did with wearables, and today we’re excited to share that’s what we’re doing with Artificial Intelligence, IoT and Voice Recognition technology via the wildly popular Amazon Echo.
The so called “future of work” has indeed arrived! Today’s office can now be boiled down to a voice and a mobile device. How exciting is it to be able to ‘punch in’ to a client site via a tap of the watch or see how your projects are performing with a simple voice command? Now with the introduction of Salesforce Einstein, Amazon Echo and whatever the next big thing is, you can count on FinancialForce to be there.
I had the pleasure of sitting down with our dream team – AKA Andy Fawcett, FinancialForce CTO and Salesforce MVP and Kevin Roberts, FinancialForce Director of Platform Technology – to discuss the latest innovation project at FinancialForce: bringing ERP to life on Amazon Alexa and how AI and voice recognition might impact the way we engage with business applications in future.
Let’s take a look!
Interview with Andy Fawcett: How We Did It
Q: What prompted you to start thinking about connecting Amazon Echo to FinancialForce ERP applications and the Salesforce Platform?
Andy: Kevin Roberts, our Director of Platform Technology and resident evangelist on all things platform pinged me in March of this year. He had seen a Tweet about an integration where Amazon Alexa (the voice service of Amazon Echo) was being used for paying your Capital One bill and planted the seed of how FinancialForce might be able to connect to Echo. He also knows I am a huge API junky and knows the right buttons to press! How could I resist experimenting?
Q: What happened next?
Andy: I quickly got my hands on a device and headed back to my hotel room for a weekend of Amazon Echo hacking! Being in San Francisco at the time certainly helped with the creative juices of course. Knowing Kevin as I do, I knew whatever I did on the development side, I had to empower him to go crazy with use cases of what ERP skills we could bring to life. To do so, this meant embracing a key tenant of the Salesforce Platform we and our customers love so much, “clicks not code.”
Q: For those of us who don’t know, what exactly is “clicks not code,” and how did this approach affect what you built?
Andy: Clicks not code allows processes to be automated by those that do not posses coding skills, by using visual process modelling tools and/or form based configurations. So I started by identifying the most flexible “clicks not code” tool on the Salesforce Platform, Visual Workflow. Through its visual drag and drop editor, it has the power to retrieve and manipulate records from not only our products but any object on the platform. Critically it also has its own API, thanks to Salesforce! So it was a matter of bridging the gap Amazon Skill API to the Salesforce Flow API.
Q: Ah ha! So you did need to write “some” code?
Andy: Well yes, but not a lot! In total it’s just over 130 lines of NodeJS code. The Amazon Echo technology translates spoken phrases into intents and slots. For example “Tell me what I am doing next thursday?”. Is one of many sample phrases (utterances) you can teach it. Each phrase you teach it is assigned an id (intent) such as “calendar enquiry”, as well as defining the varying aspects of each phrase (slots). In this example, our slot, is the date aspect of the phrase. The date slot, is cleverly calculated to whatever “next thursday” actually is before it’s passed to the code. You can even say things like “28th” or “next week”. Very cool, Amazon!
Q: So how does this map to the world of Salesforce Visual Workflow?
Andy: Simple! The result from my weekend hacking was that any skill/phrase Kevin came up with, he assigned an intent Id, which got automatically mapped to a corresponding Visual Flow he creates in Salesforce of the same name. Any slot values, the date in the above use case for example, was passed as a Flow parameter, which he also would define. All with the magic of clicks not code!
The NodeJS code (hosted in Amazon Lambada) complies with the Amazon Skill API and manages transferring the information between Salesforce Flow and back again! Ok, so maybe that does not sound so simple!?! If you would like to know more and even how to setup your own skill using this code, I have given the full low down in my blog here.
Interview with Kevin Roberts: How You Can Use It
Q: OK, so Andy walked us through how he developed an Amazon Echo skill that enables spoken conversations between a user of the Amazon Echo and the Salesforce Platform. You’ve been working on how that skill might be used to with FinancialForce apps, correct?
Kevin: Yes once Andy had built the Echo skill I was able to use the standard Flow tool in Salesforce to configure some proof of concept “ERP conversations.” For each question I wanted to ask Echo, I was able to simply configure a Flow process that queried the relevant FinancialForce data and inserted that data into the sentence or sentences that became the spoken Echo reply. I think we were both quite shocked just how quickly we could define new commands and associated responses from Echo once we had the underlying skill running.
Q: Can you give us an example of one of these conversations and how it would be configured?
Kevin: Happy to! When we looked at how Echo was being used in the consumer world the first commands we saw were simple questions such as “What’s the weather today?” or “What time is it?”, so an early FinancialForce command we wanted to try was a simple employee use case of “Who am I?”. The use case I had in mind was a new hire needing basic information to complete a task during their onboarding process. With a simple spoken command the user could check basic details from their user profile such as job title and department, their manager’s name and employee number.
To configure this new command I simply had to define a new “intent” for the FinancialForce skill in the Amazon Developer console and then create a new Flow process associated with that intent. Below you can see what the Flow looks like. The majority of the work is done in the Assignment step which constructs a spoken response that combines fixed text and variable data which has been read from the user’s profile.
Checkout this short video to see this “Who Am I?” conversation in action.
Q: That’s very cool. Did you create any other examples in your proof of concept?
Kevin: While the “Who am I” use case was a simple single fixed question we then created more sophisticated interactions which included prompts and more complex queries. As a test of doing data entry via voice commands we created a “Add Project Comment” flow which asks the user for a project name and a comment to be added to the project chatter feed. The conversation looks like this:
User: “Alexa, ask FinancialForce to add project comment”
Echo: “Tell me the project name for your comment”
User: “Project name is greentech pilot”
Echo: “What is the comment you wish to add?”
User: “Great job team”
Echo: “Your comment has been added to the chatter feed of the Greentech pilot project”
News briefings (aka “tell me the news”) was another popular use of Amazon Echo we saw in the consumer world, so we built a “FinancialForce News” flow process that pulled together a summary of live data from CRM and ERP objects including Opportunities, Projects, Cases & Sales Invoices and dynamically built a live news briefing which included commentary from departmental leaders maintained as records in a “news headlines” custom object.
The FinancialForce News briefing example to me, really demonstrates the power of having all your business data housed on the same platform. As a result, Echo can seamlessly share information pulled from Salesforce CRM, Professional Services Automation (PSA), Financial Management and Human Capital Management (HCM) applications in a single, easy to maintain, flow process.
You can see both of these use cases demonstrated in this video .
Q: What conclusions did you reach based on the work you’ve done building integration between the Amazon Echo and FinancialForce?
Kevin: Right now, Amazon Echo is predominantly focused on the consumer market rather than a business audience so I don’t expect FinancialForce customers to be asking Alexa questions about their ERP right away, but I do think what we’ve created gives us an indication as to how we might be interacting with our business applications in the future.
Working with the Echo has shown that voice recognition technology has matured to a level where accuracy is high and the spoken interface feels very natural. When you add the growing adoption of Apple Siri, Microsoft Cortana and Google voice controlled interfaces we will see an expectation for these consumer technologies to become applied to business applications. It’s been very encouraging to see how the strong APIs of both Amazon and the Salesforce platform have made our early exploration of voice driven ERP so quick to develop by maintaining our strong heritage of “clicks not code” configuration.
Want to learn more? Check out Andy Fawcett’s blog on the development side of Amazon Echo + FinancialForce ERP. We’ll be at Dreamforce showcasing our voice ERP commands so please don’t hesitate to reach out to our team – we’d love to meet you!