Monthly Archives :

november 2020

What is the connection between IKEA, playful learning and TestingSaaS?

Playful learning at TestingSaaS

Playful learning at TestingSaaS 1440 710 Cordny

What has playful learning at Ikea have in common with TestingSaaS?

Yesterday I read an article on the Dutch mtsprout.nl about IKEA and how Bas Van De Poel, creative director SPACE10, IKEA’s innovation lab, looks for answers for tomorrow’s business questions and creating a better everyday life.

With the Ikea Place app Space10 revolutionized shopping for furniture by placing the furniture immersed in your own living room.

How? By augmented reality.

Ikea and Augmented Reality

Space10 saw this opportunity when in 2016 Apple announced the launch of ARKIT for iOS11 and development started.

Together with the Dutch TWNKLS, a PTC company. they made this opportunity a reality and created the augmented reality Ikea Place app.

“Having all these different minds on the project made the difference.

It meant we all pushed each other to our limits.”

Daniel van der Schoor, manager at TWNKLS

Great Cordny, but why are you so interested in this?

Not only because it’s about augmented reality or the Ikea brand.

Playful Learning and TestingSaaS

No, I’m interested in how Space10 fosters innovation.

Not with thick and elaborate research reports, but through playful research.

Like Space10 and Bas van de Poel I want to make research accessible to more people.

How? By first visualizing it. For this, augmented reality is a great method.

It’s key to combine the hard facts with appealing and simple graphics.

I do now want the reader to become an instant academic, but let him experience

the process of the research.

Frankly that’s what I have dome with TestingSaaS from the beginning:

show the reader what the online application is all about.

Software testing, security, data science, blockchain… I do not mind the subject.

I want to give the reader the experience.

How? By first experiencing it myself, write about it and give it to the world through a

manual, whitepaper, blog, case study or even in augmented and virtual reality.

Who knows? Maybe one of my readers will become inspired and create

the next innovative app or device.

Just like the companies I test for and write about.

Do you have a new innovative app like the IKEA’s augmented reality app Space?

Do you NOT want boring and extensive documentation your prospects do not like when onboarding and walk away?

Let’s have a chat and see how I can help you create documentation for your app your future customers will certainly like.

My first experience testing Google Assistant with the chatbot testtool Botium

Getting practical with Botium: Testing Google Assistant

Getting practical with Botium: Testing Google Assistant 1394 818 Cordny

How did I end up with testing Google Assistant with Botium?

During my daily software tests I verify a lot of SaaS apps and platforms.

Examples are Chatbots and virtual assistants.

And recently, clients asked me more about testing Google Assistant.

This can be done manually. This takes a long time and errors can occur when testing the same cases over a longer period of time. 

Is it also possible to use test automation?

Yes, with the help of Botium.

This article shows my first experiences when testing Google Assistant automatically with Botium.

But first what is Google Assistant?

What is Google Assistant?

Google Assistant is Google’s voice assistant. By voice commands, voice searching, and voice-activated device control you can finish a number of tasks after you’ve said the “OK Google” or “Hey Google” wake words. Conversational interactions through text or speech is its main goal, making Google Assistant a chatbot.

After the wake words you can start talking to Google without using a ‘trigger word’. Google listens and gives a response.

Google can even recognize different voices, knowing who is talking to it and responding accordingly. 

You can also ask for multiple things in one sentence.

Great, now we know what Google Assistant is. 

But do we want to test Google Assistant manually – with our own voice, or multiple voices – or can this testing be done automatically? 

Yes, by using Botium, the tool for Testing, training and quality assurance for chatbots.

What is Botium?

Selenium is the de-facto-standard for testing web applications. 

Appium is the de-facto-standard for testing smartphone applications. 

Botium is the de-facto-standard for testing conversational AI. 

And just as Selenium and Appium, Botium is free and Open Source, and available on Github.

I can tell you a lot about Botium’s architecture, but that’s out of scope for this article.

Next to testing the conversation flow of a chatbot you can do a lot more:

  • Testing NLP model of a chatbot
  • E2E testing of a chatbot based on Selenium and Appium
  • Load- and Stress testing
  • GDPR testing
  • Security testing
  • CI/CD integration (Jenkins, Bamboo, Azure DevOps Pipelines, IBM Toolchain, …)
  • and many more

For now we will focus on testing Google Assistant with Botium.

How can I test Google Assistant with Botium?

Instead of thinking how I could test Google Assistant with Botium I did a very modern thing.

I Googled the question.

And I got lucky. Botium already made a video on this with the appropriate name ‘Setting Up a Google Assistant Project in Botium Box’

Awesome, now I can copy this completely for my own testing.

The steps in Google Actions, Assistant and Botium are very straightforward.

And soon I was making my own Botium tests and creating scripts for testing Google Assistant.

But, as with all software, it is not always as easy as it seems.

My experiences with Testing Google Assistant with Botium

From what I saw when testing Google Assistant, Botium as a tool is very intuitive.

What I really like is that you can see and hear (!)  the results as they occur when using Google Assistant. Just search in your test result for the following links.

The sound and screen-links overlap, but that’s a cosmetic issue.

The important thing is, it works. 

You can hear the text as it is displayed by using the play button as seen above.

And by clicking the weblink you are forwarded to a page resembling the results you get when using Google Assistant.

One of the things I noticed when testing Google Assistant it can be different in their response than what you expected.

For example, see this result after saying ‘Hallo’ to Google Assistant:

Always update your testcases regularly, Google Assistant is changing fast, and as you can see, it even gets more personalized.

Also, what happens when the answer to your question can be random? Better said, the answer can be given in a different sequence.

Google Assistant will give appropriate prompts, depending on the question you ask.

But see what happens when I ask Google Assistant via Botium the following question:

‘Who is Jack Nicholson’?

Great, the testcase worked fine, but let’s see what happens when I repeat this testcase?

In the second test different buttons appear in the result, failing the testcase.

Now you can delete the expected test results from your Botium code in your testcase, but remember you are testing a voice assistant and Google Assistant returns these buttons so you can click them for more information or an answer to a question.

Therefore in Botium you really have to define your testcases and their expected results to avoid situations like this.

Wrap up

This article showed my first experience testing Google Assistant with Botium.

These results are not exhaustive, I only mentioned the findings important when testing Google Assistant with Botium for the first time.

Other future test cases could be multiple voice testing, asking multiple questions at the same time etc. But that’s for a future article.

Botium is intuitive, easy to use and can be used for different testing processes with chatbots.

As with every tool, you have to stay critical as a real tester should be.

Test automating a chatbot is never easy, but with Botium you have a great tool to work with.

It should be in every chatbot tester’s toolbox!