Perceptron – AI Assistant for Observational Research
”Quant meets Qual” to give data-backed insights for real-world consumer behaviour
“95% of purchase decisions are made in the subconscious mind”
Famed author and Harvard Business School professor, Gerald Zaltman, says that 95% of our purchase decision making takes place in the subconscious mind. This makes sense to us because Nobel prize-winning psychologist, Daniel Kahneman, has proved with hard data evidence that we humans are essentially lazy users of our conscious brain. The subconscious part of our brain is very very powerful and perceptron plays an important role in such conditions. For instance, driving happens in an automatic mode for us. We can easily drive our way from home to work, all the while thinking about tasks ahead and without actually engaging our mind in the act of driving. Gerald argues that our preference for the brands that we choose and how we use these products also happens in the enigmatic layers of our deep subconscious.
“Tell me how AI can help us get implicit insights”
This is a common theme that has emerged in our conversations with clients. Today’s researchers want implicit insights, they want it fast and they want it backed by data. At Karna AI, this is the key theme on which we focus and one such solution that addresses this gap is PERCEPTRON.
Companies in the digital space are extremely avid users of market research techniques. For instance, the entire industry builds websites and apps by analyzing raw data of user behaviour through interaction logging services like Google Analytics. Armed with this data, product managers test multiple designs, understand the customer flow and make data-backed improvements.
But when it comes to the physical world, things become different. How does a company selling shaving equipment, bicycles, cigarettes, coffee etc, get Google Analytics like user behaviour and interaction data to base decisions on? The good news is that AI is bridging the gap between physical and real world to make this happen. Read on to know more about how we address this market gap with Perceptron — The AI assistant for Observational Research.
Perceptron — AI Assistant for Observational Research
An overview of how Perceptron can deliver insights that were just not possible to derive before.
Observing how people interact with products can yield useful insights to understand consumer needs and motivations, why they prefer a certain product over others and how a brand can evolve its messaging to generate higher impact. It is a common research practice which is today performed with the help of ethnographers or domain-skilled researchers who spend days or weeks observing respondents, noting down data points, capturing videos, asking questions and later provide a report that addresses key questions in the research brief.
Humans have limitations
Observing all the consumer interactions manually is a time consuming and cumbersome process. Humans can capture high-level data (like time taken to eat a subway sandwich) but find it difficult to capture minute data points (number of chews in eating a sandwich and average duration of those chews).
Presence of human keenly observing human can be unsettling for the research subject (the consumer) and can alter his behaviour. “To err is human” and researchers are no exception. Human researchers are prone to biases, fatigue, errors and judging the respondents on personal feelings.
“I want data and I want it fast”
Manual Observations are not scalable. A researcher would want her decisions to be based on a large sample of data (typically ~200 respondents) while human-based observations don’t scale very well (typically ~20 respondents). Manually intensive research projects have long turnaround times as it takes a lot of time to collect data, process it, finding relevant video sections etc. And of course, all these constraints result in the cost of the research exercise.
However, we acknowledge that ethnographers have a great deal of value to add. What we propose is that with a teaming up of man and machine, the entire research exercise can be improved.
Auto-Observational Research — A Case Study on trimming
Understanding how men trim beards and why certain products work!
This is a conceptual case study of how Perceptron is used for product testing and how can it help a beard trimmer brand to get a better understanding of its customers. What perceptron needs is hundreds of videos where bearded using different trimmers (say 100 respondents each for three trimmer variants A, B and C) for video analysis and product testing. PERCEPTRON performs facial coding of these videos which can be sourced by the client in-house, through an agency or through an online panel provider where respondents can record videos from their homes.
Once the videos are collected, the client research team and Karna AI engage with each other to understand the research objectives and accordingly develop a data collection protocol. For this particular case, the objective and protocol could be something like this:
- Understanding which of the trimmer variants (A, B and C) perform well and why
- Understanding the typical user flow of a trimming operation for different trimming objectives — full beard trimming, french beard styling, beard shortening etc
- Get data-backed answers for unique observations from the process
Typical Data Collection Protocol:
- Total time is taken for the trimming operation and time spent in each area of the face.
- The number of trimmer strokes made along with the intensity and length of each stroke.
- Angle of the trimmer and face angle at each point in time.
- Emotions and facial distortions made during trimming.
- How the trimmer is held at each point in time.
- Number of times the trimmer was shaken to remove excess hair.
- The frequency of switching on/off during the trimming exercise.
- Areas of a face where the user comes closer to the mirror for delicate trimming.
- Beard density at each point in time for every part of the face.
Deliverables and Output
With raw videos and data protocol as an input, Perceptron performs the analysis and returns all the interaction metrics in an excel file. Once armed with the raw data that captures in-depth interaction metrics for each second in time, the researchers can run analytics and answer their key questions.
The output data includes options like:
- The report that captures key findings from the research exercise.
- Heat-maps (averaged across hundreds of users) of typical user behaviour.
- A behaviour flow map/funnel with quantified metrics at each stage.
- Filtered video snippets from hundreds of hours of raw video (for instance, snippets that club all the areas where users exhibit negative emotion for more than 2 seconds).
- Data backed answers to any potential research questions.
The above graphic is a snapshot of user behaviour on ParallelDots’ website. This is a critical tool used by our digital marketing team to improve the user experience and drive higher conversion. With Perceptron, we are bringing such in-depth analysis to the domain of physical products.
Take your research game to the next level with Perceptron
By bringing Quant-like rigour into the largely Qualitative practice of observational research, Karna AI is pushing the boundaries of how researchers can tease out implicit insights.
Want to know how Perceptron will increase your revenue? Click here to schedule a free demo.