Deliver empathy with agility


We’re always innovating at Butterfly – looking for new and improved ways of getting to those little truths that make a big difference to your brand. The Hub is our proprietary online research platform that allows us to deliver our powerful methodologies in a digital environment.

What it is:

The Hub allows you to engage with your audience in different ways, through asynchronous tasks developed specifically to untangle your challenge, be it over a period of time or live conversations, wherever in the world they might be, at a time that is convenient for them.




How it works:

Because all research that we conduct on The Hub is designed with your challenge in mind and moderated by Butterfly, it’s a highly flexible and agile tool that allows us test and evolve as we learn.

Every interaction taps into people’s creativity and unlocks their emotional motivations and drivers, helping us to understand how people connect with your brand and category. We have a toolbox of ten different task types, including:


Camera – Generate rich content and challenges for your audience to share. Participants are asked to upload imageor video responses.




Discussion – Primarily text-based interaction with respondents to get to know your audience’s routine and the drivers for their choices. The functionality enables an option for a daily diary task.


Collage – Bring an idea or experience to life through the power of visuals. Work together with your audience to build a collage of materials to add colour and understanding to topics such as casting choices, brand world or packaging design briefs. There are three ways to use this task: API with Unsplash, Consumer provided visuals and Pre-defined library.




Evaluate – Test the big idea in all its details through pinning and heatmapping. Our Evaluate task allows you to put forward text or image-based concepts, mood boards or canvases that help you unlock what your audience really thinks about your idea. Fully customisable, the task allows you to specify pin colours and labels, as well as the option to explore multiple ideas.





Blanks – Working on finetuning a concept, want to get the language perfect? Our Blanks task invites your audience to build a new concept by filling the blanks using either pre-defined terms or adding their own suggestions.





Ranking​ – Prioritisation based on what works best for your audience using our Ranking Task. Research participants can drag and drop different ideas, claims, designs or experiences into their preferred order. Helping you gauge what to prioritise and how you might be able to improve those ideas.





Gallery​ – Bringing Butterfly’s proprietary ‘cept’ approach into the digital world, our gallery task allows us to unpack what really resonates by deconstructing ideas into micro stimulus and asking your audience to select the aspects of an idea that instinctively resonate with them.




Live – Interact on a human level with our Live Task which brings individuals or your whole community together in a video conference setting to discuss your challenge or query. Advanced technology allows us to share stimulus, arrange break out groups and even conduct straw polls.


Collaboration​ – Bringing together Butterfly’s, clients and consumers together to work in the most collaborative of fashions, through a real-time virtual whiteboard where you can build ideas, run workshops and more.


Mission – From rapid feedback on your ideas to understanding what your audience is thinking and feeling while on the go, our Mission Task is made up of individual steps such as video or photo uploads, text-based responses or questionnaires.



The outputs:

We download, collate, ruminate, analyse, and tease out the interesting and the insightful to build robust recommendations for your brand, product or service. For the mission task we have the help of our AI analysis features that ensure our evaluation is robust and comprehensive:


  • Facial expression recognition can detect 7 different emotions (angry, disgusted, fearful, happy, neutral, sad, surprised) and calculate a facial expression score enabling more agile and efficient ways of working as well as giving us a deeper insight into consumers’ minds and feelings. Whether its analysing by person, or the entire community you can now see what people are saying and feeling at a glance.

  • Video auto-transcription is supportedin 32 languages. We know how difficult it is to analyse video content – but also know how rich and powerful it can be. That’s why we’ve launched auto-transcription for videos in our Mission Task to speed things up. This also means that any content in videos can be analysed by our AI tools for sentiment analysis and word cloud feature. Transcripts are fully editable, highlightable and exportable.


  •  Word clouds are generated by using the text from video transcripts and all text submissions within each mission step. It uses natural language processing to calculate the importance of words and associated sentiment, allowing you to spot key themes at a glance.

  • Sentiment analysis will collate all written answers on the Mission task to give a % of positive, neutral, and negative breakdown of the overall sentiment in a Mission Task step to provide a clear reading, help aid more efficient ways of working


To find out more about The Hub, get in touch at: