[For a downloadable version of this study, click here.]
To date, Q&A on ecommerce sites has been primarily a tag-along application to customer reviews (provided by vendors that specialize in customer reviews). This approach results in a Q&A model that's more like customer reviews than a true social experience between shoppers and customers, missing the benefits that a truly social approach to ecommerce Q&A provides.
The key to Social Q&A is that shopper questions should reliably and quickly get answered by real customers, and participants should have the ability to go back-and-forth beyond the initial question, if they choose to. If shopper questions receive customer answers only rarely or after an extended period, the shopper is disappointed and the store has missed the chance to provide a fast reminder to the shopper about the purchase she was considering. Further, getting past customers to share their experience with real shoppers is a great way for stores to keep their relationships with the customer base fresh. The rise of social networks has conditioned people to expect a high level of interactivity from social applications – so if a Q&A tool isn’t providing that, it’s not really Social.
On many online stores' Q&A systems, we've observed that most answers come from store staff. That can be an OK supplement to social answers (especially if the staff are really experts), but the store may be better off directing those questions to a live chat or phone line so the staff can interact with the shopper in real time. And if a shopper wants to know something subjective - like how the product held up after 3 months, or how it felt, or just if it's really as fabulous as they hope it is(!) - they may only want an answer from someone like them who really bought the item. A Q&A system that relies heavily on staff answers also isn't really Social.
That's why TurnTo created an approach to Q&A for ecommerce that reliably provides a true Social experience – multiple, fast answers from real purchasers with continuing back-and-forth dialog. To measure the difference between the TurnTo approach and that provided by the leading customer reviews vendors, Bazaarvoice and PowerReviews, we conducted a simple test. We asked 16 shopper questions on a range of sites with Q&A powered by TurnTo and these other vendors, and we tracked how long it took for the answers to arrive. Here are the aggregated results:
Methodology: In our test design, we tried to keep the playing field level. We asked general questions that could easily be answered by anyone with experience with the product. We tried to ask the identical question about identical products wherever possible. Where not possible we tried to pick featured items on the Bazaarvoice and PowerReviews sites likely to have high traffic and have been purchased many times (no new arrivals items were used). We tried to pick sites where the Bazaarvoice and PowerReviews Q&A tools were implemented in a highly visible way on the page.
That meant that the PowerReviews and Bazaarvoice sites were not always the largest in each vertical (in particular, in the photo gear category), but more often than not, the Bazaarvoice and PowerReviews sites had far more traffic than the TurnTo sites, and they did so in aggregate. We checked the item page where each question was asked at exactly the specified intervals and counted posted answers. We also provided our email address with each question asked and counted answers received by email. (The Bazaarvoice and PowerReviews stores often emailed answers well before those answers appeared on the sites, in some cases even before the questions appeared on the sites.) None of the sites were alerted in any way about this test. All questions were submitted on Wednesday, August 10, 2011 between 9am and 11am eastern time. Here were the test sites that we used:
On each site, we asked 4 questions. So in total, we asked 16 questions per vendor. Here are the details of the answers received, by individual site. (All numbers are for social answers - answers from customers - except those in parentheses, which are answers from store staff.)
Staff answers: We also tracked answers from store staff.
These are shown in parentheses in the table above. At the end of the two week test period, the questions on PowerReviews sites received a total of 10 staff answers vs 7 social answers. The questions on Bazaarvoice sites received a total of 5 staff answers vs 9 social answers. No staff answers were received on the TurnTo sites - note that 15 out of 16 questions on TurnTo sites received at least 1 social answer within 24 hours.
We encourage you to try this test for yourself.
The raw data: Here are the urls for all the item pages for all questions in the test. The asker is "Andrew P", "Andrew RP" or "Anonymous" - also look for a submit date of August 10th where that is shown. Note that on the Bazaarvoice and PowerReviews sites, we counted answers received by email, even though some of those answers - in some cases, even the questions - were not posted on the site by the end of the test period.