Just Because You Can, Does That Mean You Should?

12 Min Read

Facebook’s recent experiments in social media mood contagion got us thinking about user-based testing in general and especially how that applies in healthcare technology that is intended to influence behavior.

Facebook’s recent experiments in social media mood contagion got us thinking about user-based testing in general and especially how that applies in healthcare technology that is intended to influence behavior.

The Experiment

For one week in January 2012, Facebook manipulated the feeds of users to show content that was either positive or negative and then looked at whether this had an influence on users. The main point of contention or dissention is that this was human subject research without consent from the subjects and without the oversight of a review board as would be expected for university research. If the research hadn’t been published in a scientific journal then there might not have been so much controversy. What is the difference between A/B testing and what Facebook did? In A/B testing, marketers test different landing pages or campaigns and see which one works the best for their desired goal. Consumers don’t know that they are part of an experiment to test messages. However, consumers did freely follow a link that brought them to the content. The difference with Facebook is probably first, that they have significant power due to the volume of users and more importantly what they know about those users, and second although Facebook lawyers will tell you their terms of use covered it, Facebook users probably did not sign up with the expectation that Facebook itself would actively attempt to make them happy or sad.

How Do You Test Behavior Change?

It’s an interesting question for those involved in healthcare, and in particular trying to help people modify their behavior. In our case, at Wellpepper we are helping people be more adherent to home treatment programs. To do that we use a number of motivating factors including personalization and notifications. As part of building our application we test which features are effective in motivating people. We continually improve and change the application based on what we learn. Is this testing on human subjects? Yes. Did we get permission? Yes. This is part of our terms of use and it is also an essential part of how the industry builds software that people will use: by testing that software with real users. When people start using our software they use it to help them with a specific problem and they are happy when we make improvements to make it more effective to solve that problem. We encourage user feedback and implement new features based on it. So while, we may test new features, it is part of the implicit agreement of delivering software to users. (If you’ve ever used software that was not tested with real end-users, you’ll know the difference.)

When we test and add features that help improve user experience and become more adherent to their treatment program users are happy because we have helped them with their goals for using our software and the implicit contract with them. If we started testing and adding features that made them less adherent or changed some other type of behavior that they weren’t trying to change using our application we would have broken that contract and they might vote with their feet or in this case fingers and stop using the application.

What’s Your Implied User Contract?

The same thing could happen with Facebook, and it stems back to what their intention is with this research. The unfortunate thing is that they probably have enough data to have figured out that positive newfeeds make you happy and negative newsfeeds make you unhappy without actually manipulating the feeds. The fact that they did this, and did this without consent, brings up a bigger question of what their intention is, and what exactly is the implicit contract you have with Facebook. What exactly is their motive in trying to manipulate your emotions? For marketing experiments of this type the motive is pretty clear: consume more of their product. For Facebook it might be the same, but the fact that they tested negative messages does cause some alarm. Let’s hope they use their power for good.

Wellpepper2-1216aFor software developers that aim at healthcare behavior change there is an additional challenge as we think about testing features with real users. In order to help someone change behavior you need to test what works and that does need to be with real users. In general software development there are industry best-practices, for example, where you test different designs to find out which is most effective. This may be considered “experimentation” as users will not see the same features and some of the features they do see may not make it into the final version of the product. When you are doing this type of testing, you are looking for what is most effective in helping users achieve their goals. However, this testing must be done while protecting personal health information and not providing any harmful impact to the patient. Software developers can partner with research organizations whose internal review board will ensure that research on human subjects is conducted in the right way. To prove out efficacy of an entire application, this is often the best way to go but not practical for feature testing.

Guidelines for User Testing in Consumer Healthcare Applications

While looking at specific feature testing, these guidelines can help make sure you respect your end-user testers:

  • Unless you have explicit consent, all user testing must be anonymous. This is because if you are dealing with PHI and have signed a HIPAA BAA you have agreed to only access PHI when absolutely necessary. If you need to know demographics of your users for user testing, then you should err on the side of getting their explicit consent. This could be either via a form, or simply a non-anonymous feedback form on your application or website. By providing you with direct feedback the user has agreed to not be anonymous. (The good thing here is that patients can do whatever they want with their own data, so if they give you consent, to look at it, you have it.) That said, if you are working with healthcare organizations you will also have an agreement with them about contacting their patients: you need to make sure they have agreed to this as well. When possible err on the side of making data anonymous before analyzing it.
  • Think about the implicit contract you have with the user. If you are providing them with an application that does one thing, but you discover it may have applications for something else, don’t test features for that something else without getting consent. That is breaking the contract you have with them. Let’s look purely hypothetical example: at Wellpepper we have an application that increases patient adherence to home treatment programs for those undergoing physical rehabilitation. Let’s say we found out that people in physical rehabilitation are also often fighting with their spouses and started adding features or asking questions about the user’s relationship with his or her spouse, users would find this both unnerving and intrusive because that was not their expectation that we would help them with marital issues when they signed up for the application. Obviously this is a bit far-fetched, but you get the point.
  • Don’t get in the middle of human-to-human communication. This is essentially where Facebook broke the implicit contract with users by dis-intermediating the newsfeed. Your expectation with Facebook is that it’s a way for you to communicate with people (and sometimes organizations) you like. By changing what showed up in your feed, Facebook got in the middle of this. In healthcare this is even more important: don’t get between healthcare professionals and their patients. Make sure it’s clear when it’s you (the application, the company) talking and when it’s the caregiver and patient.
  • Consider where you’d get more value by partnering with a research organization. Sure it will take longer and may require more effort, but you will be able learn a lot more about why or how people are using your features by getting explicit research consent. I am not sure if it’s a coincidence or not but about a month ago I noticed that my Facebook newsfeed was full of extremely depressing stories. I remember wondering what was going on both with Facebook and the world in general and I remember wanting to post something depressing but then thought, “No I don’t want to add to this. I will only post positive things.” It’s possible that I was part of another study by Facebook and if so, they didn’t get the full picture that they would have if they’d been upfront about it, got my consent, and were able to ask me questions later about my thought process.

There is no doubt that we will see more discussions of ethics and consent in the space of user testing, especially as it relates to consumer-facing health applications. Having no regulation or guidelines is not good for consumer. However, only doing research with IRB and third party researchers is also not good for the consumer as innovation that could really help them can be slowed dramatically. Most people, whether healthcare practitioners or entrepreneurs got into the space because they wanted to help people. If we remember this, and we consider the ethical implications of our actions, we should be able to balance the two worlds.

For more reading on this topic as it applies to the software industry, see:

http://en.wikipedia.org/wiki/A/B_testing

http://ai.stanford.edu/~ronnyk/2009controlledExperimentsOnTheWebSurvey.pdf

http://www.exp-platform.com/Pages/expMicrosoft.aspx

Share This Article
Follow:
Anne Weiler is CEO and co-founder of Wellpepper, a clinically-validated and award winning platform for patient engagement that enables health systems to track patient outcomes in real-time against their own protocols and personalize treatment plans for patients. Wellpepper patients are over 70% engaged. Prior to Wellpepper, Anne was Director of Product Management at Microsoft Corporation.
Exit mobile version