Ethical Online Research with Human Participants
The topic of ethics is too big to fit into one newsletter, so this month you’re receiving more than one. (See the first here.) In the first half of 2025 I’m looking at ethical studies with human participants using data collected in online interviews (theme for the May newsletter) or focus groups (theme for the June newsletter). This week I’m sharing thoughts and resources about minimizing risks in online research and making agreements with participants. Next week I’ll share another blast to discuss ethical issues specific to visual and creative methods. Stay tuned for April, and a new focus on reflexivity and journaling.
What is informed consent?
Informed consent is the term given to the formal agreement that verifies the nature and extent of the participants’ involvement in the study. The term has two complementary parts: individuals are informed about the study then consent to participate. They need to know what is expected of them, and what potential risks are involved. They need to know that their participation is voluntary and can be discontinued at any point. They need to be informed about what will data will be collected and how material about them and/or from them will be used. Their agreement to terms set out by the researcher is verified by a written or spoken affirmation of consent.
Verbiage for the typical consent agreement is often spelled out by the institution or funder, in line with any governmental regulations. However, sometimes the institutional form letters institutions require us to use tend to be formal and legalistic. You know as well as I that such documents are quickly scanned and signed-- much as we do when agreeing to terms for using software. You might get the participant’s signature on such a letter, but don't assume that it means the they are truly informed.
While I don’t disagree with the basic requirements, I think calling this set of activities informed consent gives the wrong impression. It implies that once the study was explained and forms were signed, attention to ethics is complete. You were informed, you consented: my responsibility for research ethics is satisfied! I prefer to frame research ethics as an active process that occurs continuously at all stages of the study. Let’s replace the past tense with an active label: InformING and ConsentING.
An active process: InformING and ConsentING
While pondering how to explain my thinking on this process, I came up with an analogy. Let’s picture that we want to buy a new house. We have to complete all sorts of practical and legal steps: clearing the deed, having the house inspected to ensure that the roof doesn’t leak and the foundation is solid, figuring out financing, and scheduling when the transaction will be final so we can start putting books on the shelf. We will need to sign documents indicating that we are committed to buying this house. Beyond the business of accomplishing these steps, there are other essential steps. We must imagine living in this house: is our heart in it? Will we feel safe? Will we feel comfortable? Will we feel at home?
The first part corresponds to the “consenting” part of the process. Of course this is important, because we need to have a legal basis for the sale. We need to trust the owners and be sure that any known problems have been brought to our attention. But before we are willing to sign the papers, we need another kind of assurance. We know a house is not a home if it doesn’t feel right. If the second set of questions isn’t satisfactorily answered it is unlikely we will go through with the purchase, no matter how good the house looks on paper. This more affective part of the experience corresponds to the informing part of the process. Let’s think about what the process of informing participants might entail in studies where data are collected with online interviews, focus groups, or other kinds of digitally-mediated interactions.
What should be included when you are informing participants about online studies?
Online researchers need to consider additional points when informing potential participants about the kinds of data to be collected, and how it will be used. Who will have access to the raw data? It is important to discuss specific uses of personally-identifiable material such as audio recordings or images. They need to know the kinds of publications or presentations you intend to develop. Interestingly, some participants accept the fact that other academics might learn about the study but are not happy when they learn that you intend to publish or present in a more mainstream public way. Additionally, carefully explain how participants’ identities will be protected during and after data collection.
Participants need to know specifics about the nature of the exchange. Naturally, this means we need to be specific about our designs and plans; we can’t wing it. Some features that might come naturally in a face-to-face interview must be intentionally planned when the exchange occurs online. For example, when a researcher conducts an interview in a physical setting, some level of observation occurs. Is the home or office quiet or chaotic? Whether or not we make formal notes, we take in the photos on the participant's desk, books are on the shelf, magazines on the coffee table, art on the wall. Such common objects may convey information about family, cultural heritage, gender identity, hobbies, or social memberships. They might spur further questioning or serve as an entrée for rapport-building conversation: “tell me what was happening in this picture?” or “Is this a handmade quilt? Tell me why you keep it in your office.” When such observations and conversations are central to an online study, the researcher would need to choose a tool such as a videoconference platform, inform the participant about the purpose for using it, and negotiate when cameras should be on, and what setting or background is most appropriate.
What process should you use when informing participants about online studies?
One of the ironic challenges for researchers is that we have to learn how to speak in a scholarly voice, communicate in “academese,” and write long and detailed pieces of writing. Then we have to learn how to translate scholarly erudition into succinct messages that people outside our field of study can understand. Such skills are put to the test when it comes to informing participants!
Avoid sending a length, dense, written document, or a list of bullet points. Learn how people in the target demographic for your study prefer to receive information. If they prefer audio or video to a written document, create a short recording to explain study expectations. If they prefer visuals, create an infographic or comic-style graphics. Test out your materials with people who meet your selection criteria or are familiar with the population of your study.
Use every interaction, from recruitment through completion of the study, for meaningful communication. Even an email that aims to schedule a meeting can be a chance to build credibility and rapport. Be friendly, respectful, do what you say you are going to do in a timely manner, and take any concerns seriously.
Consider making a blog or page dedicated to the study. If the study is about sensitive problems, protect access with a password. Include a jargon-free description of the purpose and potential value of the research as well as the points you want the participants to know. Embed a video introduction. Link to your prior research or your academic institution to show your credibility. For a study with multiple interactions or a longitudinal study, post updates about how the study is going. You can link to this page from posts or emails, so potential and active participants can feel that they are a part of a valuable project.
If you are conducting interviews or other data collection events that involve talking with your participants, begin your interaction by reviewing key points, and record responses that affirm they understand. Make time for a quick check-in: “do you have any questions about the study before we get started?” If you are conducting a series of interviews, ask “is there anything from our last meeting you want to talk about today? Anything come to mind after the interview that you’d like to discuss?” Use member checking, that is, ask participants to verify transcripts and in some cases, drafts. In other words, keep your virtual door open.
Minimizing Risks by Selecting Protected Platforms
Before we can discuss risks with participants we need to carefully think through and address potential risks. In online research we are inherently dependent on the platforms and tools we use. Commercial platforms are generally not developed with researchers’ interests and requirements in mind. Today it seems that their priorities are just the opposite. We need to protect the data; they want to take whatever data they can and sell it or spit it out from generative AI bots. We can never be 100% safe but we want to avoid leaks or hacks to the extent possible. As noted in the example offered in the previous newsletter, omission of someone’s name is not enough because their identities can be found by associating key characteristics. Cyberbullying doesn’t stay virtual when names and addresses can be found. Researchers studying hot-button or sensitive issues must give careful thought to the platforms and communication channels they select.
In the early days of the internet there were many free platforms that student, emerging, or independent scholars could use for small studies or projects. Alas, there are few remaining free platforms that allow researchers to collect and protect their data. When free versions exist, they often have less security than paid versions on the same platform. Some openly allow AI bots and data mining bots to scrape the data. If you are planning synchronous audio or video chats or videoconferencing, look where the recordings are saved. Choose a platform that allows you to record to your own hard drive, so the recording is not on the company’s server where you cannot protect it.
For example, according to multiple communications with Zoom staff, their paid platform allows you to opt-out of AI training, but the free version does not. Zoom allows you to record to your own computer and no file is saved on their servers. Microsoft Teams, in contrast, does not allow you to record on your own computer.
A couple of years ago I was lucky to be introduced to staff at a company called itracks. They offer both synchronous and asynchronous platforms designed specifically for qualitative research. I asked Krystal Rudyk, itracks’ Marketing Manager a few questions about how itracks protects clients’ data:
While qualitative researchers have always been concerned about protecting data, it is even more risky now because of 1) AI scraping and 2) increase in online bullying and targeting, which means that if sensitive or identifiable data is leaked there are real risks. Tell us what itracks does to protect the data collected on your platform.
Yes, we take data privacy very seriously! You can view our privacy policy here. itracks is proud to have an unblemished record thus far when it concerns data privacy. We will never share your data, it is stored securely, and should you request permanent deletion, we are happy to oblige. On top of a strict privacy and data security policy, there are a few specific features built into the itracks software that are there for data security purposes. For example:
There are no open links to any itracks synchronous or asynchronous projects. What this means is that nobody can ever access an itracks activity (focus group, interview, board) unless they log in with a profile that has specifically been registered to that project. This is in contrast to platforms like Zoom where you can have links that allow you do join a meeting or webinar that you hadn’t previously registered for. That said, we do understand the importance of ease-of-use, which is why we’ve recently added other secure login methods like single-sign-on.
All video and audio data is end-to-end encrypted and stored on secure servers.
Additionally, researchers can actually choose the geographic location of the servers that host and store their Realtime activities. This can be valuable to everyone, but especially government or academic organizations where it’s important that the data never leave the researcher’s geographic region/country.
While hacking/technology leaks are something we protect against, we also recognize that sometimes human error can lead to personally identifiable information (PII) leaks as well. Recently, we updated our platform to mask key pieces of PII within the software so that researchers don’t need to worry about accidentally/unnecessarily revealing this information other people who may be helping with the research (like recruiters, additional moderators) without inhibiting them from helping with the research.
itracks Realtime has an optional Participant Privacy mode that, when activated, will allow video groups to be conducted where the moderator can see the participants’ video while conducting the group/interview, but the participants’ videos (and facial identities) are obscured in recordings, and for fellow participants and observers who are watching the group live.
itracks requires all employees to undergo monthly data security training, and is proud to be fully HIPAA and GDPR compliant.
We wrote a blog that has a security checklist you can use while vetting research software options, you can access it here.
Some of your tools have AI functionality. How does that work, and how are data protected?
They do! Specifically, all itracks Realtime subscribers now have access to the Analysis Assistant for itracks Realtime. This AI-enabled tool analyzes the recordings and transcripts of your Realtime projects, allowing you to do things like identify key phrases, topics and named entities, conduct sentiment analysis, translate your transcripts into different languages, and quickly create data visualizations and highlight reels. You can learn more about it here.
With respect to security, using the AI-enabled Analysis Assistant poses no additional risk to your itracks data. I find that when people are asking about the security/privacy implications of incorporating AI, there are two main questions they want to know the answers to:
What are you using to power the AI used in your platform? itracks Analysis Assistant is powered by Microsoft Azure AI Video Indexer. You can read more about it and its data security policies and track record here. In short, it’s a secure platform and Microsoft makes it a big priority to keep it that way.
Will the data from my research be used for machine learning/be fed into the AI/be used by Microsoft? Absolutely not. itracks knows our clients count on us to maintain the strictest levels of data privacy, and that applies to the AI-enabled Analysis Assistant as well. Customer data does NOT feed into the AI models and will never be used for AI training. All data will stay private according to Microsoft's privacy standards, of which you can learn more here. We have asked direct questions about this in plain language to our partners at Microsoft and have double and triple-checked this multiple times to ensure it continues to be the case.
You’ve embedded consent and participant information into your tools. How do these and any other steps ensure that studies are conducted ethically?
While it will ultimately always be up to the researcher to ensure their studies are conducted ethically, what we can do as a tech provider is to a) ensure that all of the data is handled ethically on our part, and b) make it as easy as possible for the researcher to use our tech while executing ethics essentials like informed consent. Any participant who is taking part in an itracks project will, by default, be presented with the itracks Terms & Conditions so they know what they’re agreeing to as far as how they’re personal information, etc. is handled. On top of that, the researcher has the option to add their own custom terms and conditions (or, in research terms, custom consent form). Importantly, this custom consent form can be shown separately on its own page, and requires the participant to scroll through to the end of the document before hitting that “accept” button. That way, instead of people hitting a checkbox beside a hyperlink without knowing what they’re agreeing to, they are forced to actually open and read the consent form before consenting, really hitting on that key “informed” part of “Informed consent.” This would all happen the first time the participant logs into the platform, so there’s no way for them to slip through the cracks or accidentally participate in the research before providing this consent. Plus, you’ve got record/proof of this informed consent within the same platform you’re collecting data in which makes for easy record-keeping.
Our CEO, Garnette Weber, and myself actually hosted a session centered around ethics applications and some how-to’s within the itracks software that may be of additional use, it can be found here:
itracks offers discounts and fantastic support for academic users; contact them for more information.
Protect the data you’ve collected
Again, avoid uploading data to platforms or programs that are not secure. That means strict avoidance of any uploads of raw data to LLM generative AI programs, where anything uploaded becomes part of the public domain. Check the policies and practices of the platforms you are considering for data collection, management, and/or analysis; inquire if the information is not readily available.
Assign pseudonyms or a numerical code so participants’ names are not visible, especially when raw data is shared with research supervisors, co-researchers, or peers.
Use a reflexivity framework to weigh technological consequences when choosing and using tools and platforms
Dr. Trena Paulus and Dr. Jessica Lester, co-authors of Doing Qualitative Research in a Digital World (2021), developed an excellent framework for making decisions about which tools and platforms will fit your study. They discuss four categories of consequences that researchers can continually attend to while crafting and revising a digital research workflow.
Consequences are defined as the positive and/or negative, anticipated and/or unanticipated, effects across these categories within a workflow. These categories include consequences for: 1) how research methods are enacted; 2) the design of the technologies; 3) the relationships between humans (researchers and participants); and 4) knowledge production and outcomes of the study. (Paulus & and Lester, 2024)
See:
Paulus, T. M., & Lester, J. N. (2023). Digital qualitative research workflows: a reflexivity framework for technological consequences. International Journal of Social Research Methodology, 27(6), 621–634. https://doi.org/10.1080/13645579.2023.2237359 If you don’t have library access, request a copy here.
Paulus, T. M., Pope, E. M., & Bower, K. L. (2024). Ways That Qualitative Researchers Engage in “Technological Reflexivity”: A Meta-Synthesis. Qualitative Inquiry, 31(3-4), 305-318. https://doi.org/10.1177/10778004241231927 (Original work published 2025)
If you don’t have library access, request a copy here.
Lester and Paulus co-edited a special Qualitative Inquiry issue: “Qualitative Inquiry in the 20/20s: Exploring the methodological consequences of digital research workflows.” In the introduction the co-editors noted that authors contributing to this issue were asked to explore three questions: What theories should shape our engagement with the digital? How is the digital shifting how we conceptualize methodologies? How can researchers ethically work with/in inequities in technological access? While working at Sage Methodspace I hosted two roundtable discussions with contributors to this special issue.
and
Research Ethics Resources
The Association of Internet Researchers Ethical Guidelines we mentioned are freely available online. You will note that there are multiple iterations: each builds on, but does not replace, the earlier set. I encourage you to look carefully at the 2012 and 2019 guidelines to learn more about ethical research with participants as well as extant or Big Data.
Books
Learn more about ethics in Doing Qualitative Research Online (2022). Use the code UK25BOOKS for a 25% discount on my books when purchased via ebooks.com (digital versions), anywhere in the world.
Learn more about ethics in Research Ethics in the Real World: Euro-Western and Indigenous Perspectives (2018) or Qualitative and Digital Research in Times of Crisis:
Methods, Reflexivity, and Ethics, edited by Helen Kara and Su-Ming Khoo. Use the code BUP03 for a 50% discount, valid until the end of March. If you order in April, please use BUP04, which will be valid until the end of April.
See the list compiled by Substack friend Vicky Loras: My PhD Bookstack: Ethics and the PhD - and the Saturday Scholar
Articles
aan het Rot M, Wessel I. How Making Consent Procedures More Interactive can Improve Informed Consent: An Experimental Study and Replication. Journal of Empirical Research on Human Research Ethics. 2024;0(0). doi:10.1177/15562646241280208
Abstract. Prospective research participants do not always retain information provided during consent procedures. This may be relatively common in online research and is considered particularly problematic when the research carries risks. Clinical psychology studies using the trauma film paradigm, which aims to elicit an emotional response, provide an example. In the two studies presented here, 112–126 participants were informed they would be taking part in an online study using a variant of this paradigm. The information was provided across five digital pages using either a standard or an interactive format. In both studies, compared to the control condition, participants in the interactive condition showed more retention of information. However, this was only found for information about which they had been previously asked via the interactive format. Therefore, the impact of adding interactivity to digital study information was limited. True informed consent for an online study may require additional measures.
Araali, B. B. (2011). Perceptions of Research Assistants on How Their Research Participants View Informed Consent and Its Documentation in Africa. Research Ethics, 7(2), 39–50. https://doi.org/10.1177/174701611100700203
Abstract. This paper discusses the issue of informed consent from an African perspective with a particular emphasis on the problems posed by the documentation of consent in the African socio-cultural environment. The paper presents two small-scale surveys which typify and exemplify the African perspective with regard to procedures for obtaining consent (agreement) and for documenting it. To avoid causing moral pain to African research participants and the feeling of having been used as mere sources of data, this paper suggests, as a shortterm solution, that African cultural values be incorporated in the procedures and regulations aimed at protecting their human rights. As a long-term solution, the paper encourages universities and research institutions operating in Africa to come up with clear guidelines of ethical conduct which would satisfy both the interests of ordinary Africans (particularly the rural and uneducated population) and legal requirements to which Western research institutions and funding agencies are subjected.
Biggs, J. S., & Marchesi, A. (2015). Information for consent: Too long and too hard to read. Research Ethics, 11(3), 133–141. https://doi.org/10.1177/1747016115583381
Abstract. The length of participant information sheets (ISs) for research and difficulties in their comprehension have been a cause of increasing concern. We aimed to examine the information sheets in research proposals submitted to an Australian HREC in one year, comparing the results with national recommendations and published data. Information sheets in all 86 research submissions were analysed using available software. The work of Flesch was used for Reading Ease or Readability and that of Flesch and Kincaid for the level of education required for comprehension, the Reading Grade Level. The mean length of 86 information sheets was 3110 words; many had more than 5000 words. Using the Flesch scale of 0 to 100, with 0 meaning most difficult and 100 very easy to read, the mean readability level was 47. The mean length of education needed to easily grasp the information was 11.6 years, equivalent to senior secondary school. Information sheets in research projects submitted to an HREC were often too long to be read in a reasonable time and too difficult to be easily understood. Recommended standards for information sheets were infrequently met.
Curran, D., Kekewich, M., & Foreman, T. (2019). Examining the use of consent forms to promote dissemination of research results to participants. Research Ethics, 15(1), 1–28. https://doi.org/10.1177/1747016118798877
Abstract. It is becoming widely recognized that dissemination of research results to participants is an important action for the conclusion of a research study. Most research institutions have standardized consent documents or templates that they require their researchers to use. Consent forms are an ideal place to indicate that results of research will be provided to participants, and the practice of inserting statements to this effect is becoming more conventional. In order to determine the acceptance of this practice across Canada we conducted an assessment of 121 institutional consent document templates from 65 institutions (hospitals and universities) looking for language that endorsed results dissemination to participants. About half (51%) of the documents we examined had language included which stated that results should be made available. In an era where research participation in hospital settings and universities is becoming ubiquitous there should be a reciprocal expectation that results should be provided. The success of research should be measured in part by its accessibility and dissemination to all stakeholders.
Davies, H. (2022). Reshaping the review of consent so we might improve participant choice. Research Ethics, 18(1), 3–12. https://doi.org/10.1177/17470161211043703
Abstract. Consent is one necessary foundation for ethical research and it’s one of the research ethics committee’s major roles to ensure that the consent process meets acceptable standards. Although on Oxford ‘A’ REC (an NHS Research Ethics Committee based in the UK) we’ve been impressed by the thought and work put into this aspect of research ethics, we’ve continued to have concerns about the suitability and effectiveness of consent processes in supporting decision making, particularly for clinical trials. There’s poor understanding of what people want to help them decide; current processes don’t provide the best grounding for informed consent and there’s inadequate public involvement. We’ve also found a lack of proportionality with researchers failing to adapt consent procedures in proportion to the burdens and consequences of the study. As a result, people are often not best helped to make an informed choice when asked to join a research study. To address these concerns, we considered how we might improve this aspect of research ethics review. Recognising the central importance of the dialogue between the volunteer and researcher, we’ve drawn up a model or flowchart of what we deem good consent practice, proposing consent should be built around four simple steps:
Step 1: Introducing the study and the choices: helping the potential participants get an overview of the proposal and introducing the key issues.
Step 2: Explaining all the details of the study using the detailed Participant Information Sheet.
Step 3: After a gap, if necessary, reviewing and checking understanding.
Step 4: Reaching agreement and recording consent.
These steps, we believe, could help all involved and this article lays out ways we might improve participant choice while complying with accepted principles and current regulations.
Eeckhout D, Aelbrecht K, Van Der Straeten C. Informed Consent: Research Staff’s Perspectives and Practical Recommendations to Improve Research Staff-Participant Communication. Journal of Abstract. Empirical Research on Human Research Ethics. 2022;18(1-2):3-12. doi:10.1177/15562646221146043
Abstract. Informed consent (IC) is the process of communication between research staff and potential research participants. However, ensuring that participants clearly understand what research participation entails, raises significant challenges. The aim of this study is to provide insight into some communication barriers that research staff are confronted with and make practical recommendations to improve communication between research staff and participants. A qualitative research study using semi-structured interviews (n = 13) with research staff from Ghent University Hospital was conducted. Data were transcribed verbatim and coded thematically. Our results indicate that communication- and process-related factors affect the IC process. Emergent recommendations include communication training, more interactive information materials and the use of digital alternatives, increasing general knowledge about research participation and patient- and public involvement.
Klykken, F. H. (2021). Implementing continuous consent in qualitative research. Qualitative Research, 22(5), 795-810. https://doi.org/10.1177/14687941211014366 (Original work published 2022)
Abstract. This article examines ways of approaching informed consent as a relationally constituted process in qualitative research practices. It argues that a researcher’s operationalization of informed consent should be coherent with the overall epistemological framework of the project. Based on empirical examples from an ethnographic inquiry in an educational setting, the principle of informed consent is discussed as a reflexive and ethical tool throughout the inquiry, including its pre-fieldwork, fieldwork and post-fieldwork phases. Strategies of explicitly and implicitly (re)negotiated consent and dissent are discussed and illustrated by drawing on some of the recent discussions of continuous consent practices. The article’s conceptualization of a continuous, situated and relational approach to informed consent is also supported by the concepts of response-ability and thinking with care in research ethics.
Holtz, B., Mitchell, K., Adams, R., Grier, C., & Wright, J. (2024). Enhancing comprehension of online informed consent: the impact of interactive elements and presentation formats. Ethics & Behavior, 1–14. https://doi.org/10.1080/10508422.2024.2318546
Abstract. Informed consent, a cornerstone of research ethics, ensures participant protection and informed participation, particularly in online settings. Despite its significance, engagement with online consent forms remains low, underscoring the need for improved presentation strategies. This study investigates the impact of interactive elements and diverse presentation formats on the comprehension and engagement of online informed consent documents among a broad demographic beyond the commonly studied student populations. Employing a between-subjects experimental design, we explored six versions of online consent forms varying in interactivity, readability, and visual formatting to identify optimal strategies for enhancing participant comprehension and engagement.
McInnis BJ, Pindus R, Kareem D, Nebeker C. Considerations for the Design of Informed Consent in Digital Health Research: Participant Perspectives. Journal of Empirical Research on Human Research Ethics. 2024;19(4-5):175-185. doi:10.1177/15562646241290078
Abstract. The research team, prospective participants, and written materials all influence the success of the informed consent process. As digital health research becomes more prevalent, new challenges for successful informed consent are introduced. This exploratory research utilized a human centered design process in which 19 people were enrolled to participate in one of four online focus-groups. Participants discussed their experiences with informed consent, preferences for receiving study information and ideas about alternative consent approaches. Data were analyzed using qualitative methods. Six major themes and sixteen sub-themes were identified that included study information that prospective participants would like to receive, preferences for accessing information and a desire to connect with research team members. Specific to digital health, participants expressed a need to understand how the technologies worked and how the volume of granular personal information would be collected, stored, and shared.
Paasche-Orlow, M. K., Taylor, H. A. & Brancati, F. L. 2003. Readability Standards for Informed-Consent Forms as Compared with Actual Readability. New England Journal of Medicine, 348, 721-726.
Abstract. Institutional review boards (IRBs) are charged with safeguarding potential research subjects with limited literacy but may have an inadvertent role in promulgating unreadable consent forms. We hypothesized that text provided by IRBs in informed-consent forms falls short of the IRBs' own readability standards and that readability is influenced by the level of research activity, local literacy rates, and federal oversight.
Perrault, E. & Nazione, S. 2016. Informed Consent-Uninformed Participants: Shortcomings Of Online Social Science Consent Forms And Recommendations For Improvement. Journal Of Empirical Research On Human Research Ethics : Jerhre, 11.
Abstract. As informed consent forms continue to lengthen, are these lengthening forms helping to create better informed participants? The aim of this research was to determine whether the length of consent forms affected reading frequency and comprehension, and to provide recommendations on how to improve consent forms in the social sciences so they are more likely to be read. A quasi-experiment was conducted using actual consent forms at two liberal arts schools, one requiring a long form (463 words, n = 73) and one requiring a shorter form (236 words, n = 57). Participants exposed to the shorter form reported fully reading, or at least skimming the form more frequently than those exposed to the longer form. Those exposed to the shorter form also comprehended more of the form's information. The majority of participants indicated consent forms need to be shortened if researchers want future participants to be more likely to read these forms' contents. Additional recommendations are discussed.
Perrault EK, McCullock SP. Concise Consent Forms Appreciated—Still Not Comprehended: Applying Revised Common Rule Guidelines in Online Studies. Journal of Empirical Research on Human Research Ethics. 2019;14(4):299-306. doi:10.1177/1556264619853453 (Request full-text on Research Gate)
Abstract. As informed consent documents have historically gotten lengthier, recent revisions to federal Common Rule guidelines now require consent forms that are “concise” and presented in ways that “facilitate comprehension.” The current research sought to apply these guidelines by developing a consent process for an online study that was only 71 words and also allowed participants a choice to either continue directly to the study or learn more about the study to which they were consenting. All participants (100%, N = 429) decided to continue directly to the study, choosing to forgo additional information about the study and the institutional review board (IRB) approval process.
Ripley, K. R., Hance, M. A., Kerr, S. A., Brewer, L. E., & Conlon, K. E. (2018). Uninformed consent? The effect of participant characteristics and delivery format on informed consent. Ethics & Behavior, 28(7), 517-543.
Abstract. Although many people choose to sign consent forms and participate in research, how many thoroughly read a consent form before signing it? Across 3 experiments using 348 undergraduate student participants, we examined whether personality characteristics as well as consent form content, format, and delivery method were related to thorough reading. Students repeatedly failed to read the consent forms, although small effects were found favoring electronic delivery methods and traditional format forms. Potential explanations are discussed and include participant apathy, participants trying to save time by not reading the consent form, and participant assumptions about consent forms.
Quinn SC, Garza MA, Butler J, et al. Improving informed consent with minority participants: results from researcher and community surveys. Journal of Empirical Research on Human Research Ethics : JERHRE. 2012 Dec;7(5):44-55. DOI: 10.1525/jer.2012.7.5.44. PMID: 23324203; PMCID: PMC3685140.
Abstract. Strengthening the informed consent process is one avenue for improving recruitment of minorities into research. This study examines that process from two different perspectives, that of researchers and that of African American and Latino community members. Through the use of two separate surveys, we compared strategies used by researchers with the preferences and attitudes of community members during the informed consent process. Our data suggest that researchers can improve the informed consent process by incorporating methods preferred by the community members along with methods shown in the literature for increasing comprehension. With this approach, the informed consent process may increase both participants' comprehension of the material and overall satisfaction, fostering greater trust in research and openness to future research opportunities.
This license enables reusers to copy and distribute the material in any medium or format in unadapted form only, for noncommercial purposes only, and only so long as attribution is given to the creator.
Thanks to Beth Spencer for this badge, indicating that no AI tools were used to create this post!