5 Chapter Four: Research — Writing at Work: Introduction to Professional Writing
Chapter Four: Research
Research Terminology
Research is the systematic process of finding out more about something than you already know, ideally so that you can prove a hypothesis, produce new knowledge and understanding, and make evidence-based decisions. What this process looks like depends on the questions you want to answer and what techniques or strategies you use to find the related information. Techniques of collecting, sorting, and analyzing data (or bits of information) are called research methods. The better the tools and more comprehensive the techniques you employ, the more effective your research will be. By extension, the more effective your research is, the more credible and persuasive your argument will be.
The typical kinds of research sources you will use can be grouped into three broad categories:
Primary sources. Data from research you conduct yourself in lab experiments and product testing, or through surveys, observations, measurements, interviews, site visits, prototype testing, or beta-testing. Primary sources can also be published statistical data, historical records, legal documents, firsthand historical accounts, and original creative works.
Secondary sources. Sources that discuss, analyze, and interpret primary sources, such as published research and studies, reviews of these studies, meta-analyses, and formal critiques.
Tertiary sources. Reference sources such as dictionaries, encyclopedias, and handbooks that provide a consolidation of primary and secondary information. These are useful to gain a general understanding of your topic and of major concepts, lines of inquiry, or schools of thought in a field or discipline.
Categories of Data
From your sources, you will acquire primary and secondary data that you will use in your research-driven writing. Table 4.1 distinguishes between these two types of data.
Table 4.1. Primary and secondary data.
Primary Data | Secondary Data |
Data that have been directly observed, experienced, and recorded close to the event. This is data that you might create yourself by
Note: primary research done in an academic setting that includes gathering information from human subjects requires strict protocols and will likely require ethics approval. Ask your instructor for guidance and see the “Human Research Ethics” section below.
|
Data gathered from sources that record, analyze, and interpret primary data. It is critical to evaluate the credibility of these sources. You might find such data in
|
Two other common categories of data are quantitative and qualitative data. In general terms, quantitative data is numerically based whereas qualitative data is word-based. Different fields privilege different kinds of data and use them in different ways.
Quantitative data uses numbers to describe information that can be measured quantitatively. This data is used to measure, make comparisons, examine relationships, test hypotheses, explain, predict, or even control. Lab-based fields (such as many STEM fields) tend to emphasize quantitative data.
In contrast, qualitative data uses words to record and describe the data collected. This data often describes people’s feelings, judgments, emotions, customs, and beliefs, which can only be expressed in descriptive words not in numbers. This data type includes “anecdotal data” or personal experiences. Text-based fields (such as many humanities fields) tend to prefer qualitative data.
Remember, this distinction is general—there are plenty of excellent counterexamples of STEM fields effectively using qualitative data and humanities fields using quantitative data. Some fields, especially in the social sciences, use both data types.
Research Methods and Methodologies
Data alone, regardless of its type, does not mean anything until you interpret it. The processes that you use to collect, analyze, and organize your data are your research methods.
Research methods are often categorized as quantitative, qualitative, or mixed methods. Some projects, such as lab experiments, require the use of the scientific method of inquiry, observation, quantitative data collection, analysis, and conclusions to test a hypothesis. Other kinds of projects take a more deductive approach and entail gathering both quantitative and qualitative evidence to support a position or recommendation. The research methods you choose will be determined by the goals and scope of your project, and by your intended audience’s expectations.
In terms of data collection, there are a variety of qualitative and quantitative methods available. A list of several common primary data collection methods is provided below. Note that each method follows a specific protocol both to ensure the validity of the data and to protect any human or animal subjects involved. For more on research that uses human participants, see the “Human Research Ethics” section later in this chapter.
Interviews. Interviews are one-on-one or small-group question-and-answer sessions. Interviews will provide detailed information from a small number of people and are useful when you want to get an expert opinion on your topic.
Surveys/Questionnaires. Surveys are a form of questioning that is less flexible than interviews, as the questions are set ahead of time and cannot be changed. Surveys can be in print format or delivered electronically. This method can reach much larger groups of people than interviews, but it results in less detailed responses.
On-site research. These observations involve taking organized notes about occurrences at a determined research site. Research sites may be physical locations, such as a local gym or building site, or they may be virtual, such as an online forum or event. Observations allow you to gain objective information, unlike the potentially biased viewpoint reflected in an interview or survey.
Experiments. Whether in the lab or in the field, experiments are designed to test hypotheses and verify previous results. Experiments are prepared by using standard protocols and careful testing in order to protect the researchers and their subjects, as well as to isolate specific variables.
Simulations. Typically designed and run using computer programs, simulations are a type of experiment that tests hypotheses and solutions in a virtual setting that approximates the real world. Simulations are usually an option for when in-person experiments are not feasible.
Primary source documents. More common in text-based fields, original written, visual, and/or audio sources can be used to locate specific data for further analysis and interpretation. In this method, the data collected could be words, images, sounds, or movements.
Effective Primary Research Design
In a technical and professional writing class, you will likely use a few common primary research methods involving human subjects: surveys, interviews, and on-site research (field, lab, or simulation). While you are not expected to be an expert in any of these methods, you should approach them ethically and thoughtfully so as to protect any participants and generate reliable data that can be generalized.
Survey Questions
When designing surveys, remember the rhetorical situation. What are the goals of your survey? Who are you hoping will complete the survey? What will they know? What will they not know? How long can you expect them to engage with your survey? What is the best method of surveying them (online, say through Google Forms, or in person)? How many responses do you hope to obtain? Use this information to inform the design of your survey and any preliminary materials you include with it. All surveys should feature clear statements of purpose, as well as specific directions for answering the questions and how to contact the researcher if participants have any questions.
After determining your audience and purpose, you will need to design your questions. Remember, in all online surveys you will not be there to provide immediate clarification, so your questions need to be carefully worded to avoid confusion and researcher bias. As a rule, your survey questions should:
Be as specific as possible. Avoid ambiguity by providing specific dates, events, or descriptors as necessary.
Ask only one question at a time. Specifically, avoid survey questions that require the participant to answer multiple items at once. This will confuse the reader as to what you are looking for and will likely skew your data.
Be neutral. Present your survey questions without leading, inflammatory, or judgmental language. Common leading survey questions that you want to avoid include phrasing like “Do you agree that our enemies are a threat to our way of life?” You will also want to avoid using language that is sexist, racist, or ableist.
Be organized logically. Questions should be presented in a way that makes sense to the participant. For example, if you introduce a concept in Question 1, you do not want to wait to return to it again in Question 12. Follow-up questions and linked questions should be asked in succession rather than separated.
Allow participants to decline answering. In general, you will want to be wary of questions that require participants to divulge sensitive information, even if they are answering anonymously. This information could include details such as a trauma, eating disorders, or drug use. For research projects that require these questions, consult your university’s IRB (Internal Review Board). They may need you to fill out special documentation that accounts for how you will protect your participants.
After designing the questions, you will also need to consider how your participants can answer them. Depending, you may opt for quantitative data, which includes yes/no questions, multiple choice questions, Likert scales, or ranking. Note that what makes this data “quantitative” is that it can be easily converted into numerical data for analysis. Alternatively, you may opt for qualitative data, which includes questions that require a written response from the participant. A description and some of the advantages of these answer styles follow below:
Yes/No questions (Quantitative). These simple questions allow for comparison but not much else. They can be useful as a preliminary question to warm up participants or open up a string of follow-up questions.
Multiple choice questions (Quantitative). These questions allow for pre-set answers and are particularly useful for collecting demographic data. For example, a multiple choice question might be phrased like this: “How many years have you attended your university?” Depending on the question and potential answers provided, you may wish to allow for a write-in response.
Likert scale (Quantitative). One of the most common answer types, the Likert scale is a rating, usually on a 1-5 scale. At one end of the scale, you will have an option such as “Definitely Agree,” and on the other you will have “Definitely Disagree.” In the middle, if you choose to provide it, is a neutral option. Some answers in this format may use a wider range (1-10, for example), offer a “Not Applicable” option, or remove the neutral option. Be mindful of what these choices might mean. A wider scale could, in theory, provide more nuance, but only if the distinctions between each option are clear.
Ranking (Quantitative). In a ranking-based answer, you provide a list of options and prompt your participant to place them in a certain order. For example, you may be offering five potential solutions to a specific problem. After explaining the solutions, you ask your reader to identify which of the five is the best, which is second best, and so on. Participants may assign these items a number or rearrange their order on a screen.
Written responses (Qualitative). Especially when you want detailed, individualized data, you may choose for participants to provide written answers to your questions. This approach is beneficial in that you may receive particularly detailed responses or ideas that your survey did not address. You might also be able to privilege voices that are often drowned out in large surveys. However, keep in mind that many participants do not like responding to essay-style questions. These responses work best as follow-up questions midway or later in the survey.
Finally, before officially publishing your survey online or asking participants in person, make sure that you conduct preliminary testing. This preliminary testing is crucial. When seeking feedback, have your reviewers note any confusion or ambiguity in question wording, lack of clarity in question order, typographical errors, technical difficulties, and how long the survey took for them to complete. Remember, surveys with unclear questions and sloppy formatting annoy participants and damage your credibility. Conversely, the more professional a survey looks and the easier it is for your reader to complete, the more likely you will receive useful responses.
Writing Engaging Research Interview Questions
Preparing good interview questions takes time, practice, and testing. Many novice interviewers go into interviews with the assumption that they do not need to prepare and are merely having a conversation. While this approach can generate information, these interviewers often find that several important questions do not get addressed. When designing interview questions, you will want not only to consider the content of those questions but also in what order they appear.
When preparing for an interview, first contact your potential interviewee as soon as possible. Individuals, especially those who work outside academia, may operate on timelines that feel odd to college and university students. You will also want to prepare any equipment (such as a recorder or smartphone, but request permission first before recording!), questions, and IRB approval, if applicable.
Carter McNamara offers the following suggestions for wording interview questions:
Wording should be open-ended. Respondents should be able to choose their own terms when answering questions.
Questions should be as neutral as possible. Avoid wording that might influence answers, e.g., evocative, judgmental wording.
Questions should be asked one at a time. Avoid asking multiple questions at once. If you have related questions, ask them separately as follow-up questions rather than as part of the initial query.
Questions should be worded clearly. This includes using any terms particular to your context or to the respondents’ culture.
Be careful asking “why” questions. This type of question infers a cause-effect relationship that may not truly exist. These questions may also cause respondents to feel defensive, e.g., that they have to justify their response, which may inhibit their responses to this and future questions.[1]
If you choose to have a face-to-face interview or interview over Zoom or Skype, show up on time and dress according to how the interviewee might dress. Honoring the interviewee’s time by being punctual, having prepared questions, and not extending past an established time limit is crucial to both collecting good information and maintaining a positive relationship with the interviewee.
Field Research
When conducting field research, or research that takes you outside of a lab or simulation, you will need to do the following:
Gain appropriate permissions for researching the site. Your “site” is the location where you are conducting research. Sites could include potential locations for a community garden, a classroom where you’re observing student behaviors or a professor’s teaching strategies, or a local business. Certain sites will require specific permission from an owner or other individual to use. Depending on your study, you may also need to acquire IRB permission.
Know what you’re looking for. While people-watching is interesting, your most effective field research will be accomplished if you know roughly what you want to observe. For instance, say you are observing a large lecture from a 100-level class, and you are interested in how students in the class use their laptops, tablets, or phones. In your observation, you would be specifically focusing on the students, with some attention to how they’re responding to the professor. You would not be as focused on the content of the professor’s lecture or on whether the students are doing things that don’t involve electronics, such as doodling or talking to their classmates.
Take notes. Select your note-taking option and prepare backups. While in the field, you will be relying primarily on observation. Record as much data as possible and back up that data in multiple formats.
Be unobtrusive. In field research, you function as an observer rather than a participant. Therefore, do your best to avoid influencing what is happening at the research site.
Data Interpretation
Methods also include ways of interpreting and organizing data, either once it has been collected or simultaneously with data collection. More specific methodologies, such as ways to structure the analysis of your data, include the following:
Coding. Reviewing transcripts of interview data and assigning specific labels and categories to the data. A common social science method.
Cost/benefit analysis. Determining how much something will cost versus what measurable benefits it will create.
Life Cycle analysis. Determining the overall sustainability of a product or process, from manufacturing, through lifetime use, to disposal. You can also perform comparative life cycle analyses or specific life cycle stage analysis.
Comparative analysis. Comparing two or more options to determine which is the “best” given specific problem criteria such as goals, objectives, and constraints.
Process analysis. Studying each aspect of a process to determine if all parts and steps work efficiently together to create the desired outcome.
Sustainability analysis. Using concepts such as the “triple bottom line” or “three pillars of sustainability” to analyze whether a product or process is environmentally, economically, and socially sustainable.
In all cases, the way you collect, analyze, and use data must be ethical and consistent with professional standards of honesty and integrity. Lapses in integrity may lead to poor-quality reports not only in an academic context (resulting in poor grades and academic dishonesty penalties) but also in the workplace. These lapses can lead to lawsuits, job loss, and even criminal charges. Some examples of these lapses include:
- Fabricating your own data (making it up to suit your purpose)
- Ignoring data that disproves or contradicts your ideas
- Misrepresenting someone else’s data or ideas
- Using data or ideas from another source without acknowledgment or citation of the source.
A Note About Not Citing
Failing to cite quoted, paraphrased, or summarized sources properly is one of the most common lapses in academic integrity, which is why your previous academic writing classes likely spent considerable time and effort to give you a sophisticated understanding of how and why to avoid plagiarizing, as well as the consequences of plagiarizing.
Human Research Ethics
As defined in the beginning of this chapter, primary research is any research that you do yourself in which you collect raw data directly rather than from articles, books, or Internet sources that have already collected and analyzed the data. If you are collecting data from human participants, you may be engaging in human subjects research. When conducting research with human participants, you must be aware of and follow strict ethical guidelines required by your academic institution. Doing this is part of your responsibility to maintain academic integrity and protect your research subjects.
In the United States, human subjects research is guided by three core principles outlined in the federal government’s Belmont Report: respect for persons, beneficence, and justice.[2]
The first principle, respect for persons, means that researchers must respect the autonomy of research participants and provide protections against coercion, particularly for vulnerable populations. The second principle, beneficence, means that researchers have an obligation to enact the following rules: “(1) do not harm and (2) maximize possible benefits and minimize possible harms.”[3] The third principle, justice, means that research participation should be distributed, rather than concentrated heavily on one population. For example, the Belmont Report references the infamous Tuskegee Syphilis Study, in which researchers “used disadvantaged, rural black men to study the untreated course of a disease that is by no means confined to that population,”[4] along with committing a number of other serious ethical violations.[5] Respect for persons, beneficence, and justice guide researchers in ethical research practices.
There are a number of federal agencies that have guidelines and requirements governing human subjects research in addition to the Belmont Report.[6] For example, the Office for Human Rights Protections (OHRP), the Food and Drug Administration (FDA), and the National Institutes of Health (NIH) all provide oversight of human subjects research. Colleges and universities have institutional review boards, or IRBs, that review research plans to ensure compliance with government regulations and ethical guidelines.[7] Researchers, including professors and students, are required to seek and receive approval from their campus IRB before conducting human subjects research projects.
The Office for Human Rights Protections defines research as “a systematic investigation, including research development, testing, and evaluation, designed to develop or contribute to generalizable knowledge.”[8] The type of research conducted in a college class for the purposes of developing research skills does not always meet the formal definition of research, because it may not be systematic or generalizable. Students should check with their course instructors to determine whether IRB approval is required for a course research assignment.
Regardless of whether any specific study meets the specific criteria for human subjects research, researchers at every stage of their career should adhere to the core principles of ethical behavior. Even when completing assignments and studies that do not meet the formal definition of human subjects research, researchers and students should abide by the principles of respect for persons, beneficence, and justice.
Below are three common research methods that use human subjects, along with specific guidance that can help you conduct such studies ethically:
Interviews. Provide participants with information about the interview experience and have them sign an informed consent form before you begin. If you plan to record the interviews, either with audio or video, ask for specific consent for the recording. Be sure to inform participants that they may skip questions that they are uncomfortable with or end the interview at any time.
Surveys/Questionnaires. At the beginning of the survey/questionnaire, provide an informed consent section that includes a description of the research project, risks and benefits of participation, and researcher contact information. Participants must consent to participate in order to be included in the study.
Naturalistic observation in non-public venues (or field observations). In naturalistic observations, the goal is to be as unobtrusive as possible, so that your presence does not influence or disturb the normal activities you want to observe. If you want to observe activities in a specific workplace, classroom, or other non-public place, you must first seek permission from the manager of that place and let participants know the nature of the observation. Observations in public places may not require informed consent, though researchers should seek IRB approval. Photographs or videos require specific informed voluntary consent and permission.
These are the most common human subjects research methods used in undergraduate courses. There are many other methods, including engaging with people and their information via social media, organizing focus groups, engaging in beta-testing or prototype trials, etc. These other methods are generally not recommended because they involve additional ethical considerations.[9]
Finding and Evaluating Research Sources
In this “information age,” so much information is readily available on the Internet. With so much at our fingertips, it is crucial to be able to critically search and sort this information in order to select credible sources that can provide reliable and useful data to support your ideas and convince your audience.
Popular Sources vs Academic Sources
From your previous academic writing courses, you are likely familiar with academic journals and how they differ from popular sources, such as magazines, as shown in Table 4.2. Academic journals contain peer-reviewed articles written by scholars for other scholars, often presenting their original research, reviewing the original research of others, or performing a “meta-analysis” (an analysis of multiple studies that analyze a given topic). Peer reviewing is a lengthy vetting process whereby each article is evaluated by other experts in the field before it is published.
Popular sources, in contrast, are written for a more general audience or following. While these pieces may be well researched, they are usually only vetted by an editor or by others if the writer seeks out additional review. Popular sources tend to be more accessible to wider audiences. They also tend to showcase more visible biases based on their intended readership or viewership. If you would like to refresh your memory on this topic, consult your library or writing center for resources.
Scholarly articles published in academic journals are usually required sources in academic research essays; they are also an integral source for engineering projects and technical reports. Furthermore, many projects require a literature review that collects, summarizes, and sometimes evaluates the work of researchers whose work has been recognized in a field as a valuable contribution to that field. Scholarly sources such as journal articles are usually cited as the major sources, though other scholarly documents including books, conference proceedings, and major reports may also be incorporated into a literature review based on the discipline the writer is in.
Journal articles are not the only kind of research you will find useful to your work. Since you are preparing for the workplace and for researching in a professional field, there are many credible kinds of sources you will draw on in a professional context. Table 4.2 lists several types of sources you may find useful in researching your projects.
Table 4.2. Typical research sources for technical projects.
Source Type | Description |
Academic Journals, Conference Papers, Dissertations, etc. | Scholarly (peer-reviewed) academic sources publish primary research done by professional researchers and scholars in specialized fields, as well as reviews of that research by other specialists in the same field.
For example, the Journal of Computer and System Sciences publishes original research papers in computer science and related subjects in system science; International Journal of Robotics and Animation is one of the most highly ranked journals in the field. |
Reference Works | Specialized encyclopedias, handbooks, and dictionaries can provide useful terminology and background information.
For example, the Kirk-Othmer Encyclopedia of Chemical Technology is a widely recognized authoritative source. Do not cite Wikipedia or dictionary.com in a technical report. For general information, use accepted reference materials in your field. For dictionary definitions, use either a definition provided in a scholarly source or the Oxford English Dictionary (OED). |
Books, Chapters in Books | Books written by specialists in a given field and containing a References section can be very helpful in providing in-depth context for your ideas.
For example, Designing Engineers by Susan McCahan, et al. has an excellent chapter on effective teamwork. When selecting books, look at the publisher. Publishers affiliated with a university tend to be more credible than popular presses. |
Trade Magazines and Popular Science Magazines | Reputable trade magazines contain articles relating to current issues and innovations, and therefore they can be very useful in determining what is “state of the art” or “cutting edge” at the moment, or in finding out what current issues or controversies are affecting an industry. Examples include Computerworld, Wired, and Popular Mechanics. |
Newspapers (Journalism) | Newspaper articles and media releases can offer a sense of what journalists and people in an industry think the general public should know about a given topic. Journalists report on current events and recent innovations; more in-depth “investigative journalism” explores a current issue in greater detail. Newspapers also contain editorial sections that provide personal opinions on these events and issues.
Choose well-known, reputable newspapers such as The New York Times and The Washington Post. |
Industry Websites (.com) | Commercial websites are generally intended to “sell,” so you have to select information from them carefully. Nevertheless, these websites can give you insights into a company’s “mission statement,” organization, strategic plan, current or planned projects, archived information, white papers, technical reports, product details, cost estimates, etc. |
Government Publications and Public Sector Websites (.gov/.edu) | A vast array of .org, .gov, and .edu sites can be very helpful in supplying data and information. These are often, but not always, public service sites and are designed to share information with the public. Remember, organizations can also have clear biases and agendas, so keep that context in mind when reviewing them. |
Patents | You may have to distinguish your innovative idea from previously patented ideas. You can look these up and get detailed information on patented or patent-pending ideas. |
Public Presentations | Representatives from industry and government speak to various audiences about current issues and proposed projects. These can be live presentations or video presentations available on YouTube or as TED Talks. |
Other | Some other examples of sources include radio programs, podcasts, social media, Patreon, etc. |
Searching for scholarly and credible sources available to you through an academic library is not quite as simple as conducting a search on a popular Internet search engine.
Evaluating Sources
The importance of critically evaluating your sources for authority, relevance, timeliness, and credibility cannot be overstated. Anyone can put anything on the Internet, and people with strong web and document design skills can make this information look very professional and credible—even if it isn’t. Since much research is currently done online and many sources are available electronically, developing your critical evaluation skills is crucial to finding valid, credible evidence to support and develop your ideas. Some books may be published by presses with specific political agendas, and some journals, which look academic/scholarly at first glance, actually don’t use a robust peer-review method and instead focus on profit. Don’t blindly trust sources without carefully considering their whole context, and don’t dismiss a valuable source because it is popular, crowdsourced, or from a nonprofit blog.
A Research Tip
One of the best ways to make sure you establish the credibility of your information is to triangulate your research—that is, try to find similar conclusions based on various data and from multiple sources. Find several secondary source studies that draw the same overall conclusion from different data, and see if the general principles can be supported by your own observations or primary research. For instance, when you are trying to examine how a course is taught, you want to make sure that you speak with the students and the professor, not just one or the other. Getting the most important data from a number of different sources and different types of sources can both boost the credibility of your findings and help your ethos overall.
When evaluating research sources and presenting your own research, be careful to critically analyze the authority, content, and purpose of the material, using questions such as those in Table 4.3. When evaluating sources for use in a document, consider how they will affect your own credibility or ethos.
Table 4.3. Evaluating authority, content, and purpose of information.
Authority | Researchers, Creators, Authors
Authoritative Sources: written by experts for a specialized audience, published in peer-reviewed journals or by respected publishers, and containing well-supported, evidence-based arguments. Popular Sources: written for a general (or possibly niche) public audience, often in an informal or journalistic style, published in newspapers, magazines, and websites with the purpose of entertaining or promoting a product; this evidence is anecdotal. |
Content | Methods
Data
|
Purpose | Intended Use and Intended Audience
|
When evaluating sources, you will also want to consider how you plan to use a source and why. There are several reasons why you might use a particular source. Perhaps it is a scientific study that supports your claim that emissions from certain types of vehicles are higher than those of others. Or maybe the writer uses particularly effective phrasing or an example that your readers will respond to. Or perhaps it is a graph that effectively shows trends from the past ten years. Source use, like any choice you make when producing a document, should be purposeful.
Confirmation bias refers to when you (unintentionally or otherwise) only consult sources that you know will support your idea. Cherry-picking is the actual use of inadequate or unrepresentative data that only supports your position and ignores substantial amounts of data that contradict it. Comprehensive research instead addresses contradictory evidence from credible sources and uses all data to inform its conclusions.
Finally, you will want to consider if you are representing the collected data accurately. As a researcher, you are responsible for treating your sources ethically. Being ethical in this context means both attributing data to its sources and providing accurate context for that data. Occasionally, you may find a specific quotation or data point that seems to support one interpretation; however, once you read the source, you may realize that the writer was describing an outlier or critiquing an incorrect point. A common representation error is claiming that an author says something that they never actually say. In your text, you need to be clear regarding where your information is coming from and where your ideas diverge from those of the source.
Additional Resources:
“Research in Professional Writing and Communication,” by Dr. Teresa Henning. Southwest Minnesota State University.
This chapter was derived from:
Last, Suzan; and Nicole Hagstrom-Schmidt. Howdy or Hello: Technical and Professional Communication. 2nd Edition. College Station, TX: Texas A&M University Libraries, n.d. https://pressbooks.library.tamu.edu/howdyorhello/front-matter/introduction/. Licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Last, Suzan, with contributors Candice Neveu and Monika Smith. Technical Writing Essentials: Introduction to Professional Communications in Technical Fields. Victoria, BC: University of Victoria, 2019. https://pressbooks.bccampus.ca/technicalwriting/. Licensed under a Creative Commons Attribution 4.0 International License.
- Sarah LeMire, “ENGL 104 – Composition & Rhetoric (Spring 2022): Scholarly and Popular Sources,” Texas A&M University Libraries Research Guides, accessed February 4, 2022, https://tamu.libguides.com/c.php?g=715043&p=7697893. ↵
- National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research Department of Health, Education, and Welfare. The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects Research, U.S. Department of Health, Education, and Welfare (18 April 1979), https://www.hhs.gov/ohrp/regulations-and-policy/belmont-report/read-the-belmont-report/index.html. ↵
- Belmont Report. ↵
- Belmont Report. ↵
- Elizabeth Nix, “Tuskegee Experiment: The Infamous Syphilis Study,” History, May 16, 2017, updated July 29, 2019, https://www.history.com/news/the-infamous-40-year-tuskegee-study. ↵
- “Resources,” Texas A&M University Division of Research, accessed February 4, 2022, https://rcb.tamu.edu/humansubjects/resources. ↵
- “Human Research Protection Program,” Texas A&M University Division of Research, accessed February 4, 2022, https://vpr.tamu.edu/human-research-protection-program/. ↵
- 45 C.F.R. § 46.102(l), Electronic Code of Federal Regulations: E-CFR Data, (19 January 2017), https://www.ecfr.gov/cgi-bin/retrieveECFR?gp=&SID=83cd09e1c0f5c6937cd9d7513160fc3f&pitd=20180719&n=pt45.1.46&r=PART&ty=HTML#se45.1.46_1101. ↵
- “Human Research Protection Program.” ↵