Analysis of the Effect of Peer Feedback on Written Argumentation
Mary Tess Urbanek
Author
07/28/2020
Added
29
Plays
Description
This video is a poster presentation over the effect that providing peer feedback has on the rate of student revision in the general chemistry context.
Searchable Transcript
Toggle between list and paragraph view.
- [00:00:00.970]Hello everyone.
- [00:00:01.920]My name is Mary Tess Urbanek.
- [00:00:03.170]And today I will be discussing an Analysis of the Effect
- [00:00:05.530]of Peer Feedback on Written Argumentation
- [00:00:07.360]in the General Chemistry Classroom.
- [00:00:09.350]Written argumentation in the science classroom
- [00:00:11.290]is a crucial step in developing students into scientists
- [00:00:13.740]that can not only analyze and interpret scientific data,
- [00:00:16.340]but also communicate those findings
- [00:00:17.710]to those in the community.
- [00:00:19.250]While written argumentation is known to be an important
- [00:00:21.240]practice for students to engage in,
- [00:00:22.970]students often find it difficult to support their assertions
- [00:00:25.300]with evidence and reasoning.
- [00:00:26.810]Peer review is a common and well-supported writing technique
- [00:00:29.270]that can be used to improve a student's ability to reflect
- [00:00:31.620]on their own writing and understanding.
- [00:00:33.450]However, little is known regarding how peer review can
- [00:00:35.570]support written argumentation in the science classroom.
- [00:00:39.340]This research study aims to investigate how students
- [00:00:41.500]participate in drafting, reviewing
- [00:00:43.160]and revising written arguments when given the opportunity
- [00:00:45.343]to provide peer feedback
- [00:00:47.290]and how those participation patterns correlate
- [00:00:49.170]to revision rates.
- [00:00:51.270]Students enrolled in an introductory chemistry course
- [00:00:53.470]were tasked with working through a series of prompts
- [00:00:55.300]concerned with finding the optimal conditions
- [00:00:57.060]for chlorine dioxide production.
- [00:00:58.830]The assignment consisted of five prompts in total.
- [00:01:00.960]In prompts one through four students
- [00:01:02.420]were shown a dataset similar to the one shown here
- [00:01:04.419]or asked to draw a conclusion as well
- [00:01:06.380]as cite specific evidence from it.
- [00:01:08.170]The last prompt shown here as question five,
- [00:01:10.040]asks the students to essentially combine all
- [00:01:11.840]their previous arguments
- [00:01:12.790]into one cohesive recommendation regarding the optimal
- [00:01:15.360]conditions for chlorine dioxide.
- [00:01:17.735]Students submitted their initial answer to prompt five
- [00:01:20.590]through canvas as draft one.
- [00:01:22.050]Students then were assigned three arguments to read
- [00:01:24.030]and critique as part of a peer review process.
- [00:01:26.180]The students were then given the opportunity
- [00:01:27.900]to revise their draft one and submit the revised draft
- [00:01:30.170]as draft two.
- [00:01:31.300]Analysis of the arguments was broken down
- [00:01:33.130]into three different categories.
- [00:01:34.670]The first category dealt with the structure of the argument.
- [00:01:37.160]In this step the arguments were analyzed for a claim,
- [00:01:39.150]evidence and reasoning.
- [00:01:40.470]The peer reviews and revisions made by the students were
- [00:01:42.530]also coded based off of their content and specificity.
- [00:01:45.364]Each analysis was completed using validated rubrics
- [00:01:48.660]and once each variable was analyzed cluster analysis
- [00:01:51.080]was applied to examine the data for correlations.
- [00:01:53.863]The first component of data analysis was focused
- [00:01:56.460]on the claim, evidence and reasoning
- [00:01:58.190]that are contained within an argument.
- [00:02:00.020]A claim is defined as something that the student asserts as
- [00:02:02.290]the truth, but would require more information in order
- [00:02:04.430]to be accepted as such.
- [00:02:06.050]A claim can be supported by evidence and reasoning.
- [00:02:08.530]Evidence is defined as data descriptions or conclusions
- [00:02:10.860]and reasoning as defined as additional information
- [00:02:13.520]that is used to connect the evidence to the claim
- [00:02:15.610]or provide additional context to said claim.
- [00:02:17.930]In order to apply cluster analysis to the data,
- [00:02:20.660]each argument was assigned a numerical value based off
- [00:02:22.980]of the content of the argument shown here as Table one.
- [00:02:27.800]The peer reviews left by students were separated
- [00:02:29.700]into nine different categories based on both the content
- [00:02:32.060]and specificity of the peer review.
- [00:02:33.940]A peer review that consisted of a summary
- [00:02:35.630]of the student's argument or contained only positive
- [00:02:37.830]statements was classified as praise/summary.
- [00:02:40.550]An argument that pointed out a flaw
- [00:02:41.980]or offered a counterargument was classified
- [00:02:43.760]as a problem and a review that offered a suggestion
- [00:02:46.070]for the writer to implement was classified as a solution.
- [00:02:48.870]The specificity of the peer reviews was captured
- [00:02:50.645]using the terms low prose and high prose.
- [00:02:53.220]A peer review that was broad, general or superficial
- [00:02:55.500]was classified as low prose and a peer review
- [00:02:57.370]that was specific in nature was classified high prose.
- [00:03:01.410]There are five different levels that student revisions could
- [00:03:03.470]be classified under in this analysis.
- [00:03:05.720]A revision that received a minor cosmetic code change
- [00:03:07.925]meant that the revision included grammatical changes
- [00:03:10.950]or rewording of the sentence.
- [00:03:12.830]A revision that received a major revision code meant that
- [00:03:15.410]their revision included a change in the meaning
- [00:03:17.550]of the argument between drafts.
- [00:03:20.170]K means cluster analysis was used to
- [00:03:21.920]uncover patterns within the data by separating the data
- [00:03:24.380]into clusters that maximize differences between clusters
- [00:03:26.950]and minimize differences within the clusters.
- [00:03:29.410]The optimum number of clusters was determined
- [00:03:31.270]through NbClust and a Hopkins statistic was used
- [00:03:33.560]to assess the clustering tendency of the dataset.
- [00:03:36.100]A value of 0.61 was obtained,
- [00:03:38.220]which was above the needed threshold.
- [00:03:40.010]The image shown here is a Bi-plot
- [00:03:41.550]for the two cluster solution that was used
- [00:03:43.150]to extract existing correlations from the data.
- [00:03:46.260]Table Four shows the average values each cluster earned
- [00:03:48.450]from the various criterion.
- [00:03:49.790]It's important to note that in this study,
- [00:03:51.270]the data analysis was done from the perspective
- [00:03:52.972]of the reviewer.
- [00:03:54.260]In total, there were 97 participants in the cluster analysis
- [00:03:56.908]where 52 participants were ultimately placed
- [00:03:59.067]in Cluster One and 45 participants were placed
- [00:04:01.830]in Cluster Two.
- [00:04:03.400]The Draft One CER column averaged 3.4 for Cluster One.
- [00:04:06.990]What this means is that the reviewers in Cluster One had
- [00:04:09.120]an average code at 3.4 for the structure
- [00:04:11.060]of their own argument.
- [00:04:12.330]From Table One in the data analysis section,
- [00:04:14.410]we can see that this corresponds to an argument
- [00:04:16.260]that contained claim and evidence.
- [00:04:18.040]The Peer Review One CER refers to the quality
- [00:04:20.270]of the first argument that the student was critiquing.
- [00:04:22.710]For Cluster One this value was 3.3,
- [00:04:24.610]which Table One would classify as an argument containing
- [00:04:26.830]claim and evidence.
- [00:04:28.060]The column Peer Review One Code refers to the quality
- [00:04:30.380]of the first peer review left by the reviewer.
- [00:04:32.470]Cluster One had a value of 2.5,
- [00:04:34.270]which according to Table Two indicated
- [00:04:36.010]that these peer reviews were between low prose
- [00:04:37.740]and high prose problems.
- [00:04:39.500]The Column Peer Review Two CER refers to the quality
- [00:04:42.180]of the second argument that the reviewer critiqued.
- [00:04:45.590]In the case of Cluster One, this value was 2.8,
- [00:04:47.960]which according to Table One would be an argument
- [00:04:49.810]that contained just evidence or claim and evidence.
- [00:04:52.350]The column Peer Review Two Code refers to the quality
- [00:04:54.700]of the peer review the reviewer gave
- [00:04:56.110]to the second argument they critiqued.
- [00:04:57.960]In this case, Cluster One received a score of 1.2,
- [00:05:00.580]which according to Table Two would indicate that
- [00:05:02.320]their second peer review consisted
- [00:05:03.760]of praise/summary reviews.
- [00:05:05.710]Peer Review Three CER refers to the quality
- [00:05:07.890]of the third argument that the reviewer analyzed.
- [00:05:10.180]For Cluster One this value was 2.9,
- [00:05:12.080]which according to Table One would indicate the
- [00:05:13.930]arguments mainly consisted of claim and evidence.
- [00:05:16.351]The column Peer Review Three Code refers to the quality
- [00:05:18.905]of the review that the reviewer left the third argument.
- [00:05:21.780]Cluster One had an average value of 1.3,
- [00:05:24.000]which according to Table Two means
- [00:05:25.330]that they left praise/summary reviews.
- [00:05:27.265]Cluster One had a revision value of 0.2,
- [00:05:30.380]which according to Table Three indicates
- [00:05:32.040]that the reviewers in Cluster One typically did
- [00:05:33.730]not make any revision between their drafts.
- [00:05:36.053]Figure Two shows the participation patterns
- [00:05:38.470]of each of the clusters extracted
- [00:05:39.940]from the cluster analysis data above.
- [00:05:42.040]Participants in Cluster One are in blue while participants
- [00:05:44.420]in Cluster Two are in yellow.
- [00:05:45.890]The Cluster One reviewer is assigned to review an argument
- [00:05:48.260]that has a lower quality score than their own argument,
- [00:05:50.740]which is referred to as a downward comparison.
- [00:05:53.070]Based on the results from the cluster analysis,
- [00:05:54.692]this reviewer is likely to offer non-substantive peer
- [00:05:57.490]reviews to their review.
- [00:05:58.890]Once the peer reviews have been submitted,
- [00:06:00.330]their reviewer is have been given the opportunity
- [00:06:01.800]to revise a draft based off the arguments they critiqued.
- [00:06:04.370]Cluster One participants were unlikely
- [00:06:06.160]to make any revision between their drafts.
- [00:06:08.210]On the other hand, Cluster Two participants were assigned
- [00:06:10.610]to review an argument that was higher in argument quality
- [00:06:12.870]than their own argument, referred to
- [00:06:14.320]as an upward comparison.
- [00:06:15.910]These participants were likely
- [00:06:17.100]to leave substantive peer reviews
- [00:06:18.390]for the arguments they critiqued.
- [00:06:19.840]The participants in Cluster Two were likely
- [00:06:21.620]to make some revision between their two drafts.
- [00:06:24.540]The cluster analysis used to analyze the student
- [00:06:26.840]participation pattern suggests that a correlation exists
- [00:06:29.340]between the quality of the arguments the student reviews
- [00:06:31.570]and the revisions they're likely to make.
- [00:06:33.460]Future studies need to be done to see if the patterns exist
- [00:06:35.810]in a larger dataset and should incorporate a variable
- [00:06:37.924]regarding the quality of the content of the claim,
- [00:06:40.170]evidence and reasoning of the arguments.
- [00:06:42.910]To view sources and acknowledgements,
- [00:06:44.580]please scan the QR code seen on the screen.
- [00:06:46.560]Thank you for listening to my presentation.
The screen size you are trying to search captions on is too small!
You can always jump over to MediaHub and check it out there.
Log in to post comments
Embed
Copy the following code into your page
HTML
<div style="padding-top: 56.25%; overflow: hidden; position:relative; -webkit-box-flex: 1; flex-grow: 1;"> <iframe style="bottom: 0; left: 0; position: absolute; right: 0; top: 0; border: 0; height: 100%; width: 100%;" src="https://mediahub.unl.edu/media/13768?format=iframe&autoplay=0" title="Video Player: Analysis of the Effect of Peer Feedback on Written Argumentation" allowfullscreen ></iframe> </div>
Comments
0 Comments