{"id":331,"date":"2017-03-10T16:36:53","date_gmt":"2017-03-10T21:36:53","guid":{"rendered":"http:\/\/my.dev.vanderbilt.edu\/mathfollowup\/?page_id=331"},"modified":"2020-08-19T10:50:41","modified_gmt":"2020-08-19T15:50:41","slug":"student-direct-assessment-measures","status":"publish","type":"page","link":"https:\/\/my.dev.vanderbilt.edu\/mathfollowup\/resources\/student-direct-assessment-measures\/","title":{"rendered":"Student Direct Assessment Measures"},"content":{"rendered":"<h2><strong>Overall Data Collection Procedures<\/strong><\/h2>\n<p>Student direct assessment data for the Math Follow-Up Study were collected electronically, using either E-Prime or FileMaker software. For each of the computer-based assessments, E-Prime software was used to collect information about students\u2019 accuracy and response times. Assessors obtained student information for all other tasks using FileMaker software. The FileMaker system had built in checks which notified data collectors instantly if any questions were skipped, if the criterion for the ceiling and\/or basal was not met, etc.<\/p>\n<p>Assessors saved copies of their data and shared it remotely using Accellion, Vanderbilt\u2019s secure file-sharing platform. Individual assessor files were checked and then imported into a master database.<\/p>\n<p>By using a paperless system, we were able to eliminate the need for double-entry of the data, which allowed for quicker access to the final dataset at the conclusion of the data collection period.<\/p>\n<p><strong><u>Student Direct Assessments<\/u><\/strong><\/p>\n<p>All student participants in the\u00a0Math Follow-Up Study also participated in a previous longitudinal study of early math skills. The following chart provides an overview of the student direct assessment data collection timepoints for the original study (\u201cScaling Up TRIAD\u201d), as well as the two follow-up grants.<a href=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2017\/03\/DataCollectionOverview.png\"><br \/>\n<\/a><\/p>\n<p><a href=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2020\/08\/AssessmentTimeline_Update.png\"><img loading=\"lazy\" decoding=\"async\" class=\"alignleft size-full wp-image-728\" src=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2020\/08\/AssessmentTimeline_Update.png\" alt=\"AssessmentTimeline_Update\" width=\"722\" height=\"462\" srcset=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2020\/08\/AssessmentTimeline_Update.png 722w, https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2020\/08\/AssessmentTimeline_Update-300x192.png 300w, https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2020\/08\/AssessmentTimeline_Update-650x416.png 650w\" sizes=\"auto, (max-width: 722px) 100vw, 722px\" \/><\/a><a href=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2020\/02\/Math-Follow-Up-Data-Collection-Timepoints.png\"><br \/>\n<\/a><\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>As the chart illustrates, data collection for the follow-up studies began in Spring 2014 when most students were in 5<sup>th<\/sup> grade. Since that time, project staff have directly assessed the math knowledge and math-related skills of more than 500 students annually. From T5 through T8, direct assessments with student participants were administered each spring in two sessions lasting approximately 45 minutes each. Beginning with T9 data collection, participants completed one 45 minute assessment session. Assessments were individually administered inside a quiet room within the child\u2019s school.<\/p>\n<p>All data collectors completed a rigorous training, practice, and certification process before being sent out into the field to work with study participants. Assessors completed three days of formal training and then were given opportunities to practice the assessments with each other and with project staff. Following practice, data collectors administered their session to a member of the project team. If the pre-appraisal attempts were completed successfully, assessors next completed a video of themselves giving their session to a consented non-study child. Videos were scored using a scoring rubric, and assessors had to obtain a minimum score of 85% in order to become certified.<\/p>\n<p><strong><span style=\"color: #0000ff\"><a style=\"color: #0000ff\" href=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2017\/03\/E-Prime-Certification-Video-Eval-2017.pdf\" target=\"_blank\">Click here to view an example of the video certification form.<\/a><\/span><br \/>\n<\/strong><br \/>\nThe student direct assessment measures administered for this project fit into several broad categories: standardized assessments, nonstandard math assessments, math surveys, domain-general cognitive measures, and math-specific cognitive measures. Information about each of the measures is presented below.<\/p>\n<p>&nbsp;<\/p>\n<h5><strong>Standardized Assessments<\/strong><\/h5>\n<p><strong>KeyMath 3<\/strong> (Connolly, 2007) \u2013 The KeyMath 3 Diagnostic Assessment is a comprehensive, norm-referenced measure of essential mathematical concepts and skills. KeyMath content covers the full spectrum of math concepts and skills that are typically taught in kindergarten through tenth grade and can be used with individuals aged 4 \u00bd through 21 years who are functioning at these instructional levels. The instrument has three concept areas. From T5-T9, we administered three subscales out of the five in the Basic Concepts area.<\/p>\n<ul>\n<li><strong>Numeration: <\/strong>This subtest measures an individual\u2019s understanding of whole and rational numbers. It covers topics such as identifying, representing, comparing, and rounding one-, two-, and three-digit numbers as well as fractions, decimal values, and percentages. It also covers advanced numeration concepts such as exponents, scientific notation, and square roots.<\/li>\n<li><strong>Algebra: <\/strong>This subtest measures an individual\u2019s understanding of pre-algebraic and algebraic concepts. It covers topics such as sorting, classifying, and ordering by a variety of attributes; recognizing and describing patterns and functions; working with number sentences, operational properties, variables, expressions, equations, proportions, and functions; and representing mathematical relationships.<\/li>\n<li><strong>Geometry: <\/strong>This subtest measures an individual\u2019s ability to analyze, describe, compare, and classify two- and three-dimensional shapes. It also covers topics such as spatial relationships and reasoning, coordinates, symmetry, and geometric modeling.<\/li>\n<\/ul>\n<p><strong>Woodcock-Johnson Achievement Battery III<\/strong> (Woodcock, McGrew, &amp; Mather, 2001) \u2013 The Woodcock-Johnson is a standard assessment of a range of skills, designed to be used with people ages 2 to 90+. We used the Quantitative Concepts subtest to measure students\u2019 math knowledge during all years of the project. During T7 and T8 data collection, we also administered the Letter-Word Identification subtest, which provided information about students\u2019 verbal abilities.<\/p>\n<ul>\n<li><strong>Quantitative Concepts A &amp; B: <\/strong>Quantitative Concepts has two parts and assesses students\u2019 knowledge of mathematical concepts, symbols, and vocabulary, including numbers, shapes, and sequences. It measures aspects of quantitative math knowledge and recognition of patterns in a series of numbers.<\/li>\n<li><strong>Letter-Word Identification: <\/strong>This subtest measures students\u2019 ability to name letters and read words aloud from a list.<\/li>\n<\/ul>\n<p><strong>Comprehensive Mathematical Abilities Test<\/strong> (CMAT; Hresko, Schlieve, Herron, Swain, &amp; Sherbenau, 2003 ) &#8211;\u00a0The <em>Comprehensive Mathematical Abilities Test<\/em> (CMAT) was developed to assess a broad spectrum of mathematical abilities in the areas of comprehension (reasoning), calculation, and application. The instrument is comprised of six core subtests, as well as six supplemental subtests. Due to ceiling effects on the KeyMath-3, we\u00a0used the CMAT as a replacement for KeyMath beginning at T10. Of the 12 CMAT subtests, we administered one core subtest (Subtest 5: Problem Solving) and two supplemental subtests (Subtest 7: Algebra, and Subtest 8: Geometry). These measures were selected because they mapped most closely onto the KeyMath measures that were given in previous study years.<\/p>\n<ul>\n<li><strong>Subtest 5: Problem Solving (PS).<\/strong> The Problem Solving subtest measures a student&#8217;s ability to translate a problem stated in English text into a mathematical problem for ultimate solution. Problem situations are chosen from a broad cross-section of real-world situations. Included are items that require the manipulation of operations and combinations of operations, use of formulas, probability, or calculation of percentages and ratios. The use of a calculator is permitted on this subtest, which makes the computational items primarily a matter of understanding the nature of the problem rather than basic computational facility.<\/li>\n<li><strong>Subtest 7: Algebra (AL).<\/strong> The Algebra subtest includes solving equations for a single unknown, simplification of polynomials, factoring polynomials, solution of quadratics by factoring and\/or with the use of the quadratic equation, and solution of simultaneous linear equations. The use of a calculator is permitted on this subtest, which makes the computational items primarily a matter of understanding equations rather than basic computational facility.<\/li>\n<li><strong>Subtest 8: Geometry (GE).<\/strong> Subtest 7 measures abilities in Euclidean geometry. Items measure concepts from plane geometry, area, volume, surface area, parallel lines, geometry of the triangle, and geometry of the circle. The use of a calculator is permitted in this subtest, which makes the computational items primarily a matter of understanding the geometric concepts rather than basic computational facility.<\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n<h5><strong>Nonstandard Math Assessments<\/strong><\/h5>\n<ul>\n<li><strong>Functional Thinking<\/strong> \u2013 Developed by Bethany Rittle-Johnson (items taken from Lee, Ng, Bull, Pe, &amp; Ho, 2011), this task consists of 6 \u2018tables\u2019 in which the student has to fill in the missing Input number, Output number, and Rule. The maximum possible score is 18 (3 points per table). Two function rules are addition, two are multiplication, and two are multiply-then-add. Functional Thinking was administered to students during T5 and T6 data collection.<\/li>\n<\/ul>\n<p><strong><span style=\"color: #0000ff\"><a style=\"color: #0000ff\" href=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2017\/03\/Functional-Thinking-Assessment.pdf\" target=\"_blank\">Click here to view a copy of the Functional Thinking assessment.<\/a><\/span><\/strong><\/p>\n<p>&nbsp;<\/p>\n<h5><strong>Math Surveys<\/strong><\/h5>\n<p><strong>Student\u2019s Feelings About Math (FAM)<\/strong><\/p>\n<p>In the first year of the Middle School Follow-Up Study (T5), we adapted a student survey from Karabenick and Maehr\u2019s (2007) MSP Motivation Assessment Program. Students were asked 10 questions about how much they liked math, how good they were at math, how much they felt like their math teacher cared about them, etc. Responses were on a 5-point scale. The results demonstrated little variability in student responses.<\/p>\n<p><strong>Trends in International Mathematics and Science Study (TIMSS)<\/strong><\/p>\n<p>To address concerns with the FAM survey, we replaced FAM with the TIMSS beginning with T6 data collection\u00a0(Martin, Mullis, &amp; Foy, 2008; Mullis, Martin, Foy, &amp; Arora, 2012). The TIMSS Math was administered at every subsequent assessment timepoint, and at T10, we also began administering the TIMSS Science.<\/p>\n<p>For the TIMSSS Math, students were asked 26 4-point-scale questions (i.e., Agree a lot, Agree a little, Disagree a little, Disagree a lot) about how they feel about math, including how much they enjoy doing math, how well they think they are doing in math, etc. While the TIMSS Math asks students very similar questions (and a greater number compared to the 10 FAM questions), students were asked to circle their answers on paper rather than respond verbally. This process was adopted to provide students with a greater sense of confidentiality as they responded.<\/p>\n<p>The TIMSS Science asks similar questions to the TIMSS Math, but the content is science-focused instead of math-focused. The TIMSS Science also includes one additional question (\u201cI read about science in my spare time.\u201d) that is not included on the TIMSS Math.<\/p>\n<p>Both the TIMSS Math and the TIMSS Science provide national and international comparison data.<\/p>\n<p><strong><span style=\"color: #0000ff\"><a style=\"color: #0000ff\" href=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2017\/03\/Student-Feelings-About-Math.pdf\" target=\"_blank\">Click here to view a copy of the Feelings About Math survey.<\/a><\/span><\/strong><\/p>\n<p><strong><span style=\"color: #0000ff\"><a style=\"color: #0000ff\" href=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2017\/03\/TIMSS.pdf\" target=\"_blank\">Click here to view a copy of the TIMSS Math survey.<\/a><\/span><\/strong><\/p>\n<p>&nbsp;<\/p>\n<h5><strong>Domain-General Cognitive Measures<\/strong><\/h5>\n<ul>\n<li><strong>Backwards Corsi Blocks<\/strong> (Vandierendonck, Kemps, Fastame, &amp; Szmalec, 2004) \u2013 This task involves a student\u2019s working memory. Different numbers of squares light up in a sequence, and the student must then tap the squares in the reverse order from which they lit up. The task consists of 16 total trials made up of 8 2-trial items. The sequence length of squares increases from 2 to 8 across the activity. The score of interest for this measure is the highest span reached, or the longest sequence length that the student was administered and got at least 1 correct (span with at least 50% accuracy). Corsi Blocks was administered to students during T5, T6, and T8\u00a0data collection.<\/li>\n<\/ul>\n<p><a href=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2017\/03\/CorsiBlocksPicture.png\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-346\" src=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2017\/03\/CorsiBlocksPicture.png\" alt=\"CorsiBlocksPicture\" width=\"203\" height=\"143\" \/><\/a><\/p>\n<ul>\n<li><strong>Hearts &amp; Flowers<\/strong> (Davidson, Amso, Anderson, &amp; Diamond, 2006) \u2013 This task tests a student\u2019s attention shifting and inhibitory control skills. Students are instructed to use the number pad to select either the congruent or incongruent side of the screen based on the following stimuli and rule combinations:<\/li>\n<\/ul>\n<p><a href=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2017\/03\/HeartsFlowersPictures.png\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-347\" src=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2017\/03\/HeartsFlowersPictures.png\" alt=\"Hearts&amp;FlowersPictures\" width=\"578\" height=\"104\" \/><\/a><\/p>\n<p style=\"padding-left: 30px\">The Hearts &amp; Flowers task consists of 12 congruent trials, 12 incongruent trials, and 48 mixed trials. It was administered to students at T5, T6, T7, and T8 data collection.<\/p>\n<p>&nbsp;<\/p>\n<h5><strong>Math-Specific Cognitive Measures<\/strong><\/h5>\n<ul>\n<li><strong>Symbolic Number Comparison<\/strong> (Roussell &amp; No\u00ebl, 2007) \u2013 This task assessed students\u2019 Approximate Number System (ANS) acuity through the simultaneous presentation of two single-digit numbers. Before T6 data collection, the task was modified so that two double-digit numbers were presented. Students were required to select via button press which of the two numbers was larger. Scores of interest for this task include the total percent accurate (mean and slope across ratios of the two presented digits), response time for correct responses (mean and slope across ratios of the two presented digits), and a Performance score, which includes both response time and error rate. This task was given to students at T5, T6, T7, and T8 data collection.<\/li>\n<\/ul>\n<p><a href=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2017\/03\/NumbersPicture.png\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-350\" src=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2017\/03\/NumbersPicture.png\" alt=\"NumbersPicture\" width=\"169\" height=\"112\" \/><\/a><\/p>\n<ul>\n<li><strong>Nonsymbolic Number Comparison<\/strong> (Roussell &amp; No\u00ebl, 2007) \u2013 This task presents two arrays of dots simultaneously and requires the student to determine which side of the screen contains more dots. Scores of interest for this task include the total percent accurate (mean and slope across ratios of the two presented arrays), response time for correct responses (mean and slope across ratios of the two presented arrays), and a performance score, which includes both response time and error rate.<\/li>\n<\/ul>\n<p style=\"padding-left: 30px\">In T5, this task was administered to all students, and then a slightly modified task was given to roughly half of the students, starting in the middle of the assessment period.<\/p>\n<p style=\"padding-left: 30px\">For T6 and T7, only the modified task (\u2018Color Dots\u2019) was administered. The task was not given following T7. In the modified task, two arrays of dots were presented simultaneously, one on the left side of the screen containing yellow dots and one on the right side containing blue dots.<\/p>\n<p style=\"padding-left: 30px\">Example stimuli from the original and revised tasks are shown below.<\/p>\n<p style=\"padding-left: 30px\"><a href=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2017\/03\/DotsPicture.png\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-351\" src=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2017\/03\/DotsPicture.png\" alt=\"DotsPicture\" width=\"582\" height=\"126\" \/><\/a><\/p>\n<ul>\n<li><strong>Nonsymbolic and Symbolic Number<\/strong> <strong>Comparison<\/strong> (Lyons, Ansari, &amp; Beilock, 2012) \u2013 This task was introduced for T6 data collection, and a modified version of the task was administered to students during data collection at T7 and T8. Students were presented with a symbolic number and nonsymbolic numerosity (i.e., a set of dots) and were required to select which of the two simultaneously presented stimuli represents the larger quantity.<\/li>\n<\/ul>\n<p style=\"padding-left: 30px\">During year 3 (T7) of the\u00a0Middle School Follow-Up Study, the task was revised, and this revised task was also given during T8 data collection. In the revised task, students were first presented with stimuli consisting of a symbolic number at the top of the screen and two nonsymbolic numerosities at the bottom of the screen. They were instructed to select via button press which of the nonsymbolic numerosities matched the symbolic number. For the second half of the task, the stimuli were changed to show a nonsymbolic numerosity at the top of the screen and two symbolic numbers at the bottom of the screen. Students were instructed to select via button press which of the symbolic numbers matched the nonsymbolic numerosity at the top of the screen.<\/p>\n<p style=\"padding-left: 30px\">Example stimuli from the original and revised tasks are shown below.<br \/>\n<a href=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2017\/03\/MappingPicture.png\"><br \/>\n<img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-353\" src=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2017\/03\/MappingPicture.png\" alt=\"MappingPicture\" width=\"583\" height=\"179\" \/><\/a><\/p>\n<ul>\n<li><strong>Numeral Ordering<\/strong> (Lyons &amp; Ansari, 2015; Lyons, Price, Vaessen, Blomert, &amp; Ansari; 2014) \u2013 Numeral Ordering was\u00a0administered during T8 data collection. For the first half of the task, students were instructed to indicate via button press whether or not a sequence of one-digit numbers was presented in order from left to right. During the second half of the task, students were shown two-digit numbers and asked to indicate whether or not they appeared on the screen in order from left to right.<\/li>\n<\/ul>\n<p><a href=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2017\/03\/NumeralOrderingPicture.png\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-354\" src=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2017\/03\/NumeralOrderingPicture.png\" alt=\"NumeralOrderingPicture\" width=\"320\" height=\"84\" \/><\/a><\/p>\n<ul>\n<li><strong>Nonsymbolic Enumeration Fluency<\/strong> (Starkey, 2014; Starkey &amp; McCandliss, 2014) \u2013 This task was given at T5 and T6. It is comprised of a practice block, as well as two primary test components.<\/li>\n<\/ul>\n<p style=\"padding-left: 30px\">Before the test trials began, students completed 27 practice trials in which they were asked only to press the number on their keypad that matched the numeral on the screen; this was done to prime students for this method of responding.<\/p>\n<p style=\"padding-left: 30px\">During the first test section, students were presented with collections of dots to be enumerated under instructions for speeded responses. Students were presented with 64 (T5) or 76 (T6) trials in which they were instructed to rapidly indicate via button press the exact number of dots presented simultaneously on a computer screen. Arrays ranged from 1 to 9 in set size. Dot size, density, perimeter, and surface area were varied so as not to provide accurate cues to numerosity. Primary scores of interest from this measure are accuracy and response time (for correct responses only) means and slopes of median response times across set sizes for random presentations of 1-3 (an indicator of subitizing ability), and random presentations of 5-7 and grouped presentations of 5-7 (the contrast of the two indicates a student\u2019s groupitizing skill).<\/p>\n<p style=\"padding-left: 30px\">In the second test section, students were presented with 24 (T5) or 40 (T6) screens that had 2 or 3 numerals on them, and the student had to press the corresponding number key indicating the sum of the numbers on the screen. Sums ranged from 5 to 9.<\/p>\n<p style=\"padding-left: 30px\">Example stimuli from each section of the task are shown below.<\/p>\n<p><a href=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2017\/03\/GroupitizingPicture.png\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-355\" src=\"https:\/\/cdn-dev.vanderbilt.edu\/t2-my-dev\/wp-content\/uploads\/sites\/1185\/2017\/03\/GroupitizingPicture.png\" alt=\"GroupitizingPicture\" width=\"326\" height=\"404\" \/><\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Overall Data Collection Procedures Student direct assessment data for the Math Follow-Up Study were collected electronically, using either E-Prime or FileMaker software. For each of the computer-based assessments, E-Prime software was used to collect information about students\u2019 accuracy and response times. Assessors obtained student information for all other tasks using FileMaker software. The FileMaker system&#8230;<\/p>\n","protected":false},"author":552,"featured_media":0,"parent":365,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"tags":[],"class_list":["post-331","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/my.dev.vanderbilt.edu\/mathfollowup\/wp-json\/wp\/v2\/pages\/331","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/my.dev.vanderbilt.edu\/mathfollowup\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/my.dev.vanderbilt.edu\/mathfollowup\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/my.dev.vanderbilt.edu\/mathfollowup\/wp-json\/wp\/v2\/users\/552"}],"replies":[{"embeddable":true,"href":"https:\/\/my.dev.vanderbilt.edu\/mathfollowup\/wp-json\/wp\/v2\/comments?post=331"}],"version-history":[{"count":39,"href":"https:\/\/my.dev.vanderbilt.edu\/mathfollowup\/wp-json\/wp\/v2\/pages\/331\/revisions"}],"predecessor-version":[{"id":729,"href":"https:\/\/my.dev.vanderbilt.edu\/mathfollowup\/wp-json\/wp\/v2\/pages\/331\/revisions\/729"}],"up":[{"embeddable":true,"href":"https:\/\/my.dev.vanderbilt.edu\/mathfollowup\/wp-json\/wp\/v2\/pages\/365"}],"wp:attachment":[{"href":"https:\/\/my.dev.vanderbilt.edu\/mathfollowup\/wp-json\/wp\/v2\/media?parent=331"}],"wp:term":[{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/my.dev.vanderbilt.edu\/mathfollowup\/wp-json\/wp\/v2\/tags?post=331"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}