|Posted by Sharon DeNunzio on July 25, 2017 at 9:40 AM||comments (0)|
|Posted by Sharon DeNunzio on May 25, 2017 at 9:15 PM||comments (0)|
According to Stanford U’s PhD researcher Patricia Chen, incorporating the study “hacks” of metacognition and self-regulation can easily bring a B-student into the A grades 1
Imagine what these tools can do for test prep!
Metacognition refers to students’ thinking about HOW they are going to study, WHAT resources, WHAT tools, and HOW to prioritize and organize their habits. Self-regulation refers to students having strategies to be able to review their own work, and helps determine some of the key ways to improve their learning.
These tools -- metacognition and self-regulation – should be taught to your students when they participate in a structured test prep program with an experienced tutor or teacher. For example, students will learn that the verbal sections of the SAT and ACT are all about ELIMINATION, not just trying to choose the right answer. In other words, it is critical that a student can justify why each answer not chosen is a wrong answer.
Let’s take an example in SAT Writing or ACT English. We have the following question —
Every summer, one of the country’s largest craft festivals  bring together artists from every state….
A) NO CHANGE
Students who are “untrained” will just listen when they read each choice to see which answer they believe “sounds” better. That’s an ok strategy untrained, but if you want to work in metacognition, a good tutor will train the student to skim the answer choices after identifying the part of speech (verb) that is underlined. Noting that the answers have one plural and one singular verb is a tip-off that this question is testing Subject-Verb Agreement. Correctly identifying WHAT the question is testing is the first half of thinking about how to study/practice. The next step is to know the rules on properly identifying the subject and knowing that the pronoun “one” (subject) is singular. Since “one” is singular, it needs to be paired with the verb “brings” – D. Further, the student would have eliminated B because an “-ing” verb is not an action verb. And although “brought” (C ) agrees with the subject “one”, a trained student would review the verbs in the rest of the paragraph to determine that the rest of the paragraph is in the present tense, and not past tense. This would rule out answer choice C.
Now another one –
I planned on gathering five of my friends for a 30-mile bike ride through the winding hills; however, I was having trouble with my brakes. Fortunately,  one of my friends, Walt, agreed to come over and help me the next day.
A) NO CHANGE
B) one of my friends,Walt
C) one of my friends Walt,
D) one of my friends Walt
A trained student here will skim the answer choices and realize that she is dealing with a comma issue surrounding names and non-essential/essential clauses. Again, it is critical that a student has been trained to identify WHAT the question is testing! A trained student will know that the name will either have NO commas (essential clause) or a pair of commas (non-essential clause). That automatically eliminates B and C. Since the description “one of my friends” is not specific enough to refer to only one person (Walt), then Walt must be essential and should not have any commas surrounding his name. This is answer D.
Beyond the teaching of HOW to think about EACH individual question on these standardized tests, an experienced tutor or teacher should help a student prioritize WHAT practice is most beneficial and HOW to organize study habits. Using the test-maker’s practice tests is key, but supplementing with the best materials for content drills and content review is also important.
Remember that even parents can help their children learn metacognition when they offer observations and ask their children simple questions such as “what you did last month to study for the test didn’t work too well. What could you do differently or more effectively to prep for this test?”
How can you best help your children in their test prep journey?
Do the tutors and teachers you hire teach metacognition and self-regulation?
|Posted by Sharon DeNunzio on May 23, 2017 at 3:50 PM||comments (0)|
© Ydi | Dreamstime Stock Photos
Tip #1 –
Plan to take the April ACT during junior year, and plan to make this at least your student’s second or third test. Why? The April test appears to be a favorable test based on my student history because
• most students are in a groove during time in their junior year,
• the April test historically is somewhat “easier” than the other two released tests in December and June, and
• most students have experienced at least one test, kicking the anxiety to the curb and allowing a student to focus in on the sections most needed for score improvement
Tip #2 –
Plan to include several mock tests in the journey. Baseline testing helps the tutor focus on the most important material that will create the greatest score improvement opportunities. Mock testing during the journey will help monitor progress and continue to refine the areas of focus for practice and tutoring.
Here are a few highlighted case studies from this year that show why I recommend planning for an April test –
|Posted by Sharon DeNunzio on February 21, 2017 at 11:10 AM||comments (3)|
Nicole Oringer, Partner and College Counselor of Ivy Education Services, recently alerted us to inappropriate requests at test sites in NJ during the Feb ACT administration to have students clear their calculators of RAM memory. 1
This request clearly goes against existing policy of the ACT: https://www.act.org/content/act/en/products-and-services/the-act/taking-the-test/calculator-policy.html " target="_blank">https://www.act.org/content/act/en/products-and-services/the-act/taking-the-test/calculator-policy.html
This request would also go against existing policy of the SAT: https://collegereadiness.collegeboard.org/sat/taking-the-test/calculator-policy.html" target="_blank">http://https://collegereadiness.collegeboard.org/sat/taking-the-test/calculator-policy.html
I frequently encourage my students to store strategies and formulas, some executable and some look-ups, under the PRGM key on their calculators. These look-ups are primarily helpful as a student practices and prepares for a real test. However, they can also be very useful when taking a real test – whether it is the ACT, the SAT, or the SAT math subject tests.
If you are a student taking an upcoming test, I recommend that you arm yourself with protection against such an inappropriate proctoring request by printing out one of these letters provided by Ivy Education and bringing it with you to the test site:
For the ACT -- https://ivyed.net/wp-content/uploads/2015/04/CalculatorLetter2017.pdf" target="_blank">http://https://ivyed.net/wp-content/uploads/2015/04/CalculatorLetter2017.pdf
And for the SAT – https://ivyed.net/wp-content/uploads/2017/02/CalculatorLetter2017SAT.pdf" target="_blank">http://https://ivyed.net/wp-content/uploads/2017/02/CalculatorLetter2017SAT.pdf
It is always good to stay informed, stay alert, and be prepared!
1 https://www.linkedin.com/pulse/calming-calculator-confusion-standardized-tests-nicole-oringer" target="_blank">http://https://www.linkedin.com/pulse/calming-calculator-confusion-standardized-tests-nicole-oringer
|Posted by Sharon DeNunzio on February 21, 2017 at 10:25 AM||comments (0)|
Typically, the ACT has been upfront about changes to its test format and content. For example, the ACT test-maker wrote White papers announcing changes to its Reading section whereby it planned to introduce a Paired Passage (started in June 2014) and totally change the ACT essay to incorporate an analysis of three different provided perspectives on an issue. The Essay has changed in scoring twice – from the original 0-12 scale to a 0-36 scale and back to a 0-12 scale. White papers again were written to explain to students, educators, and parents the reasoning behind and the execution of these changes.
However, more recently the ACT has “snuck in” some pretty substantial changes in the Math section and in its Essay prompt directions. We can only theorize that these changes are here to stay, just as the Paired Passage was in Reading and the new 6-Passage format was in Science. It is possible, though, that the new Math content was an anomaly on the December test and that the tests going forward will not reflect this new level of difficulty.
These are some of the new topics in Math that have shown up in the recent tests, with particular emphasis on the content that appeared for the first time in the December 2016 test:
• Calculating Horizontal and Slant Asymptotes based on a graph and function’s equation • Using either Synthetic division or Polynomial long division
• Complex Combinations and Probability when Order Matters
• Complex Combinations with multiple groups (previously only covered in Math2 subject tests)
• Vector Math
• Solving for the Determinant of a 2x2 matrix (you need to know the formula!)
• Circle Rotations and Coordinate Geometry
• Domain of Log functions and solving Complex Log equations by logging both sides
• Distance between two complex coordinates in the complex plane
• Identifying the greatest standard deviation in patterns of data options
• Abstract SAT-like expressions with multiple variables
• More complex sequence problems
• More problems with radians
See my previous blog, “Recent Changes to Format of and Increasing Complexity of ACT” written in April 2015.
The one good element of the ACT having “raised the bar” in level of difficulty in Math was that the December ACT Math scale was extremely “lenient”. One of my students who received a 27 score in Math missed 20 questions. On typical prior tests, a 27 score would require no more than 13 misses. It was also possible to miss 5 questions and still receive a 34 score. On typical prior tests, no more than 2 misses would produce a 34 score.
For the ESSAY, the December 2016 test showed the same presentation of a topic that contains tension and provides three different perspectives, but a new set of directions emerged:
“Clearly state your own perspective on the issue and analyze the relationship between your perspective and at least one other perspective.”
Previous directions required a student to discuss all 3 perspectives. This is a favorable change to students, and as such, students should know ahead of time to READ the directions and only include those perspectives that can easily be woven into their own three supporting reasons.
|Posted by Sharon DeNunzio on February 21, 2017 at 10:15 AM||comments (0)|
Why take the SAT in March?
Well, just as in taking a standardized test, we’re going to examine the month of March in light of the other possible answers. And our goal is to eliminate those that make the least sense.
October 1, 2016 – Great date for seniors taking their final test; too early for most juniors to have enough practice or experience. You already missed this opportunity!
November 5, 2016 – Still a bit early, but not a bad first test date for a junior who started prepping over the summer. You already missed this opportunity!
December 3, 2016 – Favorable test date! Follows a short Thanksgiving break, perfect early test for a junior who started test prep in the late summer or early fall. Hopefully, you already took this test as your first one!
January 21, 2017 – Typically falls the same week as midterms for many students. Can be stressful academic month following the Holidays. You missed this one!
March 11, 2017 – Favorable test date! Usually follows a short mid-winter break. Students are in a good academic flow. CT public high students have the opportunity to take a March SAT test in school during the week followed by the national test center Saturday March 11th test.
May 6, 2017 – Same time as AP tests, spring tests, spring proms, spring flings, spring sports. There are just too many competing things happening at this time of the year for the test to be ideal.
June 6, 2017 – This is a Favorable test for private school students who often have a week following the end of school to do nothing but prep for this test. Public school students may find it busy with final exam prep beginning over the next week(s).
August 26, 2017 – This is likely to become a favorable and popular test for rising seniors, allowing these students an August and/or October shot.
Ok, so we’ve pretty much eliminated all but the March, June, and August test dates. So let’s get going with the upcoming March 11th test!
Mike Bergin, of Chariot Learning, wrote a recent blog titled “The Case for the March SAT’ that echoes my conclusion.1
For those who are taking the March 11th SAT, make sure that you take advantage of sitting for at least one full mock test with my group or with another testing center. Good luck!
|Posted by Sharon DeNunzio on December 31, 2016 at 5:15 PM||comments (0)|
College Board is Deliberately Misleading the Public with National Percentiles –
PSAT Scores PSAT Scores were released on December 12, 2016. Many students were delighted when they saw their scores and National Percentiles. Did you know that College Board’s “National Percentiles” are based on a fabricated population of students who are in high school and include those who never take the SAT? Did you know that College Board created this “National Percentile” definition beginning with its New PSAT in October 2015, and that these percentiles have continued to be reported this year with the New SAT and the New PSAT?
The one key difference between New SAT and New PSAT score percentiles is that College Board shows the National Percentile right next to the User Percentile in its SAT score reports. The User Percentile is nowhere to be found in the New PSAT score reports in 2016. At least they were reported in 2015! Why not now?
What are User Percentiles, and why are User Percentiles important?
User Percentiles are the standard for comparing scores between the SAT and ACT. National Percentiles are inflated percentiles. National percentiles are generally 4 to 6 percentage points higher than User Percentiles. What are User Percentiles? These are the percentiles that College Board and the ACT have historically been reporting until College Board did its revamp of the New SAT. User Percentiles are based on a real population of students who actually take the SAT or ACT. This is what makes the most sense! If you want to be able to compare your scores between the SAT and the ACT, you MUST use User Percentiles.
So why does College Board publish National Percentiles with its PSATs? It’s obvious -- the ACT surpassed College Board with the number of students taking its test in 2012. College Board is in the business of selling SATs!
If you believe that your percentiles are much higher on the PSAT than on a mock ACT you’ve taken, you’ll be duped into believing that the New SAT is a better test for you. Think again when you look carefully through these percentiles.
The type of comparison that you are really looking to do before you decide on one test or another is below.
Note this high-scoring student had a 97th percentile (National percentile) for her overall PSAT. One may falsely believe that this is equivalent to a 31.5 Composite on the ACT.
That false comparison is exactly why it is so important to go to the College Board documents and look up User Percentiles. One can even go back to concordance tables for the 2014 "old" PSAT reporting to look at equivalent percentiles. User percentiles and concorded percentiles are the best way to compare to the ACT because the percentile definitions are based on the same total population of students taking the actual tests. Note that the high-scoring student below is now similar to a 29.5 total score. Her Verbal scores, though, are really in the 78th to 84th percentiles. That's similar to a 26 on ACT Reading and 26.5 on ACT English. These are quite different from the immediate false assumption that the student's score is similar to a 31 to 32 ACT score.
Make sure that you are working with an informed test prep provider and that you do your homework!
|Posted by Sharon DeNunzio on December 31, 2016 at 2:25 PM||comments (0)|
Here it is -- the final day of 2016!
I am so grateful for having had the opportunity to work with so many wonderful students and their families in their testing journeys as one key part of the college application process. I have witnessed such growth in self-confidence through our journeys. Some families have benefited from receiving merit scholarship awards or acceptance into honor programs or other benefits from their testing results.
I so enjoy having families come back and share their college news with me. I also enjoy re-connecting up to a year after I work with a family to learn where the student catapulted to next. I am so grateful to learn of my students' successes and their new transitions. BEST of LUCK to all!
Here is the College List that my tutoring students (over 90 of them!) graduating in 2015 and 2016 are attending:
Claremont McKenna College
College of Charleston
High Point University
Miami of Ohio U
New York University
Roger Williams (RI)
Syracuse Trinity College
University of Connecticut
University of Delaware
University of Kansas
University of Maryland
University of Miami
University of Michigan
University of New Hampshire
University of Pennsylvania
University of Vermont
University of Virginia
University of Southern California
Washington & Lee
Whitworth University (WA)
|Posted by Sharon DeNunzio on October 15, 2016 at 10:30 AM||comments (0)|
Many of the retail prep books and SAT Essay books are written with recommendations that assume that every SAT persuasive article presented in a prompt will be well written and that a student need only find the appropriate tools of reasoning, evidence, and stylistic elements to praise.
Most of these books were written prior to the launch of the March 2016 New SAT. We have now recently seen the two essay prompts in May and June (Practice tests 5 and 6, respectively). Some of us have also seen the March essay prompt. At least two of these three prompts contained flawed persuasive essays. How should a student prepare to critique such an essay? I believe that Shaan Patel’s “New SAT Essay” book contains a wonderful Acronym strategy to help students recall eight successful tools used to describe evidence, reasoning, and stylistic elements of a persuasive essay. His Acronym is CREW SAID.
C – Contrast
R – Repercussions (consequences)
E – Emotion
W – Words
S – Similarities
A – Authority (citing exerts)
I – Imagery
D – Data
Repercussions, contrast, and similarities describe effective reasoning techniques. Data and authorities describe evidence. The remaining techniques fall under stylistic elements. These are important positive tools to learn, identify and recall because even when the essay is flawed, the student needs to find at least two specific tools that the author uses effectively and expand on these in his first body paragraph.
The next body paragraph should describe and expand on at least two flawed tools that the author used. Here, the student can use CREW SAID to help recall the exaggerated, ineffective tools:
C -→ Either – Or
R → Cause – Effect
E → Slippery Slope
W -> Loaded Words
S → Far-fetched comparisons
A → Card – Stacking
I → Circular Reasoning
D → Hasty or Loose Generalizations
The third body paragraph should identify and expand on at least two successful tools that the author could have used to strengthen his overall argument. Again, having the CREW SAID Acronym written as the first part of a student’s notes will help the student recall many of the critical tools. What I have seen in previous past flawed essays is a lack of data. It is difficult to construct an effective argument without numbers, costs or profits. Look for this evidence.
This summarizes an organizational template for a critique of a flawed essay. I recommend that a student use 10 to 15 minutes to read over (twice) the essay, identify specific line references for the tools she plans to cite, and bullet point the arguments. Don’t forget to vary your sentence structure: add a semi-colon a couple of times, use very short sentences once in a while, and experiment with a rhetorical question. Don’t forget to bone up on great SAT vocabulary words that you can sprinkle throughout your essay.
Let me know what you think of this advice.
When you recently took the SAT, did you determine that the essay was flawed? If so, did you call it out or did you follow the mainstream advice to simply praise the author? How did your scores turn out?
I’ll let you know my essay scores from the October test later this month.
Now you have some new ways of looking at the SAT essay. Good luck on your next test!
|Posted by Sharon DeNunzio on July 30, 2016 at 9:00 PM||comments (0)|
In case you missed the ACT Continuous Improvement latest series of announced changes to ACT reporting, I’m writing to let you know that you will be ecstatic to learn of this change, particularly with all of the grumbling and articles complaining about the 2015 move to report Essays on a 1-36 scale.
So, let’s say you scored a 25 Essay on your December, 2015 ACT, even though you had a 34 Composite. Let’s say that this 25 essay was comprised of three 8 sub-scores and one 10 sub-score in the four categories. You will be happy to know that the average of these four sub-scores is 8.5, which is rounded up to a 9 score. You will be able to report a 9 overall score on your Common App. A 9 overall score is the 88th percentile. Job well done!
If you had scored two 9 scores and two 10 scores on the four Writing categories, your overall average is 9.5, which is rounded up to 10. A 10 overall score is the 98th percentile. You will be happier to report a 10 versus reporting a 28 on the 36-scale. Keep in mind that even an overall 8 score on the Essay is the 84th percentile.
The ACT is not changing the format of its new, more challenging analytical essay with three different perspectives. It is keeping the four category sub-scores. The only difference in scoring will be the overall score. The overall score will revert to the 0-12 score rather than one reported on the 1-36 scale. Keep this in mind as you are approaching Common App season!
This announcement will help you understand how to convert your 2015 ACT Essay scores into the 0-12 score: http://www.act.org/content/dam/act/unsecured/documents/WhitePaper_5_Ways_to_Compare_Writing_Scores.pdf.