Thursday 11 October 2012

How well do home educated children do academically?

We looked yesterday at a ‘briefing paper’ about the proposed Welsh law on home education. This was designed to bamboozle education professionals into believing that there is good evidence that home education in this country is likely to lead to strong academic outcomes. One of the ways that this was done was to talk of a page on a website intended specifically for home educated children’s examination results and then pretend that these were the achievements of members of one home educating support group. We also saw evidence from the USA which apparently tended to suggest that home educated children did well academically and were more likely to go to college and university than schooled children. This is also implausible and for much the same reason.


The research quoted in the ‘briefing paper’ yesterday was that conducted by Larry Rudner and Brian Ray. Although extensive, it suffered from the same disadvantage as the statistics from the British website; that is to say, it was all self-selected. One sees at once the problem. Those volunteering to take part in such research are those for whom home education has been a success. Those who remain semi-literate after ten years of home education are unlikely in the extreme to offer to take part in any project looking at academic achievement. What is needed is a large group of young people, some of whom have been to school and others of whom have been home educated, so that we may compare their academic levels. Such a study has in fact been running in America for over ten years and the data from this provides a far more realistic and objective measure of the educational quality of home education than anything produced by Rudner or Ray.

Those wishing to attend college or university in the United States sit either the American College Testing assessment or the Scholastic Aptitude Test; the ACT or SAT. These measure such things as English, including reading ability, science and mathematics. Since the late 1990s, those taking these tests have been asked if they were educated at home. Of course, in a sense, this too is a self selected sample consisting only of those wishing to attend college, but that too reveals interesting information, as we shall see.

The first thing that one notices is that although home educated children taking these tests do tend to be slightly ahead of those who went to school, the differences are not dramatic. The ACT is scored from 1 to 36. The average schooled pupils score is 21, but home educated teenagers come in a little higher at almost 23. It gets really interesting when you break down the individual components of the scores. Home educated kids are quite a bit ahead on English, especially reading. This is not surprising really, since they spend much of their time in the company of adults; one would expect them to be more articulate and have a better vocabulary than those who spend their days in the company of other children. There is no difference at all in science and in mathematics, the home educated children lagged noticeably behind those who had been to school.

Another point to consider is that the proportion of home educated teenagers applying to go into further education seemed to be less than expected, given the numbers. In other words, it looks as though home educated children in America are less not more likely to go to university.

Of course, these figures must be treated with caution, but they are never the less intriguing. They do seem to suggest that the educational advantages of home education are not quite as pronounced as some would have us believe. The teenagers might have greater vocabularies, but they are worse at mathematics. There is no discernible advantage in science. The fact that a lower proportion of home educated children than expected is applying to go to university is also worth thinking about. This all gives a far more balanced picture of home education than one normally gains from looking at research financed by people like the HSLDA.

55 comments:

  1. " Those wishing to attend college or university in the United States sit either the American College Testing assessment or the Scholastic Aptitude Test; the ACT or SAT. These measure such things as English, including reading ability, science and mathematics."

    Any chance of a link?

    ReplyDelete
  2. 'Any chance of a link?'

    I've only got various paper journals and magazines. I'll have a look later and see if I can find anything on the internet.

    ReplyDelete
  3. Titles of papers and sources are fine, thanks, I have access to an academic library.

    ReplyDelete
  4. Titles of papers and sources are fine, thanks, I have access to an academic library.

    ReplyDelete
  5. Would science-orientated pupils be more likely to return to school to get access to lab facilities, which'd skew the figures? While you've proved you can do exam level science on a kitchen table till 16 and I'm not finding it anything like as hard as I thought I would, it's undoubtedly easier to do English and humanities there. And my experience is that children tend to have 'cluster' abilities and science and maths go together, like humanities and English do.

    Also, there is no evidence of how many children were part home-ed and part schooled. For instance, your Simone wouldn't show up as HE child going to top University, would she? And that doesn't make all the work you and she did when she was at home less valid, does it?

    After spending our first month studying statistics, my son has come to the conclusion that they aren't a science because they don't prove anything except someone's theory. I think that counts as success.

    ReplyDelete
    Replies
    1. "After spending our first month studying statistics, my son has come to the conclusion that they aren't a science because they don't prove anything except someone's theory."

      Research and development of statistical methods - statistical methodology - could be considered to be science, in that it involves development of hypotheses, testing and ultimately formulation of statistical theory which becomes an important tool for other disciplines.

      However, most statistical work is an application of tried and tested tools, and in that sense your son is correct. In that case, it's an important component of the sciences, along with mathematics.

      Delete
    2. Thanks, he's read that, and it backed me up!

      Delete
  6. "Another point to consider is that the proportion of home educated teenagers applying to go into further education seemed to be less than expected, given the numbers."

    This sounds vague. Can you describe how this was calculated and give the figures? Or at least provide references. It's and interesting idea but difficult to discuss with so little information.

    ReplyDelete
    Replies
    1. Do you mean Further or Higher Education?

      Delete
    2. I was quoting Simon but I assumed, as he was talking about American college entrance tests that are taken after graduating high school, that he meant higher education. SAT/ACT test rates can only give a rough indication of those going into higher education anyway since you don't have to apply to college/university after taking a test and not all college/universities require them for entrance (especially art/technical/music schools). Hopefully he will correct me if it was a faulty assumption. I'm not sure that they even have further education in America. I think the closest equivalent is their community colleges, but this still counts as higher/tertiary education.

      Delete
    3. It's rather confusing because many homeschoolers there have 'dual enrollment' at community colleges which would be equivalent to a student doing some classes at FE A level and some at home.

      Delete
  7. Like the earlier Anonymous, I'd also like the references for the original source that you used as the basis for your comments in this posting, please.

    TIA.

    ReplyDelete
    Replies
    1. Thanks, that's interesting. These numbers alone aren't enough to say anything definitive about the difference between schooled and home ed children; we need the individual distributions for a proper analysis.

      However, we know that the overall ACT scores show standard deviations of about 4-6 [1], so on the face of it we might say that there is no significant difference between the groups - and certainly a difference between the maths scores corresponding to less than about 0.1 sigma is utterly insignificant.

      Simon must have something much more than this for him to be able to claim that "in mathematics, the home educated children lagged noticeably behind those who had been to school."

      I await his references with bated breath.

      [1] http://www.act.org/newsroom/data/2012/pdf/profile/National2012.pdf

      Delete
    2. It sounds as though you know more about statistics than I do, so this will probably be a silly question, but wouldn't the p-value be more useful to assess significance? Hopefully Simon will be able to point us in the direction of the original study/figures.

      Delete
    3. The p-value - the probability of finding something as "extreme" as 0.1 sigma from the mean of the overall distribution (that is 0.1 sigma or beyond) - will be very close to 1; i.e., 0.99 something. So, from these data, the math scores for the two groups are indistinguishable.

      Of course, we don't have much detail; we only have the means of the two scores and the likely s.d. for the overall sample; we don't know anything about the distribution of HE scores. Nevertheless, it would be surprising to find anything so pathological that it would alther the conclusion from these data, and Simon must be basing his statement on something additional or quite different.

      Delete
    4. I can see I'm going to have to add statistics to my OU studies! Thanks for the clarification.

      Delete
  8. Arkansas has compulsory registration and testing of home schoolers and 16,405 students were tested in 2012. Their average scores in every grade for both English and Maths were above the 50th percentile when compared to public school score. They ranged from 0.52 to 0.64.

    Arkansas Home School Report

    ReplyDelete
  9. There was an interesting study published in the Journal of College Admission in 2010, which addressed not the proportion of home educated children (in the US) that go to university, but how they do once they get there. It was based on students at a single PhD-granting institution, as detailed information of the type the study used isn't easily accessible. But with that qualification in mind it's an interesting study and worth reading.

    The article is Michael F. Cogan, "Exploring Academic Outcomes of Homeschooled Students", Journal of College Admission, Summer 2010, Issue 208, p18-25.

    I'll quote the abstract: "This exploratory study examines the academic outcomes of homeschooled students who enter a medium size doctoral institution located in the Midwest. Descriptive analysis reveals homeschool students possess higher ACT scores, grade point averages (GPAs) and graduation rates when compared to traditionally-educated students. In addition, multiple regression analysis results reveal that students who are homeschooled earn higher first-year and fourth-year GPAs when controlling for demographic, pre-college, engagement, and first-term academic factors. Further, binary logistic regression results indicate there is no significant difference between homeschooled student’s fall-to-fall retention and four-year graduation rates when compared to traditionally-educated students while
    controlling for these same factors."

    The indications from this one study, that is, limited as it is, is that whether or not home educated students are more likely than other students to go to university, they do better when they get there. Arguably, though, this shouldn't be much of a surprise, as the kind of self-motivated and self-directed education that university study requires matches better what home educated students are used to than it does what students know from secondary school. And surely how well students do at university is a more interesting statistic from an educational point of view than how well they do on a standarized test like the SAT or ACT.

    ReplyDelete
    Replies
    1. "And surely how well students do at university is a more interesting statistic from an educational point of view than how well they do on a standarized test like the SAT or ACT."

      I agree; in addition, we have no measures of success for those who don't go to university, whether or not they are educated in school or home educated.

      However, Simon still has to provide the evidence to substantiate his claim here that home educated children are worse at maths.

      Delete
  10. With respect to Simon's comment that the difference in ACT scores between home educated and non-home educated students is "not dramatic", it's important to remember that the ACT is graded on a fairly steep bell curve. As a result, a 21 score corresponds to the 55th percentile, while a 23 corresponds to the 68th percentile. This is not to say that Simon's other points about self-selection aren't correct, but a 13% differential is fairly notable.

    You can get a detailed analysis of the latest ACT statistics here: http://www.act.org/newsroom/data/2012/pdf/profile/National2012.pdf

    ReplyDelete
    Replies
    1. This is the link I posted earlier as a reference for the s.d. in the overall ACT scores distribution. Unfortunately, it doesn't say anything about the distribution of scores for home educated children. Given that, we could apply a test to determine whether there is a statistically significant difference, rather than describing them in vague language such as "a little higher", "quite a bit ahead" or "noticeably behind".


      Right now, on the basis of what I've read in Simon's post, subsequent comments here and the 2012 ACT score data, I wouldn't claim a significant difference either way.

      Simon's remarks - particularly the one about HE children being "worse at mathematics" - have to be supported by some quantitative evidence in the "various paper journals and magazines" he claims to have. We have seen nothing of substance so far.

      Delete
    2. In theory, yes it would be good to have a statistical analysis to supplement other observations. But in reality a statistical analysis would only be meaningful if you have information that would let you control for other potentially relevant factors, such as family income, educational background of parents, and so on. Only once an analysis included controls for such factors could we reasonably conclude that any observable difference was actually due to the home education. Obviously, though, we're not going to have that sort of information.

      Which means, I would argue, that transparently vague comments along the lines of "a little higher", "noticeably behind", etc. are actually preferable to statistical analyses based on inadequate data. The former at least wear their inadequacy on their sleeve, so everyone knows that they are questionable. The latter hides behind a misleading precision, and easily deceives people into giving the conclusions claimed more weight than they deserve.

      Delete
    3. Firstly, I'm not the one making claims based on these data; I'm saying that one can't make any claims simply by looking at differences between average scores. The data provided by Simon are inadequate.

      Secondly, while we might agree that other factors may play a role, the first thing to ask is: "is there a difference in the scores of the two groups?" Most scientific investigations begin by asking simple questions.

      Finally, inferring things from inadequate data while hiding behind those "transparently vague comments" lies somewhere on the road between hell and madness; it is suicide for one's credibility, unless, of course you're talking to the kind of audience Simon described recently as "credulous and gullible fools". It plays into the hands of people who argue the toss about nothing; it is a world inhabited by people such as politicians - and indeed, Simon himself.

      Simon, of course, would not make any claims without having a sound statistical basis; I wonder when he's going to provide the references?

      Delete
    4. Clearly we both agree that the statistical evidence is unavailable for solid conclusions. That isn't Simon's fault, it's just the reality of the evidence. Simon, as I read his post, is merely reporting some studies, and states clearly that he is doing so. He then, I agree, draws stronger conclusions than he should on the strength of the evidence available.

      The more fundamental issue, though, is what to do in situations of evidential uncertainty. There are at least three options available.

      Firstly, one can point at a method that is extremely useful and reliable when strong evidence is available, then note that strong evidence isn't available, and conclude that therefore we can't say anything.

      Secondly, one can just ignore the problems with the evidence and stake out a position as though the evidence was as solid as could be desired.

      Thirdly, one could take seriously the reality that there are different methods of justification for conclusions, and statistics is only one of them. It works wonderfully when good evidence is available, and in those situations should be deferred to. But good evidence isn't always available, and when it isn't, then presumptively less desirable types of justification become important.

      By it's nature home education involves a lack of solid evidence. Many people who do it don't want to share information on what they see as an intimate aspect of their lives. In some cases they don't want to share because they fear negative consequences.

      There will, that is, very rarely be the kind of solid data regarding home education from which sound statistical conclusions can be drawn. One response to that situation is just to throw up our hands and say "well, now we can't say anything". That, though, isn't true. We can say things, we just have to be open about the weakness of the evidence we are forced to use.

      In theory it's nice to be able to take a purist stand and refuse to reach conclusions until the evidence is perfect. In the real world we need to act, and sometimes that means drawing conclusions based on the best evidence and reasoning we have available.

      Delete
    5. Then tell us, what do you infer from the numbers he's published so far?

      Delete
    6. ”Simon, as I read his post, is merely reporting some studies, and states clearly that he is doing so.”

      I don't think Simon is quoting figures from a study as such. The figures are given out by the ACT organisation and are simply the result of a question about place of education on the ACT application. No attempt is made to account for demographic data that is likely to cause higher scores in the home-school group as in 'proper' studies. For instance, home-schoolers as a group are more likely to be white and middle class than schooled, and if this carrries through into those who take ACT tests, this will push up their average scores. Comparing the raw figures for schooled and home-schooled students in this way means that we are probably not comparing like with like. It may be that the self selecting nature of the test for both schooled and home-schooled students smooths out some of these demographic differences (minority and poorer schooled students are possibly less likely to take the test), but without evidence it's impossible to reach any comparative conclusions.

      I have never viewed the figures from any of the HE studies as evidence that HE provides a better academic education than school. When I looked into HE initially, I used them simply as evidence that HE does not rule out the possibility of a good education if my children were academically inclined - that if hundreds or thousands of other families can achieve a good education, there's no reason to assume that we would fail. Academic achievement is only one aspect of why we home educate, so we didn't need conclusive evidence that it would definitely be superior in that respect. We just needed enough evidence to convince us that it was possible, because the other benefits were equally important – we didn’t home educate purely for academic excellence.

      Delete
    7. Anonymous said,
      "When I looked into HE initially, I used them simply as evidence that HE does not rule out the possibility of a good education if my children were academically inclined"

      Interestingly this is more or less what Rudner said about his study looking at the results of 20,760 home schooled children. He says (my emphasis):

      "The superior performance of home school students on achievement tests can easily be misinterpreted. This study does not demonstrate that home schooling is superior to public or private schools. It should not be cited as evidence that our public schools are failing. It does not indicate that children will perform better academically if they are home schooled. The design of this study and the data do not warrant such claims. All the comparisons of home school students with the general population and with the private school population in this report fail to consider a myriad of differences between home school and public school students. We have no information as to what the achievement levels of home school students would be had they been enrolled in public or private schools. This study only shows that a large group of parents choosing to make a commitment to home schooling were able to provide a very successful academic environment."

      http://www.homeschoolworld.org/Pages/03_Questions_Concerns/HSvsPS/HSvsPS_Article_ERIC.html

      Delete
  11. "Simon, of course, would not make any claims without having a sound statistical basis;"

    One would hope so, otherwise he is being just as manipulative as he claims others are being, and his comments about people, 'trying to pull the wool over people’s eyes', 'indications are that this was a deliberate falsehood' and 'presumably, he banked on nobody bothering to check the references', would apply to him.

    ReplyDelete
    Replies
    1. "I would argue, that transparently vague comments along the lines of "a little higher", "noticeably behind", etc. are actually preferable to statistical analyses based on inadequate data."

      Looks like Simon has been reduced to skulking anonymously around his own blog.

      Delete
    2. No, someone using tesco.net left that comment, Simon uses BT.

      Delete
    3. "No, someone using tesco.net left that comment, Simon uses BT."

      BTW, this isn't nearly as creepy as it sounds on re-reading it this morning. Most web sites have visitor counters and one of the most popular is Sitemeter. The free version, as used by Simon, allows anyone to view the results. I quite often view these as part of my web design work (information about referral sources is important, for instance) so it's become almost automatic. Here's the link to the information about Simon's visitors:

      http://www.sitemeter.com/?a=stats&s=s24bibles&r=8

      Delete
  12. "In theory it's nice to be able to take a purist stand and refuse to reach conclusions until the evidence is perfect. In the real world we need to act, and sometimes that means drawing conclusions based on the best evidence and reasoning we have available."

    Oh dear, not the old "theory vs real world" one; that's just argumentative bait. My position is not that of a purist theorist, but simply an honest pragmatist, trying to avoid foolish claims. Using "transparently vague comments" to obfuscate things that one does not like about some data is at best foolish, and downright dishonest at worst - no argument.

    "we just have to be open about the weakness of the evidence we are forced to use."

    Agreed, but "transparently vague comments" should not be part of this. Don't go down the slippery slope of lying about what can be inferred from the data. As another anonymous said: "Academic achievement is only one aspect of why we home educate, so we didn't need conclusive evidence that it would definitely be superior in that respect."

    "Simon, as I read his post, is merely reporting some studies, and states clearly that he is doing so. He then, I agree, draws stronger conclusions than he should on the strength of the evidence available."

    Simon does not hesitate to pillory anyone who appears to be on shaky ground (I understand he's been busy with some of that lately) and should know better than to draw strong conclusions from inadequate data or report things blindly. Perhaps he'd like to substantiate his statements in this blog entry; when referring to "various paper journals and magazines" he said (amongst other things):

    "in mathematics, the home educated children lagged noticeably behind those who had been to school… The teenagers might have greater vocabularies, but they are worse at mathematics."

    Show us the references please, Simon; while you're at it, report the means and standard deviations, at least, for the math scores, if you have them.

    Then we can see whether we agree with the conclusions - and, perhaps, make our own inferences about Simon's use of this information.

    ReplyDelete
    Replies
    1. I'm sorry, but a "transparently vague" comment pretty much by definition can't be a lie, or deliverately obfuscating. That's part of what "transparent" means. It involves making a statement that avoids unwarranted precision (which would be a lie), and then being clear that this is as specific a claim as can be made.

      One may, of course, do a poor job of being transparent, and thereby obfuscate accidentally, but as I said, it's pretty much a definitional thing that if you're being transparent you're neither lying nor deliberately obfuscating.

      So I'm afraid I'm going to have to drop out of this back-and-forth now, as it's clear there's too much dogmatism involved and too little real ingterest in the issues. Rather than engaging with any substantive point that I made, you decided to misrepresent a single statement and then express astonishment that I would advocate a view that I patently never advocated. It is, of course, always easier to argue against a "straw man", but usually not very helpful.

      Delete
    2. A different anonymous - I don't think he was 'transparently vague'. He claimed that a difference of 2 points for overall ACT scores, was 'a little higher', but describes a much smaller difference for maths, but in the opposite direction, as lagging 'noticeably behind those who had been to school'.

      He also originally mentions 'various paper journals and magazines' as his sources, but in the end it's a couple of newspaper articles that include very little data - certainly there is not enough detail to justify his conclusions. I suspect he will now claim that it was an ironic attempt to copy the way 'other home educators' pull the wool over people's eyes, offer deliberate falsehoods as fact and bank on nobody bothering to check the references.

      His final offering, the Home School Test results published by the Tennessee Department of education in 1987, showed that home schoolers out performed schooled pupils in all subject areas, including maths.

      ” In the spring of 1987, the Tennessee Department of Education found that homeschooled children in 2nd grade, on the average, scored in the 93rd percentile while their public school counterparts, on the average, scored in the 62nd percentile on the Stanford Achievement Test. Homeschool children in third grade scored, on the average, in the 90th percentile in reading on another standardized test, and the public school students scored in the 78 percentile. In math, the third grade homeschooled children scored, on the average, in the 87th percentile, while their public school counterparts scored in the 80th percentile. In eighth grade, the homeschooled students scored, on the average, in the 87th percentile in reading and in 71st percentile in math while their public school counterparts scored in the 75th percentile in reading and the 69th percentile in math”

      It’s true that I can’t find the full results and they tested 4 school years, so maybe the school year not mentioned did worse on maths, but nothing in the articles Simon mentions seem to support his claims.

      http://www.hslda.org/docs/nche/000010/200410250.asp#xix
      http://www.oakmeadow.com/resources/articles/wsj-home-schooled-stereotypes.php

      Delete
    3. This is what you said:
      "I would argue, that transparently vague comments along the lines of "a little higher", "noticeably behind", etc. are actually preferable to statistical analyses based on inadequate data. The former at least wear their inadequacy on their sleeve, so everyone knows that they are questionable. The latter hides behind a misleading precision, and easily deceives people into giving the conclusions claimed more weight than they deserve."

      The whole point about using statistical data, however inadequate, is that one can quantify things that you know (just for a start: ever heard of error bars?) then qualify with other possible selection effects, biases etc. Some of these might be amenable to further treatment, but maybe not.

      The first and blindingly obvious thing about the numbers quoted by Simon is that the errors/s.d., whatever, were not quantified and any difference between them is useless. Saying something "transparently vague" is the thing that conveys a false sense of precision.

      Consider the following real example: I have just conducted two experiments, A and B, to measure positive events generated by some mechanism. I obtained 70% positives in A and 40% positives in B.

      Following something someone (you perhaps, I'm not sure) said earlier, namely "a 13% differential is fairly notable", one might say that there is a notable differential between the two, or that A is "a little higher" than B, or B is "noticeably behind" A (your words, I think). Yet, knowing the details of the distribution, I can confidently state that the difference - 70% vs 40% - is completely insignificant.

      Unfortunately, those transparently vague comments are worthless, and imply a degree of precision that is completely misleading without an understanding of the data. Not convinced? Each experiment comprised ten tosses of a coin, and "heads" was the positive event. Still not convinced? The same coin was used in both experiments.

      Contrary to your suggestion, the fluffy language of "transparently vague comments" is the thing that implies false precision. It might not always be a deliberate lie, but if it isn't, it's so foolish as to be as bad as a lie; either way, credibility is destroyed when the truth is uncovered.

      By all means, use all the positive things that can be said in favour of home education, but when the data don't say what you want, don't try to dress them up with words (and of course, the same thing applies for those arguing in the opposite direction).

      Delete
    4. My comment at 12:49 (20:49 BST) was directed to Anonymous at 10:04 (18:04) rather than 12:02 (20:02 BST).

      Delete
    5. Following up on my 12:49/20:49 comment to anonymous at 10:04/18:04:

      "Rather than engaging with any substantive point that I made"

      I'm sorry, but I don't see a substantive point on which I did not engage.

      Delete
    6. An announcement of data in which you just say "I got 70% one time and 40% the other", but don't even tell the audience what the experiment was, strikes you as an example of transparency?

      And indeed, yes, I think you would be hard-pressed to find anyone who did not think that there was a "fairly notable" difference between 70% and 40% - when they are stated as bald numbers with no context of any type. Do you really not think there is?

      Of course, it's then possible to come up with examples where what seemed like a notable difference turns out not to be significant after all. Not "statistically significant", I'll note, which is a very different thing (a point you repeatedly elide), but genuinely significant.

      I deal with statistics on a regular basis in my job, so I'm quite familiar with both the strengths and weaknesses of their use. If used well they are a wonderful thing. If not used carefully, though, they can be extraordinarily misleading. As is the case with anyone who deals with statistics, it's quite easy to come up with cases in which an initially impressive statistical analysis fell apart on closer examination (because data was selected poorly, the data set was extremely small, questionable assumptions were made, etc.). Similarly, though, it's also quite easy to come up with situations in which statistical analysis provided real insight into an issue. Statistics are not a magical cure-all, they are just a really useful tool.

      Ultimately, it's though, it's just really not clear to me what your example is supposed to demonstrate. To repeat from an earlier post of mine: "Clearly we both agree that the statistical evidence is unavailable for solid conclusions...He then, I agree, draws stronger conclusions than he should on the strength of the evidence available."

      These are the comments that suggested to you that I would have a problem with an example that proves nothing other than that you should get as much information as possible before drawing your conclusions?

      As I have said before, the real issue is what to do when the evidence isn't there to be gotten - as will often be the case with home education. Your example, which involves deliberately omitting evidence and then spontaneously pulling it out of a hat, doesn't address that situation at all.

      Delete
    7. With respect to the Anonymous at 14 October 2012 12:02, I agree with both your points: the differential description of positive and negative results; the overstatement of the available sources. My concern has really been with the question of how data on home education can properly be approached and presented, rather than with Simon's specific comments. With respect to those specific comments, I think you're absolutely right in the problems you identify.

      Delete
    8. @anonymous 16:40
      "I think you would be hard-pressed to find anyone who did not think that there was a "fairly notable" difference between 70% and 40% - when they are stated as bald numbers with no context of any type."

      That's the trap; there is no justification for describing that difference as "fairly notable", but you seem to be happy to add words that imply things from bald numbers like those presented by Simon and in my example. I think this is the crux of our disagreement.

      "Not "statistically significant", I'll note, which is a very different thing (a point you repeatedly elide), but genuinely significant."

      I've no idea what this means; it sounds as though it's bordering on the metaphysical!

      "I deal with statistics on a regular basis in my job...Statistics are not a magical cure-all, they are just a really useful tool."

      Snap! I've done so for over thirty years, and of course I agree that statistics is a useful tool and not a cure-all. However, you seem to be suggesting that when statistics fail to give the desired result, one should resort to quack medicine, in the form of those "transparently vague comments", in order to transform those pesky null results into something desirable.

      "As I have said before, the real issue is what to do when the evidence isn't there to be gotten - as will often be the case with home education. Your example, which involves deliberately omitting evidence and then spontaneously pulling it out of a hat, doesn't address that situation at all."

      I was demonstrating that you need some extra information before making any claims about the data. If you don't have it, then that's tough; look elsewhere.

      You may be familiar with Bayesian methods; I find the principle of maximum entropy to be a powerful tool, both for practical computation and also as a more general guiding principle that could be described as common sense. With this in mind, you can return to my example - or Simon's numbers - and reflect on whether it's safe to add those vague comments.

      When the data don't tell you anything, don't add spurious information to make them say what you want! It's as simple as that.

      Delete
    9. 'Two statisticians walked into a blog...'

      Delete
    10. "I've no idea what this means; it sounds as though it's bordering on the metaphysical!"

      Try consulting even the most basic statistics textbook. It will explain it for you.

      Delete
    11. "Try consulting even the most basic statistics textbook. It will explain it for you."

      The ones I have are only written in English.

      Delete
  13. Dear me, in the few days that I have been away, this seems to have been covered in a far more comprehensive fashion than I could have managed!
    Original reports about both the SAT and ACT as taken by home educated teenagers may be found in the Des Moine Register for 14/12/04; Home schooled do well at Iowa's universities and also in the Wall Street Journal for 11/2/2000; Home School Kids Defy Stereotypes. Those wishing to see some earlier stuff, concerning the Stanford test, might wish to look at the Home School Test results published by the Tennessee Department of education in 1987.

    ReplyDelete
    Replies
    1. Given your penchant for critical analysis of the use of statistics by other people, would you care to comment on the statistical significance of the difference between math scores for educated and home educated children?

      Do the data available justify the statement that home educated teenagers "are worse at mathematics"?

      Provide quantitative justification; marks will be subtracted for verbosity and needless digression.

      (N.B. Submissions that attempt to harass distressed gentlefolk and elderly guest-house owners will be disqualified.)

      Delete
  14. 'Looks like Simon has been reduced to skulking anonymously around his own blog'

    How on earth does one skulk anonymously around one's own blog? Genuinely interested in this notion!

    ReplyDelete
  15. '(N.B. Submissions that attempt to harass distressed gentlefolk and elderly guest-house owners will be disqualified.)'

    I shall have to get back to you on the other points; I am reeling at this comment! 'elderly'? Wendy Charles-Warner is younger than me! And harass? I might equally claim that you are harassing an elderly blog writer. This is so marvellous that I am going to have to think of a special way to mock you. If only you didn't insist upon remaining anonymous.

    ReplyDelete
    Replies
    1. "I shall have to get back to you on the other points"

      Not the anonymous you were speaking to, but I'll not hold my breath for this, it's likely to endanger my health.

      Delete
    2. "This is so marvellous..."
      I thought you would enjoy that!

      Now stop prevaricating and get on with the task I set for you.

      But like the previous anon, I shan't hold my breath for a straight answer.

      Delete
  16. Simon has form for deleting pages when things get sticky. I recall he did it a few years ago, once he started to look even more foolish than usual.

    I think he claimed it was an accident on that occasion, but you can draw your own conclusions about his credibility.

    ReplyDelete
  17. Simon wrote on one of the deleted threads,
    “Well, actually, I said that:

    'Home educated kids are quite a bit ahead on English, especially reading'

    This is true, they are a couple of points ahead.”

    No Simon, you’re getting muddled. You said, ‘the average schooled pupils score is 21, but home educated teenagers come in a little higher at almost 23’. Then you go on, ‘It gets really interesting when you break down the individual components of the scores. Home educated kids are quite a bit ahead on English, especially reading’, so clearly the 2 point difference referred to overall ACT results, not the English results. In the year that home schoolers were 2 points ahead of school students (according to your own references), the home schoolers were 3 points ahead in English.


    Simon wrote,
    “They lag behind between a third of a point and just over a point in maths. I described this as 'noticeable'. This is also true; I noticed it and so did others.”

    You made this claim on the basis of a 0.3 point difference, or less than 1%. You cannot switch from one set of results to another as though you were talking about the same population. Are you really saying that a difference of 0.9% can be accurately described as, ‘the home educated children lagged noticeably behind those who had been to school’, especially after describing a 2 point difference (or 5.6% difference) as, ‘but home educated teenagers come in a little higher at almost 23’ in the same paragraph?

    Common usage of the phrase, ‘lagged noticeably behind' infers far more than you suggest when you claim that you simply noticed that the scores were lower. It suggests that the difference is enough to justify highlighting it as a problem - a less than 1% difference! And again, we go back to the lack of evidence that any of these scores reach statistical significance. None of your conclusions make sense without further statistical detail since they cannot be supported by raw data alone.

    ReplyDelete
  18. Simon wrote on one of the deleted threads,
    "I said on the post that in the ACT tests, home educated children scored higher on English, but lower on maths. There was no dispute about this."

    Firstly, you claimed that home schooled students scored a little higher on the basis of a 2 point difference, and also claimed that they lagged noticeably behind in maths on the basis of a 0.3 point difference that you failed to quantify in your blog post, a clear misrepresentation of the raw data.

    Also, raw scores in no way support your assertions that, 'home educated kids are quite a bit ahead on English', or that 'there is no difference at all in science and in mathematics, the home educated children lagged noticeably behind those who had been to school'. These claims could only be made with further statistical details such as standard deviations or p-values and ideally the comparison populations should be controlled for demographics. The differences in scores could be down to random chance as easily as to any real difference in the student’s skills. Making these statements without access to such figures is a clear misuse and distortion of the data, just as you have accused others of doing.

    ReplyDelete
  19. Hello, I desire to ѕubsсгibe fοr this blοg to get most
    up-to-ԁаtе uрdates, therefоre where can i do it please help.


    Feel fгeе to surf to my ωеblog: SEOPressor

    ReplyDelete
  20. Ah, have no idea exactly how I missed that about the version number.
    Though the funny point is that the anonymous vehicle driver on the ML-1210 web
    page is then model 3.01. Anyhow, I will certainly upgrade the blog post to make this a little
    additional clear.

    Also visit my web site xerox phaser 8560 maintenance kit

    ReplyDelete
  21. Hurrah! In the end I got a webpage from where I be capable of
    really obtain valuable data concerning my study and knowledge.


    My blog post - home

    ReplyDelete