Posted on 03/29/2006 10:20:00 AM PST by conservativecorner
NEW YORK In a (perhaps) historic shift, more Americans now consider themselves Democrats than Republicans, the Gallup organization revealed today.
Republicans had gained the upper hand in recent years, but 33% of Americans, in the latest Gallup poll, now call themselves Democrats, with those favoring the GOP one point behind. But Gallup says this widens a bit more "once the leanings of Independents are taken into account."
Independents now make up 34% of the population. When asked if they lean in a certain direction, their answers pushed the Democrat numbers to 49% with Republicans at 42%. One year ago, the parties were dead even at 46% each.
This shift indicates, Gallup says, why its polls show Democrats leading in this year's congressional races.
The latest poll was taken from January to March 2006, with a national sample of about 1,000 adults.
All depends on your sample.
I don't believe this.
More wishful polling by the Dinosaur Media.
The next question - where were the polled individuals located?
Actually, from my experience, increasing numbers of Republicans are becoming Independents. Like myself.
I wonder about anyone who might go over to the Dark Side.
Sometimes I think more FReepers consider themselves Democrats than Republicans... ;)
What was the percentage of Dimocraps is this poll? Rubbish!
The poll is probably not too far off. The change is within the margin of error, so the shift may not be all that dramatic if at all.
It was bound to happen. These globalist that we currently have in power have betrayed this country NUMEROUS times and are lapdogs of foreigners.
2006 & 2008 elections will be depressing.
You are not alone..
UM NO they are not. The only real measurement of this was the last National Election in 2004. People do not quickly or easily change their Party ID. What the Gallup people have done is manufacture a rational to justify their grosteque manipulation of their sample to get a predetermined outcome. Classic Garbage IN-Garbage Out junk science
This is relevant to your question:
Poll Methodology - A 2004 Guide
There has been intense interest in the polls this year, and the recent disagreement about the range of position has only highlighted discussion. Some people like to support a poll with results they like, without any sort of examination about why that poll is different from others. And some reject polls on a charge of outright bias or prejudice, which I can understand, given the partisan comments from supposedly objective people like John Zogby and Larry Sabato, but I must caution the readers to be careful to consider the evidence before accepting or rejecting a poll.
Lets start with the obvious; more information is better, especially if it is relevant to how the numbers were driven. By relevant, I mean two things: The information should show valid evidence to support the polls main conclusion, and the information should be consistent with past polls, so that trends and historical benchmarks may be seen. To that end, I discovered that in terms of methodology, we can separate the polls into three broad types the polls which provide demographic internal data, the polls whose questions show mood in the main issues, and those polls which refuse to provide internal data.
The best way to find out how the polls developed their methodologies, is to look for that information. Some publish their methodologies at the bottom of their poll releases, others are so proud of their methodologies, they wrote up special articles to explain their process. Others did not have their methodologies handy, but responded when I asked them how they did their polling. And others, well, they were neither forthcoming nor cooperative, and that speaks for itself. This article allows you to get to know the polls all over again, this time starting form the inside. I figure, this guide will help you figure out for yourself, whose word is worth listening to, and who is nothing but hooey. I am listing the polls in alphabetical order. All telephone polls referenced employ Random-Digit-Dialing (RDD); RDD is used to pre-select Area codes and exchanges, then use a randomizer to select the last 3 or 4 digits, depending on the poll. When I say pure RDD, I mean that the respondent poll is new; some polls appear to use an initial pool of respondents for future polling, and I will note this where it shows up. All references to Margin of Error reflect a standard 95% confidence level by the polls. When I reference NCPP, I mean the National Council on Public Polls, who published guidelines for demographic weighting and internal responsibility, which they expect their members to follow. Another national group for pollers is the American Association of Public Opinion Research (AAPOR), but they appear to be much smaller, and have looser standards than the NCPP. Its worth noting, though, that neither the NCPP nor AAPOR appears to have any deterrent in their policies; there is no specified penalty for not meeting their standards, nor any formal auditing authority. That, of course, is one reason Im doing this review.
Gallup: The gold standard of opinion polling. Gallup presents demographic and trend data for every poll they have anything to do with. Whether on their own or in combination with other groups (the CNN/USA Today/Gallup poll, for example), Gallup insists on consistent procedures to insure consistency. Their respondent poll is pure RDD for the Presidential Trial Heats. Gallup weights their polls in line with NPCC guidelines, and releases internal data on race, gender, party affiliation, age, region, education, economic strata, union/non-union, veteran/non-veteran, religious preference, and sexual orientation. Gallup polls are random telephone interviews, with around 1,000 adults on average, around 76-80% registered voters responding. Announced Margin of Error is +/- 4 points. The down side to the demographics details, is that they are generally only available to Gallup subscribers. With a 69-year track record, Gallup is able to show an impressive record for their predictions and tracking.
This is relevant to your question:
Poll Methodology - A 2004 Guide
There has been intense interest in the polls this year, and the recent disagreement about the range of position has only highlighted discussion. Some people like to support a poll with results they like, without any sort of examination about why that poll is different from others. And some reject polls on a charge of outright bias or prejudice, which I can understand, given the partisan comments from supposedly objective people like John Zogby and Larry Sabato, but I must caution the readers to be careful to consider the evidence before accepting or rejecting a poll.
Lets start with the obvious; more information is better, especially if it is relevant to how the numbers were driven. By relevant, I mean two things: The information should show valid evidence to support the polls main conclusion, and the information should be consistent with past polls, so that trends and historical benchmarks may be seen. To that end, I discovered that in terms of methodology, we can separate the polls into three broad types the polls which provide demographic internal data, the polls whose questions show mood in the main issues, and those polls which refuse to provide internal data.
The best way to find out how the polls developed their methodologies, is to look for that information. Some publish their methodologies at the bottom of their poll releases, others are so proud of their methodologies, they wrote up special articles to explain their process. Others did not have their methodologies handy, but responded when I asked them how they did their polling. And others, well, they were neither forthcoming nor cooperative, and that speaks for itself. This article allows you to get to know the polls all over again, this time starting form the inside. I figure, this guide will help you figure out for yourself, whose word is worth listening to, and who is nothing but hooey. I am listing the polls in alphabetical order. All telephone polls referenced employ Random-Digit-Dialing (RDD); RDD is used to pre-select Area codes and exchanges, then use a randomizer to select the last 3 or 4 digits, depending on the poll. When I say pure RDD, I mean that the respondent poll is new; some polls appear to use an initial pool of respondents for future polling, and I will note this where it shows up. All references to Margin of Error reflect a standard 95% confidence level by the polls. When I reference NCPP, I mean the National Council on Public Polls, who published guidelines for demographic weighting and internal responsibility, which they expect their members to follow. Another national group for pollers is the American Association of Public Opinion Research (AAPOR), but they appear to be much smaller, and have looser standards than the NCPP. Its worth noting, though, that neither the NCPP nor AAPOR appears to have any deterrent in their policies; there is no specified penalty for not meeting their standards, nor any formal auditing authority. That, of course, is one reason Im doing this review.
Gallup: The gold standard of opinion polling. Gallup presents demographic and trend data for every poll they have anything to do with. Whether on their own or in combination with other groups (the CNN/USA Today/Gallup poll, for example), Gallup insists on consistent procedures to insure consistency. Their respondent poll is pure RDD for the Presidential Trial Heats. Gallup weights their polls in line with NPCC guidelines, and releases internal data on race, gender, party affiliation, age, region, education, economic strata, union/non-union, veteran/non-veteran, religious preference, and sexual orientation. Gallup polls are random telephone interviews, with around 1,000 adults on average, around 76-80% registered voters responding. Announced Margin of Error is +/- 4 points. The down side to the demographics details, is that they are generally only available to Gallup subscribers. With a 69-year track record, Gallup is able to show an impressive record for their predictions and tracking.
I wish they would take a poll someday and no one would answer.
This is relevant to your question:
Poll Methodology - A 2004 Guide
There has been intense interest in the polls this year, and the recent disagreement about the range of position has only highlighted discussion. Some people like to support a poll with results they like, without any sort of examination about why that poll is different from others. And some reject polls on a charge of outright bias or prejudice, which I can understand, given the partisan comments from supposedly objective people like John Zogby and Larry Sabato, but I must caution the readers to be careful to consider the evidence before accepting or rejecting a poll.
Lets start with the obvious; more information is better, especially if it is relevant to how the numbers were driven. By relevant, I mean two things: The information should show valid evidence to support the polls main conclusion, and the information should be consistent with past polls, so that trends and historical benchmarks may be seen. To that end, I discovered that in terms of methodology, we can separate the polls into three broad types the polls which provide demographic internal data, the polls whose questions show mood in the main issues, and those polls which refuse to provide internal data.
The best way to find out how the polls developed their methodologies, is to look for that information. Some publish their methodologies at the bottom of their poll releases, others are so proud of their methodologies, they wrote up special articles to explain their process. Others did not have their methodologies handy, but responded when I asked them how they did their polling. And others, well, they were neither forthcoming nor cooperative, and that speaks for itself. This article allows you to get to know the polls all over again, this time starting form the inside. I figure, this guide will help you figure out for yourself, whose word is worth listening to, and who is nothing but hooey. I am listing the polls in alphabetical order. All telephone polls referenced employ Random-Digit-Dialing (RDD); RDD is used to pre-select Area codes and exchanges, then use a randomizer to select the last 3 or 4 digits, depending on the poll. When I say pure RDD, I mean that the respondent poll is new; some polls appear to use an initial pool of respondents for future polling, and I will note this where it shows up. All references to Margin of Error reflect a standard 95% confidence level by the polls. When I reference NCPP, I mean the National Council on Public Polls, who published guidelines for demographic weighting and internal responsibility, which they expect their members to follow. Another national group for pollers is the American Association of Public Opinion Research (AAPOR), but they appear to be much smaller, and have looser standards than the NCPP. Its worth noting, though, that neither the NCPP nor AAPOR appears to have any deterrent in their policies; there is no specified penalty for not meeting their standards, nor any formal auditing authority. That, of course, is one reason Im doing this review.
Gallup: The gold standard of opinion polling. Gallup presents demographic and trend data for every poll they have anything to do with. Whether on their own or in combination with other groups (the CNN/USA Today/Gallup poll, for example), Gallup insists on consistent procedures to insure consistency. Their respondent poll is pure RDD for the Presidential Trial Heats. Gallup weights their polls in line with NPCC guidelines, and releases internal data on race, gender, party affiliation, age, region, education, economic strata, union/non-union, veteran/non-veteran, religious preference, and sexual orientation. Gallup polls are random telephone interviews, with around 1,000 adults on average, around 76-80% registered voters responding. Announced Margin of Error is +/- 4 points. The down side to the demographics details, is that they are generally only available to Gallup subscribers. With a 69-year track record, Gallup is able to show an impressive record for their predictions and tracking.
This is relevant to your question:
Poll Methodology - A 2004 Guide
There has been intense interest in the polls this year, and the recent disagreement about the range of position has only highlighted discussion. Some people like to support a poll with results they like, without any sort of examination about why that poll is different from others. And some reject polls on a charge of outright bias or prejudice, which I can understand, given the partisan comments from supposedly objective people like John Zogby and Larry Sabato, but I must caution the readers to be careful to consider the evidence before accepting or rejecting a poll.
Lets start with the obvious; more information is better, especially if it is relevant to how the numbers were driven. By relevant, I mean two things: The information should show valid evidence to support the polls main conclusion, and the information should be consistent with past polls, so that trends and historical benchmarks may be seen. To that end, I discovered that in terms of methodology, we can separate the polls into three broad types the polls which provide demographic internal data, the polls whose questions show mood in the main issues, and those polls which refuse to provide internal data.
The best way to find out how the polls developed their methodologies, is to look for that information. Some publish their methodologies at the bottom of their poll releases, others are so proud of their methodologies, they wrote up special articles to explain their process. Others did not have their methodologies handy, but responded when I asked them how they did their polling. And others, well, they were neither forthcoming nor cooperative, and that speaks for itself. This article allows you to get to know the polls all over again, this time starting form the inside. I figure, this guide will help you figure out for yourself, whose word is worth listening to, and who is nothing but hooey. I am listing the polls in alphabetical order. All telephone polls referenced employ Random-Digit-Dialing (RDD); RDD is used to pre-select Area codes and exchanges, then use a randomizer to select the last 3 or 4 digits, depending on the poll. When I say pure RDD, I mean that the respondent poll is new; some polls appear to use an initial pool of respondents for future polling, and I will note this where it shows up. All references to Margin of Error reflect a standard 95% confidence level by the polls. When I reference NCPP, I mean the National Council on Public Polls, who published guidelines for demographic weighting and internal responsibility, which they expect their members to follow. Another national group for pollers is the American Association of Public Opinion Research (AAPOR), but they appear to be much smaller, and have looser standards than the NCPP. Its worth noting, though, that neither the NCPP nor AAPOR appears to have any deterrent in their policies; there is no specified penalty for not meeting their standards, nor any formal auditing authority. That, of course, is one reason Im doing this review.
Gallup: The gold standard of opinion polling. Gallup presents demographic and trend data for every poll they have anything to do with. Whether on their own or in combination with other groups (the CNN/USA Today/Gallup poll, for example), Gallup insists on consistent procedures to insure consistency. Their respondent poll is pure RDD for the Presidential Trial Heats. Gallup weights their polls in line with NPCC guidelines, and releases internal data on race, gender, party affiliation, age, region, education, economic strata, union/non-union, veteran/non-veteran, religious preference, and sexual orientation. Gallup polls are random telephone interviews, with around 1,000 adults on average, around 76-80% registered voters responding. Announced Margin of Error is +/- 4 points. The down side to the demographics details, is that they are generally only available to Gallup subscribers. With a 69-year track record, Gallup is able to show an impressive record for their predictions and tracking.
I don't know if any of these polls are worth the powder to blow them from here to perdition but it would not surprise me if there was a shift left. When republicans act more like democrats with profligate spending, enacting enormous entitlements, stripping free speech rights, backpedaling on the war and supporting amnesty for illegals and open borders why should anyone be surprised that there is a shift left? After all those doing the shifting are simply reflecting the republican party itself.
To prove how true your comment is, just start a topic on animal rights.
You'd think animals were above humans (a decidedly liberal perspective).
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.