Skip to comments.
Geek Gibberish? What does this mean?
Posted on 11/02/2004 11:50:20 AM PST by astonberry
click here to read article
Navigation: use the links below to view more comments.
first previous 1-20 ... 41-60, 61-80, 81-100 ... 161-164 next last
To: Zack Nguyen
FOR GREAT JUSTICE!
61
posted on
11/02/2004 11:56:52 AM PST
by
Pyro7480
(Sub tuum praesidium confugimus, sancta Dei Genitrix.... sed a periculis cunctis libera nos semper...)
To: ctdonath2
Does that mean if someone types Free Republic into Google, they will not be able to access the site?
To: LakeLady
I DON'T KNOW WHAT IT MEANS. STOP FREAKING OUT FREEPERS. THE MESSAGE WAS MEANT FOR A FEW PEOPLE. IF YOU DON'T UNDERSTAND, IGNORE IT.
Sorry--a little stressed out today, as you can imagine.
63
posted on
11/02/2004 11:56:54 AM PST
by
cwiz24
(Hey Yankees fans---Now who's ya daddy?)
To: astonberry
Its FR code...your supposed to use your decoder ring... PK Ã/bQ R MB5/RESTORE.COMí½}`Õõ0|çcg7»%ÈÇfÃGTFIø PËWA©v3ÔbXÜ6fwM±µZ+`ú³mJW¤HØ´Á h'vÞsîìn>l¿÷y÷ùãAgöÞsÏ=÷ÜsÏ=÷{g&-ÿ|Úºµå¥©t%
To: Chemist_Geek
A Webcrawler or spider is a computer program which automatically downloads every available page from a Web server. This ties up bandwidth, and can prevent human browsers from getting their pages. Ok, but do computer program's read the blurb that's appearing at the top of the latest post's page? Or could there be some type of program inserted in the blurb to alert the spider programs?
65
posted on
11/02/2004 11:57:05 AM PST
by
Sally'sConcerns
(It's painless to be a monthly donor!)
To: SandyInSeattle
Programs are coming to pull info from the site which slows down everything. It's how search engines know where the sites exist when you do a search. They have programs to take "snapshots" of websites.
66
posted on
11/02/2004 11:57:10 AM PST
by
AppyPappy
(If You're Not A Part Of The Solution, There's Good Money To Be Made In Prolonging The Problem.)
To: astonberry
I don't know what it means, but I'm getting duct tape and plastic sheeting ready, just in case!
Comment #68 Removed by Moderator
To: bwteim
No, no no... it means:
"This website has got to be the most fascist naziast place ive ever been! The Juws will win someday and when the Arabs are gone and when the Juws and rightous gentils go up top you will be atr the bottom! I demand an explanation for why you are doing this to me and my boss and the Juws!"
69
posted on
11/02/2004 11:57:26 AM PST
by
Thrusher
(Remember the Mog.)
To: SandyInSeattle
spiders are programs that visit web sites to download and index their content for search engines. The file 'robots.txt' on a server tells which spiders are allowed. They are supposed to stay away if not allowed in 'robots.txt'. It is a webmasters tool to control how much programs are allowed to visit the site. It's being done to conserve bandwidth, cpu for human vistors., IMHO.
To: SandyInSeattle
spiders are programs that visit web sites to download and index their content for search engines. The file 'robots.txt' on a server tells which spiders are allowed. They are supposed to stay away if not allowed in 'robots.txt'. It is a webmasters tool to control how much programs are allowed to visit the site. It's being done to conserve bandwidth, cpu for human vistors., IMHO.
To: cwiz24
Sorrrrrrrrrrrrrrrrrrrrrrrrrrry.
72
posted on
11/02/2004 11:58:04 AM PST
by
LakeLady
(I am the FEMALE ALPHA DOG POLITICAL ACTIVIST OF THE CENTURY! I hunt & tree demonRATS... WOOF! WOOF!)
To: b4its2late
73
posted on
11/02/2004 11:58:10 AM PST
by
E. Pluribus Unum
(I actually did vote for John Kerry, before I voted against him.)
To: astonberry
Geek Gibberish? What does this mean?The mods are don't have enough "R"'s. It should be gReek gibberish. I have a few "R"'s that I'll donate.
74
posted on
11/02/2004 11:58:14 AM PST
by
4CJ
(Laissez les bon FReeps rouler)
To: z3n
robots.txt informs the webcrawlers not to spider the site. Usually this is called in the meta tags and header, but aparently you may have more success by placing a note in the body (I was not aware of this). And in English?
75
posted on
11/02/2004 11:58:14 AM PST
by
benice
(Democrats are all misfits.)
To: SandyInSeattle
"But what does that mean? English, for the rest of us?"
Sorry, my bad.
Spiders/Crawlers are automated servers that regularly "crawl" a site looking for updated information to include in the search engine databases.
Small sites can have very little or no spider traffic.
My site can have anywhere from 0 to 15 crawlers at any particular time.
Busy sites, particularly ones that are linked to in a very large quantity of other spidered sites on the internet are more likely to get spidered A LOT.
You can deny spiders/crawlers from sucking up your server resources using the robots.txt
76
posted on
11/02/2004 11:58:14 AM PST
by
z3n
To: b4its2late
All your votes are belong to us.
77
posted on
11/02/2004 11:58:19 AM PST
by
pbear8
(We pray for a landslide)
To: Howlin
God, ain't THAT the truth!
78
posted on
11/02/2004 11:58:20 AM PST
by
AggieCPA
(Howdy, Ags!)
To: Lorianne
It's 'geek' to me.ROTFLMPJO
I needed a good laugh, thanks.
79
posted on
11/02/2004 11:58:22 AM PST
by
Mister Baredog
((Part of the Reagan legacy is to re-elect G.W. Bush))
To: Howlin
LOL!
Noooooooooo!!!!! Ungh! My spider's been beebered. I think I'll start a vanity thread.
80
posted on
11/02/2004 11:58:23 AM PST
by
SittinYonder
(Tancredo and I wanna know what you believe)
Navigation: use the links below to view more comments.
first previous 1-20 ... 41-60, 61-80, 81-100 ... 161-164 next last
Disclaimer:
Opinions posted on Free Republic are those of the individual
posters and do not necessarily represent the opinion of Free Republic or its
management. All materials posted herein are protected by copyright law and the
exemption for fair use of copyrighted works.
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson