Skip to comments.Have We Entered a Post-Literate Technological Age?
Posted on 08/18/2009 4:50:05 PM PDT by Star Traveler
Have We Entered a Post-Literate Technological Age?
by Adam C. Engst
Not long ago, Google produced a video that's making the rounds  on the Internet. In it, a Google employee asks people in Times Square in New York City a series of questions, such as "What is a browser?", "What browser do you use?", and "Have you heard of Google Chrome?" (Chrome  is Google's new Web browser; it's available for Windows and in pre-release test versions for the Mac.)
Among the geek set, the video has gotten a lot of play because most of the people in the video - who appear to be functional adults and who use the Internet regularly - come off as highly clueless. According to the video, only 8 percent of people queried that day knew what a browser is.
The video is clearly not a scientific study, and suffers from horrible methodology. It's likely, for instance, that simply asking "What is a Web browser?" would have produced better results, and the middle of Times Square is undoubtedly not where most people are thinking about the names of programs on their computers. But let's leave aside such criticisms for the moment.
What's Your Browser? Instead, let's take the results on face value and consider their implications. What does it say about the technological world in which we live that 92 percent of the people asked could not identify the name of the program they use to access the Web? If other statistics are to be believed, browsing the Web is the primary use of computers today, so that's saying these people couldn't name the program they use more than any other.
Worse, some of the answers on the video reveal that they don't even know what a program is. A number of them identified their browser as "a search engine" and "Google." When asked which browser he used, one guy said "the big E," undoubtedly meaning Microsoft Internet Explorer, which has a stylized lowercase letter E as its icon.
When the best someone can come up with is a vague recollection of a program's icon, it says to me that we've entered a "post-literate" technological society, one in which people have lost not just the ability to read and write about a topic, but also the ability to speak about it, all while retaining the ability to use it.
As someone who earns a living crafting text to help people learn how to use technology, I found myself profoundly troubled by Google's video. After all, if someone doesn't know what browser they use, or even that a browser is a program on their computer, how could I possibly expect them to be interested in buying my company's "Take Control of Safari 4 " book (written, with infinite care, by the estimable Sharon Zardetto)? How could they even learn of its existence, if they had no idea that Safari is a Web browser or that they were using Safari?
(One concern that I don't explore further in this article are the implications of a post-literate technological society for marketing technology itself - will even technology marketing be forced to rely solely on pretty pictures and emotional appeals? In fact, are we already there? Apple's "I'm a Mac" ads help customers identify with the actor playing the Mac but give little solid information, and Apple conceals many technical specifications about the iPhone.)
But perhaps I'm barking up the wrong tree, and Google's video in fact shows that we've taken great technological strides. TidBITS editor Glenn Fleishman, when we were discussing the video, suggested that it's a good thing that the Web browser has become so ubiquitous that people need not know what it's called to use it effectively.
(Linguistically, this same devolution has happened with the Web itself. Although it's TidBITS house style to capitalize "Web" - a proper noun that's a shortening of "World Wide Web" - it's commonplace to see even professionally edited publications lowercase the word, thus de-emphasizing the fact that it's a unique thing. I think they're wrong: "Web" should always be capitalized, as should "Internet.")
From a usability stance, I think I agree with Glenn - it's a good thing that using the Web has become so easy that a myriad of people can do so without even knowing the name of the tool they use to access it. Most people just use the browser that comes bundled with their computer, and despite the issues with Microsoft Internet Explorer over the years, Firefox has garnered only a bit over 20 percent of the browser market since 2004 - largely from the small subset of people who know what a browser is.
On a platform like the iPhone, it's even easier to see this trend toward obscuring the identity of the browser. Although Safari is the iPhone's Web browser, and its icon is clearly named, applications like Twitterrific can display Web content internally, and others, like Mail, can open a Web link in Safari without ever informing you that Safari is displaying your page. It would be difficult to quibble with someone who didn't realize that their iPhone browser was Safari, when in fact, much of the time they would be viewing the Web via some other app that piggybacks on top of OS X's WebKit core.
Tied up in all of this is the fact that if what's bundled with your computer or phone just works, you don't need to learn much more. Dissatisfaction is the mother of exploration - only if Safari or Internet Explorer isn't meeting your needs do you have much impetus to learn about and switch to Firefox. So the better technology works, the less we'll learn about how it works. I can't say that's entirely a bad thing.
When the Thing Breaks -- But I remain troubled by this post-literate inability to talk about everyday activities and the tools used to perform them, using the proper nouns that are not only generally agreed-upon by those in the know, but with which the graphical representations of those tools are clearly labeled. What happens when something goes wrong, and such a person can't connect to the Internet at all? Can you imagine the tech support call?
"Hi, this is tech support. How may I help you?"
"I can't get on the Google."
"OK, what browser are you using?"
"I told you - Google."
"Let's step back for a second. What program are you running on your computer to access the Web?"
"I don't know - I just Google when I want to find something."
"Perhaps we should go a bit further back. What icon do you click on when you want to use Google?"
"The picture? It's blue and kind of round, I think."
"OK, that's probably Internet Explorer. Can you load any Web sites other than Google?"
"If I can't get on Google, how can I load any other Web sites?!"
I could draw this out further, but it's not far-fetched (TidBITS staffer Doug McLean confirmed that my contrived dialog was painfully reminiscent of tech support calls he took in a previous job). In essence, the caller and the support rep don't share a common language. They may both be speaking English, but that's as far as it goes, and as soon as domain-specific words like "browser" come into play, communication breaks down. A good support rep would undoubtedly adjust his questions upon realizing that there's a terminology barrier, and like Captain Kirk meeting an alien, would attempt to build up some shared terminology based on visual appearance before attempting to solve the problem.
Generational Problem Solving -- If I asked you to tell me something about the caller in my fabricated script above, you might fall back on stereotypes and describe the caller as being elderly, or at least as someone who didn't grow up with technology and therefore has come to it, perhaps grudgingly, later in life. But what if I told you it could be a college student?
My neighbor Peter Rothbart teaches music at Ithaca College, and he's been noticing a disturbing trend among his students. Although they're capable of using the digital music software necessary for his courses, he says that many of them have trouble with the most basic of computer tasks, like saving files in a particular location on the hard disk. Worse, if something does go wrong, he finds, they have absolutely no idea how to solve the problem.
These aren't the sort of kids who are befuddled by high school - they're students at a well-respected institution of higher education. (It's the alma mater of Disney CEO Robert Iger, for instance.) No, they're not computer science majors, but they're not being asked to program, just to use off-the-shelf music software and perform commonplace tasks. And now those commonplace tasks are not only something that they apparently have never had to do, but lack the skills to figure out on their own.
Could this inability to solve a problem with a device with which they are otherwise familiar be a result of losing some ability to talk about it? I wouldn't go so far as to say it's impossible to troubleshoot without terminology, but it's less radical to suggest that troubleshooting will become more difficult without being able to communicate effectively with people who are experts in the field.
Not all that long ago, when adults had trouble getting something working on a computer, they would sarcastically say that they needed a teenager to explain it to them. That was largely true of those of us who were teenagers in the 1980s and 1990s, but if Peter Rothbart's experience is at all representative, today you'd be better off finding a 30- or 40-year-old geek to help.
Don't get me wrong - I'm not saying that all young people are incapable of solving technical problems or going beyond the basics. My friend Dave Burbank, whose full-time job is as a fireman in the City of Ithaca, is also a serious geek known for taking hundreds of photos on his kids' class trips, posting constant updates via Twitter, and updating a photo Web site for the trip before turning in each night. His 15-year-old son Istvan is currently a 3D animator at Moving Box Studios  in Ithaca and is perfectly capable of maintaining a technical discussion on the evolution of backup media and other such geeky topics.
In other words, there will always be geeks, and in my mind, that's a darn good thing. The technological sophistication of those people of my generation (I'm 41 now) who were interested in technology created the meme that young people were fluid with technology. But what we all missed was that being fluid with technology doesn't mean you understand how it works or can fix it when it breaks. Being able to dash off text messages on a mobile phone demonstrates fluidity; being able to troubleshoot a dead Internet connection down to a corrupted preference file or flaky cable demonstrates understanding.
So what will most members of society do when something on their computers or smartphones fails to work? Let's not pretend that problems won't happen - technology may have become more reliable over time, but the rate at which things go wrong even for undemanding users is still shamefully high.
Just recently, my father called because his iPod wouldn't show up in iTunes. After some back and forth, I suggested that he reset the iPod, and when he went to use it, he realized it was indeed entirely frozen. A hard reset brought it back to life and resolved his problem, but had he been on his own, it's possible that he - or at least someone less experienced than he is - would have concluded it was broken and bought another one.
This isn't a new concern. In 1909, E.M. Forster wrote a piece of early science fiction, "The Machine Stops ," in which he imagined a future in which face-to-face contact was considered bizarre, humanity lived underground, and the "Machine" fed all our needs. Of course, one day...the machine stopped. More recently and amusingly, consider the Pixar movie "Wall-E ."
Cars and Computers -- The obvious analogy in today's world, and one that several people have suggested in response to our discussions, is the car. At one time, knowledge of keeping a car running was a kind of patriarchal rite of passage. Failure to monitor oil levels, radiator fluids, and other factors could lead to a dead horseless carriage.
Few people know how cars work these days, and even those of us who do have a basic understanding of them can't really work on a modern car. If the car stutters when accelerating, or sometimes won't start, most of us simply take it in to the repair shop and get it fixed. Problem solved with the application of money, and of course, since cars work relatively well these days, much less monitoring is needed. When was the last time you checked your car's fluids?
Like so many automotive analogies, this one sounds good, but suffers under scrutiny. In part, repairing cars has become a specialty not so much because intelligent people couldn't understand what's wrong or figure out how to troubleshoot it, but because the training and equipment necessary to diagnose problems and effect repairs has itself become highly specialized. Gone are the days when you could fix a car with a few screwdrivers and a set of wrenches. The shops all download data from the car computer for diagnosis.
But the more serious problem with the analogy is that cars are single-purpose machines - they do one thing, and they do it moderately well. Thus, the type of problems they can suffer, while troubling, frustrating, and sometimes seemingly inexplicable, are still relatively limited in scope, more like a household appliance. How often do you have to check the inner workings of your washing machine or refrigerator?
In contrast, computers are general purpose machines that can perform a vast number of wildly different tasks, such as browsing the Web, reading email, writing a book, developing a company budget, tracking a database of customers, composing music, editing video, and so on.
We have up-and-coming geeks like Istvan Burbank, but even bright young men like Istvan have their limits. While I'd happily ask him to fix a Mac that's not booting, I'm not sure he'd have any idea how to help if I showed him a PDF where the text on some pages appeared darker and bitmapped when viewed in certain PDF readers (even Adobe hasn't been able to fix that problem reliably for me). There's a limit to how much any one of us can learn, but there's no limit to what a computer can do.
In a way, this is an odd situation for those of us who grew up with the personal computer. Before Apple, before the IBM PC, we had mainframes and minicomputers that we interacted with via dumb terminals. You couldn't do all that much, and you were sharing resources with many other people, but you also didn't have to worry about things going wrong as much, because when they did, the computer operators would fix them.
They were the gatekeepers, the wizards who controlled access and could say who was allowed to do what. Personal computers were supposed to democratize computing so anyone and everyone could do their own work. While that's come to pass in some ways, it seems to me that we've returned to the days when you need a wizard to solve problems or do anything beyond the norm. It's a somewhat uncomfortable situation, since those of us who grew up with personal computers are finding that we're the new wizards.
Technological Illiteracy -- So how did we get here? I'd argue that Apple - and we Macintosh users - are perhaps more to blame for this state of affairs than any other group. After all, no one has championed usability like Apple, with the Mac's vaunted ease-of-use. For years, many Mac users scoffed at manuals. "Why would anyone need a manual when the program is so easy to use?" they'd ask. It was a fair point, for the users of the time, who were highly interested in the technology, well versed in how it actually worked under the hood, and amenable to poking and prodding when things didn't go right.
But then we got our wish, and ever more companies started writing software that was easy enough for most people to use without reading a manual, at least at some level. That was the death of documentation, a phrase I first coined more than 10 years ago (see "The Death of Documentation ," 1998-05-04). Of course, it was really the death of the manual, and technical books have remained popular, in part because of the lack of the manual (how else could David Pogue have made a mint on his Missing Manual series?).
Even still, back when I started writing technical books in the early-to-mid 1990s, the average computer book would sell about 12,000 copies. Today, despite a vastly larger audience (though with much more competition), 5,000 copies is considered acceptable.
I'd argue there was a more insidious effect from the loss of manuals - it caused an entire class of users to become technologically functional while remaining technologically illiterate. When I asked my mother-in-law, Linda Byard, what browser she used, she became somewhat flustered and guessed at Outlook. This is a woman who uses the Web fluidly and for all sorts of tasks far more sophisticated than simply browsing static Web pages. And yet, the fact that she used Internet Explorer to do so escaped her.
As the conversation proceeded (and keep in mind that my father-in-law, Cory Byard, helped design personal computers for NCR back in the 1980s and now consults on massive database projects for Teradata - Tonya didn't grow up in a technologically backward household), it came out that Linda had stopped reading about how to use technology when manuals gave way to inferior online help.
She didn't stop learning how to use various programs, but without any sort of formalized instruction or written reference, she lost the terminology necessary to talk about the technology she was using. Of course, she had Cory around to fix anything that went wrong, and she said that the same was true of all her peers too - there was always someone technologically adept in the family to deal with troubles.
Although it's harder to pin this loss of technological literacy on the lack of manuals when looking at schoolkids, the problem isn't necessarily being addressed there either. When my son Tristan was in second and third grade in the public schools in Ithaca, NY, the closest he was taught to computer skills were typing (not a terrible idea, but tricky for kids whose hands aren't large enough to touch-type properly) and PowerPoint.
Although some level of presentation skills are certainly worthwhile, why would you have second graders focus on something that's guaranteed to be different (if not entirely obsolete) by the time they're in college?
I'd argue that some of the basics of technology - the concept of a program as a set of instructions and the essentials of networking - would be both more compelling for kids and more useful for understanding the way the world works later in life.
When TidBITS contributing editor Matt Neuburg tried to teach a group of his friends' kids REALbasic one summer, he found himself frustrated at almost every turn - they lacked the conceptual underpinning that they could make the computer do something. And more important, they didn't care, since they were accustomed to technology just working. It wasn't until he got them to draw a stick figure and, by changing the location of its parts repeatedly, make it walk across the screen, that one of them said, "Hey, this must be how my video games are made."
And networking? No, you don't need to know it works to use the Internet, but isn't it wondrous that an email message sent to a friend on the other side of the globe in Australia is broken up into many small pieces, shuttled from computer to computer at nearly the speed of light, and reassembled at its destination, no more than seconds later? Wouldn't it be fun to act out a packet-switched network with an entire class of second graders and the pieces of a floor puzzle? Or at least more fun than PowerPoint?
Luckily, this lack in the public education system isn't uniform. Glenn Fleishman's son Ben is about to enter a public elementary school in Seattle, where the beginning curriculum teaches kids about opening, saving, and printing files; later, it moves to task-based - not program-oriented - computer projects. That's much better.
But I digress.
Illiteracy Stifling Innovation? My more serious concern with our society's odd fluency with a technology that we cannot easily communicate about is that it might slowly stifle innovation. Already we're in a situation where browser innovation is almost the sole province of Apple and Microsoft, with contributions from Mozilla, Google, and maybe Opera.
Iterative changes from the incumbents can be worked in, since everyone will be forced to accept them, but does it become harder to convince most people to try a ground-breaking new technology because it's different, because it's talked about using strange new terminology, and perhaps because no paradigm-shifting new technology can by definition be so easy to use that it doesn't require some level of training? I fear that might be the case.
In the dawn of the computer age, the stakes weren't as high and the market wasn't as large, so I'd suggest that companies were more likely to take risks on innovative technologies that might appeal to only a small subset of the population. Today, with everyone using technology, I suspect that business plans and funding proposals all assume a large potential audience, which in turn causes the ideas to be vetted more on their business chances than their technological innovation.
Put another way, there have always been technological haves and have nots, but since there was no chance of selling technology to the have nots, technology of the past was less limited by the literacy of the audience. Since the technologically illiterate are not just buying technology now, but are the primary market for it, that has to be affecting the kind of ideas that get funding and are being developed in a real way.
Plus, think back to the point about dissatisfaction being the mother of exploration. We geeks may be willing to belly up to the new technology feeding trough since we're never satisfied. But once technology reaches a certain plateau of working well enough, if this lack of technological literacy is indeed a more general concern, spreading technological successes into the population as a whole may become all the more difficult.
I'm fully aware that my musings here are largely hypothetical and based on anecdotal evidence. But I think there's a new technology on the horizon that could serve as a test of my theory that anything sufficiently innovative will face an uphill battle due to the technological illiteracy of the user base: Google Wave .
For those who didn't see Google's announcement of Google Wave (we didn't cover it in TidBITS at the time because it was a technology announcement, not a service that people could use), it's a personal communication and collaboration tool that's designed to merge the strengths of email, instant messaging, wikis, and social networking services. (You can read more about it  at Wikipedia.)
On the plus side, Google Wave has the power of Google behind it, and Google could potentially merge it into Gmail, thus introducing it to 146 million users nearly instantaneously. But Google Wave will undoubtedly be quite different from Gmail, and will require a learning curve. Will that hamper its adoption, since email and instant messaging and other service work well enough that people aren't sufficiently dissatisfied to learn about and try Google Wave? Only time will tell.
This article originally appeared in TidBITS on 2009-08-18 at 1:56 p.m.
Permanent article URL: http://db.tidbits.com/article/10493
Unless otherwise noted, this article is copyright © 2009 TidBITS Publishing, Inc.. TidBITS is copyright © 2008 TidBITS Publishing Inc. Reuse governed by this Creative Commons License: http://www.tidbits.com/terms/.
A TidBITS article from Adam Engst...
I recently saw a sign on a vehicle in a parade that said something like;
“Find us by searching for [the_company_name].com on google.”
apparently not realizing that by giving their URL they already gave people all they needed to know to find them.
I’m just a dabbler, but know enough to help out friends and family that cannot communicate with tech support because of the langauge barrier from overseas tech support. Also as a former IT coordinator, I witnessed new applications’ classes being taught to employees in a very inappropriate manner - very techy and lots of IT jargon that laymen cannot comprehend. Everytime I would have to retrain the employees on the new applications because the software rep was an a@@.
I used to teach classes for the Macintosh at a Senior Center in Oregon, and I would definitely teach basic stuff and ideas without using the techy and IT jargon, too. I must have done it okay (in accomplishing that) because I always got rave reviews from those old coots... LOL...
Now, I’m one of them, too... :-)
This explains how DUmmies manage to get to their site
Like Biden referring to a web site address: “What’s the web number?” Idiot.
“Now, Im one of them, too... :-)”
That must mean we are the lucky ones! Got to love the old saying “there’s only one alternative to aging.”
I’ll tell you a big part of it is a lot of people cannot communicate. They have a poor vocabulary and are not used to using their brains to observe and figure something out. They are unable to express themselves and describe a problem, and conversely are unable to understand the solution.
And the generations of dummies are not passing on any information to their children.
My favorite latest example was the young mother I overheard in Walmart who was unable to explain to her daughter why you aren’t supposed to drink mouthwash.
Spare me another rant from a geeky technical writer. (I’m a tech writer, and I hear the same whining from co-workers all the time.)
Computer apps should be intuitive and transparent to users. People shouldn’t have to wade through steaming mounds of text or useless online help to figure out how to use a “graphical user interface.” Why should I have to parse a multi-page procedure when you could show me in a short screencam?
I used to teach computer apps, programming (Logo, Basic, and Pascal), and robotics to elementary/high school kids. He wants to teach something as abstract as networking to little kids? No. They’re better off playing outside.
You must not be familiar with Macintosh users then... :-)
A badge of honor with Macintosh Users, from my long experience with them, is how little they ever used a manual to figure out how to use the programs and operating system of the Macintosh computer, because of its excellent User Interface and the guidelines that Apple published for all its developers to follow...
I swear... I remember guys saying that they had never so much as ever opened a manual about the Macintosh Computer or any of its software — ever... LOL...
Now, that’s a good “User Interface” to be sure... :-)
for sure... :-)
I agree. If corporate IT departments bought Macs instead of PCs, lots of tech writers would be out of work!
It’s an animal that eats trees and shrubs.
If you want on or off the Mac Ping List, Freepmail me.
In case others don’t know...
TidBITS has been published weekly since April 16, 1990, which makes it one of the longest running Internet publications. TidBITS is published by Adam C. Engst, author of a number of computer books, including four editions of Internet Starter Kit for Macintosh, Eudora for Windows & Macintosh Visual Quickstart Guide, and five editions of iPhoto for Mac OS X: Visual QuickStart Guide.
TidBITS also publishes a series of electronic books in Adobe Acrobat (PDF) format that cover issues related to Mac OS X and the digital lifestyle. The “Take Control” series first appeared in October 2003 with the publication of Take Control of Upgrading to Panther which was issued at the same moment as the official launch of Mac OS X version 10.3 “Panther”.
And I’ve subscribed to their mailing list almost from the beginning. And I remember getting Adam Engst’s “Internet Starter Kit for Macintosh” when it wasn’t necessarily so easy getting on the Internet... :-)
I remember the one for Eudora, too. That was a favorite e-mail program of mine for a very long time. And then, I’ve got a series of the “Take Control” e-books put out by Adam Engst, too.
Practically all of what he has written there is very true. I had to laugh at the characterization of “teenagers” in the past and then “now”... That was right, in the past, if you wanted a computer fixed or tweaked or help with it — yeah..., “get a teenager”. But, nowadays, the teenagers have to get help from the older folks... LOL...
Heck, when I was a teenager, I was programming those new-fangled things in Base 2, with no monitor and only red lights for read-outs for Base 2 indications of on/off and paper tape to save the programming. I had to program everything, down to even telling it how to multiply and divide... Assembly language... ummm..., *I was the assembler*... LOL...
And so, these youngsters are getting quite computer illiterate these days. And a good User Interface is great (I believe in it), but it has probably contributed to the “dumbing down” of the average computer user, to the point where you get those kinds of “tech support” calls, as was indicated in that article.
I like the Mac OS X, because it can be as simple as you want — and then — as complex as you want, both at the same time and accommodating both kinds of people. That’s great. But, alas, there is a large group out there that have absolutely no clue as to what is going on... :-)
Hey, are you sure you haven't stolen my memories???
Those were the days my friend. We thought they’d never end... :-)
Actually, I was a bit old for that... but I was one of the teenagers that was selected by Bell Labs to make my own NPN transistor... They provided everything I needed including a 2” diameter silicon wafer, the doping chemicals (which I had to mix correctly), and a lot of theoretical texts that I had to extract HOW out of. It worked. The following year, I was selected to build a voice simulator... Wow... by changing capacitors I could get it to say AAAAHHHHH, EEEEEEHHHHH, OOOOOHHHHH, UUUUUUH, Ihhhhh. etc. Consonants were beyond it.
Well, my initial experience was that $50,000 “training computer” in our high school, in which there was no monitor, just red lights, indicating the state of the registers, a TTY to input and output and paper tape to save programming. We basically invented our own language, on the fly, as we programmed, using whatever pneumonics that we invented as we went along for the base 2, but Hexadecimal was useful too, as it would get tiresome writing all those 1 and 2s... LOL... (and very mistake-prone, too... of course...).
My first experience on the “real computers” was the GE TimeShare computer in Seattle, that was tied in with a university in Portland, where I was at. We also used TTYs for that, and dialed up the computer in Seattle from the TTY and connected. I learned BASIC on it (the “computer” taught me through its own tutorial program). I had reams and reams of rolled paper from that one... LOL...
After that, it was down to Oregon State and their computing center and learning FORTRAN, ALGOL, OSCAR (something used on campus and a specialized language, “Oregon State Conversational Aid to Research”). At that time, I carried stacks of punch cards in to the computing center to be ‘run” and would come back later to see what happened... :-)
Another symptom of the dumbing-down of America?
You said — I disagree. A good user interface is supposed to make “power user” features easily accessible to the average person.
:-) ... I don’t know about that one... my idea is that if you give average users “power user functions” in an easy-to-use User Interface — then you’re gonna get “power disasters”... LOL...
I would rather have it the way it is with the User Interface of the Mac OS X right now. The “power” is right below the surface, but you have to *work at it* to get there. Otherwise, all the easy to use functionality is still there for the average user...
Well, again, I would see a “power disaster” as a failing of the interface. You can’t protect users from themselves, but you can protect the computer from the user. Remember, the graphical user interface was devised because it was unreasonable to expect the average end user to learn command line prompts.
I suppose it depends on your definition of a power feature, though. It wasn’t too long ago when having a 14.4kbps modem was considered the bleeding edge of geekdom...
Well, let’s say, for one example, not setting people up as “root” — at the very beginning, when they first get their computer. Now..,. someone might say that since a user has a right to their own system and what they do, that all users should immediately set up the computer as “root” (that first account) and then set up the subsequent accounts and then use the others and only root as needed and when needed.
BUT, with Apple, you can’t set it up as root right at the beginning. In fact, that’s shut off for most all users. You have to know how to “turn it on” and then sign up as root.
That would be one example of hiding some power user functionality and also making one “go through steps” even to take advantage of it.
You said — Another symptom of the dumbing-down of America?
Well, it is surprising when some Grandmas and Grandpas are more “geeky” than their teenaged grand-daughters and grand-sons... LOL...
You said — “Racism Fail”
Well, some people will grasp at any straw to make their argument... LOL...
While there is a dumbing down going on (and I see it), it doesn’t mean that it supports the argument made by that person.... :-)
Since the beginning of desktop computers, I’ve always thought that computers should do *more of the work* and the users should do less of it, and just let the users get on with the tasks that they want the computer to accomplish *for them* — as opposed as users having to do things with the computer in order to get the computer to do anything for them in the first place. The computer should eliminate all the menial and repetitive and (what I would refer to as) “useless tasks” (useless in the sense of not really needing the user for them).
Now, the power user should also be given access to get in behind the scenes if and when he (or she) wants to. So, that means having a computer that shows an interface for the mass of the public (ease of use and not requiring them to do a whole lot of “computer tasks”) and at the same time, having the ability to provide for the access of the power users.
I really do think that the User Interface that Apple has built on top of UNIX does that sort of thing that I’m talking about (at least for the most part, and as far as we’ve gotten with computers these days). I’m still waiting for the “Star Trek” talking computer when I’ll just be able to tell it to do what I want to do — even the power user things I want it to do... :-)
After that post, I don’t think our ideas are all that dissimilar. We just have slightly different expectations for power use. I can live with that. :)
I had my computer read this to me, and I have to say, this is siejva eap39m ,app.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.