Skip to comments.Too big to trust? Google's growing credibility gap
Posted on 04/15/2014 3:30:52 AM PDT by markomalley
Remember when we all loved Google? Its search engine was both simple to use and an unbiased portal to anything you wanted to know. It was founded by two college students at a time when Silicon Valley was a shining beacon of what was right in the world, during sunny economic and political times.
We don't love Google so much any more, mainly because we trust it less and less. More and more people have realized that the Google search engine is hugely biased in favor of advertisers, and the results are increasingly manipulated by Google for inscrutable purposes. Google seems to track anything and everything we do -- it peruses our emails, our files stored on its servers, our locations, and our chats. Americans are getting nervous .
[ They all do it: Welcome to the new world of perpetual spying . | Subscribe to InfoWorld's Consumerization of IT newsletter  today. ]
When Google bought smart thermostat maker Nest  earlier this year, the public recoiled -- Nest owners didn't want their thermometers to be the latest spying portal in their homes for Google to use. That negative reaction drove home the growing Google trust problem. Likewise, no one really believed that Google wasn't participating in the NSA's spying on users; it seemed a clear case of the lady doth protest too much. Plus, we saw how much Google is spying on us, whether or not in support of the NSA. If anything, Google's response seemed to be indignation that the NSA was piggybacking on Google's own privacy-mining efforts .
For most people, Google is still a shining star. It ranks as the second-most valuable brand in the world , after Apple and before Coca-Cola, a ranking that has grown in recent years. It's also at the top of the rankings for best places to work . It's not as if Google has yet become Facebook, whose abuse of personal information is assumed . But the cracks in Google's reputation are growing.
Consider Google's recent policy update for the Google Play Store , which is where you get Android and Chrome OS apps. The latest policies forbid apps that mislead users into buying add-ins, releasing their personal data, or going to websites -- common techniques for dubious advertisers and vendors, as well as cyber criminals.
But will Google enforce these policies? Google didn't respond to InfoWorld's query on the matter, but its past actions suggest it will not, other than occasionally as a sort of spring cleaning . Google has long had a hands-off approach to apps, doing little to weed out malware and other abusive apps. It trusted app makers to do the right thing.
Ironically, Google's own search engine would fail some of those new Play Store policies -- you can't always tell what search-result links you click are sponsored  versus neutral, and many of the advertised links lead to scam sites that surreptitiously steal user information. Google also plays games with the unsponsored search results, favoring content from people and organizations with active Google+ accounts, for example. Google Search and the Play Store are becoming more and more like Craigslist, the pioneering, once-virtuous online classified-ads system that now is a seedy venue favored by scammers for finding new victims.
The reality is that Google's business is and has always been about mining as much data as possible to be able to present information to users. After all, it can't display what it doesn't know. Google Search has always been an ad-supported service, so it needs a way to sell those users to advertisers -- that's how the industry works. Its Google Now voice-based service is simply a form of Google Search, so it too serves advertisers' needs.
In the digital world, advertisers want to know more than the 100,000 people who might be interested in buying a new car. They now want to know who those people are, so they can reach out to them with custom messages that are more likely to be effective. They may not know you personally, but they know your digital persona -- basically, you. Google needs to know about you to satisfy its advertisers' demands.
Once you understand that, you understand why Google does what it does. That's simply its business. Nothing is free, so if you won't pay cash, you'll have to pay with personal information. That business model has been around for decades; Google didn't invent that business model, but Google did figure out how to make it work globally, pervasively, appealingly, and nearly instantaneously.
I don't blame Google for doing that, but I blame it for being nontransparent. Putting unmarked sponsored ads in the "regular" search results section is misleading, because people have been trained by Google to see that section of the search results as neutral. They are in fact not. Once you know that, you never quite trust Google search results again. (Yes, Bing's results are similarly tainted. But Microsoft never promised to do no evil, and most people use Google.)
The issue gets trickier when you move away from search and into apps, whether Chrome OS or Android. Free apps are what people want, so app makers end up doing the same data-mining that sustains Google Search, using a shadowy network of companies  to do the work for them. The result is that many mobile apps have the same kind of scams you see on the Web . Sometimes Google is in that mix (innocently, or at least not looking too hard), sometimes it is not. That's why opt-in permissions and clear disclosure are necessary -- so you don't feel fooled.
But many paid apps use these same services to increase their income -- you may think by paying for the app or an in-app extension, your data and behavior are not being mined. But they often are, typically without your knowledge. That's extra income for the app maker, as well as the data miners they work with. Or it supports an artficially low price that drew your interest in the first place. If a deal seems too good to be true ...
Google is hardly alone in plying this murky data-mining trade. But it's the largest visible company in that business, so it's an easy, obvious target for distrust -- and user wrath. Many of us have given up on Facebook ever being honest , so we're looking at Google as the next line to hold.
Also, Google was a very optimistic, idealistic company in its youth. It really did want to change the world for the better, and it believed in freeing information for all as a way to empower individuals. It believed its early "do no evil" motto. It really did see Android as a way to democratize smartphones, which until then were the province of the well-to-do who could afford BlackBerrys or iPhones. Yes, making Android freely available also created a large footprint for Google's services, so its moves were hardly selfless -- but they were oriented toward doing greater good while making money, a virtuous business approach.
Google employees still believe that's how their company works: a force for good that harmlessly uses personal data to both help individuals and make money that supports its many activities and innovations.
But as time goes on, the mercantile needs are coloring the do-gooder impulses. Google is a public company, and it has to satisfy shareholders' desire for profits every quarter. That creates a tension between its reputation and its economic reality. By sweeping that tension under the rug, Google only creates a place for distrust to grow. We can all see that the old Google is not the current Google, and the pretense that it is only heightens our suspicions.
It's time for Google to admit what it does and to act consistently on its policies (or withdraw policies it doesn't intend to enforce). That honesty will help stem the loss of trust. People know that companies exist to make money, but they need to know the true relationship they're entering and don't end up feeling misled. We all know the promises that the banks, airlines, insurance companies, cellular providers, and cable companies make aren't real, and they routinely mislead us on pricing and services -- so we don't trust them. Does Google really want to be like those industries?
Trust comes from honesty, and the key to honesty is to be forthright. Google doesn't seem to understand that yet.
I never trusted Google from day one. It’s nice to know that people are starting to catch on.
Not only do you get unwanted ads or worse, redirected to malware websites, but the extension can be rewritten to send phishing ads in your name to your contact lists.
It’s still beyond me why people moved away from Firefox over some very vocal homosexuals pushing out a right-of-center CEO and moved towards Google’s Chrome browser which is basically an intelligence gathering platform.
Chrome is a virus, IMO. They glitz it up with “cool features” and promote extension/add-on development for “privacy” tools, but we all know what’s really going on.
As a test, I took a PC with Windows 7, installed Chrome, and I put the machine behind a proxy that logged all of the traffic. (this could also be accomplished through netstat, but I had a proxy server available which makes it easier.) Chrome had 40% more established connections to Google (straight out of the box, no extensions or add-ons) than Firefox did. Further, many of the IPs were not configured with reverse lookups, meaning Google either intentionally obfuscated the endpoints or they were negligent in configuring their DNS. (A traceroute revealed the traffic going through Google’s infrastructure portals but timed out thereafter.)
I don’t trust Google. I don’t want to do business with them. I intentionally block all of their infrastructure tools on my personal machines. And I will advocate against them until I’m blue in the face. I don’t believe they have anyone’s interests at heart except their own.