Free Republic
Browse · Search
General/Chat
Topics · Post Article

Skip to comments.

Microsoft manager admits to copying 'Mac look and feel'
tuaw ^

Posted on 11/11/2009 9:13:06 PM PST by Gomez

We've been saying it for years, but everyone thought that we Mac-heads had a chip on our shoulder from the once-sour relationship with Microsoft. Finally, however, the truth has come to light; a group manager from Microsoft has gone on record and admitted the source of some of Windows 7's prettier bits and pieces.

Simon Aldous sat down with PCR for an interview and when he was asked to comment on the agility of Redmond's latest operating system, he had this to say:

"What we've tried to do with Windows 7 – whether it's traditional format or in a touch format – is create a Mac look and feel."
While I am glad that people at Microsoft are finally accepting the superiority of OS X, it still stung a little when Simon started to backtrack saying that Vista was more stable than OS X.

Update: It looks like the Windows team had a few things to say about Simon's earlier remarks noting that "his comments were inaccurate and uninformed." It is, in my opinion, difficult to deny that the OS X dock has had a positive impact on how people use their computers whether they be PCs or Macs. That being said, I'm glad Microsoft didn't take this one lying down.


TOPICS: Computers/Internet
KEYWORDS: ilovebillgates; iwanthim; iwanthimbad; maccult; microsoftfanboys; windolts; wintrolls
Navigation: use the links below to view more comments.
first previous 1-20 ... 41-6061-8081-100101-119 next last
To: politicket

Aww, is the show over?


81 posted on 11/12/2009 11:16:23 AM PST by Terpfen (FR is being Alinskied. Remember, you only take flak when you're over the target.)
[ Post Reply | Private Reply | To 80 | View Replies]

To: Terpfen
Aww, is the show over?

Clever...

Now we're up to three posts by you without addressing the concept of unpowered line-in ports.

I hope others reading this thread can now see through the shallow veneer of Mac zombies when they're confronted with reality.

82 posted on 11/12/2009 11:19:48 AM PST by politicket (1 1/2 million attended Obama's coronation - only 14 missed work!)
[ Post Reply | Private Reply | To 81 | View Replies]

To: Boucheau
It’s called Google. Enjoy.

So the virus you are cleaning off of all those Macs is called "Google?"

You keep trying to make others do your work for you as if it COULD be proved by a simple Google search.

YOU made an extraordinary claim... about all those Macs your shop spends time cleaning up the infections of viruses you claim exist on all of those Macs. You are the one spreading the falsehoods. Therefore it is up to YOU to prove your assertions. I know you can't.

I have challenged YOU, not anyone else, to NAME THE VIRUSES YOU CLAIM EXIST. If you spend as much time as you claimed cleaning up those Macs, then it should be drop dead easy for you to name the viruses off the top of your head. You haven't and are now dancing around trying to avoid being labeled a liar.

I, on the other hand, working from what I know to be fact, having posted every single one of the have claimed there ARE NONE. Aside from two families of Trojan Horse applications, which are easy to detect and avoid (Snow Leopard will warn you when you download either of them), there is little malware in the wild. There have been a few "proof-of-concept" virus candidates put forward in Computer security labs... but all of them have failed to work.... and none of them are in the wild to "infect" any Macs. There are no self-replicating, self-transmitting, self-installing VIRUSES for Mac OS X in the wild... and there have not been.

Unless you come up with your list of viruses you, and you alone, claim you clean off of all those Macs, I will call you what you are... a LIAR.

Come on, Boucheae, provide the list... Viruses that are actively infecting OS X Macs... now. Or you admit you are a liar.

You made the claim. You prove it.

83 posted on 11/12/2009 11:22:48 AM PST by Swordmaker (Remember, the proper pronunciation of IE is "AAAAIIIIIEEEEEEE!)
[ Post Reply | Private Reply | To 48 | View Replies]

To: politicket

Why should I address something that isn’t my job? LOL.


84 posted on 11/12/2009 11:55:39 AM PST by Terpfen (FR is being Alinskied. Remember, you only take flak when you're over the target.)
[ Post Reply | Private Reply | To 82 | View Replies]

To: Terpfen
Why should I address something that isn’t my job? LOL.

You dance better than Fred Astaire.

You'd make a good politician the way you seek to avoid the challenge brought before you.

85 posted on 11/12/2009 12:17:53 PM PST by politicket (1 1/2 million attended Obama's coronation - only 14 missed work!)
[ Post Reply | Private Reply | To 84 | View Replies]

To: politicket

Actually, since you haven’t caught on, I’m merely reciprocating the silly attitude you showed to your employee.


86 posted on 11/12/2009 1:18:57 PM PST by Terpfen (FR is being Alinskied. Remember, you only take flak when you're over the target.)
[ Post Reply | Private Reply | To 85 | View Replies]

To: FredZarguna
It's called Objective-C.

It's a good language for desktop and mobile apps with a graphical user interface. In combination with the Cocoa frameworks, it is the preferred language for Mac and iPhone software development. It can interoperate with C++ code or whatever. Judging from the number of new iPhone apps at the iTunes Store, it is becoming a popular language.

The stupidest language ever was probably plain old "C" language, which, for example, had to scan an entire string to return a character count. Its weakness as a language has been the source of many security problems, especially on Windows.

87 posted on 11/12/2009 1:29:40 PM PST by HAL9000 ("No one made you run for president, girl."- Bill Clinton)
[ Post Reply | Private Reply | To 11 | View Replies]

To: HAL9000
The stupidest language ever was probably plain old "C" language...

Good grief...

Do you realize just how pathetically ignorant that sounds?

C was developed in 1972. Would you kindly point out a competing systems language from the same timeframe which is superior to C? ...and how and why it is?

Why would millions of professional programmers STILL use C for important projects if it's "the stupidest language ever"? Is it because YOU are the only one who has everything figured out?

Please, give us a break.

88 posted on 11/12/2009 1:40:34 PM PST by TChris ("Hello", the politician lied.)
[ Post Reply | Private Reply | To 87 | View Replies]

To: FredZarguna
What I will agree to is that Apple has better marketing and better advertising, and the fact that they're able to sell Intel equipment for much higher prices than their competition is more a tribute to P.T. Barnum's knowledge of human nature than it is to any real innovation: I've had Zunes and I've had iPod Touches and frankly, the Creative Zen -- which no longer exists -- was a better value than either.
I'll speak for myself: I got sick and tired of being paranoid about viruses. And even if you have an antivirus program, common sense tells you that for the AVP to have a virus in its database, the virus has to have been on the loose compromising machines for a non-zero period of time - so best practice cannot guarantee that your Windows box doesn't get hit. Hence, the paranoia, which actually increases your vulnerability to trojan attacks.

A memorable line in a television play I once saw said, "100% safe, nothing is." But I'd rather take my chances on an industrial-strength OS (Unix) than on the Antivirus band-aid on an OS which was not designed from the ground up with the possibility of bad software in mind. I suspect Win7 will still have virus problems due to its compatibility with prior versions of Windows which notoriously did have that problem. And, antivirus costs computer cycles even if you get the software and database updates free.

Consequently I like the Mac. It costs more, at least on a first-cost basis, but I am satisfied with the value proposition. And obviously I'm not alone. Like any product which you pay more for, you are the more determined to get the value you paid for - and that means that Macs are kept in service longer (and also, admittedly, that an element of denial of anything which questions that value tends to creep into the owner's thinking. So be it; that process can also work the other way around).


89 posted on 11/12/2009 1:45:06 PM PST by conservatism_IS_compassion (Anyone who claims to be objective marks himself as hopelessly subjective.)
[ Post Reply | Private Reply | To 70 | View Replies]

To: HAL9000
It's a good language for desktop and mobile apps with a graphical user interface.

No, it's not. It's awful. I've written apps in it for feed inventory programs, and it's hideous.

In combination with the Cocoa frameworks, it is the preferred language for Mac and iPhone software development.

Preferred because the fascists at Apple essentially block all other languages on the iPhone. How long would it survive if the JVM were permitted on the iPhone or iPod/Touch? About a week. As for Mac Apps, I believe you will find the big name apps on the Mac are written mostly in C++.

It can interoperate with C++ code or whatever.

No thanks to Apple. C++ interoperability,as far as I know, is a gnu innovation.

Judging from the number of new iPhone apps at the iTunes Store, it is becoming a popular language.

It's so popular that the estimated number of developers is <1%. Nobody sensible would code in it on the iPhone if there were allowed alternatives (although Novell has just ported C# to the platform.)

The stupidest language ever was probably plain old "C" language,

Not even close. Try COBOL, or any variant of FORTRAN <= FORTRAN IV. Or any version of Visual Basic in any implementation. I could go on and on.

had to scan an entire string to return a character count.

It is trivial to prove that there is, in general, no other way to do this, so I'm not sure what point you think you're making. That the compiler should do that for you? But it's still being done whether you think so or not. If you're of the opinion that the language should do it for you, and then store the length of the string, you would be violating one of the design objectives of C, which was to incorporate as few assumptions about application specific data as possible. There are plenty of libs for "C" that can do this. [Remarkably, even fairly talentless programmers can do it themselves.]

Its weakness as a language has been the source of many security problems, especially on Windows.

What nonsense. This is like the claim that we need to regulate guns because an Islamic terrorist has just murdered 13 people in Texas, or that the government gets to regulate or even outlaw what vegetables you're allowed to have in your pocket because some people smoke them. C doesn't kill programs, programmers kill programs. The weakness of the "language" is a weakness in the IQ and/or discipline of people writing code. 90% of coders shouldn't be. Unfortunately, there's too much code for the other 10% to write.

"Especially on Windows" is even sillier. What do you think Unix and its variants are written in? Objective-C?

90 posted on 11/12/2009 1:56:27 PM PST by FredZarguna (It looks just like a Telefunken U-47. In leather.)
[ Post Reply | Private Reply | To 87 | View Replies]

To: TChris

HALs comment, which I address fully elsewhere, is almost as st00pid as Objective-C.


91 posted on 11/12/2009 1:58:35 PM PST by FredZarguna (It looks just like a Telefunken U-47. In leather.)
[ Post Reply | Private Reply | To 88 | View Replies]

To: coconutt2000
Office 2007 is a reason why OpenOffice is becoming more popular.

I agree. The docx format was an attempt to break backward compatibility and force an upgrade. The interface is bad, and I get calls from coworkers who have been using it for a couple of years now that can't find stuff they did on 97-03.

92 posted on 11/12/2009 2:00:42 PM PST by Richard Kimball (We're all criminals. They just haven't figured out what some of us have done yet.)
[ Post Reply | Private Reply | To 71 | View Replies]

To: conservatism_IS_compassion
Your claim about vulnerabilities is, in fact, preposterous. If you're not worried anymore, then your Mac is about to become some North Korean or Chinese kid's zombie -- and you may never even know it.

As Apple market share increases, the number of exposed vulnerabilities will also. There are plenty of Unix exploits. Please have a look here. I have to deal with them on Unix systems on an (almost) daily basis. I love the ix variants, but anybody who is not patching routinely is vulnerable. I applied six patches to the Ubuntu 9.04 installation I am typing on at this minute just last night; and I apply patches to my linux box(es) every day, so it's not as if I was just catching-up.

93 posted on 11/12/2009 2:11:54 PM PST by FredZarguna (It looks just like a Telefunken U-47. In leather.)
[ Post Reply | Private Reply | To 89 | View Replies]

To: Swordmaker
As Apple market share increases, the number of exposed vulnerabilities will also. There are plenty of Unix exploits. Please have a look here. I have to deal with them on Unix systems on an (almost) daily basis. I love the ix variants, but anybody who is not patching routinely is vulnerable. I applied six patches to the Ubuntu 9.04 installation I am typing on at this minute just last night; and I apply patches to my linux box(es) every day, so it's not as if I was just catching-up.
Ping.

94 posted on 11/12/2009 3:30:01 PM PST by conservatism_IS_compassion (Anyone who claims to be objective marks himself as hopelessly subjective.)
[ Post Reply | Private Reply | To 93 | View Replies]

To: FredZarguna; conservatism_IS_compassion
As Apple market share increases, the number of exposed vulnerabilities will also.

First of all, "vulnerabilities" do not always translate into "exploits."

Secondly, we've been hearing that same "any time now" mantra from Windows fans for the past eight years and it has yet to come true. From my viewpoint, eight years and counting of no malware worries counts for a lot. I know my blood pressure is better for it.

Thirdly, Fred, exactly what is the magic number of Macs that will suddenly cause the cracker community to sit up and pay attention to all those Macs out there running without AntiVirus or AntiSpy ware of any kind. Many of them operate without even a firewall. Is it 5 million? 10? 20? 30? There are currently more than 40 million OS X Macs in the wild, 99% of them running bare naked, unprotected by anti-malware applications. Why have they not been successfully targeted yet?

It can't be because they are obscure... obscurity has not stopped crackers from writing viruses and worms that have targeted far smaller populations of vulnerable machines. The Witty Worm was written to target all 12,000 or so unpatched BlackIce firewall protected PCs and within 45 minutes of being released into the wild, all 12,000 PCs were infected. Viruses were written to infect 30,000 of one particular model of smart phone... and got all of them. Someone even wrote a virus to infect iPods that had been converted to run LINUX... all couple of dozen of them. Why, then, are 40,000,000 smug sitting ducks being ignored???

It has been reported that a spambot of just 2000 machines is worth $50,000 on the black market for just a two week window of use. If we assume that all of those Macs could be converted into 2k spambots, that's 20,000 spambots that would be worth $1 billion on the black market before they could be patched. Why has no one mined this very lucrative field before? That's a lot of cash going begging?

The real reason is the extreme difficulty in finding a viable vector to spread the OS X Malware. Currently, the most dangerous are trojans that depend on social engineering to persuade a user into installing a malicious app and running it. That is limited by proper computing practices and the use of standard accounts rather than administrator accounts. There are currently only two families of viable Trojans, both of the URL hijackers, and OS X Snow Leopard will warn users when they attempt to download one of the variants.

95 posted on 11/12/2009 4:33:47 PM PST by Swordmaker (Remember, the proper pronunciation of IE is "AAAAIIIIIEEEEEEE!)
[ Post Reply | Private Reply | To 93 | View Replies]

To: FredZarguna
No, it's not. It's awful. I've written apps in it for feed inventory programs, and it's hideous.

Well, that's one man's opinion. Maybe object-oriented design isn't your specialty. Apple's development tools get generally good reviews in the industry - and they're free with each Mac sold.

Apple's market capitalization will exceed Microsoft's in the foreseeable future. Be prepared.

96 posted on 11/12/2009 6:27:05 PM PST by HAL9000 ("No one made you run for president, girl."- Bill Clinton)
[ Post Reply | Private Reply | To 90 | View Replies]

To: FredZarguna

“C++ usage is dropping because most programmers simply aren’t bright enough to code in it, and there are simpler viable alternatives for average coders now available.”

Exactly. You think you’re making a good point, but it’s one against C++. As we all know, any Turing-complete language is capable of expressing any algorithm. The tradeoffs involve complexity, readability, and expressiveness.

C++ fails in terms of excessive complexity, poor readability, and fragility in terms of its low-level object oriented approach. Only one fairly popular operating system was ever written in object-oriented C++ - BeOS. Due to the nature of vtables, the OS designers had to reserve slots in all the OS objects so adding methods later wouldn’t require all programs to be recompiled (and redistributed) for a new OS version. BeOS, for many reasons including the C++ dependencies, was a failure.

On the other hand, MacOS which heavily relies on Obj-C, is a healthy and growing force in computing.

“If most coders were not functioning at the level of 1950’s COBOL programmers, Java, to take just one example, would never have existed.”

Nonsense. You picked the worst possible example. COBOL was, and is, a dumbed-down language for business applications. While quite successful in some ways, most programmers found it distasteful due to its simplicity and verbosity. Java, on the other hand, was conceived almost entirely as a better alternative to C++ with similar syntax and has been stunningly successful.

“Please cite one major programming feature in the C++ standard that Visual C++ doesn’t conform to. Just one actually used, please.”

Looking at the official Microsoft page (possibly not the best source of Visual C++ noncoformance issues) they list eight areas that it’s not compatible with the ANSI standard. I’l grant you that’s a huge improvement over the last version I had the misfortune to encounter, VC 2002. It’s still pretty pathetic that it took a company with Microsoft’s resources so long to produce a conforming compiler, and it’s a testament to the unnecessary complexity of the C++ specification.

“I’ve written code for money in just about every language there is, and there is no other with the expressive power or capability of C++.”

Nonsense. You’re stuck in an analog of the “if all I have is a hammer, every problem is a nail” truism.

Java, for instance, has surpassed C++ as an implementation language because it’s design was largely based on C++ shortcomings.

“And I have used just about everybody’s C++ compiler. They’re ALL conformant for practical purposes.”

Meaning if you stick to a subset, as the Mozilla team recommends for instance, you’re OK. But of course you’ve lost whatever expressiveness lies between the ANSI standard and whatever subset you choose to use.

“Please do cite for me ONE large scale application actually written in Objective-C. It’s a toy language, for applets.”

I will say you have a distinct flair for making strong claims that simply reveal your ignorance.

How about the entire MacOS X GUI framework? That’s the equivalent of the entire Windows GUI API. Then theres Interface Builder, the MacOS X IDE, iTunes, iPhoto, the Omniweb browser and on and on. All Cocoa applications (modern MacOS applications exploiting all MacOS features) use Obj-C. Even the Finder, the linchpin of the MacOS X GUI, was moved to Cocoa in Snow Leopard (MacOS X 10.6).

“No one is going to code a major IT project or even a serious App in Objective-C. If you think even the major Mac applications are coded in Objective-C, you’re kidding yourself.”

Someone is kidding himself, all right. Go have a long look in the mirror.

You should really read more about MacOS X and Obj-C if you’re going to try and bash them effectively.


97 posted on 11/12/2009 7:05:55 PM PST by PreciousLiberty
[ Post Reply | Private Reply | To 74 | View Replies]

To: Swordmaker

Seems like a weird thing to do, what with Mac market share being a “rounding error” according to Ballmer. ;’)


98 posted on 11/12/2009 7:18:42 PM PST by SunkenCiv (https://secure.freerepublic.com/donate/__Since Jan 3, 2004__Profile updated Monday, January 12, 2009)
[ Post Reply | Private Reply | To 95 | View Replies]

To: FredZarguna
> Meme, as a concept with any scientific validity, is roughly as idiotic as the phlogisten theory of combustion. Except it doesn't even have the excuse that it was theorized in the 17th century.

Who said anything about "scientific validity"?? It's just a flippin' word of recent coinage, which happened to cover the particular concept I wanted to convey -- that of an idea which is transmitted from person to person (generally without the benefit of proof or questioning). I could have used another word, but I didn't happen to. Calm down...

> There are actual words you should consider that convey actually valid concepts: cliche and nostrum for example.

A "cliche" is a different thing from what I wanted to convey (which had to do with transmission). "Nostrum" is better, since it conveys falsehood (or at least exaggeration), but it likewise doesn't convey transmission.

Maybe I should have used "gossip".

> Nothing personal, but anyone using the word "meme" is stamping himself as an intellectual lightweight, and a poseur, plain and simple.

Ah, well, it's a free country, so if you want to make that kind of judgment, on the basis of one word, you're welcome to. OTOH, you're wrong, so I'll ask you to keep further explication of your position on this matter to yourself, since your name-calling is indeed rather personal, despite your disclaimer. Fortunately, I don't take it personally, because I know better. ;-)

99 posted on 11/12/2009 7:29:08 PM PST by dayglored (Listen, strange women lying in ponds distributing swords is no basis for a system of government!)
[ Post Reply | Private Reply | To 64 | View Replies]

To: PreciousLiberty
The ignorance of computer programming betrayed by your response is really just too huge for me to disabuse you of all your idiotic ideas. I'll just hit the highlights.

Objective-C: I've coded in it, and currently maintain shipping apps in it. It's crap. It's been extended to Objective-C++ because of its shortcomings. It would not be used on iPhone at all if Apple didn't require its use. If Sun gets a JVM that the Apple fascists permit to run on their little hot-house orchid, or the C# implementation is at all decent, it's history on the iPhone, and the Apple people know it. Why do you think they don't permit any runtime but their own?

BeOS: blather and nonsense. Please don't post crap like this thinking I'm some code-slave in the next cubicle you can spin your nonsense to. There are lightweight Unix implementations written entirely in C++, there are large parts of the NT code base and supporting apps in C++, and there are large parts of Unix (like CDE) written in C++.

Java: Please actually read what the inventor of Java has written about it. It's a dumbed-down version of C++, nothing more. Although he -- like you -- believes this is a virtue.

Java, for instance, has surpassed C++ as an implementation language because it’s design was largely based on C++ shortcomings.

An editorial opinion of Java's popularity, that has no basis in reality. Java is popular because 70% of programmers fall within one standard deviation of average. There is nothing that can be done in Java that cannot be done better in C++. And there are an enormous number of things: real-time control, operating systems, compiler design, libraries where performance actually matters -- like much of the Java IO -- that cannot possibly be done in Java. Talk about only having a hammer. Java is designed to do one thing: allow a new generation of COBOL programmers to continue coding in mediocrity.

C++ conformance: Yawn. Every compiler has lists of implementation specific and non-conforming behavior. I asked you for something specific. You last coded C++ in VC2002. Tell me, what language feature of VC C++ in 2001 did you want to use that you couldn't? Partial template specialization? Please enlighten me about how the failure of partial template specialization hindered you in your career. Especially when no other language actually being used even had generics at that point.

100 posted on 11/12/2009 8:05:49 PM PST by FredZarguna (Real men don't let hardware manufacturers dictate their language choices)
[ Post Reply | Private Reply | To 97 | View Replies]


Navigation: use the links below to view more comments.
first previous 1-20 ... 41-6061-8081-100101-119 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
General/Chat
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson