Free Republic
Browse · Search
General/Chat
Topics · Post Article

Skip to comments.

Apple Previews Mac OS X Snow Leopard to Developers
Apple.com ^ | June 9, 2008 | Apple

Posted on 06/10/2008 6:23:21 AM PDT by conservatism_IS_compassion

SAN FRANCISCO—June 9, 2008—Apple® today previewed Mac OS® X Snow Leopard, which builds on the incredible success of OS X Leopard and is the next major version of the world’s most advanced operating system. Rather than focusing primarily on new features, Snow Leopard will enhance the performance of OS X, set a new standard for quality and lay the foundation for future OS X innovation. Snow Leopard is optimized for multi-core processors, taps into the vast computing power of graphic processing units (GPUs), enables breakthrough amounts of RAM and features a new, modern media platform with QuickTime® X. Snow Leopard includes out-of-the-box support for Microsoft Exchange 2007 and is scheduled to ship in about a year.

“We have delivered more than a thousand new features to OS X in just seven years and Snow Leopard lays the foundation for thousands more,” said Bertrand Serlet, Apple’s senior vice president of Software Engineering. “In our continued effort to deliver the best user experience, we hit the pause button on new features to focus on perfecting the world’s most advanced operating system.”

Snow Leopard delivers unrivaled support for multi-core processors with a new technology code-named “Grand Central,” making it easy for developers to create programs that take full advantage of the power of multi-core Macs. Snow Leopard further extends support for modern hardware with Open Computing Language (OpenCL), which lets any application tap into the vast gigaflops of GPU computing power previously available only to graphics applications. OpenCL is based on the C programming language and has been proposed as an open standard. Furthering OS X’s lead in 64-bit technology, Snow Leopard raises the software limit on system memory up to a theoretical 16TB of RAM.

Using media technology pioneered in OS X iPhone™, Snow Leopard introduces QuickTime X, which optimizes support for modern audio and video formats resulting in extremely efficient media playback. Snow Leopard also includes Safari® with the fastest implementation of JavaScript ever, increasing performance by 53 percent, making Web 2.0 applications feel more responsive.*

For the first time, OS X includes native support for Microsoft Exchange 2007 in OS X applications Mail, iCal® and Address Book, making it even easier to integrate Macs into organizations of any size.

*Performance will vary based on system configuration, network connection and other factors. Benchmark based on the SunSpider JavaScript Performance test on an iMac® 2.8 GHz Intel Core 2 Duo system running Mac OS X Snow Leopard, with 2GB of RAM.


TOPICS: Computers/Internet
KEYWORDS: apple; mac; osx
Navigation: use the links below to view more comments.
first 1-5051-100101-118 next last
Doesn't particularly look like a must-have item like Leopard - if you have an iMac or a notebook, there's certainly no danger that you will need access to 16TB of RAM!

Sounds like it'll be awhile before OS X makes much of a splash again.

1 posted on 06/10/2008 6:23:21 AM PDT by conservatism_IS_compassion
[ Post Reply | Private Reply | View Replies]

To: Swordmaker

Is Steve Jobs letting Microsoft catch up on features, or is he sandbagging?


2 posted on 06/10/2008 6:25:42 AM PDT by conservatism_IS_compassion (Thomas Sowell for vice president.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: conservatism_IS_compassion

No, he’s releasing “Snow” Leopard so he can pick up some pocket change to subsidize his “snow” addiction.


3 posted on 06/10/2008 6:39:52 AM PDT by rivercat (The first thing we do, let's kill all the lawyers. - William Shakespeare)
[ Post Reply | Private Reply | To 2 | View Replies]

To: conservatism_IS_compassion
Something's not right here. I calculate that 16 TB can be reached with 44 bits. A 64 bit address space spans approximately 18 million TB.
4 posted on 06/10/2008 7:11:08 AM PDT by Steely Tom (Without the second, the rest are just politician's BS.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: conservatism_IS_compassion
Is Steve Jobs letting Microsoft catch up on features, or is he sandbagging?

It looks like he's taking a release cycle to just concentrate on making things better and faster under the hood. It also looks like there are a lot of programming libraries coming to help developers easily take advantage of all of the power of a modern machine, like multiple cores and the GPU. If it's easy for developers, it means better, more powerful software for us.

Vista could definitely use that.

BTW, getting into OS X development I've noticed some things in applications. I got a library cataloging program for our obscene amount of books, movies, etc., and while using it I noticed all the cool things it did, and how they only had to leverage the built-in libraries to do them.

5 posted on 06/10/2008 7:20:46 AM PDT by antiRepublicrat
[ Post Reply | Private Reply | To 2 | View Replies]

To: conservatism_IS_compassion

What I read on 10.6 (Snow Leopard) it is suppose to be for the Intel chips only. The G4 and G5 are left out in the “cold”.


6 posted on 06/10/2008 7:35:36 AM PDT by CORedneck
[ Post Reply | Private Reply | To 1 | View Replies]

To: antiRepublicrat

“It looks like he’s taking a release cycle to just concentrate on making things better and faster under the hood. It also looks like there are a lot of programming libraries coming to help developers easily take advantage of all of the power of a modern machine, like multiple cores and the GPU. If it’s easy for developers, it means better, more powerful software for us.”

Yes, as a developer this sounds pretty exciting. Writing software to really take advantage of multicore has been tough so far, it’ll be great if Apple can make it easier. Access to the GPU for compute power is a hot area too.

I’m very happy I made the move to Apple. I feel the added value MORE than makes up for any added cost. The great bundled software helps a lot! BTW, like a Mercedes or BMW, an Apple computer commands top resale value.


7 posted on 06/10/2008 7:42:08 AM PDT by PreciousLiberty
[ Post Reply | Private Reply | To 5 | View Replies]

To: Steely Tom

“Something’s not right here. I calculate that 16 TB can be reached with 44 bits. A 64 bit address space spans approximately 18 million TB.”

The 16 TB limit is per process, and since it’s approximately 8,000 times larger than the memory in a decently equipped computer today it provides some serious headroom.

At current prices 16 TB of RAM runs approximately $280,000...also providing some headroom. ;-)


8 posted on 06/10/2008 7:48:46 AM PDT by PreciousLiberty
[ Post Reply | Private Reply | To 4 | View Replies]

To: Steely Tom

The memory controllers and MMU’s don’t support 64 bits of physical address space. Right now, I’m using a Macbook Pro with an Intel Core 2 Duo. While the OS and CPU support 64 bit virtual addresses, the physical address space supported is 40 bits, and the memory architecture tops out at 4GB.


9 posted on 06/10/2008 7:55:58 AM PDT by NVDave
[ Post Reply | Private Reply | To 4 | View Replies]

To: antiRepublicrat

I agree — Apple has been pretty smart about how they’ve:

a) transitioned off the PPC platforms
b) created a new infrastructure
c) created a new app/programming framework (and how they did something a bit more clueful than use C++ as their default language)

As a guy who used to work on “big software” — I have to say that Apple is making some smart moves here. I’ve only dabbled with Cocoa so far, but I’m quite impressed by what I’ve seen to date.


10 posted on 06/10/2008 7:59:06 AM PDT by NVDave
[ Post Reply | Private Reply | To 5 | View Replies]

To: conservatism_IS_compassion

Snow Leopard is a great strategy for Apple. It will improve OS X as an enterprise-class operating system and extend Apple’s lead over Microsoft in software technology.

I believe it will be a compelling upgrade for Intel-based Mac owners, especially MacBook owners who should see improved battery performance, thanks to the more efficient software.

Snow Leopard will also be great for independent developers, and will increase the availability of application software designed for Macs. Apple’s OS roadmap gives us assurance that our work today is a good investment in the future of the platform.


11 posted on 06/10/2008 8:01:40 AM PDT by HAL9000 ("No one made you run for president, girl."- Bill Clinton)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Steely Tom
16 TB can be reached with 44 bits. A 64 bit address space spans approximately 18 million TB.
Now as you mention it, that sounds exactly right . . . if you keep to the convention that 1k = 2^10 =1024 rather than 1000, and 1k*1k = 1M, 1k*1M = 1G, 1k*1G = 1T, 1K*1T = 1P, and 1K*1P = 1E, that should not read 16T but 16E.

Unless they contemplate doing something really weird with the other 20 bits of available address space . . . like using the 20 most significant bits to assign the lion's share of the address space to mass memory other than RAM? That probably wouldn't do much that virtual memory concepts already, I suppose, in Unix, wouldn't already do anyway? http://www.sengpielaudio.com/ConvPrefe.htm">source


12 posted on 06/10/2008 8:15:28 AM PDT by conservatism_IS_compassion (Thomas Sowell for vice president.)
[ Post Reply | Private Reply | To 4 | View Replies]

To: PreciousLiberty
“Something’s not right here. I calculate that 16 TB can be reached with 44 bits. A 64 bit address space spans approximately 18 million TB.”
The 16 TB limit is per process,
. . . but that still seems arbitrary and not obviously necessary.
and since it’s approximately 8,000 times larger than the memory in a decently equipped computer today it provides some serious headroom.

At current prices 16 TB of RAM runs approximately $280,000...also providing some headroom. ;-)

Since RAM cost has been declining according to Moore's Law, I look at "headroom" on a log scale. It took 16 bits to address the 64K memories readily available in the early 1980s, and it takes 32 bits to address the 4Gig of RAM readily available now, one human generation later. Were Moore's Law to continue in effect for another human generation, that would suggest the need for about 48 bits to address the memory that would then be in common currency. And it would take another generation again to exhaust the full 64 bit address space.

Recall the big issue over the Y2K transition, and it does give one pause over assuming that Moore's Law will break down before reaching 44 bits of address space. Of course I have to admit that it would be no trick at all to go to 128 bit cores long before that - were there a reason - but still. What bang do they get for that buck? What it means is that your son could very easily see the time when an unnecessary software limitation creates a crisis in the operating system. Which, looked at in that way, is pretty optimistic after all. Why would OS X necessarily last two human generations?


13 posted on 06/10/2008 8:45:40 AM PDT by conservatism_IS_compassion (Thomas Sowell for vice president.)
[ Post Reply | Private Reply | To 8 | View Replies]

To: conservatism_IS_compassion

“Which, looked at in that way, is pretty optimistic after all. Why would OS X necessarily last two human generations?”

Exactly. My point was that 16 TB of memory per process represents considerably more room for expansion than we’ve had in previous memory addressing jumps. Many other things are more likely to be an issue than this limitation.


14 posted on 06/10/2008 9:04:40 AM PDT by PreciousLiberty
[ Post Reply | Private Reply | To 13 | View Replies]

To: 1234; 50mm; 6SJ7; Abundy; Action-America; acoulterfan; aristotleman; af_vet_rr; Aggie Mama; ...
OSX.6—Snow Leopard previewed at WWDC—PING!



Mac OSX.6 Snow Leopard Ping!

If you want on or off the Mac Ping List, Freepmail me.

15 posted on 06/10/2008 9:22:42 AM PDT by Swordmaker (Remember, the proper pronunciation of IE is "AAAAIIIIIEEEEEEE!)
[ Post Reply | Private Reply | To 1 | View Replies]

To: CORedneck
What I read on 10.6 (Snow Leopard) it is suppose to be for the Intel chips only. The G4 and G5 are left out in the “cold”.

There are PowerPC drivers in Snow Leopard according to some in the know. . . and the G5 is fully 64 bit. . . so they may be dropping support for the G3 and G4. Time will tell.

16 posted on 06/10/2008 9:26:23 AM PDT by Swordmaker (Remember, the proper pronunciation of IE is "AAAAIIIIIEEEEEEE!)
[ Post Reply | Private Reply | To 6 | View Replies]

To: Swordmaker
g3 iphone (news search, hits!)
Google

17 posted on 06/10/2008 9:45:25 AM PDT by SunkenCiv (https://secure.freerepublic.com/donate/_________________________Profile updated Friday, May 30, 2008)
[ Post Reply | Private Reply | To 15 | View Replies]

To: HAL9000
Snow Leopard is a great strategy for Apple. It will improve OS X as an enterprise-class operating system and extend Apple’s lead over Microsoft in software technology.

I believe it will be a compelling upgrade for Intel-based Mac owners, especially MacBook owners who should see improved battery performance, thanks to the more efficient software.

Snow Leopard will also be great for independent developers, and will increase the availability of application software designed for Macs. Apple’s OS roadmap gives us assurance that our work today is a good investment in the future of the platform.

. . . so you see Snow Leopard more from a SDK point of view - you see it empowering you to efficiently exploit all the number crunching capability of the Mac hardware, so that you can develop good software with less pain. And the more developers see that opportunity, the more of a critical mass of developers will assemble around the Mac and that will insure your investment of your own effort into the platform since it will be "where it's at" in software development. This then sells more Macs to enterprises, and to the public. Hmmm!

In my present superannuated state, I buy a Mac and the only apps I'm thinking of come with the OS or iWork. The one unmet use I see for the tremendous (by historical standards) computational power you are working to harness is: Speech Processing. I'm familiar with Dragon Nuance Naturally Speaking, and that seems to be getting fairly mature. But it's not on the Mac, SFAIK. And I guess that I will always give at most two cheers to an OS which can't be run by voice. I hope that Snow Leopard will induce Nuance to port their product to the Mac. Because in the long run the keyboard must inevitably be looked back on as a clumsy, anachronistic input device.


18 posted on 06/10/2008 9:47:00 AM PDT by conservatism_IS_compassion (Thomas Sowell for vice president.)
[ Post Reply | Private Reply | To 11 | View Replies]

To: conservatism_IS_compassion
I really, really wish they would have called it Liger.

Photobucket

19 posted on 06/10/2008 10:00:21 AM PDT by lesser_satan (Cthulu '08! Why vote for the lesser evil?)
[ Post Reply | Private Reply | To 1 | View Replies]

To: PreciousLiberty
Many other things are more likely to be an issue than this limitation.

Like a motherboard that can take 16 TB of memory? Right now we're up to 4 GB chips, 16 slots on servers, 8 for high-end desktops. That's only 64/32 GB. Slots aren't likely to proliferate much more (except on very large servers) simply due to room on the board, so we're waiting for 1 TB memory chips, 2 TB chips for desktops. That'll be a while.

I know, virtual memory technically makes this possible today, but that much address space is usually reserved for high-performance applications, and it would be counter-productive to do that in virtual memory.

20 posted on 06/10/2008 10:11:35 AM PDT by antiRepublicrat
[ Post Reply | Private Reply | To 14 | View Replies]

To: conservatism_IS_compassion

I know this is OT but did anyone else think that Steve Jobs didn’t look well yesterday? He looked thin to the point of appearing gaunt. I hope he’s all right.


21 posted on 06/10/2008 10:11:51 AM PDT by jalisco555 ("My 80% friend is not my 20% enemy" - Ronald Reagan)
[ Post Reply | Private Reply | To 1 | View Replies]

To: CORedneck; Swordmaker
What I read on 10.6 (Snow Leopard) it is suppose to be for the Intel chips only. The G4 and G5 are left out in the “cold”.

They already left my iMac G4 out in the cold. OS X 10.5 doesn't work on G4's below 867MHz and my 17" iLamp G4 is 800 MHz.

I am running Leopard on my 17" PowerBook Pro 2.5 GHz Intel Core 2 Duo and love it.

22 posted on 06/10/2008 10:18:19 AM PDT by 50mm (Inside every cynical person, there is a disappointed idealist.)
[ Post Reply | Private Reply | To 6 | View Replies]

To: PreciousLiberty
Access to the GPU for compute power is a hot area too.
I notice mention of that, and I have to ask: how does the processing speed of a GPU relate to that of a core of a CPU?

Is a GPU effectively an array processor, or suchlike?


23 posted on 06/10/2008 10:24:49 AM PDT by conservatism_IS_compassion (Thomas Sowell for vice president.)
[ Post Reply | Private Reply | To 7 | View Replies]

To: Swordmaker
There are PowerPC drivers in Snow Leopard according to some in the know. . . and the G5 is fully 64 bit. . . so they may be dropping support for the G3 and G4. Time will tell.

Also, there will be people with dual core G5s still on AppleCare until August 2009. It would be a crappy move to no longer support hardware still under warrantee.
24 posted on 06/10/2008 10:25:08 AM PDT by Gomez (trainer of insects)
[ Post Reply | Private Reply | To 16 | View Replies]

To: conservatism_IS_compassion
Mac OS® X Snow Leopard sounds like a mainframe Unix op/sys to me.

25 posted on 06/10/2008 10:33:36 AM PDT by Uri’el-2012 (you shall know that I, YHvH, your Savior, and your Redeemer, am the Elohim of Ya'aqob. Isaiah 60:16)
[ Post Reply | Private Reply | To 1 | View Replies]

To: 50mm
They already left my iMac G4 out in the cold. OS X 10.5 doesn't work on G4's below 867MHz and my 17" iLamp G4 is 800 MHz.
I had the same thing - and was salivating over Leopard for so long before it came out that when they sprung that 867MHz requirement on me I jumped and got a 20" iMac to be able to use Leopard.

I gave my G4 to my daughter for the granddaughters - they love it. When I was visiting them recently and tried to use it, I found that I am thoroughly addicted to the 20" screen and to the expandable text windows in the Leopard version of Safari.


26 posted on 06/10/2008 10:40:11 AM PDT by conservatism_IS_compassion (Thomas Sowell for vice president.)
[ Post Reply | Private Reply | To 22 | View Replies]

To: conservatism_IS_compassion

OS X has speech recognition built-in. I discovered this when playing chess. I don’t know if it’s as advanced as these other products, but it’s there.


27 posted on 06/10/2008 10:46:29 AM PDT by antiRepublicrat
[ Post Reply | Private Reply | To 18 | View Replies]

To: conservatism_IS_compassion

I’m just trying to figure out the economics of this. If I have an existing Mac and want to connect to an Exchange server, I will have to purchase a new OS? It isn’t a service pac or free upgrade?


28 posted on 06/10/2008 10:47:08 AM PDT by js1138
[ Post Reply | Private Reply | To 1 | View Replies]

To: conservatism_IS_compassion

Scotty: “Hello, computer!”


29 posted on 06/10/2008 10:58:14 AM PDT by Erasmus (I invited Benoit Mandelbrot to the Shoreline Grill, but he never quite made it.)
[ Post Reply | Private Reply | To 18 | View Replies]

To: conservatism_IS_compassion
Is a GPU effectively an array processor, or suchlike?

CPUs can do pretty much anything, but not so quickly. GPUs can do a limited set of computations, but they're highly parallel vector/matrix processors with massive memory bandwidth. So for those calculations that they can do, they are vastly more powerful than a CPU. Check out the Folding@Home site and see the relative speed of the GPUs involved in the effort.

30 posted on 06/10/2008 10:59:12 AM PDT by antiRepublicrat
[ Post Reply | Private Reply | To 23 | View Replies]

To: js1138

OS X has had three service packs so far, as we’re at 10.5.3 now. This will be the next version of the OS, probably $129. However, it looks like upgrading will in general extend the life of your hardware due to all the speed increases, so it’ll probably be worth it.


31 posted on 06/10/2008 11:04:48 AM PDT by antiRepublicrat
[ Post Reply | Private Reply | To 28 | View Replies]

To: XeniaSt; Swordmaker
Mac OS® X Snow Leopard sounds like a mainframe Unix op/sys to me.
Does sound like it, when you say it. Good Point. And when we get into arguments with the Windows fanboys, I like to point out that Unix wasn't designed for personal computers, it was designed for real computers - and now that personal computers are real computers, Unix makes sense for them.

And that's what OS X.5 officially is - and as you say, X.6 will be more so. And I guess that answers my question as to whether Jobs was giving up a tempo to Microsoft. To the contrary, he is putting pressure on the soft point of Windows - the fact that Windows wasn't industrial strength from the ground up, and would have to lose compatibility with its vaunted legacy apps in order to become so.

So the fact that Snow Leopard won't do anything for me as a personal home user reflects the fact that it is the long awaited (and despaired of) thrust by Apple at the corporate market?


32 posted on 06/10/2008 11:07:29 AM PDT by conservatism_IS_compassion (Thomas Sowell for vice president.)
[ Post Reply | Private Reply | To 25 | View Replies]

To: conservatism_IS_compassion
Apple previews Mac OS X Snow Leopard Server

33 posted on 06/10/2008 11:12:59 AM PDT by Uri’el-2012 (you shall know that I, YHvH, your Savior, and your Redeemer, am the Elohim of Ya'aqob. Isaiah 60:16)
[ Post Reply | Private Reply | To 32 | View Replies]

To: conservatism_IS_compassion

I believe that .6’s increased emphasis on threading/scheduling on multi-core products is aimed at the weak spot for both Windows and Linux. The only OS that has a *really* good threads/scheduling infrastructure out there just now in the Unix space is Solaris. Everything else is a step down from there.

If OS X could really go after this issue, it will be an entry into the database server market. Right now, if you want to run Oracle on multi-CPU/core iron and get your money’s worth, you choose Solaris for the threading.

That said, writing multi-threaded apps is something that is beyond the ken of many programmers. Apple is laying the groundwork first in the OS, but to capitalize on this, I think they’ll need to come out with some slick tools in the development suite to help programmers unfamiliar with multi-thread programming to ID the weak spots in their code; things like locking, resource contention, cache coherency, etc.


34 posted on 06/10/2008 11:15:43 AM PDT by NVDave
[ Post Reply | Private Reply | To 32 | View Replies]

To: conservatism_IS_compassion

You’re thinking like a software user.

The real limitation here is in the chips where the memory controller and MMU come together.

Carrying around a whole lot of extra bits that cannot possibly be used (and wouldn’t be - because there quite simply are not applications that demand that much memory - yet) clutters up the address bus, the speed of the bus, the size of the MMU internal data structures, you name it. So the designers strike a balance here — how many bits of physical address are really needed in, oh, say the next 10 years?


35 posted on 06/10/2008 11:24:17 AM PDT by NVDave
[ Post Reply | Private Reply | To 13 | View Replies]

To: XeniaSt
Thanks for the link. "and read and write support for the high-performance, 128-bit ZFS file system." Apple is definitely pushing for the enterprise market since this has little utility for even most small business users. And the multiprocessor orientation -- Jobs latched onto SMP as the future harder than I thought. Snow Leopard will probably be the most SMP-oriented operating system on the general market, something the enterprise will love.

Apple was throwing out so many features that I was starting to get worried quality may suffer and bloat would ensue, especially with another bag of features I expected for the next version. I'll be up front in line for this version even if just to appease the purist side of me.

36 posted on 06/10/2008 12:56:57 PM PDT by antiRepublicrat
[ Post Reply | Private Reply | To 33 | View Replies]

To: antiRepublicrat
Is a GPU effectively an array processor, or suchlike?
GPUs can do a limited set of computations, but they're highly parallel vector/matrix processors with massive memory bandwidth.
That's what I sort of suspected, from what was being said. My experience of twenty years ago was that that sort of thing was an expensive special purpose box about the size of an old PC system unit. Called an "array processor." Why would I expect that something which was the size of a PC back then would take any more than a board, if not indeed a single chip, today!

37 posted on 06/10/2008 1:01:29 PM PDT by conservatism_IS_compassion (Thomas Sowell for vice president.)
[ Post Reply | Private Reply | To 30 | View Replies]

To: NVDave
I think they’ll need to come out with some slick tools in the development suite to help programmers unfamiliar with multi-thread programming to ID the weak spots in their code

I don't know what they plan in general, but right now OS X just does it for you in some cases. The one I know is in Core Animation, which automatically multithreads calls to it (and automatically GPU-accelerates it). Your program doesn't even have to be multi-thread aware.

But for general programming I would appreciate something better than in Visual Studio. VS helps a bit at run time, such as catching some illegal cross-thread data access, but it's not all-encompassing. The Background Worker simplifies basic threading, but is also mainly a solution to the Windows problem of your UI operating on the one and only main thread. Locking and race conditions are still possible, and hard to catch.

38 posted on 06/10/2008 1:03:57 PM PDT by antiRepublicrat
[ Post Reply | Private Reply | To 34 | View Replies]

To: conservatism_IS_compassion
Why would I expect that something which was the size of a PC back then would take any more than a board, if not indeed a single chip, today!

Or part of a chip. The SSE units in Intel and the 3DNow! units in AMD are the same kind of thing, only a little more general purpose, and thus slower, than a GPU. The Cell processor in a Playstation 3 has seven such units (actually 8, but one deactivated to increase yield) in addition to a generic PowerPC CPU core. IBM is building the world's first petaflop supercomputer using Cell chips.

Seymour cray used to say what would you want to plow a field, 2 strong oxen or 1,024 chickens. The problem with that is we can make a 16-legged chicken these days that's stronger than both of his oxen.

39 posted on 06/10/2008 1:15:31 PM PDT by antiRepublicrat
[ Post Reply | Private Reply | To 37 | View Replies]

To: ShadowAce

Hey Shadow,

With my above post this has me thinking about yields. IBM is now seriously using the Cell for such applications. Wouldn’t it be a good idea for them to send all 8-core tested Cell processors to such use and then use all 7-core tested ones for the PS3? Right now they’re just deactivating the extra core even if it tests out to make all PS3 Cells the same. Or maybe they’re already doing this...


40 posted on 06/10/2008 2:09:20 PM PDT by antiRepublicrat
[ Post Reply | Private Reply | To 39 | View Replies]

To: antiRepublicrat
Wouldn’t it be a good idea for them to send all 8-core tested Cell processors to such use...

Technically speaking, I would think so. From a business standpoint? I don't know what's involved in activating that 8th core. Is it possible to do so after testing? What's the cost involved?

I'm not familiar with the Cell--does it really only have 7 cores? That seems like an odd number (pun semi-intended). To me, 8 cores would be more efficient from a scheduling and resource handling standpoint. I know HPL would probably run more efficiently on 8-core chips.

41 posted on 06/10/2008 2:17:13 PM PDT by ShadowAce (Linux -- The Ultimate Windows Service Pack)
[ Post Reply | Private Reply | To 40 | View Replies]

To: Steely Tom; conservatism_IS_compassion; PreciousLiberty; NVDave; antiRepublicrat
This might be a Beckton (upcoming Intel chip) limit. That chip (and perhaps some other Nehalem chips ... I'm not sure) has 44 physical address bits coming off the die, and no doubt internally sizes TLB and cache memory address hardware to match those 44 bits.

As NVDave notes, such details do matter, down at the hardware level, when one actually has to spend transistors and (even worse) leads coming off the die for each such bit. Intel has to size such processors for the biggest case need; and the operating system people have to write to that hardware, making sure to manage all those 44 bits correctly.

And then the marketing people get to skim over the internal design docs, pick off some nice big number, and brag incoherently ;).

From my perspective, that 44 bits wasn't enough. I'm using that same chip in a system measured in petabytes of RAM, not terabytes. It takes some serious operating system and external hardware magic to add bits that aren't there.

42 posted on 06/10/2008 3:24:26 PM PDT by ThePythonicCow (By their false faith in Man as God, the left would destroy us. They call this faith change.)
[ Post Reply | Private Reply | To 4 | View Replies]

To: ShadowAce
I don't know what's involved in activating that 8th core. Is it possible to do so after testing?

I'm sorry, I thought it was you who I was previously discussing the Cell with.

From what I heard in a usual batch some chips will be useless (errors in the PPC CPU, multiple SPE cores or elsewhere), some will have an error on just one SPE core (they comprise about half of the chip area) and some will be pristine. After testing they use all the chips in the middle case and blow one core on the last case to keep everything consistent. The alternative would be to set the PS3 standard to all eight cores and they'd have to throw away all those chips with just one defective core.

I'm not familiar with the Cell--does it really only have 7 cores?

Eight cores, seven active in the PS3 due to yield considerations as discussed above, six available to developers (the last reserved for the PS3's OS). I was just thinking to use the pristine ones for these applications and the seven-core ones for the PS3 market.

That center bus between the elements is over 200 GB/sec, the memory interface on the left is over 25 GB/sec, the I/O on the right is over 60 GB/sec. Pretty spiffy, huh? I like how the SPEs are lined up mirror-image.

I know HPL would probably run more efficiently on 8-core chips.

You can go one of two ways: Parallelize the task and have all six SPEs working on equal chunks of the larger problem, or serialize the task and have each SPE work on a part of the larger problem then pass the problem down for further processing while it receives its chunk of the next problem. Each SPE has IIRC 256K of fast local SRAM, so you can probably store a smaller program and fit more of a working dataset in there when serializing without having to go out to main memory as much.

43 posted on 06/10/2008 3:28:43 PM PDT by antiRepublicrat
[ Post Reply | Private Reply | To 41 | View Replies]

To: antiRepublicrat
BTW, getting into OS X development I've noticed some things in applications. I got a library cataloging program for our obscene amount of books, movies, etc., and while using it I noticed all the cool things it did, and how they only had to leverage the built-in libraries to do them

What program, if I may ask? I've been using Delicious Library, and it's got some pretty good features, including using an iSight to scan UPC codes. But I'd like something that does a better job with books too old to have a UPC (of which I have a lot), especially if it can do ISBN lookups and check against the Library of Congress.

44 posted on 06/10/2008 5:10:25 PM PDT by ReignOfError
[ Post Reply | Private Reply | To 5 | View Replies]

To: conservatism_IS_compassion
Doesn't particularly look like a must-have item like Leopard - if you have an iMac or a notebook, there's certainly no danger that you will need access to 16TB of RAM!

Make some of that RAM flash memory, and you're talking about all but eradicating the line between RAM and mass storage. That could be fairly revolutionary, though we're a non-trivial number of years away from it being practical, let alone mainstream.

We're a ways away from seeing how it plays out, but I can't blame Apple for taking one upgrade cycle to focus on tightening things up rather than release a raft of new features. It'd be nice if more companies took the time to do that every few revisions (I'm looking at you, Microsoft).

45 posted on 06/10/2008 5:27:47 PM PDT by ReignOfError
[ Post Reply | Private Reply | To 1 | View Replies]

To: antiRepublicrat

What does the Cell processor have to do with Apple’s upcoming operating system? Nothing, Apple already dumped IBM for their processors, and the Cell isn’t x86 or PPC compatible either. I’m sure Sony and IBM would rather you buy a PS3 and try to get some obscure version of Linux working on it, but I’d take a Mac Mini over a kludge like that any day.


46 posted on 06/10/2008 5:43:11 PM PDT by Golden Eagle
[ Post Reply | Private Reply | To 43 | View Replies]

To: conservatism_IS_compassion
Because in the long run the keyboard must inevitably be looked back on as a clumsy, anachronistic input device.

"A keyboard. How quaint."

I really don't see voice becoming the primary input mechanism for most computer users unless there are a whole lot of advances in AI under the hood. Speech between two humans is an efficient mode of communication only because humans are able to infer what should fill in the gaps. Even then, it's easily misunderstood; without miscommunication and wrong conclusions, we would have no basis for sitcoms.

If speech control of computers is based on crisp, sharply articulated commands issued in a consistent and logical temporal order every time, I don't see it replacing the keyboard (or mouse, or even handwriting) without a change in the programming philosophy behind it, not just the application of more computing horsepower under the hood, however impressive that horsepower may be.

47 posted on 06/10/2008 5:47:08 PM PDT by ReignOfError
[ Post Reply | Private Reply | To 18 | View Replies]

To: Swordmaker
There are PowerPC drivers in Snow Leopard according to some in the know. . . and the G5 is fully 64 bit. . . so they may be dropping support for the G3 and G4. Time will tell.

Didn't Leopard already drop the G3? Dropping support for the G4 and G5 at the same time would be unusual for Apple, but not unprecedented. The first release of OS X dropped support for the PPC601, 603 and 604 at the same time, IIRC.

48 posted on 06/10/2008 5:58:18 PM PDT by ReignOfError
[ Post Reply | Private Reply | To 16 | View Replies]

To: conservatism_IS_compassion
Recall the big issue over the Y2K transition, and it does give one pause over assuming that Moore's Law will break down before reaching 44 bits of address space.

I'm not a programmer or an engineer, so please forgive me if any of this is misinformed, over-simplistic, or just plain stupid.

The difference between pre-Y2K and today is that a programmer is less likely to hard-code limitations into critical software -- they'd be in subroutines or at the OS level, making it much easier to update or port a program than it was with the old, patched to the Nth degree, and mostly undocumented COBOL and FORTRAN code.

Another difference is that we might have actually learned from y2K (stranger things have happened), and databases have come a long way. If the critical data is stored in some standardized form, it would be a lot easier to move to another program or platform, even running the old and new systems in parallel to make the switch smoother.

And finally, virtualization is a pretty mature technology. It's easier now than before to run old software in its own little sandbox while making a transition to the new hotness. Bringing it back to Apple, this is something they're old hands at -- 680x0 emulation on PPC, PPC emulation on Intel, and Classic on OS X all made those transitions shockingly smooth.

What it means is that your son could very easily see the time when an unnecessary software limitation creates a crisis in the operating system. Which, looked at in that way, is pretty optimistic after all. Why would OS X necessarily last two human generations?

I guess my point is that it's a lot more modular than it used to be. OS X might not be around in two generations, just like few modern-day admins have even seen the big iron the Internet was built on. But TCP/IP survives, and if you get a couple of beers in a cranky old-timer, he'll start ranting about how "Web 2.0" is really just telnet 5.0, or gopher with pictures. Or, for that matter, that it's all just an extension of the telegraph, which was a packet-switched digital network before the telephone gummed things up with all that analog stuff.

49 posted on 06/10/2008 6:41:51 PM PDT by ReignOfError
[ Post Reply | Private Reply | To 13 | View Replies]

To: conservatism_IS_compassion
Sounds like it'll be awhile before OS X makes much of a splash again.

You base this on the announcement of back-end changes made in private? Come on. I remember hearing this stuff about 10.5 when it was announced, and then the final release somehow had a bunch of additions, changes, and features that everyone loved.
50 posted on 06/10/2008 6:50:09 PM PDT by Terpfen (Romney's loss in Florida is STILL a catastrophe. Hello, McCandidate!)
[ Post Reply | Private Reply | To 1 | View Replies]


Navigation: use the links below to view more comments.
first 1-5051-100101-118 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
General/Chat
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson