Skip to comments.Samsung claims a ‘Massive’ Graphene Wafer breakthrough – Begins Prototype production of gFETs ....
Posted on 04/07/2014 11:23:16 AM PDT by Ernest_at_the_Beach
Graphene is slated as the major breakthrough of this century. Infact it could very well propel the semiconductor a couple of decades easily (compared to the performance trend via Moores law ). Graphene transistors are more than capable of being clocked at 500Ghz so you get the idea of what Samsung is claiming to have achieved: a replicateable production process of Graphene nodes.
OK, I admit, I was being slightly sarcastic when I wrote the headline. It seems sort of ironic that if Samsungs claims turn out to be true then the first Graphene processors will be in future wearable devices instead of lets say, desktop computers. Of course, Samsung can be forgiven for saying this considering its a primarily mobile company after all, but it still stings to a PC Enthusiast like me. Graphene, aka the miracle material of the century, to be used in future smart watches; seems like an extract from Douglas Adams Hitch Hikers Guide to the Galaxy. Anyways, enough of my rant. On with the specifics.
This breakthrough was claimed at the Samsung Advanced Institute of Technology and Sungkyunkwan University in South Korea. Basically they used a common silicon wafer with Germanium substrate to manufacture a mono layer mono crystal Graphene impression on top of it and then remove the silicon wafer (and germanium) from below. The silicon wafer can then be reused which is pretty great. Also since the Graphene is removed from the germanium using a completely dry process the Graphene is completely wrinkle free which basically means that the construction of the crystal is completely clean and low in defects. Since both the germanium substrate and silicon wafer can now be reused ( in previously known production processes the germanium substrate had to be burned off) it will exponentially increase, mass production capabilities. Samsung has started production of Graphene Field Effect Transistors (gFET)
* by CPUs I mean SoCs, but that didnt sound so poetic up there.
The Telegraph ^ | 4/4/2014 | Sophie Curtis
Posted on Sat 05 Apr 2014 06:51:29 PM PDT by Vince Ferrer
Skynet thanks Samsung.
Dammit. My graphene phone briquetted itself!
Whoa, this is a big deal if true. Having to burn off the substrate was a big drawback and a major problem for mass-production. They’ve essentially made a reusable die for the graphene matrix, bumping the reproducibility of the product manifold. This is very exciting and could lead to a revolution in computing.
My prediction: 100+ cores on one server proc die by 2020.
And it wasn’t that many years ago the electronic stuff from Samsung was considered the cheapest junk you could buy! But my cheesy microwave still works, and I just got one of their tvs. But when they first hit the market here they were considered to be junk.
I thought IBM were the leaders on this on. Did they let it get away from them?
When you need raw processing power...faster is always better.
At 500Ghz a lot of strange stuff will be possible.
Carbon sequestration = GOOD!
It’s becoming obvious mankind didn’t invent any of this stuff. This is clearly stuff being reverse-engineered from the Area 51 spacecraft.
I went to a gas grill, so my yard’s intelligence has obviously dropped.
Most of this stuff is electrical in nature. Given the predisposition to anal probes, I guess that most aliens used waste to power their craft.
Or should that be Ass-End-ing?
You tell me, Laz. You seem to be educated in the art of dick-tion.
I think IBM is working on some sort of diamond chip.
Implantable? Coming soon.
I searched IBM and graphene. The latest seems to be over a year ago:
Weeeellll, 500Ghz, pretty nice.
I would appreciate it if somebody more knowledgeable than me (a very large group) could explain how much this would speed processing up relative to existing systems.
Can someone help me to understand the significance of this advance if it works—both in terms of current technology and what kind of work this speed up will enable. (presumably this does not enable quantum computer speeds.)
for example to understand the meaning of 500 GHz ....correct me if I’m wrong — are we talking about typical desktops operating at about 4-5 GHz?
according to this article
As our hardware analyst mentioned in his Haswell review, Intels new parts struggle to get past 4.5GHz on air, while Ivy Bridge could reliably hit 4.7GHz, with some parts reaching 4.9GHz. In reality, the picture is even muddier than that: Early reports suggest that some Haswell chips can only reach 4.3GHz, while others can get to 4.7GHz or higher (again, on air).
The diamond is a quantum chip at USC, not IBm
Gasp! This is a terrible, frightening event!
Carbon is POISONOUS!! All industrial use of carbon was well on it’s way to being properly suppressed as the awful Enemy of Nature that it is!!
STOP BIG CARBON FROM DESTROYING THE EARFF!!
(I can’t wait to hear some enviro say this seriously.)
I have my zoom up high and could not figure out what that was,....had do drop down to make it out....LOL
Want to take a crack at answering them.
We could have a good discussion .
Think superconductor. Low resistance, uses less power, less waste heat.
Science is equal opportunity....Samsung is no slouch....in manufacturing,.
If you had an 8-core CPU in your desktop machine and they were running at 500Ghz then your PC would qualify as a super-computer.
It could simulate the physics needed to do realistic real-time hi-res 3D-graphics without breaking a sweat. Your current PC would take weeks or months to generate a single frame.
It could do strange things like alter your image in real-time so that you could look and sound exactly like someone else on a live video chat.
It could handle simulations of nuclear devices.
If you could feed it the data it could monitor all US cell traffic in real-time and watch for keywords.
Creating the software to take advantage of such power would be the hardest part.
Just imagine 10,000 cores running at 500Ghz.
IBM has a Powerserver that has the top cycle time.
This would do damage to all of the ...enthusiasts doing overclocking.
See 36 and 37.
Too bad the NSA has dibs on the first years supply.
...graphene transistors ibm circuits ....
Just my casual impression that in the past every story about graphene was a story about IBM research. Then out of the blue ( to me ) Samsung starts prototype production of a product.
I remember wafers.. rows and rows of ‘em.. those wafer testers were a bugger sometimes.
The whole Globe has been working on researching Graphene.
See my uodates...links.
Incorrect. Not Area 51, Apollo 22.
Someone can correct me if I’m wrong but I think the Intel limitation is thermal. Faster clock rates increase dissipation so there is a natural practical limit to the clock speed.
Wow. Thanks much.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.