Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

Math problems too big for our brains
Ottawa Citizen via The Windsor Star ^ | November 8 2005

Posted on 11/08/2005 8:48:52 AM PST by RightWingAtheist

click here to read article


Navigation: use the links below to view more comments.
first previous 1-20 ... 61-8081-100101-120 ... 161-177 next last
To: Bob

"Which finger do you use to represent 0 in base 10? If you don't need it in base 10, you don't need it in base 8 either."

OK. I get you. 1,2,3,4,5,6,7,10. You're right. I have a cold, so I'm not thinking as clearly as usual today.


81 posted on 11/08/2005 9:59:22 AM PST by MineralMan (godless atheist)
[ Post Reply | Private Reply | To 79 | View Replies]

To: MineralMan

There are 10 kinds of people in this world. Those that understand binary math, and those that don't! ;-P


82 posted on 11/08/2005 10:00:56 AM PST by MortMan (Eschew Obfuscation)
[ Post Reply | Private Reply | To 24 | View Replies]

To: Bob

I was interested to find that the Sumerians and Babylonians used Base 60 math. There's an interesting web site that discusses this:

http://www-groups.dcs.st-and.ac.uk/~history/HistTopics/Babylonian_numerals.html


83 posted on 11/08/2005 10:04:43 AM PST by MineralMan (godless atheist)
[ Post Reply | Private Reply | To 79 | View Replies]

To: MortMan

There are only two kinds of people in the world: those who think there are only two kinds of people in the world and those who don't.


84 posted on 11/08/2005 10:06:25 AM PST by boojumsnark (Time flies like an arrow; fruit flies like a banana.)
[ Post Reply | Private Reply | To 82 | View Replies]

To: SlowBoat407
Yes. From http://www-groups.dcs.st-and.ac.uk/~history/Mathematicians/Godel.html

"Gödel is best known for his proof of "Gödel's Incompleteness Theorems". In 1931 he published these results in Über formal unentscheidbare Sätze der Principia Mathematica und verwandter Systeme. He proved fundamental results about axiomatic systems, showing in any axiomatic mathematical system there are propositions that cannot be proved or disproved within the axioms of the system. In particular the consistency of the axioms cannot be proved. This ended a hundred years of attempts to establish axioms which would put the whole of mathematics on an axiomatic basis. One major attempt had been by Bertrand Russell with Principia Mathematica (1910-13). Another was Hilbert's formalism which was dealt a severe blow by Gödel's results. The theorem did not destroy the fundamental idea of formalism, but it did demonstrate that any system would have to be more comprehensive than that envisaged by Hilbert. Gödel's results were a landmark in 20th-century mathematics, showing that mathematics is not a finished object, as had been believed. It also implies that a computer can never be programmed to answer all mathematical questions."

85 posted on 11/08/2005 10:06:41 AM PST by Faraday
[ Post Reply | Private Reply | To 77 | View Replies]

To: Bob

I was interested to find that the Sumerians and Babylonians used Base 60 math. There's an interesting web site that discusses this:

http://www-groups.dcs.st-and.ac.uk/~history/HistTopics/Babylonian_numerals.html


86 posted on 11/08/2005 10:10:54 AM PST by MineralMan (godless atheist)
[ Post Reply | Private Reply | To 79 | View Replies]

To: boojumsnark

I've never been wrong.
Wait,...once I thought I was wrong, and THEN I realized I was wrong.


87 posted on 11/08/2005 10:15:18 AM PST by Tulsa Ramjet ("So far, so good. But this is only phase 1."--Captain America)
[ Post Reply | Private Reply | To 84 | View Replies]

To: RightWingAtheist

Heh, What's your mathematician opinion about this?


88 posted on 11/08/2005 10:18:15 AM PST by SuziQ
[ Post Reply | Private Reply | To 1 | View Replies]

To: MineralMan
Really? I can prove that 2 + 2 = 11.

And I can prove that there are no uninteresting numbers.

89 posted on 11/08/2005 10:19:24 AM PST by bruin66 (Time: Nature's way of keeping everything from happening at once.)
[ Post Reply | Private Reply | To 15 | View Replies]

To: RightWingAtheist

Oops! Meant to send #88 to someone else!


90 posted on 11/08/2005 10:20:51 AM PST by SuziQ
[ Post Reply | Private Reply | To 88 | View Replies]

To: DuncanWaring

True 2 + 2 does equal 11 in base 3, but it still does equal 4.


91 posted on 11/08/2005 10:20:56 AM PST by John D
[ Post Reply | Private Reply | To 28 | View Replies]

To: SirKit

Mathematician's opinion?


92 posted on 11/08/2005 10:21:24 AM PST by SuziQ
[ Post Reply | Private Reply | To 90 | View Replies]

To: RightWingAtheist

"Or rather, math problems have grown too big to fit inside our heads"

Paging Seymour Cray!


93 posted on 11/08/2005 10:26:07 AM PST by Amish with an attitude (An armed society is a polite society)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Darksheare

94 posted on 11/08/2005 10:29:23 AM PST by OSHA (I've got a hole in my head too, but that's beside the point.)
[ Post Reply | Private Reply | To 56 | View Replies]

To: MineralMan
2 + 2 = 5
(for extremely large values of 2)

95 posted on 11/08/2005 10:29:57 AM PST by ItsForTheChildren
[ Post Reply | Private Reply | To 15 | View Replies]

To: OSHA

LOL

This must be a derivative of the same formula for establishng a liberal world view.


96 posted on 11/08/2005 10:37:38 AM PST by Amish with an attitude (An armed society is a polite society)
[ Post Reply | Private Reply | To 94 | View Replies]

To: RightWingAtheist
You mean we can't "prove" this?


97 posted on 11/08/2005 10:41:35 AM PST by Professional Engineer (Have you have your Breakfast yet?)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Professional Engineer
It's proof enough for me. Unfortunately, my wife's not good at math.
98 posted on 11/08/2005 10:46:14 AM PST by ILikeFriedman
[ Post Reply | Private Reply | To 97 | View Replies]

To: RightWingAtheist
Not so; we're just readjusting how people think about math. The history of math is the history of how to think about the unthinkable. Romans did not have a symbol for "zero", irrational numbers are exactly that, infinity is plainly too big to consider, etc. - yet those mathematical concepts are easily considered by everyone on this thread, not because "our brains are big enough", but because someone got creative and figured out yet another way to symbolically express the unknowable.

Joke (sort of):
A couple of guys are sitting around bored.
"What's the biggest number you can think of, Joe?"
"Um...er...two. You?"
"Uh...three!"

Likewise, there are legends of African tribes whose math system amounts to "1, 2, 3, many".

Sounds dumb, right? Have you ever seriously thought about how you count things at a glance? Given a bunch of stuff to count, aside from "1,2,3,...", the best most people can do is visually lump the items into groups of 2 or 3, then conceptually add those numbers up. Go ahead: drop 5 pennies on the desk in front of you and count them, paying strict attention to how you count them; you might go "1,2,3,4,5" and effectively perceive only 1, or you might mentally divide the group into subgroups of 2 and 3 then add 2+3=5, however you do it you're isolating subsets no larger than a size beyond which your mind cannot purely reference ... strictly speaking, the biggest number you can really think of is 3!

So if the human mind can REALLY only perceive "3" as the largest number, how do we get to pi, infinity, sqrt(-1), and proving the 4-color theorem? Same way we get to "5": divide, isolate, symbolize, combine, repeat - and thru use of tools.

Once we have a mental set of symbols representing whatever it is we are mathematically doing, we again rapidly run out of mental processing space. Wasn't long in human history before Og the caveman went from counting stuff to marking rocks so he could be reminded of what he counted and what the number was. From mud on cave walls to impressions on clay tablets to abacuses (sp?) to ink on parchment to pencil on paper to magnetic field directions to transistor states to holes on DVDs, humans have used external tools to store expressions of mental symbolism. And it's not cheating, and it's not considered the end of mathematics.

So on we go to the next phase: tool-assisted computation - and it's not new either. Where one mathematician might have decided that proving something required checking all the permutations of a problem, he would go thru an algorithmic process to check all the possibilities. Given a big enough problem, he might hire other people - professionally called "computers" - to work as a team to solve the problem. Then Charles Babbage and Lady Ada concocted a machine to do what people were doing (at which point a politician, clueless as ever in history, asked "if you put the wrong figures in, will you get the right figures out?" - but I digress). Von Neuman extended the concept into a theory of electromechanical computation, and soon the modern electronic digital computer was born, followed by huge networks of incredibly fast sub-nanosecond-cycle computing devices solving enormously complicated mathematical problems in shockingly short periods of time (Euclid spent decades trying to compute the first fractal, now trivially done in milliseconds).

So what's the upshot? We are not outside the limit of the human mind with advanced math. We have simply figured out, again, how to divide up a problem, assign symbols to the pieces and their relationships, create an algorithm to process them, and rather than doing the processing by hand turn it over to a human-built machine for rapid processing.

I'm a software engineer with an affinity to mathematics. Trust me (as it's my profession): it's all simple, it's just a matter of breaking down the problem into manageable chunks - ultimately to rearranging 1s and 0s - and telling a machine in detail what process to follow. It's all still understandable by humans, we just get a machine to do the hard/tedious work for us. When someone claims we've hit the human limit, that's usually time to get ready for a whole new way of doing things that goes far beyond that limit.

99 posted on 11/08/2005 10:47:18 AM PST by ctdonath2
[ Post Reply | Private Reply | To 1 | View Replies]

To: Faraday
One major attempt had been by Bertrand Russell with Principia Mathematica (1910-13).

Which, by the way, spent 128 pages proving 1+1=2.

100 posted on 11/08/2005 10:48:53 AM PST by ctdonath2
[ Post Reply | Private Reply | To 85 | View Replies]


Navigation: use the links below to view more comments.
first previous 1-20 ... 61-8081-100101-120 ... 161-177 next last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson