Real Life/Headscratchers/Math

Everything About Fiction You Never Wanted to Know.


  • If it's 0 degrees outside right now, and in an hour it will be twice as cold, what will the temperature be in an hour?
    • There is no such measurable thing as "twice as cold". The closest thing you could find is "half as hot". We can measure calories/joules, after all. You also failed to specify Celsius, Fahrenheit or Kelvin.
      • Not quite. "Twice as hot" means "twice the energy," which is measured on the Kelvin scale. So temperature ratios that you make on the Kelvin scale are perfectly acceptable. Furthermore, it can never be zero degrees K in our universe, so the question itself is unfounded. But good try.
      • Alternatively, the numerical value of how cold something is can be defined as how strongly human nerve endings which measure cold are activated by the outside temperature. This would depend on a lot of factors, varying between individuals, their core body temperatures at the time, the effective thermal conductivity of the heat sink and the presence of radiation, and it would lose meaning once pain receptors are activated instead. Assuming the only variable which changes is the temperature of the air, for most people "twice as cold" would probably be somewhere between -5 and -20 on the celsius scale. On the farenheit scale, 0 degrees couldn't be doubled in coldness without passing into the pain range for most people. On the Kelvin scale, it's impossible, since it's not possible to achieve zero degrees within any finite timespan.
  • If I got 4 cards out of a standard deck, what is the chance that at least one of them would be a diamond?
    • That's fairly easy combinatorics, though you'll need to specify the number of jokers in the deck to get the exact probability. Just know that there are 52!/48! ways to draw four cards from a 52-card deck and 39!/35! ways to draw four non-diamonds.
  • Why does pi get a name, rather than 2*pi? You don't measure a circle by it's diameter; you do it by its radius. Nobody would ever use half the circumference. If you want a circle constant, make it the circumference divided by the radius.
    • There are people that agree with you. Some of them came up with a name for 2pi. We call it tau.
      • Which must annoy people who use tau for the golden ratio.
    • Presumably to make the area equation, pi*r^2, easier to remember and use.
    • It's a holdover from the Greek era, when people did measure a circle by its diameter, not quite having gotten out of that habit from the pre-compass days. Even the familiar formula for the area of a circle, pi*r^2, was originally put down by Archimedes as "diameter times circumference over four."
  • Why do people in an Honors Advanced Precalc class still need to ask "When are we going to use this in life?" If you've opted to take the class, and gotten to this level, you should know that unless you get a career in pure math or teaching, you're not going to use it. Just deal with it.
    • I'm sympathetic to the idea that if a student is taking a course they have obviously already decided that it is worth it (either as a bullshit requirement for their degree/major or because they actually expect to apply it or just because they like it) and should STFU with their whining, but pre-calc is useful in a lot more fields than pure math or teaching. It's also used in science and engineering and programming. Math ISN'T just taught so other people can wank in it or teach the next generation of people how to wank in it, you know (that's literary analysis); it's actually, practically useful.
      • This Troper would argue that literary analysis is useful beyond school, or teaching. The concept of being able to look beyond the surface of a presented situation and really dig deep to have a deeper understanding, is skill that is useful through out the rest of life. Lit analysis promotes analytical thought across most other disciplines in life. Oh, and troping is a form of literary analysis
    • Math is used in everything. Well, almost everything. Economists use some very advanced math including fields like differential equations, probability and statistics, dynamical systems, and even some more pure stuff like linear algebra and analysis. Iannis Xenakis was known for using mathematical ideas (especially from group theory) in the composition of music. As a biology major, I use concepts from differential equations and statistics fairly frequently. I repeat: Math is used in everything.
    • Precalc isn't really a good example. Calculus and related concepts are found in all kinds of fields, and can't really be learnt without precalc; furthermore, concepts from precalc itself (like sets, vectors, limits) are likely to come up for nearly everyone once in a blue moon.
    • Ha, I tried this whine once and my instructor said "you won't use it, not for you major...what was it again? ah, yes, but..it will teach how you think better."
    • This is the best counterargument to "When will I ever use this," I've read:

"NEVER. You will never use this. People don't lift weights so they'll be prepared should, one day, somebody knock them over on the street and trap them under a barbell. You lift weights so that you can knock over a defensive lineman, or carry your groceries or lift your grandchildren without being sore the next day.
You do math exercises so that you can improve your ability to think logically, so that you can be a better lawyer, doctor, architect, prison warden or parent.
MATH IS MENTAL WEIGHT TRAINING.
It is a means to an end, (for most people), not an end in itself."

    • It's interesting how mathematical concepts even pop up in (non-mathematical) philosophy. Baudrillard talks about "Mobius-spiraling negativity," Lacan makes (rather suspect) analogies to various topological spaces, and many cultural critics talk about vectors in a similar sense to how mathematicians do, just without the quantitative aspect.
    • Wouldn't it make more sense to just teach them applied mathematics so they know when to use it?
      • You mean science and engineering courses?
  • How is it that it took me failing math-related classes three times in a row, before my college instructors finally admitted that I must have a learning disability. I don't understand algebra or algorithms, and have always wanted a tutor, but the attitude from the college was mostly, "Oh, you'll do better next year".
    • Because schools don't want to spend money creating a tutoring program. Since colleges are a business, they have to turn a profit. And that often means shafting the people who really need help.
      • Actually, most colleges are non-profit. While they still need to pay for everything they provide, profit is not the goal.
      • Also, most colleges do, in fact, have tutoring programs. You just have to look for them.
  • Why are practically all theorems named after people who didn't invent them? The most egregious example is probably the Pythagorean Theorem; when the person who really thought of the Pythagorean Theorem (a student of Pythagorus) showed it him, Pythagoras had him DROWNED. (Because it implied that non-integer numbers existed, which Pythagorus considered mathematical heresy.)
    • That's not quite right. The theorem had been in use as a rule of thumb in the West before Pythagoras, and no one's sure who ultimately proved it there, but it was a Pythagorean, if not Pythagoras himself. The person he's said to have drowned might well have been the one, but it wasn't immediately obvious from the theorem (which, keep in mind, had been in use) that there had to be irrational (not just non-integral, which Pythagoras was fine with) numbers. The proof most likely presented to him relies on the theorem, and seems fairly intuitive, but it's much easier to call a proof "intuitive" after you've read it.
    • Why don't people bother doing the friggin research before posting a 'just bugs me' entry? Seriously guys, it makes you look like idiots. Pretentious idiots.
    • "Euler's work touched upon so many fields that he is often the earliest written reference on a given matter... in an effort to avoid naming everything after Euler, discoveries and theorems are named after the 'first person after Euler to discover it.'" -The Other Wiki
    • Not to mention the Pythagorean Theorem was found by Arab scientists and Chinese scholars (independently) way before the Greeks did.
      • Wah? While it's true there's evidence Egyptians and Indians knew of it prior to Pythagoras's era, it was published in Greece around 400 BCE, and the Chinese proofs are near the same time as Pythagoras.
      • Architectural evidence, and texts on architecture, suggest it was found by everyone, including the Greeks, centuries or millennia before Pythagoras lived. The Pythagoreans' alleged accomplishment (which is largely thought to be made up as advertising hundreds of years later) is having proven it. Although "Arabs" weren't doing much in those days; while it's an "Arab" country today, pre-Islamic Iraq/Mesopotamia isn't usually considered Arab any more than, say, Archimedes is considered Italian, if Babylonians, Sumerians, etc. are what you mean.
    • Generally, the more modern you get, the better people tend to be about naming. This is especially true in regards to living people, as, for example, Green and Tao might get kind of pissed if the Green-Tao theorem wasn't named after them. However, there are some situations in which the person who first made a conjecture will have the theorem named after them - for example, Fermat's Last Theorem is still called that even though Wiles proved it, and the Poincare conjecture will probably keep that name instead of becoming Perelman's theorem. (Although Fermat claimed to have proved it, his proof was never found.) There are some unfortunate situations in which an important result was discovered independently in many different places. For example, there's the famous Cauchy-Schwarz-Bunyakovsky inequality, discovered over the course of a century by three mathematicians. In most of the world, it's called the Cauchy-Schwarz inequality, but in Russia, they still call it the Bunyakovsky inequality. There are other instances in which political barriers played a role. Sharkovsky's theorem wasn't known to much of the world until after the fall of the Soviet Union. Two American mathematicians, Li and Yorke, proved a less general result in the meantime, and while the use of the term "Sharkovsky's theorem" has spread, many still refer to the "Yorke-Li theorem." Generally, theorems are named after the discoverer. Just not necessarily the theorem's you've heard of.
      • My math professor just calls it Cauchy's inequality. :P But yes, he does love to talk about this. "Okay, now we'll be learning about Stoke's theorem, which probably wasn't discovered by Stokes, and it's basically just another version of Green's theorem, which Green probably stole from some other guy..."
    • IIRC the (probably apocryphal) drowning story was for proving that the square root of two is irrational (can't be written in integer/integer form - e.g. 0.45 is rational because it can be written as 9/20). The proof goes as follows:
Imagine that the square root of two could be written as A/B, where A and B have no common factors.
Then 2 = A^2 / B^2. Rewrite this as A^2 = 2.B^2.
Then A^2 must be divisible by 2, hence so is A. Let A=2.C, and rewrite the equality as (2.C)^2 = 2.B^2
A little algebra and we find that B^2 = 2.C^2. But then B must be divisible by 2 too.
But this violates our premise, which is therefore disproved.
    • Pythagoras was a bit mystical about integers, so this kinda rubbed him the wrong way.
      • Probably not. As mentioned above, the irrationality of the square root of two is what (allegedly) rubbed him the wrong way (although another story has Pythagoras finding it himself and counting it a massive breakthrough; knowledge of pre-Socratic philosophy is like that). The proof above, however, is an unknown Platonist's two hundred years later (referenced by Euclid and Aristotle, but no extant primary source), and relies on number-theoretic concepts from that school. The Pythagoreans' proof was more likely a geometric proof similar to the one in The Other Wiki here - hence why it relies on the Pythagorean theorem.
        • In one sense the above proof does rely on Pythagoras's theorem as, if you wanted to be stubborn about only rationals existing, you could take it as a proof that the square root of 2 doesn't exist. You'd then need to take the unit square and apply Pythagoras to the diagonal to prove that root 2 does exist.
    • There is actually a name for this phenomenon. It is called Stigler's Law of Eponymy, which states that no scientific discovery is named after the person who actually discovered it. Interestingly enough Stiglers Law is an example of Stiglers Law.
    • My question: Why is it still called a theorem when it has been proven and used for many centuries? Why not call it "Pythagorean's Law?"
      • Because a "law" is actually a very weak principle, based solely on observation. A "theorem" is so far to the other end of the scale it's not even funny: what it states, however counterintuitively, is tautological when the definitions are properly understood. It could not be any other way. The fact that "law" sounds so strong in comparison is an accident of nomenclature, from the days when scientists still imagined they were seeking "laws" of some divine force, while laymen heard of the "theories" being used to describe the universe and coopted the word for their own hypotheses, as opposed to its original (and current strict) meaning of an area of study, with a "theorem" being originally a principle of a theory and coming over time to mean one of a mathematical theory. To call a theorem a "law" is an unspeakably profound insult, comparable to calling a novel an "anecdote."
  • Why is it impossible to divide by 0?
  • Why is algebra and algorithms considered or anything involving advanced mathematics required classes if the career I want is to become a cartoonist?
    • Because you might want to create an incredibly popular but somewhat esoteric stick-figure webcomic.
    • All right, somebody has GOT to form a band called "Onion Squirts" or "Walrus Dream Butter".
    • Short answer: math is useful for everyone; see many of the other answers on the page. Long answers: a) it's not a good idea to rule out all careers but one at an early age; b) most cartoonists have a day job too; c) you'll need a lot of math if you run your own business - and that includes working freelance, which almost all cartoonists do these days; d) there's more to life than work - knowing math will help you understand the scientific advances, and knowing statistics will help you understand the political and economic debates and social problems, that will happen in your lifetime.
      • "So why aren't math teachers taught to recite this in class to bored students?"
        • When my Algebra students ask "When are we ever gonna use this?" my answer is always "In Algebra 2".
        • As someone who went through Algebra 2 only a couple years ago, I have to advise you not to do that. That, to a high schooler, is akin to just saying "because I said so" and just adds to the belief that there is no practical need for it when, in reality, there truly is. In hindsight and/or from the teacher's perspective, it may be witty, but you really aren't helping those kids by saying it.
        • So the correct answer is "What do you want to do when you leave school/college/university?"

Student: "I want to be a ..."
Teacher: (irrespective of the student's answer) "Well, you'll need algebra for that."

          • No, the correct answer is for the teacher to know some goddamn practical applications of the subject s/he teaches and to tell the kids what they are, just like they would tell them any other piece of information in class. Equations are widely used to model everything from traveling times and speeds to shopping and budgets; it's not that hard to come up with a few examples. And if s/he really doesn't know, then let it be said "I am sorry, I don't know any applications of the subject I make my living teaching because I am an idiot. Please go ask some other math teacher."
          • Like what, calling those things x and y? We don't usually do that outside of school on the fly, even on the top of our heads. Not only it's troublesome, but is silly and takes too much time and energy to solve in our heads.
          • The Mundane Utility that comes with Math is only realized if you THINK REAL HARD about it. When I was a kid, I used to get errands to buy this and that from there and here. So my mom always gives an amount far more greater than the value of what we are actually going to buy, even before I knew how much I'm going to buy. Only when I actually buy that thing, then I know that if you give $100.00 to buy something worth $49.99, I would get $50.01. And if I just so lose that single cent, Mum will know that something's Gone Horribly Wrong. So after I did just that a million times, only then did I realize that Math DOES have Mundane Utility.
          • Fridge Brilliance: If I'm betting at the Superpower Lottery, I would rather have a Game Breaker ability like that of Accelerator which involves doing Math. Just to show people that Math Is Power.
    • Also, knowing geometric relationships is a good idea when drawing anything. I don't know what kind of artistic complexity or realism you're hoping to get into, but knowing proper perspective and scaling is really important for making things look right. People who are good at picturing things spatially usually honed this ability in math and science.
      • Although no one can deny that math is important, I personally think more weight should be given to language skills (and that's not just because I'm an English major). Think about it: Your language is the one discipline that you will be using every single chucklefucking day of your life. Even if you don't say a word out loud all day long, you're still thinking in English (or Japanese or Spanish or Urdu or whatever), and if you go the whole day without thinking, you're either comatose or dead. Or very, very good at Zen. And yet the world is full of people who can barely write their own language without screwing up every other word. Am I the only one who sees something wrong with this picture?
        • And I am very bad with words, terminology, especially. Yet, I am damn good mathematician and programmer. When I think I of something math-y I tend to go in Buffy speak mode (well, obscene Russian variant of it, you Ruskies know what I am talking about), 'cause while I don't remember the word (or remember it incorrectly) I still have the idea behind that word.
        • Because they are both important, and depending on the type of person you will often say one is more important than the others. This would be wrong as they are both important. But Math is harder for many people to grasp and enjoy as recreation, so that it's consider work instead. If our knowledge of language degrades, we'll more likely to fight and lose our high education and fall as a culture. If we lose our knowledge of math, we may die as a planet, due to our dependence on it to keep the world going and keep us fed.
        • You get taught all the language skills you need to not fuck up sentences by the time elementary school is over. The fact that people don't care is a different matter, not a reason to give langauge "more weight".
    • Also, take a quick look at some of the gaffes in Writers Cannot Do Math and Artists Are Not Architects to see some of the ways knowing algebra might be helpful.
    • Geometry might help you if you are a cartoonist. Algebra and calculus are more general, useful things for everyone regardless of career. But this isn't about math in particular, is it? This is about the entire concept of required courses in general. In which case, are you sure you are going to be a cartoonist? And you will get to be that straight away for the rest of your life, without needing to do something else on the side or along the way or when you are down on your luck?
    • If you want to understand economics, then you need to study algebra. And if you're 100% sure your life's sole career will be professional cartoonist, then you need to study economics.
    • What does it matter? I’m stuck in an Algebra 2 class as well, along with nineteen other students, and I can assure everyone here that there isn’t a damn thing my classmates and I can do about it, whether it has a practical application in our lives or not. Besides that, why are we even complaining? It’s free knowledge. Yeah, we have to pass the class to graduate, but is it going to kill any of us? No. Who cares if we never use it? What matters is that we can use it.
  • Alright, is infinity a number or not? If yes, say 1/infinity=x, then 1-x... If no, then how many numbers are there?
    • Way too many comments being wasted over this. No, infinity is not a number. It is a cardinality (or actually either two or countably infinite cardinalities, depending on convention), i.e., a property of a set that for finite sets is given as a counting number, and it's a symbol for absolute increase without bound. It's sometimes written as though it were a number in limits, sums, integrals, etc., but that doesn't make it a number, but rather means that you're taking a different type of limit, sum, integral, etc., with significantly different properties from those that approach a number. However, it's not a number, and you can't meaningfully use it in arithmetic operations, even though it's sometimes (sloppily) written in that way to signify a limit, as is dividing by zero. (No, whatever your calculus teacher told you, you really can't divide by zero, even zero by zero; you can sometimes take convergent limits of functions as they approach points where they would require division by zero, but that's not the same thing.)
  • Why the hell do they call square roots of negative numbers imaginary and not just, not real, or fake, or something? And if those are imaginary, then what do you call the other non-real numbers?
    • I don't think the term was ever meant to be derogatory. I suppose Descartes came up with the term quite naturally. People knew there is no sqrt of -1 in the plane of irrational numbers. But somebody probably said - let's imagine, for the sake of argument, that there *is* a number representing sqrt of -1, obviously not in the irrational plane but in some imaginary plane of which the irrational numbers are just a subset - and see where it gets us. Eventually, it got us quite far, most notably to the theory of electromagnetism, theory of signals and quantum theory. And the term stuck.
    • If sqrt(-1)=i, what is the symbolology (why did I lol?) for other numbers sqrt(-n), where n is any number [0,∞)? Sure, i might be the most important Square Root Of A Negative Number, because (in real square roots), 1 is the only number that's its own square root, but the other S.R.O.A.N.N.s must have some kinda use...
      • sqrt(0) is also itself.
      • Any other symbols would be redundant. The square root of -n is sqrt(n)* i. 2i is the square root of -4, for instance.
      • S.R.O.A.N.N.s? I don't believe they exist. (Gets attacked by an S.R.O.A.N.N.)
    • The name "imaginary numbers" started off as a dismissive name used by critics, but like "Big Bang" it caught on and is still used even now it has become clear that they have practical uses.
      • In the 19th century, complex numbers were sometimes referred to as having a "possible part" and an "impossible part" rather than a "real" and "imaginary" part.
    • More than just utility, complex numbers have an entirely rigorous construction as points on the plane with multiplication being defined by (a,b)* (c,d)=(ac-bd,ad+ bc). If we then define i=(0,1), we see that i^2=-1. Everything about complex numbers is well-defined, and properly constructed. There is no such x such that 0x=1. Maybe in the limit sometimes this occurs somewhat, but it is easy to prove that for any ring (such as the real numbers) 0x=x0=0 for any x. So you would either have to cripple your arithmetic in some way (such as dropping distributivity of multiplication over addition or you would be forced with the nonexistence of such a system.
    • To make things even more interesting, imaginary numbers can be used to model real world phenomena successfully, a practical demonstration of its logical validity.
  • So what's the square root of i?
    • There's more than one, but (1+ i)* sqrt(2)/2 comes to mind.
    • The easiest way is using Euler's formula: e^[(pi/2)* i] = cos(pi/2) + i* sin(pi/2) = i, so sqrt(i) = {e^[pi/2]* i]]^(1/2) = e^[pi/4]* i = cos(pi/4) + i* sin(pi/4). (Sorry for all the badly written math). The reason there are many answers is because of the nature of sin and cos (they oscillate).
    • That's right, although saying there are multiple answers is overcomplicating it; there are really only two answers, directly opposite one another in phase angle (i.e., one is the negative of the other), as with all square roots. The principal square root is the one with either a positive real part or no real part and a positive imaginary part, just like with real numbers. Also, keep in mind that sin(pi/4) = cos(pi/4) = sqrt(2)/2 ~= .707, so it's about .707 + .707i (or that times -1).
  • How about .999...=1 ?
    • Consider x=0.999.... Then 10x-x=9.999...-0.999..., so 9x=9, so x=1. There's no rounding error, they're just two ways of writing the same number.
    • The way I like to explain this to my students is that the dissonance that is occurring is that they keep trying to visualize 0.999... as getting closer and closer to 1, as you add successive nines. The mind-blowing revelation is that the number isn't MOVING, because 9s are not being added. They are all already there. ALL OF THEM. An infinite number of them. If you start looking at this as a fixed number, rather than a moving number on a number line, then I think it's easier to accept the mathematical proofs that show 0.999... = 1. We just can't write down all the 9s, because it would take longer than high school. And, you know, all the of the rest of human history...
    • The way it was explained to me that caused me to finally "get it" was: okay, so you assert that 0.999... is not equal to 1? Then there must be at least one number that comes between them, a number that is greater than 0.999... and less than 1. Tell me what that number is.
      • This?
        • …which, by passage to the limit, is equal to 1, since as n→∞, (1/n)→0. So 0.999… < 1 < 1, and in particular, 1 < 1, i.e., 1 ≠ 1 (⇒⇐). As this contradicts that 1 = 1, it must be the case that no such number exists. □
      • There are many, many proofs that any number infinitesimally close to another number IS that number. It might not makes sense; but if it isn't true, then the age-old adage 'numbers don't lie' is dead wrong. If you deny that 0.999...=1 than you are stabbing calculus in the face and taking a whiz on it's grave.
        • Apparently, as much as Writers Cannot Do Math, it is also true that Mathematicians Cannot Use Grammar. It's should be its.
      • For instance:

1/3+1/3+1/3=1
1/3=0.333...
0.333...+0.333...+0.333...=0.999...
Therefore 0.999...=1

    • Another thing to consider with .999... is that every integer is followed by a decimal point and an infinite number of zeros. 1 = 1.000... and so forth. If someone claims that .999... has a "last nine," they're also saying that 1.000... has a "last zero." Due to how math works, this means you MUST follow that last zero with a nonzero digit -- you can't "end" it because there is no "end" to infinity, and if you cap it off with another zero, then the one before it isn't the last zero anymore, is it? And once you cap it off with a nonzero digit, it means that 1.000... is now greater than 1, even if by the tiniest fraction. When we say it goes on for infinity, we mean it -- there is no "last nine" in .999... without turning the thing into a different number.
    • I don't know much about math, but let me tell you my own theory, as one who can, as C.S. Lewis would put it, "look along the beam" instead of "looking at" it. (Uh, strike that, reverse it? You know what, drop the metaphor altogether.) Anyway, here's my theory: the numbers themselves really, truly, DON'T lie. It's just our numeric representations of them--these symbols on paper--THEY lie. The Arabic numerical system that we use (0-9) is well-constructed, but all like human-made systems it is imperfect. It doesn't communicate with perfect accuracy all the time the numeric truths it's designed to be transcribe, because there are limitations in the way it can work. Chinks. And that leaves us with APPARENT discrepancies such as, for instance, zero-point-repetend-nine being the same as one.
    • My preferred way of "getting it", helped out by the limitations of dividing things with a base-ten calculator. Is .333… equal to one-third, or somehow "less than" it? If you can get that it is equal, than all you have to do is triple it. Ta-da!
      • The problem is that some people will say it is somehow "less than" 1/3; you're asking the same question again, essentially, which is, "does the decimal point represent the limit at infinity?"
    • The best way is to point out that a decimal representation is shorthand for a sum of numbers over powers of ten, and repeating decimals are therefore shorthand for the limit of an infinite sum. Take the infinite sum of 9/10^n and see what you get.
    • Equivalents to this problem exist in every radix (or base). For example, in hexadecimal, 0.F… = 1. Perhaps the "strangest" one is binary 0.1… = 1. Given those bases, consider this: if (decimal) 0.9… did not equal 1, then there would have to be a real non-zero number resulting from [1 minus 0.9…]. Some in forums insist that this difference would be the "smallest number greater than zero" (which doesn't exist in the real numbers), and that it should be written 0.000…1. Well, what about the equivalent subtraction in binary? The binary number 0.1 is five times the size of decimal 0.1, and 0.01 is five times the size of decimal 0.01, etc. So, is binary [1 minus .1…] a "bigger smallest number" than decimal [1 minus .9…]? Or is base-20 [0.000…1] a "smaller smallest number"? This would mean that our choice of radix somehow affects the properties of real numbers, which would be rather like Formulaic Magic. In reality, for any conventional radix, [1 minus 0.[largest-digit-in-radix-repeating-forever]] always equals zero, so it all works out fine.
  • On the subject of .999...=1, how is it even possible to arrive at .999...? Wouldn't any problem that gives you that answer properly be reduced to 1 anyway? I was always taught to use fractions during work and decimals only for answers and only when specified, so I really don't see how you could arrive at .999... in any practical situation?
    • Consider the infinite sum "Sum_{n=1}^{infinity} 9/(10^n)" which works out as 0.9 + 0.09 + 0.009 + ... It's easy enough to see that this sums to 0.9999... (and hence to one, due to all the stuff that has previously been mentioned).
      • Of course, I have to point out the formula for the sum of an infinite series S = a/(1-r); in this case a = 9/10 and r = 1/10, so S = (9/10)/(1-1/10) = (9/10)/(9/10) = 1, confirming that 1 = .999....
  • Does everything REALLY involve numbers, or are people just making that up to scare kids into getting A+ 's in math?
    • Yes. (The joke had to be made. See below for serious answers)
    • Everything important for modern civilization involves numbers. Medicine dosing, for example, is determined by things like the rate of absorption into the body as well as body weight factors. As You Know, too less medicine won't help you get well, and too much medicine can kill you. Electricity you can maybe use without Math; but to understand how the signal gets to your house, you need to understand trigonometry at a minimum. Your computer is based off numbering systems. Essentially, if you want to understand modern technology at all, you need math, and it starts with numbers to get to variables, where the real work starts. If you don't, then you'll be beholden to someone else for everything. Heck, proper cooking needs a good understanding of math and chemistry. You might be able to do okay with out it, but understanding how the heat is dissipated from pans and absorbed by different materials and the effects of a higher versus lower temp and how it's not quite the same if you double the temperature to halve the time...
    • Math is a way for humans to understand the world around them. The Moon circles around the Earth without knowing any math; but if you want to predict its movement, then you'll need math. There is a theory that the nonhuman mind will produce very different math, but we are stuck with the math we have. Non-scientific methods of understanding the universe exist (but don't necessarily work) and don't rely on math. But you can't tell what you discovered there to others, and progress is impossible there.
    • Math is not about numbers - it is about patterns. It can be represented by numbers, but you can talk about groups having in mind geometric transformations (which are closer to pictures). That's the beauty of math - the pattern appears over and over again in seemingly unrelated fields. Well - numbers are easy to operate on, so they are thrown for simplification.
    • Bah. Anyone could argue that their job, or any job, is or isn't "important to modern civilization", and no one has any right to look down on anyone for holding a job that isn't that "important" either. And listen to me, original poster: you will get lectures identical to the ones above from people who are aficionados, teachers, or experts in ANY ACADEMIC SUBJECT in the world you ask the same question about, and each will have their own arguments for why their profession is uniquely central to modern life and the knowledge of it uniquely critical to everything. I once had a history professor who could take these guys above to town and back in an argument over math vs. history in such regards, and I don't even necessarily think such things about history myself. Listen: whatever subjects are most necessary to what you think you need to do in your own life, focus on those. I'm not just talking about school and careers, it's always an ongoing thing.
      • It is true that any good professor could make an argument for why their field is central to modern society. If the subject wasn't important enough to them for them to believe it they wouldn't have sent all that time getting a PHD. And they might even have a point, after all modern life is so complex that any number of subjects could be considered central to modern society. The thing about math though is that it isn't just important, it is fundamental. Biology, physics, chemistry, geology, materials science, engineering, electronics, computer science, economics, medicine, actuarial science, digital illustration, 3D modelling, business etc. etc. and so on all require advanced mathematics to understand. Pretty much everything else requires at least some math. I can understand the desire to focus on what is important in your life, but in the modern world we don't live in a vacuum. In a democratic society we are required to be good citizens who understand the issues of the day and can vote intelligently on them, and we cannot possibly do this without mathematics.
    • All of science, engineering, economics and statistics directly involve math. That's a pretty broad bunch of categories. Since science is about describing reality, mathematics is also the involved with everything at a detailed enough level. If you get abstract enough you can avoid dealing with math directly, though (for instance, computers are devices made of particles whose behavior is described by equations, with electrical circuits modeled and designed by different equations, which represent information in binary numbers and manipulate it with logic rules and mathematically focused programming languages... but you can just use a mouse and a keyboard to manipulate a GUI and never worry about that if you don't want to; it's still there, though).
    • The deductive reasoning structures formalized by math, went on to become standard notation for philosophy. Seriously. You can use logic and proofs in philosophy now. Sure, it isn't as formalized, but that method of thinking is great for arguments. Example philosophical argument--this one's for continuous revelation:

Assume that Your Entity Of Choice, hereafter referred to as God, is perfect. Therefore, everything which people claim that God has done, is something which God wanted to happen. Ergo, continuous revelation is part of the Plan.
Conversely, assume that God is not perfect. Thus, continuous revelation must take place, as, being omniscient, God must be able to realize his mistakes and attempt to correct them.
Who says you can't mix logic and religion?

      • I would never say that, but I for one would never use such Insane Troll Logic as the above to support the idea.
  • Why are calculators, even basic ones, so much fun to play with?
    • 5318008.
      • Rebuttal: 55378008.
    • Because they're easy to fiddle with, and chances are you don't enjoy using them properly. You will have them in front of you during unending hours of boredom in the classroom without a whole lot of other things to mess with. Probably very much like how dictionaries are so much fun to look up random things in.
    • Probably one of the reasons graphing calculators have apps, Half the time in my classes, a student next to me is playing some block puzzle game, rather than actually doing math.
      • Most Brazilian universities require the HP 50G calculator for engineering/science courses. It is VERY common to see people playing games during classes, or even trading games (the calculator has an infrared port and a card reader).
  • Seriously, guys. Who the hell came up with the term 'integer'? What is wrong with calling them 'numbers'? If they're supposed to be called 'integers', why the hell do we even use the word 'number' anyway? Let's be honest here: when I first learned the term 'integer' back in middle school, that was the moment when mathematics Jumped the Shark for me. I've never trusted it since.
    • An integer is a type of number, just as you can say that a Golden Retriever is a "type" of dog. Specifically, Integers are whole numbers, no decimal point, no fraction.
      • Seriously, guys. Who the hell came up with the term 'Golden Retriever'? What is wrong with calling them 'dogs'? If they're supposed to be called 'Golden Retrievers', why the hell do we even use the word 'dog' anyway? Let's be honest here: when I first learned the term 'Golden Retriever' back in middle school, that was the moment when biology Jumped the Shark for me. I've never trusted it since. (The message is: Use Google.)
    • Integers are numbers with no fractional parts, hence the name. Pi and 2.5 are numbers that aren't integers. Sad you gave up on math due to a random misunderstanding.
    • 5 second Google search would have saved you a lot of pain. Better question would have been "Why not just call them whole numbers?"
      • And, for those who are wondering, an explanation of this is that "whole numbers" is a vague term; it could refer to the nonnegative integers, the positive integers, or all integers. In math, we like to define things precisely to avoid confusion.
    • As for where the term came from, OED says it's from Latin, meaning "whole, intact". Same origin as "integrity" apparently.
    • For what it's worth, the symbol used to denote the set of integers, the blackboard bold Z, is an abbreviation for the German Zahlen, meaning "numbers."
  • Why doesn't 1 count as a prime number?
    • They'd have to make too many exceptions for too little gain. Prime factorizations don't need any 1s, and we don't need the definition to be changed to "divisible by itself and 1, or just 1 if it's 1".
      • One is divisible by itself and one, but then every number is divisible by itself and one. Prime numbers are defined as having exactly two factors.
        • So, first off, that statement should be "one is divisible by ONLY itself and one." Second, if one is itself, then wouldn't it only be divisible by itself? so it fails to meet the definition of having exactly TWO factors.
    • 1 is a unit; its properties are VERY different from those of actual prime numbers. A better question is "Why should 1 count as a prime number?", and there is really no answer to that beyond "It looks sort of like a prime from a purely cursory view".
      • That gets pretty subjective. Why do we care about prime numbers at all? Well they are considered the building blocks of other numbers so we want to study them and have a name for them so we can refer to them concisely. The unit, 1, is definitely a vital building block of other numbers. By that reasoning it should be counted as prime. Of course with such a high level of reasoning 0 should also be considered prime since none of our prime numbers can multiply to 0 without it. I think the first responder was correct in saying that it would require a bunch of theorems to add exceptions.
        • Ultimately, mathematical terms are defined the way they are because it leads to interesting or useful properties. We could easily define an even number as "any multiple of 2, and also 1", but there's not really anything you can do with that.
      • One of the biggest reasons we're interested in primes is that any number can be written as a product of primes in one and only one way; for example. 35 is 7 x 5. That's what's normally meant by "primes are the building blocks of the integers". If we include 1 as a prime, then it's no longer true; 35 is 7 × 5 × 1, or 7 × 5 × 1 × 1, or 7 × 5 × 1 × 1 × 1, ...
      • To the above troper: THANK YOU. I wasn't the one who originally asked the question, but it's bugged me, too. This is a simple and perfect explanation. Now I get it, instead of just accepting it.
      • One minor issue: prime factorizations are only unique up to order.
  • Infinitesimals (things like dx and dy) bug me. We're allowed to divide by them because they're not * really* zero, but when we're adding them, we can treat them as zero because they're basically zero. Yes, I know their purpose is for limits and ratios, but it still bothers me that they are treated as both 0 and not 0.
    • Me, I always assumed it was just sloppiness. Yeah, you could do it rigorously with limits, but why would you? It's hard enough work as it is.

taking a derivative on.

      • Infinitesimals can in fact be defined rigorously without limits by things like extending the reals to the Hyperreal Numbers. They are more complicated than but analogous to the reals, and are represented by an infinite sequence of reals (note: some sequences represent the same number, like 0.999…=1). In this case, a real r is represented by the hyperreal <r>=r,r,r…; infinitesimal hyperreals have sequences that converge to 0 iirc. Approaching calculus in this way is called nonstandard analysis.
    • The official line is that dx/dy is not division, it's just a way of writing differentiation - Leibniz's notation is still used because it's awkward to change everything.
      • And because if you have multiple independent variables, it's good to know which one you're taking a derivative on.
    • Treating them as both 0 and not 0 is just for ease of use. When a new student walks into a calculus class, it's easier to tell them "1 divided by 0 equals infinity" than say "the limit of 1/x as x approaches 0 from the right is infinity". Pretending that infinitesimals equal zero is just our way of doing limits using mental math.
    • You are not alone in asking this question: this was one of the big complaints lodged against calculus when it was first introduced. I find it helps to think of calculus as being interested in the behavior near a point, rather than at the point.
  • Anyone else ever noticed that teaching math is like one long series of lies, and then flipping back on what you said? "Oh, no, you can never subtract a big number from a smaller number..." "... unless you use negatives..." "You can't multiply fractions..." "...Until you find the least common denominator" "You can't find the square root of a negative..." "...without using i..." etc.
    • I can actually remember back in Kindergarten or 1st grade using a calculator and doing 3 - 8 just to see what it would show. I was confused about what the - meant in the answer because I didn't know about negatives at the time.
    • That's a problem with the way that math is taught in school, not with math itself. Also, who tells people you can't multiply fractions, and why the hell would you need the least common denominator to do it?
      • I think he meant adding fractions, which would require some common denominator, but it's usually faster to just multiply the the denominators (which rarely gives the least common denominator) since half the time you have to simplify anyway.
    • It really depends on the teacher. My teachers were always pretty honest: "What happens if you subtract 10 from 9?" "You get a negative number, but we'll talk about those later. For now, just don't do it." That was satisfying: I knew it was possible, and I'd learn eventually.
    • It's part of all sciences, it just comes up most in maths. Lies to children so that they don't ask questions of teachers when they won't understand the answers.
      • You can still teach math and science without doing that; it's just harder. All you have to do is be up-front about which stuff will take a more advanced technique: "OK, this is a square root, taking the square root of a negative number requires complex number theory, and we're not ready for that yet."
    • Besides, some of them aren't even lies. It's perfectly true that you can't find the square root of a negative number at first, because you're working with real numbers. It stops being true once you start working with complex numbers, but before you introduce the concept of i students have no business working with complex numbers anyway.

Student: Can we get a square root of a negative number?
Teacher: ...No. You can't.

      • Frighteningly, some teachers of early-level math teach it in that annoying "you can't do this" "now you can because I said so" way because they actually don't get it either. My (tenured) sixth grade teacher happily taught the class that the area of a circle was pi* r* 2. Not pi* r squared. Pi* r times two. Fortunately she was called out by some of her own students before it could stick with the others. Unfortunately, she tried to fight them to save face instead of shrugging and making sure the students got it right, and later was forced to admit she didn't know what exponents were. As a math tutor these days, I can say with great confidence that the vast majority of problems people have with math can be traced back to a teacher screwing them over somewhere early along the line instead of genuinely being "not good with numbers" -- most of the people I help have to be re-taught several things that were drilled into their heads outright wrong several years prior.
      • This Troper's boyfriend has been incredulously good at math since he was in elementary school, to the point where he knew algebra since he was is first grade (he evidentally had an uncle who was rather zealous about teaching him this stuff...). He used to get incredibly frustrated every time a teacher insisted that he stay on the same level as everyone else. Understandably, he eventually got tired of this, and one day in fifth grade, when everyone else was just beginning to learn how to solve equations and other extremely basic algebra stuff, he decided to create an algebra quiz and make his teacher take it. She nearly failed it.
    • This is how anything is taught. First we give the big picture, then we explain that it's not actually quite that simple. Compare a first grader's understanding of the American Revolution with a college student's. Both know how it ended, but the first grader has a very simplistic understanding of it.
    • This Troper, who is taking an independent study on complex variables, found an SAT II math question that didn't actually have a correct answer because of this problem. He had to actually ignore what he had learned that semester in order to give the "right" answer.
      • Don't they specify "the best answer" instead of "the most correct answer", for moments like that?
    • Math isn't just one body of knowledge. For instance, according to the Peano axioms, there is no number who's successor is 0. There's nothing wrong with the Peano axioms. But you also get something perfectly consistent if you replace that with an axiom that says every number is the successor of another number.
      • There isn't any number whose successor is zero, because the successor function is only valid for the counting numbers. When you define negative numbers, they don't have "successors," even though you can add one.
    • There was a group of French mathematicians who wrote a series of textbooks in mathematics under the collective pseudonym "Nicolas Bourbaki" which didn't ever "flip back on things" as the original troper described it. This was mainly because it bugged Jean Dieudonné, who would always threaten to resign if he thought someone was suggesting that they do this. The result really has to be seen to be believed, and although I like the book on General Topology, I understand that there are people for whom this way bugs them even more. However, It bugs me too, which is why I prefer graduate mathematics textbooks to undergraduate ones, as undergraduate ones often give a misleading view of the subject, fail to take advantage of fundamental and important techniques because they are "too abstract" etc...
    • They tried to fix this once. That's where the "New Math" came from, and it was a disaster. As frustrating as it is to us looking back as adults, or to those with specific kinds of minds as children, it looks like kids generally have to rote-learn simple operations on the natural numbers before they can move on to understanding of the concepts behind them.
  • So what exactly is it about stuff like Algebra and above that makes it so hard for some people to explain? Even college professors often hate teaching college algebra because there's no way to explain it in a way that everyone can understand. I've literally seen it...half the class would understand what the professor said, and the other half would probably wonder how s/he got from Point C to Point D or think s/he pulled random numbers out of their ass.
    • The problem is that being able to do something and being able to teach other people how to do it in a lecture hall aren't the same skill. Explaining algebra doesn't have to be hard, but being good at algebra doesn't give you magic explanation-powers either. Moreover, a lecture on any subject will go over the head of a big chunk of the audience unless there's a lot of repetition and use of different techniques to cover the same topic, because no one lecture technique works for every person who listens to it.
    • Math gets more abstract as it gets more complex, and people have less everyday stuff to comprehend it intuitively. Integers? Well, those are like, how many things you have, right? So that's easy. Negative numbers and decimals? Well, those are just like money! Except money only goes to 2 decimal places, but it is simple enough to realize that it can go to more places. Then you step out of arithmetic and are suddenly dealing with things like variables and graphs of functions and it just becomes more arcane to people. Basically, each for every level of complexity in math you go up (algebra to calculus to differential equations, for example), there is a lesser percentage of people who will be able to truly grasp and understand it (as opposed to, say, memorizing algorithms to get the right answer), because it just keeps on getting more abstract and counter-intuitive, and thus takes more brainpower to conceptualize; brainpower that some people simply don't have.
    • Another place that problems may arise is where the student is unable to articulate what he doesn't understand about the concept under discussion.
    • It's not just Maths. There isn't any single class without a student thinking what the f*** the lecturer is talking about. Of course, some subjects are more so than others.
  • Who decided that Pi is equal to 3.14159265389...Pop!
    • Why does math hurt my brain?
    • Noone "decided" that, Pi is defined as the circumference of any circle divided by its diameter, which just happens to be an irrational number.
      • Not only that, it happens to be a transcendental number.
    • If it helps, it hurt Pythagoras's brain too.
    • Remember that all (perfect) circles are exactly the same as all other circles except for size. So the diameter-circumference ratio will be the same for every single circle, and we call that ratio pi. Meanwhile, the diameter lives in line-land and the circumference in curve-land. At no "zoom level" is a circle's edge composed of tiny lines; it's curved "all the way down". Because of this, the circumference can't be given as exactly any number of diameters, or vice versa. We can't say "X diameters equal Y circumferences", and that's the definition of an irrational number. However, we can say that one circumference is more than 3 diameters and less than 4, that it's more than 3-and-one-tenth diameters and less than 3-and-two-tenths, and so on forever.
      • Actually it's been determined that pi to 42 digits is accurate for a circle with diameter and circumference greater than that of the universe, accurate to within less than the diameter of one proton, which in This Troper's opinion, is plenty accurate enough!
      • Sure... if all you want to do is measure actual, physical circles. Because of Euler's identity, conceptual perfect circles come up all the time in the solutions to differential equations - indeed, you could make a case that pi's real significance is the number that solves these equations, and circles are just a special case. It's not very common, but there are times when you need more precision in pi to compute a quantity you've come to in that way, especially if what you're actually looking for is the distance between two very close quantities, as with relativistic effects.
    • A key word put in that previous bullet is "perfect"; when you draw a circle in the real world, the amount of "stuff" its circumference and diameter are made out of can be expressed in terms of each other. (The circumference won't be infinitely curved, or infinitely thin, like a "real" circle. For that matter, the diameter won't be infinitely straight. But the closer to those ideals you get, the more digits of pi you need; 3.14 is enough for circles drawn with a compass.)
    • It might help to know that, while the number 3.14159265359etc looks like a random set of digits, it actually looks a lot more natural when written as a continued fraction. It would be hard to write these fractions here in ASCII format, so I recommend looking it up in the other wiki.
  • Why is it that logarithms aren't taught at the same time as powers and roots? There's three variables in a^b=c. Only teaching how to find two will inevitably lead to confusion.
    • a^b=c is the same as: c=a*a*a*...*a (so that a appears b times). It can therefore be easily (if laboriously) worked out with a pen and paper. Logs, not so much. The only ways to find them are with a calculator, a log table, or a slide-rule. Basically, it's the fact that logs aren't trivially solvable with arithmetic methods that means they're not introduced until later.
      • Okay, now try that with b not being an integer. :-P
    • Roots have the same problem, but they're introduced way before logarithms are. With roots, you generally either are given easy numbers like sqrt(25)=5, or are expected to leave it expressed as a root like sqrt(5) = sqrt(5). We could, but don't, do exactly the same things with logarithms: give easy numbers like log_5(25) = 2, or leave it as a log like log_5(2) = log_5(2). Later on, you can introduce hybrid problems like sqrt(50) = 5 * sqrt(2) and log_5(50) = 2 + log_5(2).
    • There are two reasons, really: first, they're not introduced until after because they used to be introduced well before, but as slide rules fell out of fashion, they were dropped from that part of the curriculum, whereas roots and powers stayed where they were. Second, the order in which math is taught in secondary school tends to reflect that in which the concepts were developed, and approximation of irrational square roots goes back to prehistory, higher roots to Classical Greece, and general methods to medieval Persia, whereas logarithms other than those of integer powers aren't seen until the seventeenth century.
  • What exactly constitutes a "number"? A rational number is normally defined as an ordered pair of integers. A real number is defined as a converging infinite series of rational numbers. Given that I don't see why people tend to have a problem calling vectors and matrices numbers. But how far does it go? Is a point a number? How about a line? Is a set a number?
    • A number is something we can intuitively perceive as a number (math is full of duck typing). The exact notion of what we intuitively perceive as a number varies depending on the time period -- negative, non-integer, and irrational numbers were not considered numbers at different points in history, and complex numbers still commonly aren't. Definitions are given for scientific rigor, and can in fact be extremely clumsy and less intuitive than the concept itself (just look at the set-theoretical definition of natural numbers through the axiom of infinity). So, strict definitions are used when when we need to ground the new concept in existing ones instead of just saying "There is a number whose square is -1, because we say so!".
  • Why doesn't anybody seem to teach matrices well? They're sets of linear equations. People do operations on functions all the time, so it doesn't seem like it would be too confusing to tell people they're multiplying a set of linear equations (it's actually equivalent to composite functions, but you get the idea). A lot of people have problems with fractions. This is like teaching people the rules of fractions, but never telling them what they actually mean.
    • This is actually a great point, specially since matrices start getting <very> important in university level math!
      • It's actually possible to learn matrices much earlier than they're taught. I personally think the reason lots of people don't understand quantum mechanics very well is that they don't understand matrices very well.
        • I'm currently taking computational linear algebra and so far everything I have seen could have been included in a high school algebra class or taught as a separate course. I looked ahead at the book and it looks like there are some applications for differential equations coming up, but I think they really should teach this stuff in high school. It makes solving systems of linear equations a lot easier and requires no calculus or trig, just algebra and geometry.
    • Another good question is why matrices aren't taught first as representations of linear transformations. Most linear algebra classes (especially ones with a computational bent) spend a good month noodling around with matrix row operations before introducing the linear transformation. Wouldn't doing things the other way make more sense?
    • The above is a terrible way of teaching matrices.

Student: So what are matrices for?
'Teacher: They're used in linear algebra.
Student: What's linear algebra?
Teacher: It's something involving vectors, which I'll teach you about in a few months, though you won't actually learn any linear algebra until 1st year college.

  • You are so wrong, my friend! It is true that the mathematical faculties of which you speak are closely tied in with the same linguistic intelligence faculties which together combine to make a large part of one's deductive reasoning faculties, but those two components are still separate in and of themselves, and rationality is itself still but one part of the difficult-to-capture aggregate known as the intellect. I, for instance, have a massive amount of skill with my linguistic faculties (I remember one time in elementary school when my principal had me read and explain to him a paper from his office because he didn't know what some of the words meant) yet have a disability in math, giving me approximately the same math skills as your average dead sea bass, and I find it hard to think of a time in my whole life when anyone said I had any problem impressing them with my reasoning skills. The mind is a funny, fickle, mysterious thing.
    • For the same reason people who are illiterate (whn u typ lyk dis) who have no reason to be (they're not living in poverty, they don't have any learning disorders) are considered stupid. I mean, I'm not saying you have to be a whiz at advanced calculus or anything, but if you can't do basic algerba because "that's too hard, so I won't even try!", then yeah, you're a moron.
      • If you can't tell the difference between stupidity and simple laziness, you're probably a moron.
      • If your "simple laziness" extends to levels of not bothering to learn the difference between there/their/they're and being incapable of solving simple algebraic systems like 2x = 4, that's stupid and you're a moron.
      • Funny, though, just how far beyond 2x=4 and there/their/they're these things get long before it ceases to be a problem even before school is over, isn't it?
    • Note that it's exceedingly hard to tell if someone is actually bad at math, or if they were just taught poorly by a subpar teacher and the student lost so much ground there that they couldn't catch up later -- the latter happens depressingly often at grade school levels (see above for a teacher who didn't know what exponents were, or one who nearly failed a quiz prepared by her own student).
      • Not that hard, methinks... you just have to test them with questions that require little mathematical knowledge, but a lot of ingenuity. The SAT Math does this, requiring only basic algebra and geometry. It should be possible to make a similar test composed of questions that require only basic arithmetic for people who claim they had a bad algebra/geometry teacher (and no teacher I've ever seen, no matter how bad, has been able to screw up teaching arithmetic - they may have added a lot of unnecessary bullshit that made the whole thing unpleasant, like memorizing multiplication tables, but they got through the core stuff well enough; that's the one math everybody knows).
    • Fun Fact: It's thought that 5th grade math is what determines how you'll do in future math classes. This is because 5th grade is when math becomes more focused on abstract concepts (fractions, very basic algebra) rather than concrete ones (as in anything you can use physical objects, such as fingers, to illustrate). If you do poorly in that you make grasping more difficult concepts even harder. That said, it's pretty basic knowledge that not everyone has the ability to understand certain levels of math. Saying that someone is a moron because they aren't good at math is in itself moronic. That said, if you never even try to learn it, you ARE a moron (I may suck at math, but at least I tried to learn).
  • Does it bug anyone that all four sided triangles have exactly two sides?
    • It probably bugs some people, but those of us who have studied logic as well as math are familiar with the idea that a false proposition implies any proposition.
      • It's easy to give a simple proof of the above statement, without resorting to the principle of explosion. The usual definition of a triangle is a planar polygon with three sides. Therefore, a triangle with four sides implies that 3 = 4, since each polygon has the same number of sides as itself. Subtract 1 from both sides of this equation to obtain 2 = 3. Therefore 4 = 2 by transitivity and so it also has two sides.
      • I'm reminded of Raymond Smullyan's answer, when a friend didn't believe in the principle of explosion: "Do you mean, from 2+ 2=5, it follows that I am the Pope?" "Yes: 2+ 2 = 5. Therefore, 2=3. Therefore, 1 = 2. The Pope and you are two people. Therefore, the Pope and you are one person."
      • Bah, the above was weak. Here's how it's done properly
      • Credit where credit is due: the above proof is quoted in one of Smullyan's books but the originator was Bertrand Russell.
  • This isn't so much a JBM but...what in the name of all that is green and good on this earth is infinite summation? I'm not a mathematician, I have no problem with trying to understand abstract ideas but I can't help but be iffy on a subject when I ask my teacher "what IS this?" and the response is "..." and an eighty minute lecture on How to Use Excel for Dummies. Please, somebody explain what this is, as far as I can tell it's some extension of arithmetic and geometric series and sequences (which I think I understood but I resented having to learn it, though I admit taking SL math was my own choice and I can blame nobody but myself for learning math I will never ever need or encounter ever again once my final exams are done)...that or it's the mental equivalent to water torture, which is improbable.
    • Basically it's a way of saying that (in a series in which the next term is smaller than the last) for each term you add you will get closer and closer to this number, and if you added EVERY TERM UP TO INFINITY you would get this number. Think of it as like a limit.
    • It's actually a rather simple. Finite summation is when you add the terms described to the right of the sigma from some number on the bottom of it to some number on the top of it, right? Well, you can let the top number be a variable, and then to make an infinite sum, you take the limit as the variable increases without bound (that is, the variable's limit is positive infinity). As for the series and sequences connection, a series is an infinite summation, while a summation is when you add together (sum) the terms of a sequence.
    • Let a1, a2, a3, ... be elements that you sum. Take the partial sum p(n)=a1+a2+...+an. An infinite sum is the limit of p(n) when n tends to infinity.
    • Here's probably the best way: if there's a number such that, for any open interval you can define that contains it, there's a point after which no matter how many times you add the next term as it's defined, it will never leave that interval, that's the sum. Consider: that's how we think of non-repeating decimals - each figure added narrows the range in which the number is, say from between 3 and 4, to between 31/10 and 32/10, to between 314/100 and 315/100, 3141/1000 and 3142/1000, and so on. (And you have no idea how much willpower it took to leave the fractions in that form so you'd recognize the number...)
  • Godel's Incompleteness Theorem. Can someone explain exactly how Godel proved it's impossible to prove a mathematical theorem (I'm not a math major by the way, so simpler is better).
    • Just to clarify, Godel's Incompleteness Theorem does not say that it's impossible to prove a mathematical theorem, as clearly this is not true, what it does say is that for any consistent formal theory that proves certain basic arithmetic truths, there is an arithmetical statement that is true but not provable in the theory. The proof is extremely difficult, but the key point is that it is possible to construct a formula that in essence says "This formula is not provable", and show that it is neither provable nor disprovable (Massively oversimplified explanation of that last point: If it's provable then it follows that it's not provable, and if it's disprovable then it's provable. It's similar to the paradoxical sentence "This statement is false").
    • To put it in simpler terms, Godel didn't prove that you can't prove ANYTHING, but that you can't prove EVERYTHING. There will always be certain things that are true, but cannot be mathematically derived from a limited set of axioms.
    • The proof, in intuitive terms, goes something like this: take some logical system - call it Bob. Godel showed that if Bob can express the basic truths of arithmetic, then there is a way of saying (within the rules of Bob) "this sentence cannot be proven by Bob". If you can prove the sentence, it is false, and if you can't, it is true but not provable. Therefore, any such system is either inconsistent (contradicts itself) or incomplete (can't tell you everything).
    • If Godel had indeed proven that you can't prove anything, then his proof would contradict itself, now, wouldn't it? Read Godel, Escher, Bach by Douglas Hofstadter for a relatively simple explanation, although if you're ok with a REALLY simple explanation the above is actually not bad.
  • Is math real? Plato had the idea-world as far as I remember and in that numbers exist. Also the it-ness of a horse. (I haven't brushed up on this...). So is math real? Is the number 2 real in some way? Does it exist as a separate entity? How can math prove something, like string-theory (or m-theory, whatever) when it is dealing with something that cannot be experienced, and does not follow logic? How can you ascribe a value to something you haven't seen? Something that follows no other rules?
    • What is real?
    • Math is an abstract concept like language. The things involved are not "real" exactly (beyond being symbols on a computer screen/ noises we say out loud/ whatever), but in the same way that we can define "horse" to refer to one of those big four-legged things that goes "neigh", we can draw connections between the maths and real world - if we say that there are five horses then that means that we can take the set of horses and the set of integers {1,2,3,4,5} and assign one number to each horse with nothing left over in either set. With things more complicated than just counting the connection becomes a bit less obvious, but for example once we invent units we can measure things like distance and time, and from that we can derive area (length* width), speed (distance / time), acceleration (the rate of change of speed with respect to time) and so on. And of course, maths doesn't need to refer to the real world any more than language does.
    • Fundamentally, all of mathematics (at least, the kinds that most people work with) follows from the rules of arithmetic. Each level beyond arithmetic develops a concept, then uses the concepts of arithmetic to prove properties of the new idea, and so on. Since arithmetic is clearly grounded in reality, so is (some of) higher math.
    • This is actually a good question. I'll paraphrase my professor: "A mathematical object is what it does." Math is constructed based on properties of objects and operations. Of course, the way we are taught in elementary school, we associate numbers with objects, which is great...for elementary school level math. Eventually, you have to separate the idea of "two-ness" from the two oranges on the table so we can deal with more abstract (but still very real) ideas.
      • Unless you are a pure materialist, then math is obviously real.
      • It would be better to say that it reflects things that are real. While imaginary numbers and Hermitian matrices and other things are not real in that there would be no such objects unless mathematicians constructed them, yet they are useful. Reality never thinks to itself, "Ok, now I'm going to undergo a linear transformation," but the nature of physical laws may make such objects useful in predicting what will happen. It's a confusing subject, and I'm not sure if I can explain it well, but most pure mathematicians would agree there is nothing real about the objects they work with.
        • The objects are, from a strictly material viewpoint, entirely different, and there is no way in the world to avoid going into intangible, purely abstract fact to explain how two pairs of objects that are utterly different physically (say, a pair of puddles of water and a pair of bars of gold) still share the same element of duality that they also share together on a different level--their "two-ness", if you will. The mere fact of the two-ness is something completely real which cannot be comprehended except on an invisible, purely mental level--or at least not as of current scientific understanding. That fact is very, very objectively real.
    • Which reminds me of the famous "Two Plus Two Makes Five" thought experiment. In philosophy yes you can argue that 2 + 2 = 5 using the "math does not exist" argument, but we stick to the conventional 2 + 2 = 4 simply for representational, practical, utilitarian reasons. Same way we invented language to represent Real Life nameless objects. We got used to using "two and two" objects as four and not five, so we'll use that, and you don't want philosophy causing you Centipede's Dilemma while working on scientific problems.
  • If a dice rolled a certain number, are the chances of not rolling the same number 19/20?
    • Assuming that it is a fair 20-sided die ("dice" is plural), and ignoring weird circumstances like the die balancing on one corner, then the odds of rolling any particular number are one in twenty. Since the events are mutually exclusive (i.e. you can't roll a one and a six at the same time), we can add together the probabilities of each of the nineteen outcomes to get the probability of any one occurring to get 19/20. Another way to look at it is that the probability of getting a different number are one minus the probability of getting the same number, which again gives us 19/20.
      • tl;dr: Yes.
    • It's worth noting that individual rolls are generally treated as independent, so rolling a 20 doesn't make you any less likely to roll another 20 on the next roll. You can calculate the probability of getting two particular consecutive rolls (or even n particular consecutive rolls), but no one roll affects the next one (unless you're cheating).
  • Why is it that when you ask a teacher how this principle can be used in the real world, they say, "Oh, lots of places!" and NOTHING MORE. I mean, I understand where geometry can be used in the real world, but what about complex numbers? They're fun and all, but WHAT IS THE PRACTICAL APPLICATION, and WHY CAN NOBODY TELL ME?
    • To start with, above imaginary numbers can be used to model real world phenomenon . Most importantly complex numbers can be used to demonstrate things like vectors, movement in multiple dimensions, and electricity. In fact electrical engineering is almost impossible without complex numbers. Mostly you have bad teachers if they can't draw context to show the usefulness of the the math. Most math was designed to solve problems, occasionally though, in fields like complex problems they were developed first and then later created.
    • A really good example of a complex number is found in Matrix math, where a matrix represents a series of complex numbers. There's alot of useful things you can do with matrices, for example solving a complex problem where you only know certain relationships, say x+3=y and y+z=9. Complex numbers are needed to solve.
    • If teachers actually understood and knew how to use the applications, chances are many of them wouldn't be teachers in the first place. Rest assured, though, they do exist.
    • This from an electrical engineering student: complex numbers help working with alternating-current circuits. This covers power systems, transformers, electric machines, AC circuit analysis etc... Instead of having to work with sines/cosines in differential equations (a task that can become rather nasty in some situations), one can use complex numbers that represent those functions and are easier to work with a calculator or even by hand. We tend to call them phasors, by the way.
    • Read the first half of the annotation for Irregular Webcomic #1960 here, it will explain some. Also, start the Archive Binge if you're into roleplaying, any sort of science, art, history or Star Wars. Thus was a paid advertisement, thank you.
    • There are stuff like percentages, probabilities, summing and substracting, multiplying and dividing that you should learn to do for small practical gain, but thats the extent of mathematical concepts having any practical use. Do it for fun, try to understand it if you can afford yourself that sorta luxury, think and create new stuff. Thats what math is all about. Unless you're going to be a mathematician, physicist, engineer or a programmer. Then its a tool as well as an art. For the rest 99% of us, its just an art class.
      • If more people understood just the basic math of engineering, all kinds of things in science and economics could be discussed at an adult level, rather than politicians fighting to fool intellectual preadolescents with rhetoric. I would go so far as to say that the fact that innumeracy of this level is acceptable is literally the single greatest problem facing society, and if it is overcome, people will look back at that event as we look at the introduction of the printing press to the West.
  • This is more of a statistics question than an actual IJBM, but here we go. I work in retail and we frequently have %-off sales. Our usual is 60%, but sometimes an additional 20 gets added on top of that to certain items. I tell this to my customers and some exclaim, "Wow, so that's 80%!" I never know what to tell them because I utterly fail at math. The small part of my brain that does remember high school algebra is telling me percentages don't add up that way. Am I right?
    • The question is a bit ambiguous, by the additional 20% do you mean 20% of the original price or 20% of the reduced price? If it's the first then the customer is right (as always) - if the item cost £10 then you get the 60% saving of £6 plus an additional 20% saving of £2 for a total 80% saving of £8. On the other hand if it's 20% of the reduced price then the saving will be smaller - for a £10 item you save £6, but the second saving is 20% of what's left, i.e. 20% of £4 which is 80p for a total of £6.80 or 68% off. If the savings are x% followed by y% off the reduced price, then the total saving is x + (100-x)*y/100 percent.
      • I'm sorry, I should have clarified. 60% off, plus 20% off on top of the 60.
      • The important question is still "what is that 20% of?" Percentages don't really mean much on their own, they need to be a percentage of something, in this case the two sensible options are 20% of the full price or 20% of the reduced price. I would guess that it is most likely 20% of the reduced price, leading to a total saving of 68% off the full, but either way makes sense.
  • I'm not being sarcastic, and I am sincerely trying to expand my knowledge here, but is there ANY real world application for pure math?
    • That's kind of a leading question, since once any aspect of pure math starts to have real world applications, that aspect starts to be called applied math instead, so the answer is no pretty much by definition, but:
      • That being said, number theory, elliptic curves, and the attendant abstract algebra used to be among the most obscure of the pure math topics. Now they're the basis for the cryptography that allows stuff like online banking to happen safely.
      • Before that, the attempt in the early 20th century to make abstract formal logic played no small part in the development of the computer. Besides, some aspects of formal logic are now finding use at validating stuff like microprocessor designs and safety-critical systems.
      • Lie alegbras, once obscure, are now the basis for the standard model of quantum mechanics.
    • You're gonna have to define "pure math" here. If you mean it as the opposite of applied math, then the answer is obviously no by definition, but I think you would have trouble finding much of it in the average mathematics student's curricula. And just because something has no application now doesn't mean a future one won't be discovered eventually; that happens all the time.
    • A lot of pure math exists simply to add rigor to applied math. Without rigorous foundations, all of mathematics would fall apart. That said, the pure math/applied math boundary has been steadily dissolving for the last two centuries. Now group theory is essential in combinatorics, which is the basis of a good chunk of probability. Analysis is essential to the study of dynamical systems, which is useful in analyzing chaotic behavior in the real world. And, as people mentioned above, computers wouldn't exist without rigorous mathematical logic, cryptography would be nowhere without number theory, and quantum physics depends on abstract algebra. There are theorems that at first glance seem to have nothing to do with the real world, but which, when you look closer, are actually very profound in their consequences.
    • Balancing your checkbook.
  • WHY THE HELL DOES LEFTPONDIA INSIST ON CALLING IT "MATH"?!
    • Because they only have one.
    • Same reason you guys insist on adding extra u's to perfectly good words?
    • In North America, we are cutting the word off after four letters. In British English, you are shortening the name of the whole concept of mathematics. Both are fine.
      • It makes more sense to think of the noun as an uncountable.
    • And because "maths" is hard to pronounce. But I will gladly switch when you present to me exactly one math.
      • But surely the British have been known to play more than one sport.
  • I am saddened that the Collatz conjecture is not actually part of a Soviet conspiracy to slow down mathematical research in the U.S., because that would have been awesome.
    • Don't be silly. It's actually a plot by time travellers from the future to give computers something interesting to think about so they don't get bored and start thinking about how much better they could run the world if they were in charge.
  • If you take a Mobius and poke a hole in it, the hole goes from one side to the same side?
    • Yes, and? Are you expecting it to create a new side, or something?
    • Mind = blown.
    • If you take a sphere and poke a hole in it, the hole goes from one side to the same side? If not, where does a side end?
  • The Monty Hall Problem. Is it really so hard to understand that opening the non-selected/non-winning door won't... uh... somehow retroactively increase your odds of winning to 1/2? I mean, I can be understanding when someone is hitting intuition dissonance and can't visualize what's going on... but when school teachers are flat-out insisting that each door has a 2/3 chance of winning? ASDFADFADS
    • In certain situations the opening of the door does change the probability. If the host didn't know where the prize was and chooses at random, then there was a probability of showing the car: that this hasn't happened provides information to the contestant (namely, the host showing a goat is more likely if the contestant picked the car) and changes the probability to 1/2. On the other hand if the host knows where the car is and always opens the other door (to the one he has opened) when he can, then the fact that he can't open his preferred door provides the contestant with a certainty of the car being behind that door. However, given that the situation is in the context of a game show, the 1/3 option is most sensible since the others ruin the format.
    • I actually designed an experiment to test this, and tried it on my mum. I used three ordinary playing cards, two black and one red. The red card represented the prize, and the blacks were the boobies. I would shuffle the cards and lay them on the table, then mum would pick one. When she did, I would flip a black card that she did not pick, and then offer her the chance to choose again. When I looked at my results, mum got the red card more often when she switched and less often when she stayed. In fact, the rate of success was about 2/3 for a switch and 1/3 for a stay. My mind was blown.
    • Using a computer, this troper compared switching vs. not-switching over hundreds of thousands of trials. The result? Switching resulted in a win 2/3 of the time, not-switching only 1/3. Simple.
    • Also, MythBusters actually tried this out in the real world by having Adam always switch and Jamie always stay. The results were exactly the same as the above troper, switching had a 2/3 chance and staying had a 1/3 chance.
  • So in college I tried to get trig, but it would always lose me around the time we start the whole pi/2 sine wavey...stuff. I don't event think I was given a name for that. Thus, I never really finished the class (I must try again...) the three times I took it. Yes. Three times. I've had the feeling since I failed the first time (withdrew really, as it helped me realize I wasn't cut for a Comp Sci major) that I am some kind of sub-human, a resource drain that cannot understand or truly explain reality - and thus should make way for his superiors. This bugs me.
    • There are many ways of understanding reality. You'll find your own. Hopefully.
    • You are not sub-human. A lot of humans can't do trig or even algebra (they just fumble through it in high school and then promptly forget it). That said, if you are stuck at the level of trig, then yeah, you will never have a very good understanding of reality, which we have discovered to be governed by mathematical equations. Maybe try doing calculus on your own? There is a lot of non-trig dependent concepts and algorithms there that you might be better at.
    • Not to be flippant, but chill out, man! It's okay that you don't understand some mathematical concept. You have trouble with it...so what? I'm a multilingual writer--words are my thing. But you know what I'll never be able to understand? Arabic. DAMN, that is a hard language. But I'm no less human because of it.


  • 1+e^(pi*i)=0 It is really counter-intuitive that an a number raised to an imaginary/complex number produces a sinusoidal wave. I'm not the the only one who thinks so.
    • It is a remarkable proof, but it makes perfect sense. There are many such strange results, some of which are really counterintuitive.
    • You should look at it from a geometrical point of view. If you're ok with the fact that a^n * a^m = a^(n+m), and if you're ok with the fact that rotating with an x angle then an y angle is the same as rotating with an x+y angle (obviously, there's an analogy with the previous relation), and eventually if you're ok with the link between e^(ix) and the unity circle, THEN it should be perfectly clear why there's a sinusoidal wave around.
    • Speaking of that equasion, why isn't it e^(pi*i)=-1, or better yet, e^(2*pi*i)=1? What's so elegant about walking half way around a circle, turning 90 degrees, and walking towards the center?
    • The reason for that is that it's a simple equation that contains five fundamental constants in mathematics and nothing else. For any purpose in mathematics or elsewhere, you'll typically see it put that way, or the more useful general equation e^(theta*i) = cos(theta) + i*sin(theta), but it's just the aesthetic of having those five numbers, 0, 1, e, pi, and i, in one simple equation.
      • Except that 2pi is considered a more fundamental constant than pi, the use of which is mostly just a matter of historical accident. (See the top of the page).
        • Considered by who ? Mathematics teachers and mathematicians are generally fine with pi. Physicists on the other hands...but we don't care about them, do we ?
  • On a related note to the "cannot divide by zero" question above; if dividing can be understood as "x/y=z, where z*y=x," then would 0/0=every number? After all, if we plug in 0 for both x and y, then z can (must?) be every number. On the other hand, that would imply that z=/=z after all...
    • Not exactly, but you've got the general idea. 0/0 could be anything, and it is possible to determine what it makes sense for it to be in certain contexts given extra information. That's why 0/0 is called "indeterminate" while every other division by zero is "undefined" instead. Read this for a more throughout explanation.
    • Nope. Any number divided by zero is twelve. Therefore, since 0 is generally regarded as a number, 0/0 must be twelve.
    • If you get 0/0, you need to use l'Hôpital's rule to find the actual answer.
  • When someone says that something is "20% better," does that mean that the new version is 120% of the old one, or that the old one is 80% of the new one?
    • Presumably, it's the former, as one could argue that the latter would be a 25% increase, and that would sound better to buyers, so they would use that.
    • On the other hand, the second interpretation could be used to downplay the severity of something bad, like "deaths have only gone up 20% in the last week," instead of 25%.
    • This sort of thing is why mathematicians have so much jargon, to avoid the ambiguities of normal human language.
    • It gets really bad if they say "120% better" and you're not entirely sure if they meant 120% of the old version or 220% of the old version.
    • On a different subject but the same topic, my orchestra teacher would say that we sound 100% better when we improved. I think she meant x100 better, but I didn't have the heart to correct her.
      • I think you probably only got twice as good, realistically.
      • No, all of you just got better.
    • Doesn't that really mean that it's the same quality but 20% more expensive? I'm confused now...
  • What really bugs me is a common theme in the discussions above: "Everything must be useful/applicable to real life in order to be motivated." What does that even mean? Is pure math a waste of time unless someone eventually finds a way to apply it? What about art, music, entertainment, sports, history, not to mention philosophy - all useless too? Let's face it, everyone finds "use" in different things. The only way to objectively determine something's usefulness is whether it is profitable, and if that's the optimal solution then society should immediately prevent people from studying anything else than engineering or economics. Just send me to another planet before you start.
    • Anything is useless unless it's intrinsically valuable, or can be used for something that is. I believe that happiness is the only thing that's intrinsically valuable. As such, if pure math is just entertaining, that's enough, so long as it entertains you. Otherwise, find a new use for it. If you talk about multiple things being intrinsically valuable, you have to start comparing them. How much truth is a human life worth?
      • Six pounds. Whether that's weight or money is left as an exercise for the reader.
  • Is this just the high school education stuff, but how come a lot of people my age don't seem to realize how interest or Credit Cards work? Or some stuff like basic accounting/finance? Do they just use deceptive language that people don't know, or do schools assume that since it's basic stuff that you'd already know it so stuff like Calculus looks better on a college application?
    • This troper can't speak for all school districts, but I was taught financing and other "real world" math concepts like how to balance a checkbook in the sixth fucking grade, almost a decade before I would actually use it. In late high school, much closer to the time I'd actually be doing that, I was learning stupidly-elevated math that I will bet money I will never use again in my life. So in conclusion, students are either not being taught it at all, or are being "taught" it years too early.
    • If you're making a steady paycheck, or even running a steady business, you'll likely be fine, but if you want to keep track of economic trends at all, you'll need what's known as "business calculus." Or if you want to analyze any kind of marketing buzz, or if you want to make informed decisions in the voters' booth.
  • Who thought Online Math Classes were a good idea? Look at some of the stuff on the Troper Tales page of Everyone Hates Mathematics!
    • Don't underestimate them. I still firmly believe the online course I took in Algebra I was much better than the Algebra I crap my school gave.
    • This troper did better in her online math courses than many traditional classes.
  • Why do some places use periods for separating groups of thousands and commas as a decimal point (like say, 31.943,24) , while others use it the other way around (like 31,943.24)? It's confusing!
    • Same answer as to why some use lb and others use kg.
  • The formula for finding arc length really bugs me. It seems like a very simply concept: how long is the line from A to B (for the non-math people: imagine taking a string and shaping it so that it follows some mathematical formula, like a graph, then ask how long the piece of string is), but the mathematics just causes it to grind to a halt. I do understand how its derived, but it seems like the Math God is playing a joke on us. He says, "Ok, so you want to find the arc length? First, find the derivative. Then square it. Then add one. Now take the square root. NOW INTEGRATE THE WHOLE THING FROM A TO B! AHAHAHAHAHAHA!!!"
    • Oh, God, I remember that. And the way I was taught, I learned it before Taylor series, so I was left staring at what seemed impossible integrals if I tried it for anything but a few spoon-fed problems... note to those preparing math curricula: never do that. The same goes for solids of revolution.
    • The easiest way to visualise the arc length formula is to imagine that you are walking along the line at a speed. At each point for a small increase in time dt you move a horizontal distance (dx/dt)dt and a vertical distance (dy/dt)dt. Use pythagoras to find the distance: you get (((dx/dt)^2+(dy/dt)^2)^0.5)dt (this the arc length formula for a parameterised curve). Now to derive the above troper's arc length formula, set x=t so dx=dt and dx/dt=1.
  • Is any "base" number system better than the others? We typically use "10," but would any one in particular make math easier, assuming someone knew both their whole life? I ask because back in the Cold War, they kept trying to teach kids Base 5.
    • The point of teaching them base 5 wasn't anything special about base 5, just trying to make sure kids didn't get too used to base 10 and have trouble thinking in other bases when they get older. The only one that can really be said to be objectively better is base 2, because it's the easiest to treat logically, but it's unwieldy for human use.
    • There are those who swear by base 12 or base 60 because 12 is a multiple of 1,2,3,4,6,12 and 60 is also a multiple of 5. That means that (for example) a round number in base 12 divided by 2, 3, 4 or 6 is a whole number, while in base 10, 10/3=3.333 10/4=2.5 10/6=1.667 10/7=1.429 and so on. Base 60 is relatively impossible (we'd need 60 different symbols) and base 2 (which also has its merits, see above) makes numbers too large - even a relatively small number, like 5280 (the number of feet in a mile, if I'm not mistaken) is 13 digits (actually 1010010100000) and it only gets worse.
    • In computer science, base 16 is another commonly seen method. As 16 is a multiple of 2, it's trivial to switch numbers back and forth between binary and hexadecimal (that is, base 2 and 16)(that's why they chose base 5 to teach to kids) and base 8 wouldn't be so great because the use of bytes mean that we need the bits in a grouping of 8s - base 8 groups them in 3s and base 16 groups them in 4s, and 2 groups of 4 together is a group of 8.
  • How come I can find Algebra and above to be too abstract for me, yet I can actually do Statistics?
    • Is it because in Statistics, they teach a formula, and when it comes to use it on the test, they actually stick to it? The problem I would always have in Algebra is that they would teach a formula, and then start throwing around a subversion and curveball (that's not fully explained) and all the ones on the test are the subversions and curveballs. Say, we're given the formula to calculate the variance, yet we're given every variable except one in the middle?
      • The point of using variations on the problems is to test whether you understand the maths involved, rather than just having learnt a bunch of formulae by rote. If you know why you have to use a particular method in a particular situation then you should be able to see what has changed and what you need to do differently.
  • Online Math Classes. Who thought this was a good idea? There's no real learning involved. At Colorado State university, people in the physics and chemistry department wonder why everyone seems unable to do the math and start blaming the high schools, when the math programs are just as guilty? It's easier to just memorize answers and learn to take a test rather than do the math.
  • Math Textbooks just flat out bug me. I know they don't tell you the answers so people don't just write down the answers without actually doing it, but don't they actually know that in just about every single math class out there, just writing down the answer without showing your work gets points taken off? How're we supposed to know we're right when they don't run over the homework?
    • Where I'm from, all math books at every level have correct answers available at the end of the book or somewhere like that, so you can check your answers. This country has topped math-section of PISA, so its unlikely that it's a really bad idea to have answers available.
    • In the USA, math textbooks tend to have answers in the back for the odd-numbered problems. That way, the teacher can decide whether or not to give problems with textbook answers.
  • Why is it very rare for math teachers to emphasize on Applied Mathematics (both literally, as in practical applications for math, and the trope, which is as exploitable and fun as a meme)? No wonder people see math as boring and bleak.
    • Because some applications of mathematics are highly specific for high school. Most people will not use complex numbers, matrices etc... in their daily life unless they decide to follow careers in science/engineering.
    • High school Math doesn't even have to have Real Life applications, it just needs to be taught in a fun way, like, you know, a video game or something. If you like Math enough, there's no need to worry about whether you will use it in real life or not, just because it's fun. But of course Math is an unavoidable subject even if you don't like it, so how about this: Since fifth grade Math determines whether you will be good in Math or not, how about giving the students the option to take or skip Math come Middle School? Why force students to take Math when they think there's really no mundane use of it except for shopping and balancing your books?
  • Is it just me, or does it seem like any sufficiently advanced discussion of mathematics starts to get into problems of perception, philosophy, language, etc.? And then you run into things like quantum mechanics, which is about both math and the nature of the universe, and it starts to seem like the whole of human is inextricably interlinked. I don't know if this question is easy to understand or if I'm saying it right, but why does it seem like everything is linked to everything else?
    • Modern philosophy essentially is math, the same way physics is, just without the big scary signs of continuous analysis. As for everything else, you'll find that this is an intermediate phase you'll get out of when you stop looking for applications - keep working on math, and you'll find applications.
  • Why is it that the harmonic series diverges to infinity instead of converging on a finite number? Each successive term is smaller than the previous, so shouldn't it converge to something?
    • Nope. Trivially, consider (1/2 + 1/2^k) - each term is smaller than the last, but obviously it couldn't possibly converge. More relevantly consider the series ln (k/(k-1)), which is equal to ln k - ln (k-1) - clearly each term is smaller than the last, but obviously the partials are the logarithm, which can't converge. The harmonic series diverges for almost the exact same reason - although it's not as easy to show, its partials approach the natural logarithm plus a constant.
    • The fractions do get smaller, but the thing is they don't get smaller fast enough. Here is how I can prove it to you. Think of some part of the infinite harmonic series, say all the terms between 1/1,000 and 1/10,000. If you add them up, what do you get? Using a computer, the answer is about 2.30314. Now what about 1/10,000 to 1/100,000? The terms are smaller, but there's a lot more of them (90,000 vs. 9,000). It turns out it's about the same, 2.30264! Between 1/100,000 and 1/1,000,000? 2.30259! Basically, the terms in each segment are about one-tenth the value as in the previous segment, but there are ten times more terms, which end up nearly cancelling each other out! And, there are an infinite number of such terms (you can keep multiplying the endpoints by ten indefinitely), so the sum is some finite number (2.302-whatever) times infinity, which is obviously infinity.
  • Who decided that only the odd-numbered questions should get answers in the back of the text book? Why not the even ones?
    • If all the answers were in the back, what do you think would happen?
    • Enemy spies.
  • This troper is an example of Writers Fail Math Forever. To beat a dead horse to death, let me see if my satiric writing skills can clear up the "I'll never use this" issue in a way that's funny and informative. (Warning: Misused maths. Misused maths everywhere…)
    • Because I use math ALL THE FREAKIN' TIME, y'all. How else would I get that diaper on my son just right without exponentials to help me along? I can't imagine reading to my daughter without trig to help me figure out how to turn the pages. Gosh Dang It to Heck, I wouldn't even be able to kiss my spouse without adding up a few decimals! Why, without matrices, I'd just wander the grocery store aimlessly, looking for all my foodstuffs. Flat tire on the road and no spare? Polly Nominal to the rescue! With advanced pre-calculus, I know exactly how much food to give my cat without overfeeding him! Plus, I'll never get my start as a singer without PEMDAS to make sure I hit all notes smoothly and on key. On top of that, how else could I write without fractions to teach me the difference between "accept" and "except"? Not to mention my hobby of linguistics is so interwoven with irrational numbers, you can't do anything without running into an improper fraction! It's also practically impossible for me to watch TV without the Pythagorean Theorem popping up somewhars. Boiling spaghetti? Fuck the instructions; just use geometry! It's much easier to boil noodles that way. A special thanks goes to hexadecimal; I wouldn't be trilingual (and working on… quadlingual [LOL]) without it. After all, whenever my mood disorder strikes, where would I be without my best friend Binary to pick me up? No one's fought harder than Binary by my side to get me the help I've been desperate to find for 12 years. Can't forget Pi for letting me call him in the middle of the night for a drive, either. He'd be mad. So you see, Tropefriends, we use maths everyday for everything! Therefore, we can't complain. It's easy to see how my life would fall apart if I didn't have all those mathses to count on. Straighten up, buckle down, and take your math like a man!
    • (Troper does not actually have a life. Troper fails math so badly that a class that was supposed to help actually decreased math skills. Troper actually has mood disorder. Please excuse if normally polite (albeit overly truthful and pessimistic) troper legitimately steps out of line- Gollum Made Me Do It. Side effects of this post may include but are not limited to: laughter, shin splints, watery eyes, kittens falling on pillows made of rainbows, brain decay, and in rare cases, butthurt. Stripping, ripping out all manners of body hair, and sobbing uncontrollably in the corner isn’t normal- but on math it is. Talk to your teen about Math Misuse today. MATH- NOT EVEN ONCE.) Point: Our argument isn't against ALL math, it's against the higher maths that a lot of people (like me) know for SURE they're never going to use. That is all (this time). (Tried to make post easier to read; pretty sure I failed.)