Back in college, I was a math major for one semester. (The next semester I had to take Advanced Calculus at 8 am five days a week. I didn't mind the actual course. At 19, getting up for an 8 am class five days a week was not an option. I quickly changed majors.) I am still fascinated by math. Here is a problem that we discussed in our math forums.
I totally stole this post from here.
Sometimes we make a habit of assuming things. Or sometimes we think we know things but we don't.
Let me ask you this. Does 1 + 1 = 2 all the time?
The correct answer is, it depends on the context. Add 1 cup of liquid (say, a vinegar) and 1 cup of baking soda solution, and you'll end up with less than 2 cups of something.
It's not all the time that you're in the same page as person next to you. Sometimes, your differences compound until it's not a 1 + 1 matter anymore.
It's not all the time that people you deal with are transparent to you. They may seem to be in the same page as you, but at the back of their minds, they are several pages ahead of you.
How do you deal with these kind of people?
1. Know the context of their actions. Learn to factor other significant details to understand why their conclusion is different from yours.
2. Stick to the generally accepted truth. Sure, 1 + 1 = 2 may not always be correct, but most people I know will accept it as such.
3. Build your own contextual support. We may be looking at the same thing but think differently. However, I can fully support my thoughts and give more weight to my side of truth more than yours.
Also, in base 1, 1 + 1 = 11. In base 2, 1 + 1 = 10. In base 10, 1 + 1 = 2. :)
NOW, if you really, really want to stick with the you-idiot-of-course-im-referring-to-the-decimal-numbering-system kind of argument, here's a proof that 1 + 1 = 1:
Let a = 1 and b = 1.
Therefore a = b, by substitution.
If two numbers are equal, then their squares are equal, too:
a^2 = b^2.
Now subtract b^2 from both sides (if an equation is true, then if you subtract the same thing from both sides, the result is also a true equation) so
a^2 - b^2 = 0.
Now the lefthand side of the equation is a form known as "the difference of two squares" and can be factored into (a-b)*(a+b). If you don't believe me, then try multiplying it out carefully,
and you will see that it's correct. So:
(a-b)*(a+b) = 0.
Now if you have an equation, you can divide both sides by the same thing, right? Let's divide by (a-b), so we get:
(a-b)*(a+b) / (a-b) = 0/(a-b).
On the lefthand side, the (a-b)/(a-b) simplifies to 1, right? and the righthand side simplifies to 0, right? So we get:
1*(a+b) = 0,
and since 1* anything = that same anything, then we have:
(a+b) = 0.
But a = 1 and b = 1, so:
1 + 1 = 0, or 2 = 0.
Now let's divide both sides by 2, and we get:
1 = 0.
Then we add 1 to both sides, and we get:
1 + 1 = 1.
But of course, the above formula is flawed. I'll let you figure it out.
If you figure it out, leave a comment. Or if you glazed over at the sight of a number, let me know that too.