(obviously that equation isn't correct, the part where you split the root in two is illegal)
Actually no, splitting a root is not illegal. Think of the rules of powers:
(a*b)^x = a^x*b^x. This applies the same no matter what number you put in for x, such as 1/2, so (a*b)^1/2 = a^1/2 * b^1/2. This is the same as writing sqrt(a*b)=sqrt(a)*sqrt(b).
This can be done at any time. Let me ask you this, why is sqrt(27)=3*sqrt(3)? Why can we reduce a square root in this way? Simple, because sqrt(27) = sqrt(9*3) = sqrt(9)*sqrt(3) = 3 * sqrt(3).
In reaction:
Can you show me the transformation from R to C then, because if they are the same thing with different syntax there should be a transformation (isomorphism) between them.
Certainly:
A real number is a any number contained in the reals. An imaginary number is any number constant b * i where i = sqrt(-1). A complex number is the set of all numbers a+b*i where a and b are any constants.
Therefore, for all a in the reals, there exists a complex number z = a+0*i which also equals our original a. Similarly, all imaginary numbers b*i also equal a complex number 0+b*i.
My calculator spits out Sqrt(2). It kind of depends on the calculator
(would be a nice analogy between the expensiveness of your calculator and the number fields)
Smart ***
You know full well what I'm talking about though.
And yes Equations never truly change anything, that's why they're equations.
anecdote: My physics teacher once said that it is perfectly fine to introduce a pink elephant in your equations as the end result would (read: should) not change.
I don't really see the point though, I mean Sqrt(2) is also just the answer to the equation x^2 = 2.
And 1.414... doesn't really count as it is not exact. approximation is fine for real life purposes but it has no place in pure mathematics.
Mathematics only exists to apply to the real world (imo). What's wrong with discrete sets, or approximating? What's wrong with linearizations? What's wrong with Taylor Series Expansions? They make life easier. If you can't apply something to the real world in some way it's just speculation for the sake of speculation. Its purpose becomes lost.
Could you elaborate on that? It sounds like an interesting PoV but i don't entirely grasp it.
It's hard to explain, but I'll give it my best shot. In general, I just think the way most people choose to teach math is wrong. First, there is this arrogance in math, about how it's for some people, and NOT for others. I've tutored a lot of children in math. Granted, I'm sure you might call my viewpoint biased, but many of the children I taught were gifted in other aspects of their life (often writing), so their math teachers always called them failures. Granted it's the viewpoint of the child that the teacher called them such a thing, but yet again, I'm digressing...
The point I'm trying to get to is that I was able to teach these kids math on a level I don't believe their teachers foresaw possible. And I don't think it's very hard to do. It's not about who you are, or how good a teacher you are... it's just about HOW you teach.
First, the concept of ONE RIGHT ANSWER AND METHOD needs to be removed. We need to explain clearly that in math there's almost always more than one way to skin a cat. This brings me to the next concept, and it comes from a book called "outliers". A good read and I definitely recommend it. He refers to a national test (the one everyone takes at 8th grade in the US I believe, but it's been a while since I read the book) where people answer questions on all of the subjects offered in school. There's one extra, optional portion of the test, which consists of about 25/50 questions (I don't remember, whatever) about you, your school, your personal information, how your teachers tended to teach a subject, etc. Most people skip it. But guess what, lo and behold, those who were good at math also filled in the most of those optional questions.
Being good at math is just about trying more things (according to the author of the book). Now, I don't think this is 100% true, but I do believe it's true to a very large extent. The biggest concept that succeeds for me in tutoring children is telling them this story, and then asking them to try again, because that's all it takes. Most people just stop trying in math, they say "I'm just not good at it". This usually causes them to doubt themselves further.
Then there's the way we teach math as a "things are true JUST BECAUSE I SAID SO" viewpoint. How many people do you know who can prove the Pythagorean theorem? How many people do you know who can prove ANYTHING in math? Even understand what that is asking? Why don't we teach proofs? Whenever I prove concepts to the kids I tutor they understand the concept 10 times better. Things just start to click. It's how it's always been with me.
We're so busy teaching the "what" part of math... but we don't bother about the "why"...
Does this make more sense? Now to suggest a curriculum with these concepts in mind is another story...
I also came across this:
http://www.math.toronto.edu/mathnet/answers/imaginary.html
an interesting read in general, and also shows an argument why fractional numbers exist and then proceeds to use the same argument for complex numbers.
Ah, but this author is taking on a different viewpoint as well, let me quote him below:
Since numbers are just abstract concepts anyway
I guess if one were to say "all numbers are abstract concepts" then to say imaginary numbers are just another abstract concept wouldn't be wrong. But honestly man, how can you not read his explanation and just say "yes the syntax works"? It's just like a programming language. You're defining that a number system just needs to satisfy certain parameters. It has nothing to do with "existence" in the way you and I think of the word.
-blazed
Edit: Just so you know, the book outliers explains that concept A LOT better than I did...