Tuesday, September 13, 2011

A plane flies northeast starting at the equator. How far does it fly until it reaches the north pole?

If a plane starts at the equator and flies northeast (45 degrees north of east) continuously, how far does it fly until it reaches the north pole?


Assume the plane is a mathematical point and the earth is a perfect sphere of radius 4000 miles.


Can this problem be solved without using calculus to compute arc length of a curve in spherical coordinates?|||Hmm, calling the Earth's circumference 25,000 miles for simplicity, it seems like it should be 6,250 * 鈭?, but I'm not sure. Interesting question - maybe someone here can give a rigorous answer.





* * * *





OK, upon further research, 6,250 * 鈭? is indeed correct. The Wikipedia article on rhumb lines tells you everything you want to know. As stated in that article: "The distance between two points, measured along a loxodrome, is simply the absolute value of the secant of the bearing (azimuth) times the north-south distance". For a northeast direction, the angle is 45掳, and sec(45) = 鈭?.





http://en.wikipedia.org/wiki/Rhumb_line|||It never will, unless it flies due north.|||You don't need calculus.


Think of the plane's movement as being a combination of movement to the north and movement to the east (not north THEN east but north AND east at the same time). If you head NE, you are moving North just as fast as you are moving East.


So, for every mile north the plane moves, it has also moved 1 mile east and the actual amount of ground covered by the plane will be the square root of two. To find the total distance traveled you just need the distance North from the equator to the pole and multiply it by the square root of two

No comments:

Post a Comment