Dal J. answered 03/14/15
Tutor
4.9
(64)
Expert Instructor in Complex Subjects and Public Speaking
Presumably this is a trig problem. What you have to do is look at the individual vectors that add up to the plane's speed.
45 degrees from North toward East corresponds with a line that heads at 45 degrees from zero on a circle starting at East. That means that the airspeed vector heading east has a length 500 KPH times the cos of 45 degrees (which is the square root of 2 over 2), and the one heading north has a length 500 KPH times the sin of 45 degrees (which is also the square root of 2 over 2).
30 degrees from North towards West corresponds with a line that heads at 120 degrees from zero on a circle starting at East. That means the wind speed adds a vector heading east with a length 60 KPH times the cos of 120 degrees (which is negative 1/2, giving 30 KPH West vector) and north with a length 60 KPH times the sin of 120 degrees (which is the square root of 3 over 2).
The final vector on the ground in each direction is the sum of the relevant vectors.
I don't see an obvious calculation here to keep it exact, so you'll have to multiply that out with a calculator and add the results.
You should get a number a little over 400 MPH north and around 325 east. Use the Pythagorean theorem to calculate the final ground speed, somewhere in the 520s.
If you have to determine the actual bearing, then divide the north vector (the one around 400) by the total vector (around 520 something) and that will be the sin of hte angle, relative to our original circle with zero at east. Take the arcsin of that angle, then that will tell you how far off from east toward north it is. 90 minus that should be how far it is off of north, toward east.