Use the coordinate plane definitions of these trig functions:
With a circle of radius r, centered on the origin, and (x,y) the coordinates of the point of intersection between the circle and the terminal ray of an angle, Θ, in standard postion,
sinΘ = y/r
cosΘ = x/r
tanΘ = y/x
(The instructor here has named the angle x, which confuses the issue a bit. I have called it Θ, in order to distinguish it from the x-coordinate of the point of intersection.)
Now draw a diagram of an angle Θ in Quadrant II (π/2 < Θ < π) such that its cosine value is -5/13. (The x-coordinate of the pt. of intersection is -5, the radius of the circle is 13.) Use pythagorean theorem to find the y-coordinate of the pt. of intersection and note that it will be + in QII. Use definitions above.
Alternatively ...
Note that in QII sinx > 0 and tanx < 0.
Calculate sinx by means of the pythagorean identity sin2x + cos2x = 1.
Calculate tanx by recalling the quotient identity tanx = sinx / cosx.