I like the CORDIC (COordinate Rotation DIgital Computer) algorithm for
computing sin, cos, and arctan. It just uses shifts and adds, with
each iteration improving accuracy by about a bit.

For representing angles in a computer, I like Binary Angular
Measurement (BAM). Wikipedia describes the 8-bit version of this in http://en.wikipedia.org/wiki/Angle, where you split the circle into
256 angles calling each or them a "binary degree" or "binary radian".
Personally I like to call them "octal grads", since there are 400
(octal) of them in a circle.

BAM can get you any resolution. For example, with a 32-bit integer
you get a resolution of one 4 billionth of a circle or about one
billionth of a radian. BAMs are fixed-point numbers, so you don't
need those awful floats and doubles. Calculating modulo 360 degrees
is really easy: you just mask off the high bits. If you use an entire
word, you don't need to worry about whether you're using signed or
unsigned numbers. With signed numbers you get 0 to just less than
360. With signed numbers you get -180 to just less than +180.
Totally equivalent, and uses every bit combination. When you add and
subtract full-word angles you do it as normal integer arithmetic. The
overflows and underflows do the modulo 360 for you.

Why they're so rare in computing is something I don't understand. I
guess it's because the numerical side of computing developed in math
departments and they love their radians. One of the most common uses
of BAMs is for optical shaft encoders which use a disk with black and
white arcs to show position using a linear array of optical sensors.
In this case they use a Gray code version of BAM.