Now to algebraic numbers.
Let us consider some algebraic number defined as a root of some rational-coefficient polynomial. If it is a real one, then it can be found by using some approximation algorithm like
Newton's method. If one starts with a rational initial guess, and if it converges, it will make a rational-value Cauchy sequence.
Now consider one with algebraic-number coefficients, one which I'll call P. If those coefficients contain powers of some r that is not a rational number, a r given by some nth-degree rational-coefficient polynomial. Take powers P, P^2, ..., P^n, and divide by r's defining polynomial, leaving the remainder. These n quantities will then be polynomials in r with a degree of at most n-1. Treat the r powers as separate variables and solve for them. This will leave a polynomial with no powers of r in it.
Repeat this operation until one is left with all rational coefficients. That proves that the algebraic numbers are algebraically closed, that every algebraic-coefficient polynomial has an algebraic-number solution.
That is also true of complex (real) numbers: the
Fundamental theorem of algebra.
-
Now for defining arithmetic on algebraic numbers. One can do that without calculating the roots explicitly. One uses
Newton's identities, an interrelationship between "elementary symmetric polynomials" (sum of all products of a certain number of distinct elements of some set) and "power sums" (sum of a certain power of all elements of some set).
For r1, r2, ..., rn,
R(ESP):
1
r1 + r2 + ... + rn
r1*r2 + r1*r3 + ... + r(n-1)*n
...
R(PS):
n
r1 + r2 + ... + rn
r1^2 + r2^2 + ... + rn^2
...
To get the ESP's of the roots from a polynomial's coefficients, do
a(ESP,k) = (-1)^k * a(n-k)/a
with
a(ESP,k) = 0 for k > n
One then uses Newton's identities to calculate a(PS,k) to as high a value as one needs.
To get a polynomial back, reverse these operations.
I'll use A(k) for a(PS,k) in what follows.
For addition, x = a + b, the roots rx = ra + rb and their powers rx^k = sum over l from 0 to k of k!/l!/(k-l)!*ra^(k-l)*rb^l
For multiplication, x = a*b, the roots rx = ra*rb and their powers rx^k = ra^k * rb^k
Since the polynomials may have several roots, one sums over all selections of a root from each polynomial. This gives us
a + b: X(k) = sum over l from 0 to k of k!/l!/(k-l)! * A(k-l) * B(l)
a*b: X(k) = A(k)*B(k)
These operations are obviously commutative, and for associativity, one does addition or multiplication of A,B, and C:
a + b + c: X(k) = sum over l, m from 0 to their sum:k of k!/l!/m!/(k-l-m)! * A(k-l-m) * B(l) * C(m)
a*b*c: X(k) = A(k)*B(k)*C(k)
Multiplication being distributive over addition is easy.
a*(b+c): X(k) = sum over l from 0 to k of k!/l!/(k-l)! * A(k) * B(k-l) * C(l)
a*b+a*c: X(k) = sum over l from 0 to k of k!/l!/(k-l)! * A(k-l) * B(k-l) * A(l) * C(l)
Zero: A(0) = 1, A(k) = 0 for k > 0 -- additive identity, multiplicative zero
One: A(k) = 1 -- multiplicative identity
When one adds or multiplies the roots of polynomials a and b, the resulting number of roots is (number of roots of a) * (number of roots of b). That means that if one is looking for a specific root, one has to select it out. One can do that with a Cauchy sequence for approximating it.
However, this detail is shows that the algebraic numbers are closed under arithmetic operations -- one can construct a rational-coefficient polynomial that contains the sum or the product of two rational-coefficient polynomial roots.