algebra, linear

algebra, linear

Introduction

      mathematical discipline that deals with vectors (vector) and matrices (matrix) and, more generally, with vector spaces (vector space) and linear transformations. Unlike other parts of mathematics that are frequently invigorated by new ideas and unsolved problems, linear algebra is very well understood. Its value lies in its many applications, from mathematical physics to modern algebra (algebra, modern) and coding theory.

Vectors (vector) and vector spaces (vector space)
 Linear algebra usually starts with the study of vectors, which are understood as quantities having both magnitude and direction. Vectors lend themselves readily to physical applications. For example, consider a solid object that is free to move in any direction. When two forces act at the same time on this object, they produce a combined effect that is the same as a single force. To picture this, represent the two forces v and w as arrows; the direction of each arrow gives the direction of the force, and its length gives the magnitude of the force. The single force that results from combining v and w is called their sum, written v + w. In the figure—>, v + w corresponds to the diagonal of the parallelogram formed from adjacent sides represented by v and w.

 Vectors are often expressed using coordinates. For example, in two dimensions a vector can be defined by a pair of coordinates (a1a2) describing an arrow going from the origin (0, 0) to the point (a1a2). If one vector is (a1a2) and another is (b1b2), then their sum is (a1 + b1a2 + b2); this gives the same result as the parallelogram (see the figure—>). In three dimensions a vector is expressed using three coordinates (a1a2a3), and this idea extends to any number of dimensions.

      Representing vectors as arrows in two or three dimensions is a starting point, but linear algebra has been applied in contexts where this is no longer appropriate. For example, in some types of differential equations (differential equation) the sum of two solutions gives a third solution, and any constant multiple of a solution is also a solution. In such cases the solutions can be treated as vectors, and the set of solutions is a vector space in the following sense. In a vector space any two vectors can be added together to give another vector, and vectors can be multiplied by numbers to give “shorter” or “longer” vectors. The numbers are called scalars (scalar) because in early examples they were ordinary numbers that altered the scale, or length, of a vector. For example, if v is a vector and 2 is a scalar, then 2v is a vector in the same direction as v but twice as long. In many modern applications of linear algebra, scalars are no longer ordinary real numbers (real number), but the important thing is that they can be combined among themselves by addition, subtraction, multiplication, and division. For example, the scalars may be complex numbers (complex number), or they may be elements of a finite field such as the field having only the two elements 0 and 1, where 1 + 1 = 0. The coordinates of a vector are scalars, and when these scalars are from the field of two elements, each coordinate is 0 or 1, so each vector can be viewed as a particular sequence of 0s and 1s. This is very useful in digital processing, where such sequences are used to encode and transmit data.

Linear transformations and matrices
      Vector spaces are one of the two main ingredients of linear algebra, the other being linear transformations (or “operators” in the parlance of physicists). Linear transformations are functions (function) that send, or “map,” one vector to another vector. The simplest example of a linear transformation sends each vector to c times itself, where c is some constant. Thus, every vector remains in the same direction, but all lengths are multiplied by c. Another example is a rotation, which leaves all lengths the same but alters the directions of the vectors. Linear refers to the fact that the transformation preserves vector addition and scalar multiplication. This means that if T is a linear transformation sending a vector v to T(v), then for any vectors v and w, and any scalar c, the transformation must satisfy the properties T(v + w) = T(v) + T(w) and T(cv) = cT(v).

 When doing computations, linear transformations are treated as matrices (matrix). A matrix is a rectangular arrangement of scalars, and two matrices can be added or multiplied as shown in the table—>. The product of two matrices shows the result of doing one transformation followed by another (from right to left), and if the transformations are done in reverse order the result is usually different. Thus, the product of two matrices depends on the order of multiplication; if S and T are square matrices (matrices with the same number of rows as columns) of the same size, then ST and TS are rarely equal. The matrix for a given transformation is found using coordinates. For example, in two dimensions a linear transformation T can be completely determined simply by knowing its effect on any two vectors v and w that have different directions. Their transformations T(v) and T(w) are given by two coordinates; therefore, only four coordinates, two for T(v) and two for T(w), are needed to specify T. These four coordinates are arranged in a 2-by-2 matrix. In three dimensions three vectors u, v, and w are needed, and to specify T(u), T(v), and T(w) one needs three coordinates for each. This results in a 3-by-3 matrix.

Eigenvectors
      When studying linear transformations, it is extremely useful to find nonzero vectors whose direction is left unchanged by the transformation. These are called eigenvectors (also known as characteristic vectors). If v is an eigenvector for the linear transformation T, then T(v) = λv for some scalar λ. This scalar is called an eigenvalue. The eigenvalue of greatest absolute value, along with its associated eigenvector, have special significance for many physical applications. This is because whatever process is represented by the linear transformation often acts repeatedly—feeding output from the last transformation back into another transformation—which results in every arbitrary (nonzero) vector converging on the eigenvector associated with the largest eigenvalue, though rescaled by a power of the eigenvalue. In other words, the long-term behaviour of the system is determined by its eigenvectors.

      Finding the eigenvectors and eigenvalues for a linear transformation is often done using matrix algebra, first developed in the mid-19th century by the English mathematician Arthur Cayley (Cayley, Arthur). His work formed the foundation for modern linear algebra.

Mark Andrew Ronan

Additional Reading
Two exceptionally lucid introductions with many computational applications from business, economics, sociology, engineering, genetics, and computer science are Carl Meyer, Matrix Analysis and Applied Linear Algebra (2000); and Howard Anton and Chris Rorres, Elementary Linear Algebra: Applications Version, 8th ed. (2000). Two accessible introductions that emphasize the abstract structural elements of the subject (and therefore are most suited as preparation for more advanced material) are Robert J. Valenza, Linear Algebra: An Introduction to Abstract Mathematics (1993, reprinted with corrections, 1999); and Sheldon Axler, Linear Algebra Done Right, 2nd ed. (1997, reprinted with corrections, 1999). David Carlson et al. (eds.), Linear Algebra Gems: Assets for Undergraduate Mathematics (2002), contains 73 short expository articles and more than 120 problems designed under the aegis of the National Science Foundation to improve undergraduate education in the subject.

* * *


Universalium. 2010.

Игры ⚽ Нужен реферат?

Look at other dictionaries:

  • Algebra (disambiguation) — Algebra is a branch of mathematics.Algebra may also mean: * elementary algebra * abstract algebra * linear algebra * universal algebra * computer algebraIn addition, many mathematical objects are known as algebras. * In logic: ** Boolean algebra… …   Wikipedia

  • Linear algebra — R3 is a vector (linear) space, and lines and planes passing through the origin are vector subspaces in R3. Subspaces are a common object of study in linear algebra. Linear algebra is a branch of mathematics that studies vector spaces, also called …   Wikipedia

  • algebra — /al jeuh breuh/, n. 1. the branch of mathematics that deals with general statements of relations, utilizing letters and other symbols to represent specific sets of numbers, values, vectors, etc., in the description of such relations. 2. any of… …   Universalium

  • Algebra — This article is about the branch of mathematics. For other uses, see Algebra (disambiguation). Algebra is the branch of mathematics concerning the study of the rules of operations and relations, and the constructions and concepts arising from… …   Wikipedia

  • Linear map — In mathematics, a linear map, linear mapping, linear transformation, or linear operator (in some contexts also called linear function) is a function between two vector spaces that preserves the operations of vector addition and scalar… …   Wikipedia

  • Linear functional — This article deals with linear maps from a vector space to its field of scalars.  These maps may be functionals in the traditional sense of functions of functions, but this is not necessarily the case. In linear algebra, a linear functional… …   Wikipedia

  • linear — linearly, adv. /lin ee euhr/, adj. 1. of, consisting of, or using lines: linear design. 2. pertaining to or represented by lines: linear dimensions. 3. extended or arranged in a line: a linear series. 4. involving measurement in one dimension… …   Universalium

  • Algebra tiles — Algebra tiles are known as mathematical manipulatives that allow students to better understand ways of algebraic thinking and the concepts of algebra. These tiles have proven to provide concrete models for elementary school, middle school, high …   Wikipedia

  • linear algebra — n. the algebra of vectors and matrices, as distinct from the ordinary algebra of real numbers and the abstract algebra of unspecified entities …   English World dictionary

  • Linear least squares/Proposed — Linear least squares is an important computational problem, that arises primarily in applications when it is desired to fit a linear mathematical model to observations obtained from experiments. Mathematically, it can be stated as the problem of… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”