A **linear subspace** (or **vector subspace**) is an important concept in linear algebra and related fields of mathematics.
A linear subspace is usually called simply a *subspace* when the context serves to distinguish it from other kinds of subspace.

Table of contents |

2 Examples 3 Properties of subspaces 4 Operations on subspaces |

Let `K` be a field (such as the field of real numbers), and let `V` be a vector space over `K`.
As usual, we call elements of `V` *vectors* and call elements of `K` *scalars*.
Suppose that `W` is a subset of `V`.
If `W` is a vector space itself, with the same vector space operations as `V` has, then it is a **subspace** of `V`.

To use this definition, we don't have to prove that all the properties of a vector space hold for `W`.
Instead, we can prove a theorem that gives us an easier way to show that a subset of a vector space is a subspace.

**Theorem:**
Let `V` be a vector space over the field `K`, and let `W` be a subset of `V`.
Then `W` is a subspace if and only if it satisfies the following 3 conditions:

- If
**u**and**v**are elements of`W`, then the sum**u**+**v**of**u**and**v**is an element of`W`; - If
**u**is an element of`W`and`c`is a scalar from`K`, then the scalar product`c`**v**is an element of`W`; `W`is not empty,W contains a zero vector*or*^{*}

**Proof:**
Looking at the definition of a vector space, we see that properties 1 and 2 above assure closure of `W` under addition and scalar multiplication, so the vector space operations are well defined.
By property 3, there is some element `w` of `W`.
Multiplying this by the scalar 0, we get an additive identity, 0`w` = **0** in `W`, and we get the additive inverse of any vector in `W` by multiplying it by the scalar −1.
Since elements of `W` are necessarily elements of `V`, the other properties of a vector space are satisfied *a fortiori*.

**Example I:**
Let the field `K` be the set **R** of real numbers, and let the vector space `V` be the Euclidean space **R**^{3}.
Take `W` to be the set of all vectors in `V` whose last component is 0.
Then `W` is a subspace of `V`.

*Proof:*

- Given
**u**and**v**in`W`, then they can be expressed as**u**= (`u`_{1},`u`_{2},0) and**v**= (`v`_{1},`v`_{2},0). Then**u**+**v**= (`u`_{1}+`v`_{1},`u`_{2}+`v`_{2},0+0) = (`u`_{1}+`v`_{1},`u`_{2}+`v`_{2},0). Thus,**u**+**v**is an element of`W`too. - Given
**u**in`W`and a scalar`c`in**R**, if**u**= (`u`_{1},`u`_{2},0) again, then`c`**u**= (`c``u`_{1},`c``u`_{2},`c`0) = (`c``u`_{1},`c``u`_{2},0). Thus,`c`**u**is an element of`W`too. **0**= (0,0,0) is immediately an element of`W`.

*Proof:*

- Let
**p**= (`p`_{1},`p`_{2}) and**q**= (`q`_{1},`q`_{2}) be elements of`W`, that is, points in the plane such that`p`_{1}=`p`_{2}and`q`_{1}=`q`_{2}. Then**p**+**q**= (`p`_{1}+`q`_{1},`p`_{2}+`q`_{2}); since`p`_{1}=`p`_{2}and`q`_{1}=`q`_{2}, then`p`_{1}+`q`_{1}=`p`_{2}+`q`_{2}, so**p**+**q**is an element of`W`. - Let
**p**= (`p`_{1},`p`_{2}) be an element of`W`, that is, a point in the plane such that`p`_{1}=`p`_{2}, and let`c`be a scalar in**R**. Then`c`**p**= (`c``p`_{1},`c``p`_{2}); since`p`_{1}=`p`_{2}, then`c``p`_{1}=`c``p`_{2}, so`c`**p**is an element of`W`. - Again, the point
**0**= (0,0) belongs to`W`, since 0 = 0.

**Example III:**
Again take the field to be **R**, but now let the vector space `V` be the set **R**^{R} of all functions from **R** to **R**.
Let C(**R**) be the subset consisting of continuous functions.
Then C(**R**) is a subspace of **R**^{R}.

*Proof:*

- We know from calculus the sum of continuous functions is continuous.
- Again, we know from calculus that the product of a continuous function and a number is continuous.
- Consider the function
**0**from**R**to**R**defined by**0**(`x`) = 0 for all`x`in**R**. This*zero function*is continuous from**R**into**R**.

Examples that extend these themes are common in functional analysis.

Another way to characterise subspaces is that they are closed under linear combinations.
That is, `W` is a subspace iff every linear combination of (finitely many) elements of `W` also belongs to `W`.
Conditions 1, 2, and 3 for a subspace are simply the most basic kinds of linear combinations (where for condition 3, we remember that a linear combination of no vectors at all yields the zero vector).

Given subspaces `U` and `W` of a vector space `V`, then their intersection `U` ∩ `W` := {**v** ∈ `V` : **v** is an element of both `U` and `W`} is also a subspace of `V`.

*Proof:*

- Let
**v**and**w**be elements of`U`∩`W`. Then**v**and**w**belong to both`U`and`W`. Because`U`is a subspace, then**v**+**w**belongs to`U`. Similarly, since`W`is a subspace, then**v**+**w**belongs to`W`. Thus,**v**+**w**belongs to`U`∩`W`. - Let
**v**belong to`U`∩`W`, and let`c`be a scalar. Then**v**belongs to both`U`and`W`. Since`U`and`W`are subspaces,`c`**v**belongs to both`U`and`W`. - Since
`U`and`W`are subspaces of`V`,**0**belongs to both`U`and`W`by a property in the previous section. Thus,`U`∩`W`is nonempty.