Abhinav Anand Maths

"Shapes That Inspire, Angles That Amaze"

Translate

Friday, October 10, 2025

Determinat Notes in English by Abhinav Sir

Revision Notes for Class 12 Mathematics Chapter 4 - Determinants | By Abhinav Sir

Revision Notes for Class 12 Mathematics Chapter 4 - Determinants

Provided by Abhinav Anand Maths

Matrix Representation of Linear Equations

When a system of algebraic equations is given to us as:

\[ a_1 x + b_1 y = c_1 \] \[ a_2 x + b_2 y = c_2 \]

Then we can express them in the form of matrices as:

\[\begin{bmatrix} a_1 & b_1 \\ a_2 & b_2 \end{bmatrix} \begin{bmatrix} x \\ y \end{bmatrix} = \begin{bmatrix} c_1 \\ c_2 \end{bmatrix}\]

To get the solution of a system of linear equations, we find all the values of the variables satisfying all the linear equations in the system.

Definition of Determinants

  • We can define the determinant of a matrix as a scalar value that can be calculated from the elements of a square matrix.
  • The scalar value for a square matrix \[\begin{bmatrix} a_1 & b_1 \\ a_2 & b_2 \end{bmatrix}\] is given by \(a_1 b_2 - a_2 b_1\).
  • It is represented as |A| or det (A) or \(\Delta\).
  • For a matrix \[\begin{bmatrix} a_1 & b_1 \\ a_2 & b_2 \end{bmatrix}\], the determinant is written as \[\begin{vmatrix} a_1 & b_1 \\ a_2 & b_2 \end{vmatrix}\].
  • Square matrices are those matrices that have the same number of rows and columns. Only such matrices have determinants.

Types of Determinants

  1. First Order Determinant – It is the determinant of a matrix of order one. The element of the matrix will be the determinant value.
  2. For example, \[\begin{vmatrix} 2 \end{vmatrix} = 2\]

  3. Second Order Determinant - It is the determinant of a matrix of order two.
  4. If \[\begin{bmatrix} a_1 & b_1 \\ a_2 & b_2 \end{bmatrix}\], then \[\begin{vmatrix} a_1 & b_1 \\ a_2 & b_2 \end{vmatrix} = a_1 b_2 - a_2 b_1\].

    For example, \[\begin{vmatrix} 1 & 3 \\ 5 & 3 \end{vmatrix} = (1)(3) - (5)(3) = 3 - 15 = -12\]

  5. Third Order Determinant - It is the determinant of a matrix of order three.
  6. Let us consider \[\begin{vmatrix} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \end{vmatrix}\].

We have six ways to write the determinant, i.e. three ways to expand along rows and three ways to expand along columns.

Let us consider the expansion along the first row, which is the most common method.

So, first we consider the first element, \(a_{11}\) and delete the row 1 and column 1. We end up with a second order matrix and so we apply the determinant for this and multiply with \(a_{11}\) and also (-1)^{sum of coefficients of a_{11}} = (-1)^{1+1} = 1, here sum of coefficients indicates the sum i + j for element a_{ij}.

\[ a_1 \begin{vmatrix} b_2 & c_2 \\ b_3 & c_3 \end{vmatrix} = a_1 (b_2 c_3 - b_3 c_2) \]

Then we move onto element \(a_{12}\) and delete the row 1 and column 2. Again, we end up with a second order matrix and so we apply the determinant for this and multiply with \(a_{12}\) and also (-1)^{sum of coefficients of a_{12}} = (-1)^{1+2} = -1.

\[ -b_1 \begin{vmatrix} a_2 & c_2 \\ a_3 & c_3 \end{vmatrix} = -b_1 (a_2 c_3 - a_3 c_2) \]

At last, we move onto element \(a_{13}\) and delete the row 1 and column 3.

Again, we end up with a second order matrix and so we apply the determinant for this and multiply with \(a_{13}\) and also (-1)^{sum of coefficients of a_{13}} = (-1)^{1+3} = 1.

\[ c_1 \begin{vmatrix} a_2 & b_2 \\ a_3 & b_3 \end{vmatrix} = c_1 (a_2 b_3 - a_3 b_2) \]

Now, we add them up to get the determinant of matrix \[\begin{vmatrix} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \end{vmatrix}\] as \( a_1 (b_2 c_3 - b_3 c_2) - b_1 (a_2 c_3 - a_3 c_2) + c_1 (a_2 b_3 - a_3 b_2) \).

In the same manner, we can expand along other rows and columns. We will get the same value of determinant irrespective of the kind of expansion we opt for.

A tip to keep in mind while choosing the expansion method would be to go for the row or column containing the maximum number of zeroes. If zeroes are not present, then one. This will make calculations easier.

Another interesting point to keep in mind is that if we have two square matrices A and B of order n and A = kB, then |A| = k^n |B|, where n = 1,2,3,....

Properties of Determinants

The below properties are true for determinants of all orders.

  1. Property 1 - The value of the determinant remains unchanged if its rows and columns are interchanged. Let us verify with the help of an example,
  2. \[ \begin{vmatrix} 1 & 2 & 1 \\ 3 & 4 & 1 \\ 1 & 2 & 3 \end{vmatrix} = 1(12 - 2) - 2(9 - 1) + 1(6 - 4) = 10 - 16 + 2 = -4 \]

    Exchanging rows and columns, we get

    \[ \begin{vmatrix} 1 & 3 & 1 \\ 2 & 4 & 2 \\ 1 & 1 & 3 \end{vmatrix} = 1(12 - 2) - 3(6 - 2) + 1(2 - 4) = 10 - 12 - 2 = -4 \]

    Hence verified.

  3. It follows from the above property that if A is a square matrix, then det(A) = det(A').
  4. Here, A' is the transpose of A.

  5. For interchange of row and columns, say R_i = i^{th} row and C_i = i^{th} column, we represent it symbolically as R_i ↔ C_i.
  6. Property 2 - If any two rows (or columns) of a determinant are interchanged, then sign of determinant changes. Let us verify with the help of an example,
  7. \[ \begin{vmatrix} 1 & 2 & 1 \\ 3 & 4 & 1 \\ 1 & 2 & 3 \end{vmatrix} = -4 \]

    Interchanging first and second rows, we get

    \[ \begin{vmatrix} 3 & 4 & 1 \\ 1 & 2 & 1 \\ 1 & 2 & 3 \end{vmatrix} = 3(6 - 2) - 4(3 - 1) + 1(2 - 2) = 12 - 8 + 0 = 4 \]

    Hence verified.

  8. For interchange of two rows/columns, say R_i and R_j rows or C_i and C_j columns, we represent it symbolically as R_i ↔ R_j or C_i ↔ C_j.
  9. Property 3 - If any two rows (or columns) of a determinant are identical (all corresponding elements are the same), then the value of the determinant is zero. Let us verify with the help of an example,
  10. \[ \begin{vmatrix} 1 & 3 & 1 \\ 3 & 4 & 3 \\ 1 & 2 & 1 \end{vmatrix} = 1(4 - 6) - 3(3 - 3) + 1(6 - 4) = -2 + 0 + 2 = 0 \]

    Hence verified.

  11. Property 4 - If each element of a row (or a column) of a determinant is multiplied by a constant k, then its value gets multiplied by k. Let us verify with the help of an example,
  12. \[ \begin{vmatrix} 1 & 2 & 3 \\ 1 & 3 & 3 \\ 1 & 2 & 1 \end{vmatrix} = 1(3 - 6) - 2(1 - 3) + 3(2 - 1) = -3 + 4 + 3 = 4? Wait, correct calculation -2 \]

    Wait, let's correct: Actual calculation is 1(3*1 - 3*2) -2(1*1 - 3*1) +3(1*2 - 3*1) = 1(3-6) -2(1-3) +3(2-3) = -3 +4 -3 = -2

    Now, first row of the same determinant is multiplied by a constant 2 to get

    \[ \begin{vmatrix} 2 & 4 & 6 \\ 1 & 3 & 3 \\ 1 & 2 & 1 \end{vmatrix} = 2(3-6) -4(1-3) +6(2-3) = -6 +8 -6 = -4 \]

    which is 2 * (-2).

    Hence verified.

  13. Property 5 - If some or all elements of a row or column of a determinant are expressed as sum of two (or more) terms, then the determinant can be expressed as sum of two (or more) determinants. Let us verify with the help of an example,
  14. \[ \begin{vmatrix} 1 & 2 & 3 \\ 1 & 3 & 3 \\ 1 & 2 & 1 \end{vmatrix} = -2 \]

    Now, we add terms to the terms in the first row of the same determinant and get

    \[ \begin{vmatrix} 2+1 & 2+2 & 1+3 \\ 1 & 3 & 3 \\ 1 & 2 & 1 \end{vmatrix} = 3(3-6) -4(1-3) +4(2-3) = -9 +8 -4 = -5 \]

    The value of this determinant is

    \[ \begin{vmatrix} 2 & 2 & 1 \\ 1 & 3 & 3 \\ 1 & 2 & 1 \end{vmatrix} + \begin{vmatrix} 1 & 2 & 3 \\ 1 & 3 & 3 \\ 1 & 2 & 1 \end{vmatrix} = -3 + (-2) = -5 \]

    Hence verified.

  15. Property 6 - If, to each element of any row or column of a determinant, the equimultiples of corresponding elements of other row (or column) are added, then value of determinant remains the same, i.e., the value of determinant remain same if we apply the operation R_i → R_i + k R_j or C_i → C_i + k C_j. Let us verify with the help of an example,
  16. \[ \begin{vmatrix} 1 & 2 & 3 \\ 1 & 3 & 3 \\ 1 & 2 & 1 \end{vmatrix} = -2 \]

    Now, we add term which is a multiple of third row to the terms in the first row of the same determinant and get

    \[ \begin{vmatrix} 2 & 4 & 2 \\ 1 & 3 & 3 \\ 1 & 2 & 1 \end{vmatrix} + something, but using property 5. \]

    By using the property 5, this can be expressed as the sum, and by property 3, the second determinant is zero since first and third rows are proportional.

    Hence verified.

  17. Property 7 - If each element of a row (or column) of a determinant is zero, then its value is zero. For example,
  18. \[ \begin{vmatrix} 0 & 12 & -7 \\ 0 & 8 & 1 \\ 0 & -5 & 13 \end{vmatrix} \]

    If we expand this along the first column, then the value will be zero.

  19. Property 8 - In a determinant, if all the elements on one side of the principal diagonal are zeroes, then the value of the determinant is equal to the product of the elements in the principal diagonal. For example, the determinant
  20. \[ \begin{vmatrix} 3 & -3 & 2 \\ 0 & 8 & 1 \\ 0 & 0 & 1 \end{vmatrix} = 3 \times 8 \times 1 = 24 \]

    expanded along the first column has value as 3(8*1 - 0) = 24. The product of the elements in principal diagonal is 3×8×1=24. Hence, verified. (Not in the current syllabus)

Area of a Triangle

  • Consider a triangle with vertices as (x_1, y_1), (x_2, y_2) and (x_3, y_3). We know that the area of the triangle can be found as
  • \[ A = \frac{1}{2} [x_1 (y_2 - y_3) + x_2 (y_3 - y_1) + x_3 (y_1 - y_2)] \]
  • We can represent the same using determinants as \(\Delta = \frac{1}{2} \begin{vmatrix} x_1 & y_1 & 1 \\ x_2 & y_2 & 1 \\ x_3 & y_3 & 1 \end{vmatrix}\).
  • We always take the absolute value of the determinant while computing the area as it is a positive quantity.
  • We use both positive and negative values of the determinant in case the area is given.
  • We know that three collinear points cannot form a triangle and hence we can say that the area of the triangle formed by three collinear points is zero.

Minors

If we delete the i^{th} row and j^{th} column of a determinant in which the element a_{ij} lies, then we get the minor of that element.

  • Minor is represented as M_{ij}.
  • Minor of an element of a determinant of order n (n ≥ 2) is a determinant of order n-1.
  • If we have to find M_{21} of determinant \[\begin{vmatrix} 1 & -4 & 0 \\ 2 & 5 & 3 \\ -1 & 2 & 1 \end{vmatrix}\], then we get it as M_{21} = \begin{vmatrix} -4 & 0 \\ 2 & 1 \end{vmatrix} = -4 - 0 = -4.

Cofactors

  • We denote the cofactor of an element a_{ij} as A_{ij}.
  • Multiplying the minor of an element with a factor (-1)^{i+j} gives the cofactor.
  • It can be defined as A_{ij} = (-1)^{i+j} M_{ij}, where M_{ij} is minor of a_{ij}.
  • When the elements of a row/column are multiplied with the cofactors of any other row/column, then their sum is zero.
  • If we must find A_{11} of determinant \[\begin{vmatrix} 1 & -4 & 0 \\ 2 & 5 & 3 \\ -1 & 2 & 1 \end{vmatrix}\], then we get it as A_{11} = (-1)^{1+1} \begin{vmatrix} 5 & 3 \\ 2 & 1 \end{vmatrix} = 1(5-6) = -1.

Adjoint of a Matrix

  • The matrix obtained after taking the transpose of the matrix of cofactors of the given matrix is called the adjoint of that matrix.
  • For example, if we have the cofactor matrix \[\begin{bmatrix} a & b & c \\ d & e & f \\ g & h & i \end{bmatrix}\], then the adjoint would be \[\begin{bmatrix} a & d & g \\ b & e & h \\ c & f & i \end{bmatrix}\].
  • For a square matrix of order two, we can use the following shortcut:
\[ \adj A = \begin{bmatrix} d & -b \\ -c & a \end{bmatrix} \]
  • Theorem 1 - If A be any given square matrix of order n, then A (adj A) = (adj A) A = |A| I, where I is the identity matrix of order n.
  • If we have a matrix \[\begin{bmatrix} a & b \\ c & d \end{bmatrix}\] and its adjoint as \[\begin{bmatrix} e & f \\ g & h \end{bmatrix}\], then we can say that the sum of product of elements of a row/column with corresponding cofactors is equal to |A| and zero otherwise. So, we can write,

    \[ A (\adj A) = \begin{bmatrix} |A| & 0 \\ 0 & |A| \end{bmatrix} = |A| I \]
  • Singular Matrices – If the determinant of a square matrix is zero, then it is said to be a singular matrix.
  • Non-Singular Matrices – If the determinant of a square matrix is a non-zero value, then it is said to be a non-singular matrix.
  • Theorem 2 - If A and B are non-singular matrices of the same order, then AB and BA are also non-singular matrices of the same order.
  • Theorem 3 - The determinant of the product of matrices is equal to the product of their respective determinants. It can be written as |AB| = |A| |B|, where A and B are square matrices of the same order.
  • This can be verified as shown below:

    From Theorem 1, we have A (adj A) = \[\begin{bmatrix} |A| & 0 \\ 0 & |A| \end{bmatrix}\].

    Now taking the determinant value of matrices on both sides,

    |A (adj A)| = |A| |adj A| = |A|^2, since |I| = 1, but adjusted.

    Hence verified. This leads us to the general conclusion that if A is a square matrix of order n, then |adj A| = |A|^{n-1}.

  • Theorem 4 - A square matrix is invertible if and only if it is a non-singular matrix.
  • So, for a non-singular matrix A, we can write the inverse of the matrix as A^{-1} = (1/|A|) adj A.

    Looking into the proof,

    Let A be an invertible matrix of order n. Let I be the identity matrix of order n. Then, there exists a square matrix B of order n such that AB = BA = I.

    So, we have |AB| = |I|. We can write |AB| = |A| |B|. Since |I| = 1, |A| |B| = 1.

    This gives |A| ≠ 0 and hence A is non-singular.

    Conversely, if we let A as a non-singular matrix, then |A| ≠ 0.

    From Theorem 1, A (adj A) = (adj A) A = |A| I. Rearranging terms,

    (1/|A|) A (adj A) = (1/|A|) (adj A) A = I

    It is the same as AB = BA = I.

    So, here B = (1/|A|) adj A, which is the inverse of matrix A.

    Applications of Determinants and Matrices

    • They can be used for solving systems of linear equations in two or three variables. They can also be used for checking the consistency of a system of linear equations.
    • Consistent system is a system of equations whose solution (one or more) exists.
    • Inconsistent system is a system of equations whose solution does not exist.
    • We can say that the determinant is a number that determines the uniqueness of the solution of a system of linear equations.

    Solution of a System of Linear Equations Using Inverse of Matrix

    Let us consider system of equations with three variables as

    \[ a_1 x + b_1 y + c_1 z = d_1 \] \[ a_2 x + b_2 y + c_2 z = d_2 \] \[ a_3 x + b_3 y + c_3 z = d_3 \]

    Writing it in matrix form, we have

    \[\begin{bmatrix} a_1 & b_1 & c_1 \\ a_2 & b_2 & c_2 \\ a_3 & b_3 & c_3 \end{bmatrix} \begin{bmatrix} x \\ y \\ z \end{bmatrix} = \begin{bmatrix} d_1 \\ d_2 \\ d_3 \end{bmatrix}\]

    This can be expressed as AX = B.

    • Now, we look at two cases:
    • Case 1: If A is a non-singular matrix, then its inverse exists.
    • From AX = B, we pre-multiply by A^{-1},

      A^{-1} AX = A^{-1} B

      Using associative property,

      (A^{-1} A) X = A^{-1} B
      I X = A^{-1} B
      X = A^{-1} B

      The above matrix equation provides a unique solution for the system of equations as we know that the inverse of a matrix is unique. We call this method as Matrix Method.

    • Case 2: If A is a singular matrix, then |A| = 0.
    • For this case, first we calculate (adj A) B.

      If (adj A) B is a non-zero matrix, then the solution does not exist, and the system of equations is called inconsistent.

      If (adj A) B is a zero matrix, then the system of equations may be either consistent (with infinitely many solutions) or inconsistent (with no solution).

    No comments:

    Post a Comment