This is a (very) brief summary of what was (or will be) covered in each class.
December 9
Lecture overview: Defined diagonally dominant matrices and proved that such matrices are invertible and admit an LU factorization. Defined symmetric and positive definite matrices; proved that a symmetric matrix that admits an LU factorization also admits a facotrization of the form LDLT, where D is diagonal and contains the pivots of A; when A is positive definite, all the pivots are positive so D admits a square root and A has a Cholesky factorization: A = LLT; this has a nice application to least squares problems.
The lecture material is contained in Section 6.6 of your text.
Quiz 5 was returned and solutions were distributed.
Problem Set 6 is due on Wednesday, December 11. See Problem Sets for more information.
Our final presentations for the graduate students will be on Friday, December 13, 6:00 - 8:00 pm in Bannow 139. See Announcements for more information.
Please review your notes and look over Section 6.6 for our next meeting.
December 4
Lecture overview: Completed theorem describing when a square matrix admits an LU factorization; worked out a small example; demonstrated ALG064 which performs an LU factorization for an nxn matrix inputed as a .DTA file; when a matrix does not admit an LU factorization, we can introduce a permutation matrix P so that PA admits an LU factorization; then A = PtLU and this more general factorization is available for all square matrices.
The lecture material is contained in Section 6.5 of your text.
Quiz 5 was given during the first 20 minutes of class.
Problem Set 6 is due on Wednesday, December 11. See Problem Sets for more information.
Please review your notes and look over Section 6.6 for our next meeting.
December 2
Lecture overview: Introduced determinant of a square matrix as a measure of volume; proved area formula in dimension 2; recalled some convenient properties of determinants; the algorithm for computing determinants from the definition is O(n!), but if we use Gaussian elimination to reduce to a triangular matrix and then compute the determinant of the triangular matrix, then this process is O(n3); gave an example. Introduced the idea of matrix factorizations: if we can express a matrix as a product of simpler matrices, we can simplfiy some computations; started with LU factorization; with an LU factorization, solving a linear system is an O(n2) process; stated theorem describing when a square matrix admits an LU factorization and started the proof; the main idea is that each row operation corresponds to matrix multiplication by a lower triangular matrix.
The lecture material is contained in Sections 6.4 and 6.5 of your text.
Quiz 5 will be given during the first 20 minutes of class on Wednesday, December 4. See Announcements for more information.
Problem Set 6 is due on Wednesday, December 11. See Problem Sets for more information.
Please review your notes and look over Section 6.5 for our next meeting.
November 27
Lecture overview: No class - Happy Thanksgiving!
Quiz 5 will be given during the first 20 minutes of class on Wednesday, December 4. See Announcements for more information.
Problem Set 6 is due on Wednesday, December 11. See Problem Sets for more information.
Please review your notes and look over Section 6.4 for our next meeting.
November 25
Lecture overview: Gave an example of when partial pivoting may fail; this motivates the technique of scaled partial pivoting, which chooses the pivot with the largest relative size in a row; worked out an example of scaled partial pivoting in detail; described complete pivoting; scaled partial pivoting is often implemented instead of complete pivoting since partial pivoting has O(n2) efficiency while complete pivoting has O(n3) efficiency. Reviewed matrix inverses and an algorithm to compute one for a given square matrix; this is also O(n3).
The lecture material is contained in Sections 6.2 and 6.3 of your text.
Quiz 5 will be given during the first 20 minutes of class on Wednesday, December 4. See Announcements for more information.
Problem Set 6 is due on Wednesday, December 11. See Problem Sets for more information.
Please review your notes and look over Section 6.4 for our next meeting.
November 20
Lecture overview: Showed two examples of how the Gaussian elimination algorithm can fail: there can either be no solutions or infinitely many solutions; explained the geometric interpretation of these ideas; derived an operations count for the Gaussian elimination algorithm, which is O(n3) for an nxn matrix; gave an example in which a small pivot greatly magnififed a small rounding error; introduced partial pivoting as a simple method to address the danger posed by small pivots.
The lecture material is contained in Sections 6.1 and 6.2 of your text.
Problem Set 5 was collected. Problem Set 6 was assigned and is due on Wednesday, December 11. See Problem Sets for more information.
Please review your notes and look over Section 6.3 for our next meeting.
November 18
Lecture overview: Demonstrated the use of ALG044, ALG045 and ALG046 for computing double and triple integrals using Composite Simpson's Rule (44) for double integrals on domains of Type I, and Gaussian quadrature for double integrals (45) and for triple integrals (46). Began a new chapter on matrices: reviewed Guassian elimination and basic ideas from linear algebra.
The lecture material is contained in Section 4.8 and 6.1 of your text.
Problem Set 5 is due on Wednesday, November 20. See Problem Sets for more information.
Quiz 4 was returned and solutions were distributed.
Please review your notes and look over Section 6.2 for our next meeting.
November 13
Lecture overview: Described how to adapt some quadrature methods to double integrals; showed how this is done explicitly for the composite Simpson's rule and for Gaussian quadrature on rectangular domains; worked out an example of Guassian quadrature to approximate a double integral with error 4 x 10-7 using only 9 functional evaluations; briefly described how this procedure can be generalized to nonrectangular domains: specifically domains of type 1.
The lecture material is contained in Section 4.8 of your text.
Problem Set 5 is due on Wednesday, November 20. See Problem Sets for more information.
Quiz 4 was given during the first 30 minutes of class.
Please review your notes and look over Section 4.8 for our next meeting.
November 11
Lecture overview: Happy Veterans Day! Introduced Gaussian quadrature as a quadrature method which seeks to choose weights and nodes in such a way as to maximize the degree of the polynomial for which the method provides an exact answer; worked out an example with 2 nodes; in general, we can expect n nodes and n weights to give an exact answer for polynomials of degree 2n-1; defined Legendre polynomials on [-1,1] and used their roots and the corresponding Lagrange coefficient functions to prove the above theorem for Gaussian quadrature; worked out an example with n=3 and noted the accuracy.
The lecture material is contained in Section 4.7 of your text.
Problem Set 4 was returned in class. Problem Set 5 is due on Wednesday, November 20. See Problem Sets for more information.
Quiz 4 will be given during the first 15 minutes of class on Wednesday, November 13. See Announcements for more information.
Please review your notes and look over Section 4.8 for our next meeting.
November 6
Lecture overview: Demonstrated the use of ALG042 to apply Romberg integration for a given function f. Introduced the idea of Adaptive Quadrature methods to increase the efficiency of refinements where needed; derived an algorithm for an adaptive composite Simpson's Rule; this forms the basis of ALG043; worked out an example and compared the efficiency of the adaptive version with the standard composite Simpson's Rule.
The lecture material is contained in Sections 4.5 and 4.6 of your text.
Problem Set 4 was collected. Problem Set 5 was assigned and is due on Wednesday, November 20. See Problem Sets for more information.
Quiz 4 will be given during the first 15 minutes of class on Wednesday, November 13. See Announcements for more information.
Please review your notes and look over Section 4.7 for our next meeting.
November 4
Lecture overview: Derived composite Simpson's Rule with error term; worked out an example using ALG041, using the error term to determine the number of subintervals needed in the algorithm; discussed stability of numerical integration with respect to roundoff error. Introduced Romberg integration, which uses Richardson's extrapolation to increase the accuracy of a lower order numerical approximation method; worked out an example by hand which computed 5 levels of Romberg integration.
The lecture material is contained in Sections 4.4 and 4.5 of your text.
Problem Set 4 is due on Wednesday, November 6. See Problem Sets for more information.
Please review your notes and look over Section 4.5 for our next meeting.
October 30
Lecture overview: Derived Simpson's Rule with error term; stated general theorem for closed and open Newton-Cotes formulas, which take advantage of a cancellation to obtain an order of convergence higher when n is even; defined degree of precision, which can vary from the Newton-Cotes formulation when the spacing of the nodes is not even; defined composite integration and derived the composite Trapezoidal Ruel with error term;
The lecture material is contained in Sections 4.3 and 4.4 of your text.
Problem Set 4 is due on Wednesday, November 6. See Problem Sets for more information.
Please review your notes and look over Section 4.5 for our next meeting.
October 28
Lecture overview: Discussed Richardson's Extrapolation, which is a method to use low order approximations to derive higher order approximations through a recursive process; showed how to accomplish this when the quantity we are approximating has an expansion with all powers of h and an expansion with only even powers of h; worked out an example of each case. Introduced numerical quadrature as a method of numerical integration based on Lagrange polynomials of a certain degree; in the simplest cases, these lead to the familiar Trapezoidal Rule and Simpson's Rule from Calculus; showed how the integration of the first order Lagrange polynomial yields the Trapezoidal rule with error term.
The lecture material is contained in Sections 4.2 and 4.3 of your text.
Problem Set 3 was returned. Problem Set 4 is due on Wednesday, November 6. See Problem Sets for more information.
Please review your notes and look over Section 4.3 for our next meeting.
October 23
Lecture overview: Introduced the problem of numerical differentiation; using a first order Lagrange polynomial gives us an error term for the difference quotient proportional to the step size h. We can also use higher degree Lagrange polynomials to derive n-point approximations to the derivative with error term that is of the order of hn-1; derived 3-point end point and 3-point midpoint approximations for the derivative; stated 5-point midpoint approximation; discussed the role of rounding errors: generally, we should emply rounding errors which are one order of h higher than the truncation error, otherwise the rounding error will dominate the calculation.
The lecture material is contained in Section 4.1 of your text.
Problem Set 3 was collected and Problem Set 4 was assigned. Problem Set 4 is due on Wednesday, November 6. See Problem Sets for more information.
The Class Project was returned graded.
Please review your notes and look over Section 4.2 for our next meeting.
October 21
Lecture overview: Recalled Cubic Spline Interpolations of two types: natural spline and clamped spline, which differ in the boundary conditions we specify; derived the linear system associated with the clamped spline and showed how to incorporate the necessary boundary conditions; demonstrated the use of ALG035 to find the coefficients and to use them to graph the corresponding solution. Reviewed and collected the project completed last class.
The lecture material is contained in Section 3.5 of your text.
Problem Set 3 is due on Wednesday, October 23. See Problem Sets for more information.
Quiz 2 was returned and solutions were distributed.
Please review your notes and look over Section 4.1 for our next meeting.
October 16
The class worked on a project involving the production of a clamped spline for the normal distribution using ALG035. This project will be collected on Monday, October 21.
Problem Set 3 is due on Wednesday, October 23. See Problem Sets for more information.
Please review your notes and look over Section 3.5 for our next meeting.
October 14
Columbus Day: No Class.
October 9
Lecture overview: Defined Cubic Spline Interpolations of two types: natural spline and clamped spline, which differ in the boundary conditions we specify; presented the form of the cubic polynomials on each interval; for n+1 data points, this consists of 4n conditions for 4n unknowns. Derived relations between coefficients and eliminated all but one set of coefficients; we can write a linear system in this set of coefficients; we set up this linear system in the case of the natural spline and reduced it to a linear algebra problem; this is implemented in ALG034; computed an example using ALG034 and graphed the spline on Desmos.
The lecture material is contained in Section 3.5 of your text.
A project involving the production of a clamped spline for the normal distribution using ALG035 was handed out in class. This project will be completed during our next class on Wednesday, October 16.
Problem Set 3 is due on Wednesday, October 23. See Problem Sets for more information.
Quiz 2 was given during the first 20 minutes of class.
Please review your notes and look over Section 3.5 for our next meeting.
October 7
Lecture overview: Recalled definition of Hermite polynomial and started to work out an example by hand with only 3 data points; it becomes immediately clear that we need an algorithm to do a calculation of any reasonable size; introduced divided differences for Hermite polynomial and identified an algorithm to produce the coefficients; contrasted with the the divided differences for Lagrange polynomials; computed an example using ALG033. Introduced cubic splines as another way to interpolate between data points; splines take a different approach: instead of a single interpolating polynomial whose degree varies with the number of data points, splines use a polynomial of fixed degree (in the cubic case 3) to interpolate between every 2 data points, and match smoothly at each point. Motivated the need for this different approach using a data set whose Lagrange polynomial oscillates wildly, althought the data is fairly regular.
The lecture material is contained in Sections 3.4 and 3.5 of your text.
Problem Set 2 was returned in class.
Problem Set 3 is due on Wednesday, October 23. See Problem Sets for more information.
Quiz 2 will be given during the first 15 minutes of class on Wednesday, October 9. See Announcements for more information.
Please review your notes and look over Section 3.5 for our next meeting.
October 2
Lecture overview: Reviewed definition of divided differences and their role in determining the coefficients of Lagrange polynomials; worked out an example by hand in which we computed the full table of divided differences; demonstrated the use of ALG032 to compute the required coefficients; the algorithm only displays the top row of the table, although the other entries are implicitly computed. Defined Hermite polynomials as being polynomial approximations that match both the function values as well as the slopes of the function at n+1 given points; this uniquely determines a polynomial of degree at most 2n+1; proved a formula for the degree 2n+1 degree Hermite polynomial in terms of the Lagrange coefficient functions Ln,j.
The lecture material is contained in Sections 3.3 and 3.4 of your text.
Problem Set 2 was collected. Problem Set 3 was assigned and is due on Wednesday, October 23. See Problem Sets for more information.
Quiz 2 will be given during the first 15 minutes of class on Wednesday, October 9. See Announcements for more information.
Please review your notes and look over Section 3.4 for our next meeting.
September 30
Lecture overview: Recalled Neville's Method, which allows us to recursively work out approximations using Lagrange polynomials; worked out a short example by hand; demonstrated the use of ALG031 to derive an approximation quickly using Neville's method. Introduced definition of divided differences of order 0, 1, 2 and so on; these turn out to provide the coefficients of Lagrange polynomials written in a particular form; derived expressions in terms of divided differences for the coefficients a0, a1 and a2.
The lecture material is contained in Sections 3.2 and 3.3 of your text.
Quiz 1 was returned and the solutions were briefly discussed.
Problem Set 2 is due on Wednesday, October 2. See Problem Sets for more information.
Please review your notes and look over Section 3.3 for our next meeting.
September 25
Lecture overview: Recalled definition of Lagrange polynomials; stated theorem which provides an error term resembling the error term for Taylor series. There are two main difficulties with using Lagrange polynomials: determining error when working with data without a known function; computing successive approximations entails increasing amounts of work. For the first, we adopt the convention that we compute successive approximations with increasing data points until the desired number of decimal places stabilize. To reduce the computational time, we proved a recursive definition for Lagrange polynomials which allows us to leverage work computing degree n approximations to find degree n+1 approximations; this forms the basis for Neville's Method, which was briefly described.
The lecture material is contained in Sections 3.1 and 3.2 of your text.
Quiz 1 was given during the first 20 minutes of class.
Problem Set 2 is due on Wednesday, October 2. See Problem Sets for more information.
Please review your notes and look over Section 3.2 for our next meeting.
September 23
Lecture overview: Defined Lagrange polynomials interpolating between given data points. Worked out two examples of computing Lagrange polynomials. Compared their convergence and approximation to Taylor polynomials; since Taylor polynomials only convergence within a determined radius of convergence, they are not useful for approximating certain functions; in these cases, Lagrange polynomials can be more useful.
The lecture material is contained in Section 3.1 of your text.
Problem Set 1 was returned and solutions were distributed. Several key points in the solutions were reviewed in class.
Problem Set 2 is due on Wednesday, October 2. See Problem Sets for more information.
Quiz 1 will be given during the first 15 minutes of class on Wednesday, September 25. See Announcements for more information.
Please review your notes and look over Section 3.2 for our next meeting.
September 18
Lecture overview: Introduced Muller's Method as a way to numerically approximate complex roots of polynomials; the motivation for this is that Newton's method yields real outputs whenever the input is real so one has to know a priori to test complex inputs; the advantage of Muller's method is that it will locate complex roots starting from real inputs. The main idea in Muller's algorithm is to use 3 points on the graph of f to determine a parabola and then use one of the roots of the parabola to approximate the root of f; update the 3 points with the approximate root and repeat. Demonstrated an application of this algorithm in MATLAB to find the 4 roots of a polynomial using 3 different starting triples for inputs.
The lecture material is contained in Section 2.6 of your text.
Problem Set 1 was collected. Problem Set 2 was assigned and is due on Wednesday, October 2. See Problem Sets for more information.
Quiz 1 will be given during the first 15 minutes of class on Wednesday, September 25. See Announcements for more information.
Please review your notes and look over Section 3.1 for our next meeting.
September 16
Lecture overview: Reviewed Newton's method and discussed ways in which it can fail to converge; demonstrated related Matlab algorithm provided on the textbook website; proved that Newton's method converges quadratically to a root p when f'(p) is not 0; defined order of convergence and contrasted error bounds for linear and quadratic convergence; stated theorem that fixed point iteration converges only linearly unless g'(p)=0, which is a very special condition that Newton's method exploits.
The lecture material is contained in Sections 2.3 and 2.4 of your text.
Problem Set 1 is due on Wednesday, September 18. See Problem Sets for more information.
Please review your notes and look over Section 2.6 for our next meeting.
September 11
Lecture overview: Discussed the connection between fixed points of functions and roots of a related function; proved a theorem guaranteeing the existence and uniqueness of a fixed point on an interval; introduced Fixed Point Iteration algorithm and under the same hypotheses as the theorem, showed that the algorithm converges, also deriving an error bound; demonstrated ALG023 in MATLAB to show how given a root-finding problem, different choices of the function g to be iterated can produce divergent or convergent sequences of approximations to the root. Introduced Newton's Method, stated the algorithm and worked through a brief example showing fast convergence.
The lecture material is contained in Sections 2.2 and 2.3 of your text.
Problem Set 1 is due on Wednesday, September 18. See Problem Sets for more information.
Please review your notes and look over Sections 2.4 and 2.5 for our next meeting.
September 9
Lecture overview: Introduced terminology around stability of an algorithm and order of convergence of an algorithm; worked through an example of an unstable algorithm to show how errors can grow exponentially. Introduced the Bisection Method as the first numerical method to find roots of a continuous function on an interval; introduced textbook based webpage containing algorithms and demonstrated ALG022 for MATLAB, which is an algorithm that implements the Bisection Method.
The lecture material is contained in Sections 1.3 and 2.1 of your text.
Problem Set 1 is due on Wednesday, September 18. See Problem Sets for more information.
Please review your notes and read Sections 2.2 and 2.3 of your text for our next meeting.
Lecture overview:
The lecture material is contained in Section 1.1 of your text.
Problem Set 1 was assigned and is due on Wednesday, September 18. See Problem Sets for more information.
Please review your notes and read Sections 1.3 and 2.1 of your text for our next meeting.