top of page
bilspiqaralmins

Master Numerical Optimization with Chegg: Homework Solution and Online Help



Exam problems will be reflective of homework problems in their style and content coverage. Problems can cover material presented in the slides, in addition to material from the homework. You are expected to understand slide material to the level of detail presented in the slides (but no more). You can be tested on the content of and solutions to homework problems.


Course: MATH 164, Optimization, Lecture 3, Fall 2016Prerequisite: Math 115A. Not open for credit to students with credit for Electrical Engineering 136. Course Content: Fundamentals of optimization. Linear programming: basic solutions, algorithms (simplex method, ellipsoid method, interior point methods). Gradient descent, Newton's Method, Conjgate Gradient methods. Least squares. Unconstrainted optimization. Semidefinite programming. Calculus of variations. Last update: 15 October 2016




Numerical Optimization Homework Solution



  • Honor code: You are encouraged to discuss homework assignments with each other, but write up solutions on your own! Sharing or copying a solution will result in a zero score for the relevant assignment for both parties.

  • If a significant part of your solution is due to someone else or from other sources (books, forums, etc), you should acknowledge the source! Failure to do so will result in a zero score for the relevant assignment.



Goals: The course focuses on the formulation, solution and analysis of nonlinear optimization problems. It illustrates the difference between well-posed and ill-posed problems, and how the solutions are characterized. Modern techniques for solving nonlinear optimization problems are discussed in detail


This course will cover a wide range of topics in numerical optimization. The major goal is to learn a set of tools that will be useful for research in Artificial Intelligence and Computer Graphics. The course is a graduate-level course that combines instruction of basic material, written homeworks , and a final project. The course material integrates the theory of optimization and concrete real applications. Grading is based on homeworks (50%) and the final project (50%).


We will see that the most expensive step in these optimization algorithms is invariably the solution of large systems of linear equations Ax = b where A is often asparse matrix. This brings us to the subject matter of the second part of this course.


The optimization result represented as a OptimizeResult object.Important attributes are: x the solution array, success aBoolean flag indicating if the optimizer exited successfully andmessage which describes the cause of the termination. SeeOptimizeResult for a description of other attributes.


Method trust-constr is atrust-region algorithm for constrained optimization. It swichesbetween two implementations depending on the problem definition.It is the most versatile constrained minimization algorithmimplemented in SciPy and the most appropriate for large-scale problems.For equality constrained problems it is an implementation of Byrd-OmojokunTrust-Region SQP method described in [17] and in [5], p. 549. Wheninequality constraints are imposed as well, it swiches to the trust-regioninterior point method described in [16]. This interior point algorithm,in turn, solves inequality constraints by introducing slack variablesand solving a sequence of equality-constrained barrier problemsfor progressively smaller values of the barrier parameter.The previously described equality constrained SQP method isused to solve the subproblems with increasing levels of accuracyas the iterate gets closer to a solution.


Many situations of interest to researchers require solutions to equations that cannot be solved exactly or are too complicated to be modeled quickly and easily. An example of the latter is the three-dimensional simulation of particle motion in flowing liquids. In situations like this, computational methods are used to produce theoretical models that provide behavioral insight and predictive power under new conditions. It's not unusual for introductory courses in computational methods to be tied in with a numerical computing platform like MATLAB, covering the following topics:


  • Each homework assignment contains both theoretical questions and will have programming components.You are required to use Python for the programming portions. There are a number of excellent tutorials for getting started with Python.You may use any numerical linear algebra package, but you may not use machine learning libraries (e.g. sklearn, pytorch, tensorflow) unless otherwise specified (later in the course). Your code should be submitted as an executable script (e.g., *.py) and not pasted into a typeset document.

  • You must submit your HW as a typed PDF document typeset in Latex (not handwritten). Learn Latex in 30 minutes.Use an online editor or install and use LaTeX on your local machine (recommended). Also note that LaTeX is installed on department-run machines.

The first homework (10 points) is designed to be very easy and its purpose is to get you comfortable with Python and Latex. There will be generous office hours for assistance.


HONOR CODE: As we sometimes reuse problem set questions from previous years, covered by papers and webpages, we expect the students not to copy, refer to, or look at the solutions in preparing their answers (referring to unauthorized material is considered a violation of the honor code). Similarly, we expect students not to google directly for answers. The homework is to help you think about the material, and we expect you to make an honest effort to solve the problems. If you do happen to use other material, it must be acknowledged clearly with a citation on the submitted solution. For more information, please see the CSE Academic Misconduct policy that this course adheres to.


Weekly homework assignments, due each Friday at midnight, starting the second week. Homework assignments (and later, solutions) are posted on Ed.We will use Gradescope forhomework submission, with the details on Ed.Late homework will not be accepted.You are allowed, even encouraged, to work on the homework insmall groups, but you must write up your own homework to hand in.Each question on the homework will be graded on a scale of 0, 1, 2.


Good knowledge of linear algebra (as in EE263) and probability.Exposure to numerical computing, optimization, and application fields helpful but not required; the applications will be kept basic and simple.


  • Description of course:We will cover as many topics as possible, but of course several of them will be skipped due to time limits. Characterization of solutions (such as optimality conditions inoptimization) and convergence analysis of the algorithms will beessential to this course. We give below a partial list of topics and algorithms to be treated in connexion with three generalclasses of problems: Unconstrained optimization:

  • Steepest-descent method

  • Newton-like methods

  • Quasi-Newton methods

  • Linear/nonlinear conjugate gradient methods

  • Interval reduction methods

  • Line-search methods

  • Trust-region methods

  • Local and global convergence

  • Nonlinear equations:

  • Newton's method

  • Modified Newton's methods

  • Broyden's (quasi-Newton) method

  • Inexact Newton methods

  • The bisection method

  • Line-search methods and merit functions

  • Trust-region methods

  • Local and global convergence

  • Constrained optimization:

  • Lagrange multipliers

  • Karush-Kuhn-Tucker conditions

  • Line-search methods and merit functions

  • Active-set methods (for inequality constraints)

  • Penalty function methods (for equality constraints)

  • Reduced-gradient and gradient-projection methods

  • Augmented Lagrangian and projected Lagrangian methods

  • Barrier methods (for inequality constraints)

  • Interior-point methods (for inequality constraints)

  • Sequential linearly constrained programming

  • Sequential quadratic programming

  • More topics in optimization:

  • Convexity

  • Linear programming and the simplex method

  • Quadratic programming

  • Duality

  • Nonlinear least-squares problems

  • Variational calculus

  • Nonsmooth optimization

  • Dynamic optimization and the maximum principle of Pontryagin

  • Dynamic programming and the Hamilton-Jacobi-Bellman equation

  • Neural networks and the backpropagation algorithm

  • Stochastic optimization

  • Simulated annealing

  • Genetic algorithms


Objectives and goals of the course:This course is at a graduate level and it is assumed thatyou can work along the course in an independent fashion.This course will cover modern optimization techniquesfor both constrained and unconstrained optimization withcontinuous (as opposed to discrete) variables.Given its strong links to optimization techniques, the numerical solution of nonlinear equations will also beconsidered. At the end of the course the studentshould master some essential issues in numerical optimization.


Homework:Will be assigned approximately weekly. Presentation of your resultsis very important. Scratch paper will not be accepted.Do not expect good grades if your solution to a problem is poorly communicated. Like for everything, if you cannotexplain something in great details, you certainly have not fully understood it. The importance of doing homework cannotbe overemphasized, most of human people learn by doing, notonly by watching and/or listening. Late homework may not be accepted, you need to request permission first or to provide a reasonable justification. Late homework is not accepted oncea correction is given. 2ff7e9595c


0 views0 comments

Recent Posts

See All

Comments


bottom of page