jagomart
digital resources
picture1_Algorithms For Optimization Pdf 85862 | Nedich


 190x       Filetype PDF       File size 0.33 MB       Source: lidsconf.mit.edu


File: Algorithms For Optimization Pdf 85862 | Nedich
distributed algorithms for optimization in networks angelia nedi c january 27 2022 angelia nedich asu edu lecture 1 27th lids student conference mit distributed large scale optimization problems sources automatic ...

icon picture PDF Filetype PDF | Posted on 14 Sep 2022 | 3 years ago
Partial capture of text on file.
      Distributed Algorithms for Optimization in Networks
                 Angelia Nedi´c
                 January 27, 2022
                Angelia.Nedich@asu.edu
        Lecture 1             27th LIDS Student Conference    MIT
            Distributed (Large-Scale) Optimization Problems:
                                Sources
         ◮Automatic Control Systems (robot networks)
            • Energy Systems
            • Envisioned Smart Grids and Smart Cities
         ◮Signal and Image Processing (Image Reconstruction, Pattern Recognition)
         ◮Data Science (Learning from Data)
                                                                1
            Lecture 1                      27th LIDS Student Conference                    MIT
                                  Machine Learning Problem
             ◮Consider a prototype problem arising in the supervised learning, where a machine (or
                neural net) is trained from a large data set.
             ◮The problem typically consists of minimizing some objective cost subject to a large
                number of constraints of the following form:
                             minimize   ρ(x)
                                                                                n
                            subject to  g(x;y ,z ) ≤ 0,    i = 1,...,m,    x∈R ,           (1)
                                              i i
                where p is the number of data points (m ≫ 1), x ∈ Rn is a decision vector (the vector
                of weights in neural-nets), and the function ρ(·) is used to promote certain properties
                of the solutions, such as sparsity or robustness.
             ◮The function g(x;y ,z ) represents a constraint imposed by the data point (y ,z ) ∈
                                  i  i                                                  i i
                Rn+1, where y is a measurement and z is the label associated with the measurement.
                             i                      i
             ◮For example, for linear classifiers, each data constraint is linear, i.e.,
                                           g(x;y ,z ) = 1−z hy ,xi
                                                i  i         i i
                while the labels z are binary.
                                i
             ◮Thedifficulty in solving problem (1) lies in the large number m of constraints.
                                                                                             2
         Lecture 1            27th LIDS Student Conference      MIT
                                Strategies
         ◮The existing methods developed prior to the emergence of such large problems could
           not cope with such a large scale.
         ◮Tocopewiththelargenumberofconstraints, therearetwomainconceptualapproaches
           related to problem (1)
            • Penalty-Based Reformulation, which essentially replaces problem (1) with an
             unconstrained problem obtained by penalizing the constraints to form a new objec-
             tive function. The resulting unconstrained problem is not necessarily equivalent
             to the original constrained problem (1).
            • Sampled-Constraint Approximation, where the problem is either approximated
             or addressed directly by sampling the constraints “on-the-go” (within an algorithm).
                                                                 3
The words contained in this file might help you see if this file matches what you are looking for:

...Distributed algorithms for optimization in networks angelia nedi c january nedich asu edu lecture th lids student conference mit large scale problems sources automatic control systems robot energy envisioned smart grids and cities signal image processing reconstruction pattern recognition data science learning from machine problem consider a prototype arising the supervised where or neural net is trained set typically consists of minimizing some objective cost subject to number constraints following form minimize x n g y z i m r p points rn decision vector weights nets function used promote certain properties solutions such as sparsity robustness represents constraint imposed by point measurement label associated with example linear classiers each e hy xi while labels are binary thediculty solving lies strategies existing methods developed prior emergence could not cope tocopewiththelargenumberofconstraints therearetwomainconceptualapproaches related penalty based reformulation which e...

no reviews yet
Please Login to review.