08 Optim: Univariate (GSS, Brent)

image.png Սևան, լուսանկարի հղումը, Հեղինակ՝ Armen

📚 Նյութը

ToDo - 🛠️📺 Գործնականի տեսագրությունը - 🛠️🗂️ Գործնականի PDF-ը

🏡 Տնային

Note
  1. ❗❗❗ DON’T CHECK THE SOLUTIONS BEFORE TRYING TO DO THE HOMEWORK BY YOURSELF❗❗❗
  2. Please don’t hesitate to ask questions, never forget about the 🍊karalyok🍊 principle!
  3. The harder the problem is, the more 🧀cheeses🧀 it has.
  4. Problems with 🎁 are just extra bonuses. It would be good to try to solve them, but also it’s not the highest priority task.
  5. If the problem involve many boring calculations, feel free to skip them - important part is understanding the concepts.
  6. Submit your solutions here (even if it’s unfinished)

Golden Section Search Implementation 🧀🧀

Problem 1 🧀🧀 Implement Golden Section Search (GSS) in Python.

Given the function \(f(x) = x^4 - 3x^3 + 2x\) on the interval \([0, 3]\):

Starting point: You have 3 points: - Left endpoint: \(a\) - Right endpoint: \(b\)
- Current guess: \(x_c\) (initially, could be the midpoint)

In each iteration: Generate one new guess \(x_{\text{new}}\) and evaluate \(f(x_{\text{new}})\). Compare with \(f(x_c)\) to decide which sub-interval to keep.

Part (a): Random vs Golden Ratio Selection

  1. Implement GSS with random guess selection: Generate the new guess \(x_{\text{new}}\) randomly within the current interval \([a, b]\) using np.random.uniform(a, b).

  2. Implement GSS with golden ratio (\(\phi = \frac{1 + \sqrt{5}}{2}\)): Generate the new guess \(x_{\text{new}}\) using the golden ratio formula.

  3. Compare both methods:

    • Run each method with max_iter=50
    • Plot the convergence: iteration number vs. interval length
    • Plot the iteration number vs distance to true minimum

Part (b): Maximum Iterations for Tolerance

For Golden Section Search on interval \([a, b]\).

  1. Derive the formula for the number of iterations \(n\) needed to reduce the interval to length \(\epsilon\).

  2. Given \([a, b] = [0, 10]\) and tolerance \(\epsilon = 10^{-6}\), how many iterations are required?

  3. Implement a function max_iterations_for_tolerance(a, b, epsilon) that returns the required number of iterations.

  4. Verify your formula by running GSS and checking when the interval length first becomes \(< \epsilon\).

Part (c): Early Stopping Criteria

Add both absolute and relative stopping criteria to your Golden Section Search implementation:

Absolute criterion: Stop if \(|f(x^{(k+1)}) - f(x^{(k)})| < \varepsilon_{\text{abs}}\)

Relative criterion: Stop if \(\frac{|f(x^{(k+1)}) - f(x^{(k)})|}{|f(x^{(k)})| + \delta} < \varepsilon_{\text{rel}}\)

where \(\delta = 10^{-10}\) is a small constant to avoid division by zero.

  1. Implement golden_section_search_with_stopping(f, a, b, max_iter, eps_abs, eps_rel)

  2. Test on \(f(x) = (x-2)^2 + 0.001\) on \([0, 4]\) with:

    • eps_abs = 1e-6
    • eps_rel = 1e-4
    • max_iter = 100
  3. Report which stopping criterion was triggered and at which iteration.

Brent’s Method with Parabolic Interpolation 🧀🧀🧀

Problem 2 🧀🧀🧀 Implement Brent’s method using parabolic interpolation.

Key idea: Fit a parabola through three points and find its minimum.

Given three points \((x_1, f_1)\), \((x_2, f_2)\), \((x_3, f_3)\), we want to fit \(p(x) = ax^2 + bx + c\).

  1. Set up the system of linear equations (SLE): \[\begin{bmatrix} x_1^2 & x_1 & 1 \\ x_2^2 & x_2 & 1 \\ x_3^2 & x_3 & 1 \end{bmatrix} \begin{bmatrix} a \\ b \\ c \end{bmatrix} = \begin{bmatrix} f_1 \\ f_2 \\ f_3 \end{bmatrix}\]

  2. Solve using scipy.linalg.solve to get coefficients \([a, b, c]\).

  3. Implement a simplified Brent’s method:

    • Start with three points from the bracket \([a, b]\)
    • Try parabolic interpolation
    • If it fails or gives invalid result, use Golden Section Search
    • Repeat until convergence
  4. Test on \(f(x) = x^4 - 3x^3 + 2\) on \([0, 3]\) and compare convergence with pure GSS.

🛠️ Գործնական

Կոդը ստեղ ա

🎲 44 (08)

Flag Counter