08 Optim: Univariate (GSS, Brent)
Սևան, լուսանկարի հղումը, Հեղինակ՝ Armen
📚 Նյութը
ToDo - 🛠️📺 Գործնականի տեսագրությունը - 🛠️🗂️ Գործնականի PDF-ը
🏡 Տնային
- ❗❗❗ DON’T CHECK THE SOLUTIONS BEFORE TRYING TO DO THE HOMEWORK BY YOURSELF❗❗❗
- Please don’t hesitate to ask questions, never forget about the 🍊karalyok🍊 principle!
- The harder the problem is, the more 🧀cheeses🧀 it has.
- Problems with 🎁 are just extra bonuses. It would be good to try to solve them, but also it’s not the highest priority task.
- If the problem involve many boring calculations, feel free to skip them - important part is understanding the concepts.
- Submit your solutions here (even if it’s unfinished)
Golden Section Search Implementation 🧀🧀
Problem 1 🧀🧀 Implement Golden Section Search (GSS) in Python.
Given the function \(f(x) = x^4 - 3x^3 + 2x\) on the interval \([0, 3]\):
Starting point: You have 3 points: - Left endpoint: \(a\) - Right endpoint: \(b\)
- Current guess: \(x_c\) (initially, could be the midpoint)
In each iteration: Generate one new guess \(x_{\text{new}}\) and evaluate \(f(x_{\text{new}})\). Compare with \(f(x_c)\) to decide which sub-interval to keep.
Part (a): Random vs Golden Ratio Selection
Implement GSS with random guess selection: Generate the new guess \(x_{\text{new}}\) randomly within the current interval \([a, b]\) using
np.random.uniform(a, b).Implement GSS with golden ratio (\(\phi = \frac{1 + \sqrt{5}}{2}\)): Generate the new guess \(x_{\text{new}}\) using the golden ratio formula.
Compare both methods:
- Run each method with
max_iter=50 - Plot the convergence: iteration number vs. interval length
- Plot the iteration number vs distance to true minimum
- Run each method with
Part (b): Maximum Iterations for Tolerance
For Golden Section Search on interval \([a, b]\).
Derive the formula for the number of iterations \(n\) needed to reduce the interval to length \(\epsilon\).
Given \([a, b] = [0, 10]\) and tolerance \(\epsilon = 10^{-6}\), how many iterations are required?
Implement a function
max_iterations_for_tolerance(a, b, epsilon)that returns the required number of iterations.Verify your formula by running GSS and checking when the interval length first becomes \(< \epsilon\).
Part (c): Early Stopping Criteria
Add both absolute and relative stopping criteria to your Golden Section Search implementation:
Absolute criterion: Stop if \(|f(x^{(k+1)}) - f(x^{(k)})| < \varepsilon_{\text{abs}}\)
Relative criterion: Stop if \(\frac{|f(x^{(k+1)}) - f(x^{(k)})|}{|f(x^{(k)})| + \delta} < \varepsilon_{\text{rel}}\)
where \(\delta = 10^{-10}\) is a small constant to avoid division by zero.
Implement
golden_section_search_with_stopping(f, a, b, max_iter, eps_abs, eps_rel)Test on \(f(x) = (x-2)^2 + 0.001\) on \([0, 4]\) with:
eps_abs = 1e-6eps_rel = 1e-4max_iter = 100
Report which stopping criterion was triggered and at which iteration.
Brent’s Method with Parabolic Interpolation 🧀🧀🧀
Problem 2 🧀🧀🧀 Implement Brent’s method using parabolic interpolation.
Key idea: Fit a parabola through three points and find its minimum.
Given three points \((x_1, f_1)\), \((x_2, f_2)\), \((x_3, f_3)\), we want to fit \(p(x) = ax^2 + bx + c\).
Set up the system of linear equations (SLE): \[\begin{bmatrix} x_1^2 & x_1 & 1 \\ x_2^2 & x_2 & 1 \\ x_3^2 & x_3 & 1 \end{bmatrix} \begin{bmatrix} a \\ b \\ c \end{bmatrix} = \begin{bmatrix} f_1 \\ f_2 \\ f_3 \end{bmatrix}\]
Solve using
scipy.linalg.solveto get coefficients \([a, b, c]\).Implement a simplified Brent’s method:
- Start with three points from the bracket \([a, b]\)
- Try parabolic interpolation
- If it fails or gives invalid result, use Golden Section Search
- Repeat until convergence
Test on \(f(x) = x^4 - 3x^3 + 2\) on \([0, 3]\) and compare convergence with pure GSS.
🛠️ Գործնական
Կոդը ստեղ ա