Skip to content

Commit d7befd9

Browse files
committed
Merge branch 'main' of github.com:Pyomo/pyomo into finalize-release
2 parents 592f3d4 + 6edb983 commit d7befd9

File tree

17 files changed

+1511
-252
lines changed

17 files changed

+1511
-252
lines changed

doc/OnlineDocs/_templates/recursive-module.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -94,7 +94,7 @@ Library Reference
9494
:toctree:
9595
:template: recursive-module.rst
9696
:recursive:
97-
{% for item in modules %}
97+
{% for item in all_modules %}
9898
{# Need item != tests for Sphinx >= 8.0; !endswith(.tests) for < 8.0 #}
9999
{% if item != 'tests' and not item.endswith('.tests')
100100
and item != 'examples' and not item.endswith('.examples') %}
Lines changed: 117 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,16 +1,121 @@
11
Covariance Matrix Estimation
2-
=================================
2+
============================
33

4-
If the optional argument ``calc_cov=True`` is specified for :class:`~pyomo.contrib.parmest.parmest.Estimator.theta_est`,
5-
parmest will calculate the covariance matrix :math:`V_{\theta}` as follows:
4+
The uncertainty in the estimated parameters is quantified using the covariance matrix.
5+
The diagonal of the covariance matrix contains the variance of the estimated parameters.
6+
Assuming Gaussian independent and identically distributed measurement errors, the
7+
covariance matrix of the estimated parameters can be computed using the following
8+
methods which have been implemented in parmest.
69

7-
.. math::
8-
V_{\theta} = 2 \sigma^2 H^{-1}
10+
1. Reduced Hessian Method
911

10-
This formula assumes all measurement errors are independent and identically distributed with
11-
variance :math:`\sigma^2`. :math:`H^{-1}` is the inverse of the Hessian matrix for an unweighted
12-
sum of least squares problem. Currently, the covariance approximation is only valid if the
13-
objective given to parmest is the sum of squared error. Moreover, parmest approximates the
14-
variance of the measurement errors as :math:`\sigma^2 = \frac{1}{n-l} \sum e_i^2` where :math:`n` is
15-
the number of data points, :math:`l` is the number of fitted parameters, and :math:`e_i` is the
16-
residual for experiment :math:`i`.
12+
When the objective function is the sum of squared errors (SSE):
13+
:math:`\text{SSE} = \sum_{i = 1}^n \left(y_{i} - \hat{y}_{i}\right)^2`,
14+
the covariance matrix is:
15+
16+
.. math::
17+
V_{\boldsymbol{\theta}} = 2 \sigma^2 \left(\frac{\partial^2 \text{SSE}}
18+
{\partial \boldsymbol{\theta} \partial \boldsymbol{\theta}}\right)^{-1}_{\boldsymbol{\theta}
19+
= \boldsymbol{\theta}^*}
20+
21+
When the objective function is the weighted SSE (WSSE):
22+
:math:`\text{WSSE} = \frac{1}{2} \left(\mathbf{y} - f(\mathbf{x};\boldsymbol{\theta})\right)^\text{T}
23+
\mathbf{W} \left(\mathbf{y} - f(\mathbf{x};\boldsymbol{\theta})\right)`,
24+
the covariance matrix is:
25+
26+
.. math::
27+
V_{\boldsymbol{\theta}} = \left(\frac{\partial^2 \text{WSSE}}
28+
{\partial \boldsymbol{\theta} \partial \boldsymbol{\theta}}\right)^{-1}_{\boldsymbol{\theta}
29+
= \boldsymbol{\theta}^*}
30+
31+
Where :math:`V_{\boldsymbol{\theta}}` is the covariance matrix of the estimated
32+
parameters, :math:`y` are the observed measured variables, :math:`\hat{y}` are the
33+
predicted measured variables, :math:`n` is the number of data points,
34+
:math:`\boldsymbol{\theta}` are the unknown parameters, :math:`\boldsymbol{\theta^*}`
35+
are the estimates of the unknown parameters, :math:`\mathbf{x}` are the decision
36+
variables, and :math:`\mathbf{W}` is a diagonal matrix containing the inverse of the
37+
variance of the measurement error, :math:`\sigma^2`. When the standard
38+
deviation of the measurement error is not supplied by the user, parmest
39+
approximates the variance of the measurement error as
40+
:math:`\sigma^2 = \frac{1}{n-l} \sum e_i^2` where :math:`l` is the number of
41+
fitted parameters, and :math:`e_i` is the residual for experiment :math:`i`.
42+
43+
In parmest, this method computes the inverse of the Hessian by scaling the
44+
objective function (SSE or WSSE) with a constant probability factor.
45+
46+
2. Finite Difference Method
47+
48+
In this method, the covariance matrix, :math:`V_{\boldsymbol{\theta}}`, is
49+
calculated by applying the Gauss-Newton approximation to the Hessian,
50+
:math:`\frac{\partial^2 \text{SSE}}{\partial \boldsymbol{\theta} \partial \boldsymbol{\theta}}`
51+
or
52+
:math:`\frac{\partial^2 \text{WSSE}}{\partial \boldsymbol{\theta} \partial \boldsymbol{\theta}}`,
53+
leading to:
54+
55+
.. math::
56+
V_{\boldsymbol{\theta}} = \left(\sum_{i = 1}^n \mathbf{G}_{i}^{\mathrm{T}} \mathbf{W}
57+
\mathbf{G}_{i} \right)^{-1}
58+
59+
This method uses central finite difference to compute the Jacobian matrix,
60+
:math:`\mathbf{G}_{i}`, for experiment :math:`i`, which is the sensitivity of
61+
the measured variables with respect to the parameters, :math:`\boldsymbol{\theta}`.
62+
:math:`\mathbf{W}` is a diagonal matrix containing the inverse of the variance
63+
of the measurement errors, :math:`\sigma^2`.
64+
65+
3. Automatic Differentiation Method
66+
67+
Similar to the finite difference method, the covariance matrix is calculated as:
68+
69+
.. math::
70+
V_{\boldsymbol{\theta}} = \left( \sum_{i = 1}^n \mathbf{G}_{\text{kaug},\, i}^{\mathrm{T}}
71+
\mathbf{W} \mathbf{G}_{\text{kaug},\, i} \right)^{-1}
72+
73+
However, this method uses the model optimality (KKT) condition to compute the
74+
Jacobian matrix, :math:`\mathbf{G}_{\text{kaug},\, i}`, for experiment :math:`i`.
75+
76+
The covariance matrix calculation is only supported with the built-in objective
77+
functions "SSE" or "SSE_weighted".
78+
79+
In parmest, the covariance matrix can be calculated after defining the
80+
:class:`~pyomo.contrib.parmest.parmest.Estimator` object and estimating the unknown
81+
parameters using :class:`~pyomo.contrib.parmest.parmest.Estimator.theta_est`. To
82+
estimate the covariance matrix, with the default method being "finite_difference", call
83+
the :class:`~pyomo.contrib.parmest.parmest.Estimator.cov_est` function, e.g.,
84+
85+
.. testsetup:: *
86+
:skipif: not __import__('pyomo.contrib.parmest.parmest').contrib.parmest.parmest.parmest_available
87+
88+
# Data
89+
import pandas as pd
90+
data = pd.DataFrame(
91+
data=[[1, 8.3], [2, 10.3], [3, 19.0],
92+
[4, 16.0], [5, 15.6], [7, 19.8]],
93+
columns=['hour', 'y'],
94+
)
95+
96+
# Create the Experiment class
97+
from pyomo.contrib.parmest.examples.rooney_biegler.rooney_biegler import RooneyBieglerExperiment
98+
99+
exp_list = []
100+
for i in range(data.shape[0]):
101+
exp_list.append(RooneyBieglerExperiment(data.loc[i, :]))
102+
103+
.. doctest::
104+
:skipif: not __import__('pyomo.contrib.parmest.parmest').contrib.parmest.parmest.parmest_available
105+
106+
>>> import pyomo.contrib.parmest.parmest as parmest
107+
>>> pest = parmest.Estimator(exp_list, obj_function="SSE")
108+
>>> obj_val, theta_val = pest.theta_est()
109+
>>> cov = pest.cov_est()
110+
111+
Optionally, one of the three methods; "reduced_hessian", "finite_difference",
112+
and "automatic_differentiation_kaug" can be supplied for the covariance calculation,
113+
e.g.,
114+
115+
.. doctest::
116+
:skipif: not __import__('pyomo.contrib.parmest.parmest').contrib.parmest.parmest.parmest_available
117+
118+
>>> pest = parmest.Estimator(exp_list, obj_function="SSE")
119+
>>> obj_val, theta_val = pest.theta_est()
120+
>>> cov_method = "reduced_hessian"
121+
>>> cov = pest.cov_est(method=cov_method)

doc/OnlineDocs/explanation/analysis/parmest/datarec.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -44,8 +44,8 @@ The following example returns model values from a Pyomo Expression.
4444

4545
>>> # Define objective
4646
>>> def SSE(model):
47-
... expr = (model.experiment_outputs[model.y]
48-
... - model.response_function[model.experiment_outputs[model.hour]]
47+
... expr = (model.experiment_outputs[model.y[model.hour]]
48+
... - model.y[model.hour]
4949
... ) ** 2
5050
... return expr
5151

doc/OnlineDocs/explanation/analysis/parmest/driver.rst

Lines changed: 28 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -16,19 +16,21 @@ the model and the observations (typically defined as the sum of squared
1616
deviation between model values and observed values).
1717

1818
If the Pyomo model is not formatted as a two-stage stochastic
19-
programming problem in this format, the user can supply a custom
20-
function to use as the second stage cost and the Pyomo model will be
19+
programming problem in this format, the user can choose either the
20+
built-in "SSE" or "SSE_weighted" objective functions, or supply a custom
21+
objective function to use as the second stage cost. The Pyomo model will then be
2122
modified within parmest to match the required specifications.
22-
The stochastic programming callback function is also defined within parmest. The callback
23-
function returns a populated and initialized model for each scenario.
23+
The stochastic programming callback function is also defined within parmest.
24+
The callback function returns a populated and initialized model for each scenario.
2425

25-
To use parmest, the user creates a :class:`~pyomo.contrib.parmest.parmest.Estimator` object
26-
which includes the following methods:
26+
To use parmest, the user creates a :class:`~pyomo.contrib.parmest.parmest.Estimator`
27+
object which includes the following methods:
2728

2829
.. autosummary::
2930
:nosignatures:
3031

3132
~pyomo.contrib.parmest.parmest.Estimator.theta_est
33+
~pyomo.contrib.parmest.parmest.Estimator.cov_est
3234
~pyomo.contrib.parmest.parmest.Estimator.theta_est_bootstrap
3335
~pyomo.contrib.parmest.parmest.Estimator.theta_est_leaveNout
3436
~pyomo.contrib.parmest.parmest.Estimator.objective_at_theta
@@ -65,16 +67,9 @@ Section.
6567
columns=['hour', 'y'],
6668
)
6769

68-
# Sum of squared error function
69-
def SSE(model):
70-
expr = (
71-
model.experiment_outputs[model.y]
72-
- model.response_function[model.experiment_outputs[model.hour]]
73-
) ** 2
74-
return expr
75-
7670
# Create an experiment list
7771
from pyomo.contrib.parmest.examples.rooney_biegler.rooney_biegler import RooneyBieglerExperiment
72+
7873
exp_list = []
7974
for i in range(data.shape[0]):
8075
exp_list.append(RooneyBieglerExperiment(data.loc[i, :]))
@@ -83,15 +78,15 @@ Section.
8378
:skipif: not __import__('pyomo.contrib.parmest.parmest').contrib.parmest.parmest.parmest_available
8479

8580
>>> import pyomo.contrib.parmest.parmest as parmest
86-
>>> pest = parmest.Estimator(exp_list, obj_function=SSE)
81+
>>> pest = parmest.Estimator(exp_list, obj_function="SSE")
8782

8883
Optionally, solver options can be supplied, e.g.,
8984

9085
.. doctest::
9186
:skipif: not __import__('pyomo.contrib.parmest.parmest').contrib.parmest.parmest.parmest_available
9287

9388
>>> solver_options = {"max_iter": 6000}
94-
>>> pest = parmest.Estimator(exp_list, obj_function=SSE, solver_options=solver_options)
89+
>>> pest = parmest.Estimator(exp_list, obj_function="SSE", solver_options=solver_options)
9590

9691

9792
List of experiment objects
@@ -137,17 +132,20 @@ expressions that are used to build an objective for the two-stage
137132
stochastic programming problem.
138133

139134
If the Pyomo model is not written as a two-stage stochastic programming problem in
140-
this format, and/or if the user wants to use an objective that is
141-
different than the original model, a custom objective function can be
142-
defined for parameter estimation. The objective function has a single argument,
143-
which is the model from a single experiment.
135+
this format, the user can select the "SSE" or "SSE_weighted" built-in objective
136+
functions. If the user wants to use an objective that is different from the built-in
137+
options, a custom objective function can be defined for parameter estimation. However,
138+
covariance matrix estimation will not support this custom objective function. The objective
139+
function (built-in or custom) has a single argument, which is the model from a single
140+
experiment.
144141
The objective function returns a Pyomo
145142
expression which is used to define "SecondStageCost". The objective
146143
function can be used to customize data points and weights that are used
147144
in parameter estimation.
148145

149-
Parmest includes one built in objective function to compute the sum of squared errors ("SSE") between the
150-
``m.experiment_outputs`` model values and data values.
146+
Parmest includes two built-in objective functions ("SSE" and "SSE_weighted") to compute
147+
the sum of squared errors between the ``m.experiment_outputs`` model values and
148+
data values.
151149

152150
Suggested initialization procedure for parameter estimation problems
153151
--------------------------------------------------------------------
@@ -162,4 +160,11 @@ estimation solve from the square problem solution, set optional argument ``solve
162160
argument ``(initialize_parmest_model=True)``. Different initial guess values for the fitted
163161
parameters can be provided using optional argument `theta_values` (**Pandas Dataframe**)
164162

165-
3. Solve parameter estimation problem by calling :class:`~pyomo.contrib.parmest.parmest.Estimator.theta_est`
163+
3. Solve parameter estimation problem by calling
164+
:class:`~pyomo.contrib.parmest.parmest.Estimator.theta_est`, e.g.,
165+
166+
.. doctest::
167+
:skipif: not parmest_available
168+
169+
>>> pest = parmest.Estimator(exp_list, obj_function="SSE")
170+
>>> obj_val, theta_val = pest.theta_est()

doc/OnlineDocs/explanation/analysis/parmest/overview.rst

Lines changed: 17 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -41,23 +41,32 @@ that for most experiments, only small parts of :math:`x` will change
4141
from one experiment to the next.
4242

4343
The following least squares objective can be used to estimate parameter
44-
values, where data points are indexed by :math:`s=1,\ldots,S`
44+
values assuming Gaussian independent and identically distributed measurement
45+
errors, where data points are indexed by :math:`s=1,\ldots,S`
4546

4647
.. math::
4748
4849
\min_{{\theta}} Q({\theta};{\tilde{x}}, {\tilde{y}}) \equiv \sum_{s=1}^{S}q_{s}({\theta};{\tilde{x}}_{s}, {\tilde{y}}_{s}) \;\;
4950
50-
where
51+
where :math:`q_{s}({\theta};{\tilde{x}}_{s}, {\tilde{y}}_{s})` can be:
5152

52-
.. math::
53+
1. Sum of squared errors
54+
55+
.. math::
56+
57+
q_{s}({\theta};{\tilde{x}}_{s}, {\tilde{y}}_{s}) =
58+
\sum_{i=1}^{m}\left({\tilde{y}}_{s,i} - g_{i}({\tilde{x}}_{s};{\theta})\right)^{2}
59+
60+
2. Weighted sum of squared errors
61+
62+
.. math::
5363
54-
q_{s}({\theta};{\tilde{x}}_{s}, {\tilde{y}}_{s}) = \sum_{i=1}^{m}w_{i}\left[{\tilde{y}}_{si} - g_{i}({\tilde{x}}_{s};{\theta})\right]^{2},
64+
q_{s}({\theta};{\tilde{x}}_{s}, {\tilde{y}}_{s}) =
65+
\sum_{i=1}^{m}\left(\frac{{\tilde{y}}_{s,i} - g_{i}({\tilde{x}}_{s};{\theta})}{w_i}\right)^{2}
5566
5667
i.e., the contribution of sample :math:`s` to :math:`Q`, where :math:`w
57-
\in \Re^{m}` is a vector of weights for the responses. For
58-
multi-dimensional :math:`y`, this is the squared weighted :math:`L_{2}`
59-
norm and for univariate :math:`y` the weighted squared deviation.
60-
Custom objectives can also be defined for parameter estimation.
68+
\in \Re^{m}` is a vector containing the standard deviation of the measurement
69+
errors of :math:`y`. Custom objectives can also be defined for parameter estimation.
6170

6271
In the applications of interest to us, the function :math:`g(\cdot)` is
6372
usually defined as an optimization problem with a large number of

pyomo/_archive/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,6 @@
1313
official Pyomo API.
1414
1515
These modules are still importable through their old names via
16-
:func:`pyomo.common.moved_module()`
16+
:func:`pyomo.common.deprecation.moved_module()`
1717
1818
"""

pyomo/common/collections/__init__.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -10,10 +10,10 @@
1010
# ___________________________________________________________________________
1111

1212

13-
from collections.abc import MutableMapping, MutableSet, Mapping, Set, Sequence
14-
from collections import UserDict
13+
from collections import OrderedDict, UserDict
14+
from collections.abc import Mapping, MutableMapping, MutableSet, Sequence, Set
1515

16-
from .orderedset import OrderedDict, OrderedSet
16+
from .bunch import Bunch
1717
from .component_map import ComponentMap, DefaultComponentMap
1818
from .component_set import ComponentSet
19-
from .bunch import Bunch
19+
from .orderedset import OrderedSet
Lines changed: 72 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,72 @@
1+
# ___________________________________________________________________________
2+
#
3+
# Pyomo: Python Optimization Modeling Objects
4+
# Copyright (c) 2008-2025
5+
# National Technology and Engineering Solutions of Sandia, LLC
6+
# Under the terms of Contract DE-NA0003525 with National Technology and
7+
# Engineering Solutions of Sandia, LLC, the U.S. Government retains certain
8+
# rights in this software.
9+
# This software is distributed under the 3-clause BSD License.
10+
# ___________________________________________________________________________
11+
12+
from collections import defaultdict
13+
14+
15+
class HashDispatcher(defaultdict):
16+
"""Dispatch table for generating "universal" hashing of all Python objects.
17+
18+
This class manages a dispatch table for providing hash functions for all Python
19+
types. When an object is passed to the Hasher, it determines the appropriate
20+
hashing strategy based on the object's type:
21+
22+
- If a custom hashing function is registered for the type, it is used.
23+
- If the object is natively hashable, the default hash is used.
24+
- If the object is unhashable, the object's :func:`id()` is used as a fallback.
25+
26+
The Hasher also includes special handling for tuples by recursively applying the
27+
appropriate hashing strategy to each element within the tuple.
28+
"""
29+
30+
def __init__(self, *args, **kwargs):
31+
super().__init__(lambda: self._missing_impl, *args, **kwargs)
32+
self[tuple] = self._tuple
33+
34+
def _missing_impl(self, val):
35+
try:
36+
hash(val)
37+
self[val.__class__] = self._hashable
38+
except:
39+
self[val.__class__] = self._unhashable
40+
return self[val.__class__](val)
41+
42+
@staticmethod
43+
def _hashable(val):
44+
return val
45+
46+
@staticmethod
47+
def _unhashable(val):
48+
return id(val)
49+
50+
def _tuple(self, val):
51+
return tuple(self[i.__class__](i) for i in val)
52+
53+
def hashable(self, obj, hashable=None):
54+
if isinstance(obj, type):
55+
cls = obj
56+
else:
57+
cls = type(obj)
58+
if hashable is None:
59+
fcn = self.get(cls, None)
60+
if fcn is None:
61+
raise KeyError(obj)
62+
return fcn is self._hashable
63+
self[cls] = self._hashable if hashable else self._unhashable
64+
65+
66+
#: The global 'hasher' instance for managing "universal" hashing.
67+
#:
68+
#: This instance of the :class:`HashDispatcher` is used by
69+
#: :class:`~pyomo.common.collections.component_map.ComponentMap` and
70+
#: :class:`~pyomo.common.collections.component_set.ComponentSet` for
71+
#: generating hashes for all Python and Pyomo types.
72+
hasher = HashDispatcher()

0 commit comments

Comments
 (0)