# Deformation (congruence) analysis - statistical tests

Hi Micha,

I have two concerns about the congruence analysis package.

1. It seems to me that the global congruence test (detection test) is not performed for datum points in JAG3D and, immediately, single - datum - point displacements are tested individually (local tests) to identify possible unstable datum point(s). Am I right?

2. A posteriori test statistics for single datum points - T_post,j - are tested using F-test, where the degrees of freedom are: d1 = m (the degrees of freedom of the quadratic form in the nominator, in fact, the rank of the cofactor matrix in the nominator) and d2 = f (the redundancy of congruence model) - m. In other words, we obtain d2 = the redundancy of network model from epoch 0 + the redundancy of network model from epoch 1. And this does not raise my concerns.

2.1. However, it is not the case for the a posteriori test statistics for single object points - T_post,k. It means d2 = f in this case. While, it seems to me that it should be, as previously, d2 = f - m or, equivalently, d2 = the redundancy of network model from epoch 0 + the redundancy of network model from epoch 1. It is due to the fact that the estimate of variance factor (in the denominator of T_post,k) should be based on unconstrained model (i.e., the model of combined adjustment as, e.g., it is presented in Caspary's (2000) monography on p.121-123, and, not congruence model). Furthermore, Lehmann and Löesler (2017), Congruence analysis of geodetic networks – hypothesis tests versus model selection by information criteria, write on p.274: '... σ̂^2 is the estimate of σ2. Note that they are external estimates, i.e., they must be computed in the unconstrained model, in the sense of least squares.' The Caspary's experiments also seem to confirm this statement, e.g., d2 = 58 (the redundancy of network model from epoch 0 (f0 = 29) + the redundancy of network model from epoch 1 (f1 = 29)), in the second equation on p.153. What do you think about this?

Best regards,
Krzysztof

## Deformation (congruence) analysis - statistical tests

Hello Krzysztof,

1. It seems to me that the global congruence test (detection test) is not performed for datum points in JAG3D and, immediately, single - datum - point displacements are tested individually (local tests) to identify possible unstable datum point(s). Am I right?

The congruence analysis in JAG3D is based on the joint adjustment of two epochs on the observation level. For that reason, there is no difference between a normal adjustment or a congruence analysis. This is some interpretation of the user not of the software. In general, there is no evident difference between outliers detection or detection of deformed points.

JAG3D performs a global test. The test result is given in the table of the variance components estimation, see the table here which shows a rejected global test.

2. A posteriori test statistics for single datum points - T_post,j - are tested using F-test, where the degrees of freedom are: d1 = m (the degrees of freedom of the quadratic form in the nominator, in fact, the rank of the cofactor matrix in the nominator) and d2 = f (the redundancy of congruence model) - m. In other words, we obtain d2 = the redundancy of network model from epoch 0 + the redundancy of network model from epoch 1. And this does not raise my concerns.

All quantiles used within the project are given in the table test statistic, see the table here.

And yes, you are right, because this is the constrained model but we need the quantile of the unconstrained model...

2.1. However, it is not the case for the a posteriori test statistics for single object points - T_post,k.

In contrast to datum points, object points are separated in JAG3D - one per epoch. Estimating two different positions (one per epoch) for an object point $P_{i}$, i.e., $P_{i,e1}$ and $P_{i,e2}$, is nothing else than to estimate a single position $P_{i}$ and the shift vector say $\nabla_{i}$, where $\nabla_{i} = P_{i,e2} - P_{i,e1}$. Since two different positions are estimated, the object point $P_{i}$ does not force the network (it is the unconstrained model). The vector $\nabla$ is already part of the parameters to be estimated.

The datum points are assumed to be stable in both epochs during the adjustment (only a single position is estimated - constrained model). The redundancy of the network does not contain the unknown $m$-vector $\nabla$ of extra parameters related to the tested datum point. If you estimate $\nabla$ within the adjustment, the number of parameters to be estimated increases by $m$ and, thus, the redundancy is reduced, i.e., $f = n_0 - u_0 - m + d$. Here $n$ is the number of observations (not changed), $u$ is the number of unknowns in the constrained model and $m$ denotes the additional parameters in $\nabla$. Thus, $u_0 + m$ represents the unknown in the unconstrained model.

(The equations used in JAG3D are also given in the textbook written by Jäger et al. (2005), p. 277, eg. Eq. 6.109.)

Kind regards
Micha

--
applied-geodesy.org - OpenSource Least-Squares Adjustment Software for Geodetic Sciences

## Deformation (congruence) analysis - statistical tests

Hi Micha,

thank you very much for your fast and detailed response.

1. If I well understand, the global congruence test for the detection of datum point displacement(s) in JAG3D is based on the following quadratic form: omega = vT*P*v, where v is the residual vector from the congruence/constrained model presented by Eq. (1), i.e., v = [viT vjT]T :

Eq. (1)

While, in my opinion, the global congruence test for the detection of datum point displacement(s) should be based on the following quadratic form: vT*P*v - (viT*Pi*vi + vjT*Pj*vj), where vi, vj are the residual vectors from the unconstrained (single-epoch) models presented by Eq. (2).

Eq. (2)

The part vT*P*v from Eq. (1) is related to null hypothesis and the part (viT*Pi*vi + vjT*Pj*vj) from Eq. (2) is related to alternative hypothesis, according to the generalized likelihood ratio testing theory.

My point is, the global test for the detection of datum point displacement(s) based on omega = vT*P*v from Eq. (1) is less powerful than the one based on: vT*P*v - (viT*Pi*vi + vjT*Pj*vj), e.g., Caspary (2000, Eq. (10.13), (10.14)), Niemeier (2008, Eq. (13.3.21)), Heunecke et al. (2013, Eq. (11.11-11.13)) or Hekimoglu et al. (2010, Eq. (16) from the article ‘Increasing the Efficacy of the Conventional Deformation Analysis Methods: Alternative Strategy’).

2.1. Let me allow to give my second concern in a slightly different way. Object point displacements are estimated from congruence model in JAG3D, i.e., the model as it is presented in Eq. (1). This is clear and understandable. My concern was related to variance factor estimate (and, at the same time, its denominator), which is used in statistical significance testing of estimated object point displacements. If I well understand your explanation and user manual, the statistical significance testing of estimated object point displacements uses the variance factor estimate which is estimated from the congruence model, i.e., the same model, which is used for the estimation of object point displacements. And this raises my fears. Admittedly, the object points do not force the network. Nevertheless, the datum points do it. In other words, the observations are not here completely free as in the model from Eq. (2). In consequence, if the group of congruence/datum points included unstable (non-detected) point(s), the variance factor – which is estimated from the congruence model – would be biased.
So, in my opinion, this variance factor should be estimated from a completely free model as presented in Eq. (2). Such a solution is also presented in literature known to me.

Best regards,
Krzysztof

## Deformation (congruence) analysis - statistical tests

Hello Krzysztof,

1. If I well understand, the global congruence test for the detection of datum point displacement(s) in JAG3D is based on the following quadratic form: omega = vT*P*v

Yes, you are right. The functional model used in JAG3D is given in your Eq. 1. It is a normal network adjustment. The residuals of the (joint) model are used for the global test. This global test checks the compatibility of the functional (and the stochastic) model - as usual in a normal network adjustment.

If datum points are unstable, the functional model is misspecified, and - depending on the critical value - the test is rejected. The reason of the rejected test cannot be identified, and this step is called detection.

Let us briefly review the "normal" outlier test statistic, i.e.,

$T_{prio} = \frac{\mathbf{\nabla^T Q_{\nabla\nabla}^{-} \nabla}}{m \sigma_0^2}$

where $\mathbf{Q_{\nabla\nabla}} = ( \mathbf{B^T P Q_{vv} P B} )^{-}$ and $\nabla = \mathbf{Q_{\nabla\nabla} B^TPv}$. Matrix $\mathbf{B}$ is the design matrix of the extended model, and $m = rg(\mathbf{Q_{\nabla\nabla}})$. We agree that this test statistic based on the likelihood ratio test, and Baards w-test is a special case of $T_{prio}$.

If we set $\mathbf{B = I}$, we obtain

$\mathbf{B^T P Q_{vv} P B} = \mathbf{P Q_{vv} P}$,

and $\mathbf{Q_{\nabla\nabla}} = ( \mathbf{P Q_{vv} P} )^{-}$ as well as $\nabla = -( \mathbf{P Q_{vv} P} )^{-} \mathbf{Pv}$.

Inserting these expresions yields the test statistic

$T_{prio} = \frac{\mathbf{v^T P} ( \mathbf{P Q_{vv} P} )^{-} ( \mathbf{P Q_{vv} P} ) ( \mathbf{P Q_{vv} P} )^{-} \mathbf{Pv} }{m \sigma_0^2} = \frac{\mathbf{v^T P} ( \mathbf{P Q_{vv} P} )^{-} \mathbf{Pv} }{m \sigma_0^2}$

Since $\mathbf{v} = -\mathbf{Q_{vv}Pl}$, where $\mathbf{l}$ is the observation vector, the test statistic reads

$T_{prio}= \frac{\mathbf{l^T P Q_{vv} P l}}{m \sigma_0^2} = \frac{\mathbf{v^T P v}}{m \sigma_0^2}$

because $\mathbf{Q_{vv}P Q_{vv}P = Q_{vv}P}$ and $m = f = n-u+d$

$T_{prio}$ is the global test (based on the likelihood ratio test). Your suggested test based on the likelihood ratio test, thats right, but this is also the case for the global test used in the network adjustment.

In other words, the observations are not here completely free as in the model from Eq. (2). In consequence, if the group of congruence/datum points included unstable (non-detected) point(s), the variance factor – which is estimated from the congruence model – would be biased.

Yes, you are right. But: Do you really test/check the object points without having a stable reference network (datum points)? I don't think so. Usually, one checks for a stable reference (sub-)network (a stable datum) first. If such a stable network is identified(!), object points or other deformation parameters are evaluated afterwards. In this case, the variance factor is (almost) unbiased.

Such a solution is also presented in literature known to me.

Yes, I know this literature; but the model implemented in JAG3D is also known in the geodetic literature, cf. Jäger et al (2005). By the way, your suggested/preferred model is also given in the textbook written by Jäger et al., pp. 270ff.

Kind regards
Micha

--
applied-geodesy.org - OpenSource Least-Squares Adjustment Software for Geodetic Sciences

## Deformation (congruence) analysis - statistical tests

Dear Micha,

But please let me put forward a few thoughts yet.

- In my opinion, please correct me if I am wrong, we may not test observations against possible blunders and points against possible displacements at the same stage. We should first test observations against blunders, based on the unconstrained model given by me in Eq. (2). Only then, we may test datum points against displacements, based, e.g., on the congruence/constrained model (implicit hypothesis method). In other words, we should perform these two testing procedures separately. Otherwise, possible blunder(s) and displacement(s) will mix and it will be difficult to identify and adapt them.

- Please correct me if I am wrong, the global congruence test for the detection of datum point displacement(s) based on: vT*P*v - (viT*Pi*vi + vjT*Pj*vj) is more powerful, i.e., a given displacement will be detectable with a higher probability than by the test based on vT*P*v. It is due to the number of degrees of freedom of global test statistic. The test statistic which is based on vT*P*v - (viT*Pi*vi + vjT*Pj*vj) has fewer degrees of freedom and the test is more powerful.

- You have written: "But: Do you really test/check the object points without having a stable reference network (datum points)? I don't think so. Usually, one checks for a stable reference (sub-)network (a stable datum) first. If such a stable network is identified(!), object points or other deformation parameters are evaluated afterwards."
Yes, we have unstable datum points in practice and we accept this fact. Test (including detection test) does not have a success rate = 100% (it is not possible). It is always a lower rate and it is called test power. Please note that the probability of Type II error (beta) is, in fact, the probability that some point displacement(s) will not be detected by test. In other words, if we have some true value of displacement, we can calculate beta, or conversely, the power of test (1 - beta). Of course, the higher the true value of displacement is, the lower beta is.
For example, if the beta is 20% for some displacement value, it means that 20 times out of 100 times we won't detect this displacement by test. In consequence, this displacement will be in the congruent/datum part of a model.

Best regards,
Krzysztof

PS. Unfortunately, I do not have a work of Jäger et al. (2005) and, hence, I cannot refer to this book.

## Deformation (congruence) analysis - statistical tests

Hello Krzysztof,

- In my opinion, please correct me if I am wrong, we may not test observations against possible blunders and points against possible displacements at the same stage.

Yes, I fully agree!

We should first test observations against blunders, based on the unconstrained model given by me in Eq. (2).

Yes, I agree. Estimating both epochs individually to check the data and to adapt the stochastic model is recommended.

Only then, we may test datum points against displacements, based, e.g., on the congruence/constrained model (implicit hypothesis method).

Yes, I agree. On this stage, we should test the datum points using explicit or implicit hypothesis tests or a similar technique.

In other words, we should perform these two testing procedures separately. Otherwise, possible blunder(s) and displacement(s) will mix and it will be difficult to identify and adapt them.

Yes, I fully agree. However, the test statistic

$T_{prio} = \frac{\mathbf{\nabla^T Q_{\nabla\nabla}^{-1} \nabla}}{m \sigma_0^2}$

used for detecting/identifying blunders is equivalent to the test statistic used for the point test. The only difference is the matrix $\mathbf{B}$, i.e., the (assumed) misspecified functional model of the null model. Misspecifying contains blunders, point shifts, missing parameters (e.g. zero point parameter of the EDM), a wrong functional equation etc. In my opinion, we only test against misspecifying - nothing more. Since powerful local tests are always performed in JAG3D, usually, I evaluate only these test statistics instead of the (weak) global test - in outlier detection as well as in deformation analysis.

- Please correct me if I am wrong, the global congruence test for the detection of datum point displacement(s) based on: vT*P*v - (viT*Pi*vi + vjT*Pj*vj) is more powerful, i.e., a given displacement will be detectable with a higher probability than by the test based on vT*P*v.

Yes, this should be correct. However, your suggested implicit test is not implemented in JAG3D, but powerful local tests are provided, which are specified as explicit hypothesis tests. The golden rule is: If a global test is rejected, we can conclude that something is wrong. Usually, we start with the localisation of the problem at this state. On the other hand, if the global test is not rejected, we cannot draw the conclusion that everything is fine; and again, we start with the (local) analysis.

Kind regards
Micha

--
applied-geodesy.org - OpenSource Least-Squares Adjustment Software for Geodetic Sciences

## Deformation (congruence) analysis - statistical tests

Hi Micha,

All is clear and reasonable.
Thank you very much for your time and help.

Best regards,
Krzysztof

## Deformation (congruence) analysis - statistical tests

Hello Krzysztof,

Thank you very much for your time and help.

You are welcome. Let me know, if you like to compare results or something like that; and how I can support your investigations.

Have a nice weekend
Micha

--
applied-geodesy.org - OpenSource Least-Squares Adjustment Software for Geodetic Sciences