The error “value does not satisfy subset constraints of T”
for some type name T
arises when a value is trying to be converted to a subset type T
,
and the value cannot be proved to satisfy the predicate that defines the subset type.
This is pretty clear when one is trying to assign, say an int
to a nat
, but is more complex when using generic types.
This example
type neg = r : real | r <= 0.0
datatype formula<T> =
| Var(z: T)
| Plus(x: formula<T>, y: formula<T>)
| Minus(x: formula<T>, y: formula<T>)
| Mult(x: formula<T>, y: formula<T>)
| Divide(x: formula<T>, y: formula<T>)
function method toReal(x: formula<real>) : real
{
match x
case Var(z) => z
case Plus(y, z) => toReal(y) + toReal(z)
case Minus(y, z) => toReal(y) - toReal(z)
case Mult(y, z) => toReal(y) * toReal(z)
case Divide(y, z) =>
if toReal(z) == 0.0 then 42.0 else 43.0
}
datatype ChildLineItem = ChildLineItem(negChargeAmount: formula<neg>)
predicate isValidChildLineItem(l: ChildLineItem)
{ toReal(l.negChargeAmount) <= 0.0 }
produces
ERROR_Covariance.dfy(24,11): Error: value does not satisfy the subset constraints of 'formula<real>'
Dafny program verifier finished with 2 verified, 1 error
The problem is that the code is trying to convert a formula<neg>
to a formula<real>
.
While a neg
is a subtype of real
, that does not imply a subtype relationship between
formula<neg>
and formula<real>
.
That relationship must be declared in the definition of formula
.
By default, the definition of a generic type is non-variant, meaning there is no
subtype relationship between formula<T>
and formula<U>
when T
is a subtype of U
.
What is wanted here is for formula
to be covariant, so that
formula<T>
is a subtype of formula<U>
when T
is a subtype of U
.
For that, use the declaration formula<+T>
.
To declare formula
as contravariant use formula<-T>
. Then formula<U>
is a subtype of formula<T>
when T
is a subtype of U
.
Type parameter characteristics are discussed in the reference manual