Conclusion

Starting from Jonathan Lewis' test case "Discrete Dangers" on page 126 of his book "Cost Based Oracle",
we have investigated (and illustrated with graphs) the CBO cardinality estimation algorithm for range-based predicates:
  select x from t where x >  low_x and x <  high_x  (open  , open  )
  select x from t where x >= low_x and x <= high_x  (closed, closed)
  select x from t where x >  low_x and x <= high_x  (open,   closed)
  select x from t where x >= low_x and x <  high_x  (closed, open)

  template:
  select x from t where x #low low_x and x #high high_x
  #low  in (">", ">=")
  #high in ("<", "<=")
when column x has no associated histogram.

Letting
  min_x = min (x) over all rows
  max_x = max (x) over all rows
  B = (max_x - min_x) / num_distinct (Band width)
and defining
  left band      : min_x     < x < min_x + B
  central region : min_x + B < x < max_x - B
  right band     : max_x - B < x < max_x
we have seen that We have also graphically illustrated, hopefully for a better intuitive understanding, the case of selection over both an infinitesimal and a finite range for the (open,open) and (closed, closed) cases.

The discussion starts here.

And here's my speculation about the rationale behind this strange (but quite sound in hindsight) behaviour.

For corrections / feedback:
[email protected]