Abstract : Among the major difficulties that one may encounter when estimating parameters in a nonlinear regression model are the non-uniqueness of the estimator, its instability with respect to small perturbations of the observations and the presence of local optimizers of the estimation criterion. We show that these estimability issues can be taken into account at the design stage, through the definition of suitable design criteria. Extensions of $E$, $c$ and $G$-optimality criteria are considered, which, when evaluated at a given $\theta^0$ (local optimal design), account for the behavior of the model response $\eta(\theta)$ for $\theta$ far from $\theta^0$. In particular, they ensure some protection against close-to-overlapping situations where $\|\eta(\theta)-\eta(\theta^0)\|$ is small for some $\theta$ far from $\theta^0$. These extended criteria are concave and necessary and sufficient conditions for optimality (Equivalence Theorems) can be formulated. They are not differentiable, but a maximum-entropy regularization is proposed to obtain concave and differentiable alternatives. When the design space is finite and the set $\Theta$ of admissible $\theta$ is discretized, optimal design forms a linear programming problem, which can be solved directly or via relaxation when $\Theta$ is just compact. Several examples are presented.