This paper proposes and develops new Newton-type methods to solve structured nonconvex and nonsmooth optimization problems with justifying their fast local and global convergence by means of advanced tools of variational analysis and generalized differentiation. The objective functions belong to a broad class of prox-regular functions with specification to constrained optimization of nonconvex structured sums. We also develop a novel line search method, which is an extension of the proximal gradient algorithm while allowing us to globalize the proposed coderivative-based Newton methods by incorporating the machinery of forward-backward envelopes. Further applications and numerical experiments are conducted for the $\ell_0$-$\ell_2$ regularized least-square model appearing in statistics and machine learning.