This work exclusively focuses on the mixed precision algorithms that integrate classical numerical linear algebra methods with nonlinear neural network-based preconditioners to accelerate the solution of some parametric Partial Differential Equations (PDEs). Specifically, we consider Krylov subspace methods such as Flexible GMRES (FGMRES) and Flexible FOM (FFOM), combined with nonlinear or flexible preconditioners derived from the trained neural operator models. The parametric PDEs addressed include the Helmholtz equations, Poisson equations, Darcy flow, and Diffusion-Advection equations, spanning both academic benchmarks and practical dataset. Compared to the classical numerical preconditioners, the trained neural operator-based preconditioners exhibit significant generalization capabilities in addressing a wide range of numerical and parametric variations. These include the ability to address varying source term, speed of sound, diffusivity term, velocity field for advection, and varying boundary conditions for these parametric PDEs. In addition, we support both CPUs and GPUs implementations of the proposed mixed precision algorithms. Overall, this work demonstrates the efficiency and flexibility of combining modern neural network-based solvers with classical Krylov methods, leveraging the strengths of both to achieve higher attainable accuracy and broader applicability across diverse problem settings.