Title: A Differentially Private Weighted Empirical Risk Minimization Procedure and its Application to Outcome Weighted Learning
Abstract: Data used for model training via empirical risk minimization (ERM) may contain sensitive personal information. While these models can be beneficial, their release and use pose privacy risks. Differential privacy (DP) offers a mathematical framework for bounding privacy loss when releasing information from sensitive data. Building on previous research applying DP to unweighted ERM, this talk presents the first general application of DP to weighted ERM, where each individual’s contribution to the objective function can be assigned a different weight. This extension paves a path for deriving privacy-preserving learning methods for individualized treatment rules. The performance of our approach was evaluated on a simulation study and a real clinical trial, and empirical results demonstrate the feasibility of effectively learning individualized treatment rules through outcome weighted learning with rigorous privacy guarantees in real-world scenarios involving sensitive data.