Application of machine learning techniques to the analysis and prediction of drug pharmacokinetics

Model-driven analysis has a long history in the pharmacokinetic/pharmacodynamics (PK/PD) field [1]. Mathematical models consist of differential equations that evolve over time and can relate blood levels in the body and/or pharmacological effects to dosing regimens and patient characteristics. With the progress of data accumulation and analysis, model-based prediction may have been accepted as probable; however, owing to the complexity of physiological events, it remains practical, with artificial intelligence or machine learning expected to fill the gap. Machine learning [2] is a prediction platform that can handle complex phenomena through data-driven analysis and has been successfully adopted in many fields, such as image recognition [3,4], time series analysis [5,6], and language processing [7,8]. This review discusses the usefulness of machine learning in this field, with PK/PD analysis highlighted as a time-series problem.

Machine learning is a method for uncovering trends and patterns from large amounts of data. This learning method is broadly classified into supervised learning [9] and unsupervised learning [10]. Supervised learning also has subcategories, such as classification problems and regression problems, ultimately forming a huge system of domains. The main issue in the PK/PD field is accurate and quantitative prediction of events. Here, examples of the application of machine learning techniques to solve PK/PD prediction challenges are presented. Rather than using a single machine learning method, ensemble methods [11] that combine multiple machine learning methods may be used to improve prediction performance. Although this has yet to be applied in the PK/PD field, new frameworks have been developed in recent years, such as transfer learning [12], which diverts a model built for another task to build a prediction model for the target task, and generative adversarial networks (GANs) [13], which can generate pseudo data. In this article, we also discuss the potential of such newer techniques in the future.

留言 (0)

沒有登入
gif