Talk abstract: Data Quality with LV=’s Input-Checker. During the production of automated Machine Learning models, the quality of the data being received is paramount for precise and trustworthy predictions. When left to the mercy of the system they are deployed to, there is no end to the number of erroneous data points that could, and will, be fed to the model. How can our model sniff out these issues? How can we be sure that the predictions we return are sensible? LV=’s Input-Checker tool is the answer to these questions. Providing rigid data checks at point of inference, coupled with lightweight calculations for maximum call speed, it makes the perfect tool for data quality checking in a live environment. My talk intends to give a brief overview of the key capabilities of Input-Checker, and to highlight the strengths of the package. Hopefully providing you with another tool in your arsenal for point of inference data quality checking.
Bio: Ned Webster is a Senior Data Scientist at LV=GI, working primarily on the delivery of ML use-cases as well as development of Data Science focused python tools. At LV= he has had experience building and implementing predictive models across a wide array of business areas.