To ensure quality, models are updated by the Othot Data Curation and Othot Data Science teams only. The Othot team uses the Model page to make the models that were tested and reviewed on internal tools accessible to you in the Othot Platform.
Generally, your model will be rebuilt at least once a year after the academic year's enrollment cycle is complete. In this process (called a Rollover), the newly completed year of data is added to the historic/train data and used to make predictions for the upcoming year.
While we attempt to keep the number of updates to models to a minimum to avoid disruption and changes to predictions, your model may also be updated at other points in the year to make changes or include additional variables that are necessary for making decisions and building your strategy. These model updates will be coordinated via your Strategic Partnership Manager.
The Reference section of the Insights page is the most comprehensive source for the details of your model.
Othot measures the performance of our predictive models by taking the average of three metrics:
Since Othot builds detailed models for each phase and each phase includes a different set of variables, the overall performance and the individual predictions are better than models that do not use a Lifecycle-based approach. Click Show All in any Lifecycle phase to see all of the variables that impacting its model, sorted by Importance.
To optimize performance, we test a range of model parameters and then select the model that maximizes the Model Performance Score. This is accomplished through 10 fold stratified cross-validation.
As an output of Othot's Lifecycle-based approach to predictive modeling, users are able to see which variables have more or less impact on predictions in each phase. Variables with a higher percentage are ones that certain values have been observed to lead to higher likelihoods than other variables.
Using an enrollment example, Applicants who have visited campus or live within a certain distance may be known to produce more matriculated students than those who haven't visited or live farther away. Or, a model for Admitted students may show financial aid having more of an impact than a campus visit, or vice versa, depending on the institution.
These tables expand upon the variable-level information found in the Top Importances section by showing the counts, average likelihood score, and average impact each value has in the model by Phase. This information can be especially useful to validate data, as well as identify which characteristics of students tend to have the most significant contributions to their predictions.
Quickly discover and be notified of unusual changes to your data, which may lead to invalid predictions. Examples of unusual data changes may include individuals moving backward in lifecycle phases, financial aid awards that have been significantly reduced, or event attendance that has disappeared.
There are two ways to learn if our system detected any unusual data changes.