Scoring Rubric for Baylis/Burwell VMI. Image courtesy of Presenter Media
This post is a continuation of my previous post [KPI – Part IV: Guiding Principles]. It will describe the scoring rubric we selected to use to assign points to institutions on the Guiding Principles Factor of the Baylis/Burwell Vitality/Morbidity Model.
Two different approaches to building our scoring rubric. Image courtesy of Presenter Media
There are two different directions which we could have taken to develop our scoring rubric. The first way was an ultra-quantitative, spreadsheet approach attempting to measure the quality of the statements of institutional Mission, Vision, and Core Values, and the institution’s efforts to live out those statements in their actions.
This type of approach is typically called the objective approach. However, if by objective you mean “not influenced by personal feelings or opinions in considering and representing facts,” this direction is far from being truly objective. There are many points of subjectivity present in the quantitative scoring of the various components and in the weighting factors used in combining component scores to obtain a final score, where the raters’ biases and opinions enter into the equation.
A panel of higher education experts weigh the evidence and make judgments on each aspect of a particular factor. Image courtesy of Presenter Media.
The second approach is a more holistic approach which is typically labeled as a subjective approach. It relies on the use of higher education experts, who have had years of training and experience in the field of higher education, to evaluate the institution in a number of ways.
The first thing these experts are asked to do is to read the institution’s published documents and judge whether they believe the institution has selected values and behaviors that represent those of a quality institution of higher education. The institutions are scored on the following three-point scale:
Is the institution a stellar citizen of the higher education community or a devil in disguise? Image courtesy of Presenter Media
-1 Totally inadequate for a quality institution of higher education
0 Barely adequate for a quality institution of higher education
+1 Describes a high performing institution of higher education
The higher education experts are then asked to judge whether the behavior of a given institution matches its stated beliefs using the following scale:
In the opinion of the higher education experts does the institution’s behavior match its stated values. They will weigh the evidence and make their decision on their training and experience in higher education. Image courtesy of Presenter Media
-1 Behavior doesn’t come close to its stated values. The institution fails to meet its own stated standards
0 Behavior barely meets its stated values or standards.
+1 Behavior exceeds the expectations set by its stated values.
A quality institution of higher education should be beyond reproach. In light of this, the panel of higher education experts is asked to make two more judgments. The first judgment involves the institution’s track record with those entities and agencies to which the institution is responsible. Does the institution meet all of its required reporting deadlines and fulfill all obligations to federal and accrediting agencies? Institutions will be scored on the following scale:
The institution has done everything it could to move it to the top of its class. Image courtesy of Presenter Media
-1 The institution has failed to meet more than one reporting obligation or legal requirements.
0 The institution has met all requirements and obligations but has occasionally been late or hesitant in making results public.
+1 The institution has gone of out its way in meeting requirements and obligations. It has made been completely transparent in all of its operations.
The final area of concern for the panel of experts deals with the reputation of the institution. The panel will judge whether the institution is held in high esteem by various entities such as higher education as a whole, the general public, students and prospective students, and employers of the institution’s graduates.
Is the institution a stellar citizen of the higher education community or a devil in disguise? Image courtesy of Presenter Media
The scoring scale for this area of concern is as follows:
-1 The reputation of the institution is tarnished in a number of areas with a number of groups.
0 The reputation of the institution is considered “run-of-the-mill.” It is not outstanding in any area.
+1 The reputation of the institution is stellar with all groups with which it deals.
To determine a factor score for Guiding Principles, the sub-factor scores are summed. Total scores are assigned as follows:
If the total sub-factor score is -3 or less, the assigned factor score is -1. Any institution in this area should be considered in trouble and possibly dying.
If the total sub-factor score is -2 to +2, the assigned factor score is 0. An institution with a score in this area is just hanging on and should be considered just surviving.
In institution in this category is considered a top-tier or elite institution. It is truly thriving. Image courtesy of Presenter Media.
If the total sub-factor score is +3 or more, the assigned factor score is +1. An institution with a score in this area is doing well and should be considered to be thriving.
With the institutions we have examined we have found a predisposition away from the thriving side of the scale. It should not be surprising. Most observers will readily say that the overwhelming majority of colleges and universities are either in trouble or just surviving. There are few elite, or top tier institutions that are really thriving.
Next Tuesday, March 19, I will take a break from this series of post on Key Performance Indicators and publish a special post inspired by the scores of birthday wishes that I received this past week. It may be unusual to throw a big celebration for someone’s 73rd birthday. However, after a series of traumatic brain incidents more than a decade ago, scores of doctors wouldn’t have given you a plug nickel that I would make my 73rd birthday. Thus I will publish a post celebrating an unexpected decade of extra life. What would you do with an extra decade of life?
Inactivity, inattentiveness, and other bad business practices lead to the failure of any organization. Image courtesy of Presenter Media.
As noted in the previous post, Key Performance Indicators – Part II: Definition, the theme of this post was going to be A University Should Be Managed as If It Were a Business. In all of my previous roles, as a university administrator or the creator of this blog, I made no efforts to hide my sentiments concerning this proposition. It was always one of my operating premises.
When the wheels fall off an organization, it will fail to run. Image courtesy of Presenter Media.
In 2016, I started a series of posts on the theme The Business Model of All Higher Education Is Broken. Even in the title of the series, I attempted to make the point that institutions of higher education (IHEs) must view themselves as business enterprises. As an academician, I believe that institutions of higher education must be more than businesses. However, if they don’t operate using the best business and management techniques then they will surely fail, which is what we have seen with 2,000 American IHEs since 1950.
Too many universities live in a fantasy world chasing rainbows, leprechauns, and illusory pots of gold. Image courtesy of Presenter Media.
Milton Greenberg in his seminal essay “The University Is Not a Business (and Other Fantasies)” published in EDUCAUSE Review, vol. 39, no. 2 (March/April 2004), argues forcefully that a university should be managed as if it were a business.
Very early in his essay, Greenberg proclaims, “ Presumably, a “business” involves the hierarchical and orderly management of people, property, productivity, and finance for profit.” The primary counterarguments of academicians to Greenberg’s position hinge on three concepts he introduces in this sentence: hierarchical management, productivity, and profit. In three future posts in this series, I will separately tackle each of these counterarguments.
Eight Factor Model of Institutional Vitality developed by By Baylis and Ron Burwell. Image copyright by Higher Ed By Baylis, LLC. Image courtesy of By Baylis and Ron Burwell. Constructed using ClickCharts Software
But first I return to present my argument on why universities should be run more like businesses. In studying the 2,000 deceased IHEs, Ron Burwell and I noticed eight factors that we believe contributed negatively to their vitality and their eventual morbidity. The eight factors are shown in the diagram to the left.
Although the eight factors are obviously not completely independent of each other, they are sufficiently different to warrant separate consideration. Additionally, that consideration would take up too much space for one blog post. Thus, I will address each of the factors in upcoming posts.
Mind Map of the Guiding Principles Factor. Image courtesy of authors By Baylis and Ron Burwell. Constructed using ClickCharts Software.
To give you a taste of how I will be introducing and treating these factors, I present a Mind Map Diagram on the right illustrating the three components which define the Guiding Principles Factor.
Under each of the three components, the diagram presents the major ingredients that go into measuring the success of the organization in that component.
A reasonably informed person weighing the evidence should be able to make an informed judgment. Image courtesy of Presenter Media.
In order to simplify our study, without losing the crux of discovering the reasons why institutions failed, we have chosen to use the straight-forward three-point scale of Thriving, Surviving, and Dying. Instead of attempting to construct complicated, quantitative scales to measure each subfactor of our eight factors, we are going to use a subjective approach similar to Supreme Court Justice Potter Stewart’s take on defining pornography: “I may not be able to define it. But I know it when I see it.”
Vitality/Morbidity Index (VMI) Gauge indicating an institution is greatly struggling in the Guiding Principles Factor. Image courtesy of Presenter Media
With each component in our factors, most reasonable observers can easily determine whether: 1) an organization is extremely successful and thriving in terms of this component, 2) just barely getting by or only surviving, or 3) failing badly and falling far short of success or flat out dying. This approach permits us to use a simple gauge to illustrate the vitality/morbidity level of an institution.
We will also associate a three-point numerical scale with our three categories: Thriving (+1); Surviving (0); Dying (-1). We then added the scores across all eight factors. Repeating this process for each institution in our database of closed colleges and universities, we were not at all surprised to find that the total score of each closed institution was negative. No closed college had a total positive score. Some individual factor scores were positive but they were outweighed by a much larger share of factors with negative scores.
If this model is to have predictive capabilities it must also work with all types of institutions. We have tried our model out on a number of institutions that we identified as thriving, surviving, and outright struggling.
Is the wrecking ball set to knock down your institution? Image courtesy of Presenter Media.
In this process, we did find a number of colleges that were still operating which had negative VMIs. In each of these cases, the colleges involved could easily be classified as struggling or just barely surviving. They were definitely not thriving.
Although I believe that it is difficult to “kill a college” it is not impossible. Just ask the constituencies of Newbury College (MA), College of New Rochelle (NY), Green Mountain College (VT), and Hampshire College (VT).
For institutions that we identified as thriving, just as we expected each of them had a total VMI that was positive. What about the struggling institutions with positive VMIs? We believe that these institutions must address the factors that are negative or “zero” or they could be heading for more serious trouble.
In the post above I outline several different directions that I could go with my next post. At this point, I am working on a post that delves more deeply into the VMI Factor Guiding Principles which I introduced in this post. Watch for it next Tuesday, March 12, 2019.