Categories
Uncategorized

Target Practice

Many graphics on the pursuit of results has the look of a target and a small “bulls eye” at the center. Indeed, my colleagues and I called our book Outcome Funding…a new approach to targeted grantmaking. And—sure enough—put the traditional target image on the cover. I am now questioning this way of looking at aiming points. In the vital area of inclusion, diversity, and equity, for example, a much broader canvas seems needed. Perhaps many points on the periphery should get as many points as hitting the center. The point is to recognize both complexity and different perspectives as what must first be achieved. As well as the sense that we need to loosen up to see more possibilities rather than narrowly focus on a pre-set victory shot. I will henceforth keep the bullseye for very specific gains and limit its use when vision must broaden rather than narrow.

Categories
Uncategorized Wednesday Whimsy

Never the Twain

It’s tough to learn from mistakes you never made.

Mark Twain

While I spend hours tapping out prose to make and amplify thoughts, I keep finding that short zingers like this one get to the point far more quickly.   Further,  the best are actionable without further ado.   Suggestion:   think of or discover a quote that you like.  Ask yourself what you could do to express its core meaning next week.  If you can’t think of anything, try a new quote.    Email me if you want a good batch that I have found not just wise but useful.  My way of expressing this passage is to require myself to identify one screw up a week to ask what I will do differently next time.    They are all different. Never the Twain shall meet. 

Categories
Uncategorized

Human-Centered Design: Part 2

Last week, I reprinted the first three principles my client Dave Haney described for Human-Centered Design. Continuing from his newly published article, In times of crisis, colleges should ask different questions than they do in a traditional strategic planning process in Inside HigherEd:


No. 4: Engage in prototyping. Too often institutions spend months planning a major initiative and then roll it out with great fanfare, not knowing whether it will produce the intended results. When possible, it’s better to prototypeby implementing a small-scale, low-risk version of an initiative that can test the critical concepts involved and allow you to readjust according to what works and what doesn’t. For example, instead of launching a new degree program, start with a badge or certificate and carefully examine how it plays with students. A prototype can also be a simulation: before creating a new enrollment office, build a mock-up (physical or virtual) and run students and staff through a simulated set of enrollment interactions. This approach can help create a culture of continuous improvement in which new ideas are constantly tested, evaluated and revised.

No. 5: Resource the early adopters, and let consensus follow later. The downfall of many strategic plans is that everybody agrees with them at the outset. If that’s the case, then it is probably too general and probably looks like everyone else’s plan because it represents the lowest common denominator.

In results-based strategic design, institutions instead provide resources, often minimal, to individuals and groups so that they can try things (prototyping), and then consensus is built around successful or promising results, not prior agreement. (From a slightly different perspective, the higher ed consulting firm CREDO also advocates abandoning consensus as a goal for the “new university.”) The Rensselaerville Institute refers to community members who are energetic early adopters as “community spark plugs.” You know who they are on your campus, and they may be administrators, faculty members, staff members or even students — where they are in the organizational chart is often less important than the energy, creativity and attitude they bring to the table. When other people see that the spark plugs are getting the resources, producing results and having more fun, the number of early adopters will grow.

No. 6: Don’t try to do everything. Too many strategic plans try to cover everything an institution does and therefore sink under their own weight. I prefer Hal Williams’s definition of strategy: something is truly strategic only if it requires a behavior change when business as usual won’t accomplish the desired results. For example, one institution included as a strategy within their plan to review the food service and facilities contracts with external vendors. Do you really need a strategic plan to tell you to do that? If such reviews are not part of business as usual, then you are looking at problems that are not going to be solved by a strategic plan.

Instead, focus on the things that require major behavioral changes. For example, one institution increased both efficiency and organizational health by changing siloed behavior in administrative offices. They cooperated with other offices to ensure student success became a specific job requirement at every level of the institution — a result that would be evaluated in performance reviews and lauded when it succeeded. That was truly strategic, because business as usual required a sharp behavioral change. Rather than spending the five years of a strategic plan checking off boxes toward the plan’s completion, it is more effective to adopt a strategic design with recursive cycles of prototyping, learning and improvement.

When I led a strategic design process in 2017 as a college president, and the steering committee had completed its preliminary design for the institution’s future, an initially skeptical faculty member gave the process an appropriate endorsement: “This process was messy as hell, but the result is good.” The times are even messier now, which makes it even more imperative that we design the future of higher education rather than simply try to plan it.

David P. Haney is the former president of Centenary University. He and Jeremy Houska, director of educational effectiveness at the University of La Verne, will present on results-based strategic design at the SCUP 2020 Virtual Annual Conference, sponsored by the Society for College and University Planning. For more resources on results-based strategic design, see davidphaney.com.

I’ll comment on how I use these six principles with remarkably easy steps in my next blog entry. Stay safe.

Hal

Categories
Uncategorized

Human-Centered Design: Part 1

In times of crisis, colleges should ask different questions than they do in a traditional strategic planning process.  My client and friend Dave Haney wrote an article just published in Inside Higher Ed. Here is a long excerpt that speaks to our work together and the first three of his six principles of result-based strategic design. 

Next Tuesday, I will post the other three principles defined in the article. Please email me if you want to see the full articles, including the introductory framing paragraphs and links included in text.


In times of crisis, colleges should ask different questions than they do in a traditional strategic planning process

By David P. Haney

…“Human-centered design” is now used worldwide for designing everything from organizational pivots in corporations to microloan programs in developing countries, often through the influential work of IDEO, whose chairman Tim Brown wrote Change by Design: How Design Thinking Transforms Organizations and Inspires Innovation over a decade ago. When I used this approach to strategic planning as a college president, I added to the mix a sharp focus on outcomes rather than activities, based on the work of Hal Williams, former CEO of the Rensselaerville Institute. I’ve been fortunate to work with him in higher education administration, and he has helped me see how, despite the recent emphasis on outcomes assessment, higher ed is still burdened with a focus on activities that should be changed to a focus on results.

For example, why do we count student community service hours when we could be documenting the results of students’ community service work? Why do syllabi still list activities to be undertaken instead of results for student to achieve? Why do we have meeting agendas that list the topics to be covered instead of the outcomes we want to see? Why do job descriptions list expected activities (slavishly described as “duties”) instead of what employees should be expected to accomplish? (In fact, if working remotely, where activities are relatively invisible to colleagues, continues in popularity, outcomes may provide the best and perhaps only way to measure employee performance.)

The combination of human-centered design and Hal Williams’s outcomes focus produces what I call “results-based strategic design.” Here are six of the basic principles of this approach and how they can apply to higher education. Much of this involves asking questions that are different from the ones asked in a traditional strategic planning process.

No. 1: Recognize that planners plan and designers solve problems. Instead of asking, “Where do we want to be in five years,” it’s better to ask, “What problems do we need to solve?” That helps shift the focus from what we by definition can’t know (the future) to what we can do(solve problems and produce results).

For two reasons, it’s not always easy to identify the problems. First, we often jump to potential solutions before defining the key issue. For example, “Our enrollment is too low” does not state a problem. Increasing enrollment is a solution to different potential problems, such as unused capacity or most commonly an operational deficit. Increasing enrollment may be a solution to a deficit, but it may also drive up the discount rate and create additional expenses, so it may not be the appropriate solution, or it may need to be considered in concert with other solutions. As long ago as 2015, some colleges decided to address financial problems by shrinking rather than “chasing volume.”

Second, we worry too much about what designers call “gravity problems”: issues that are not really problems because, like gravity, they are going to be there no matter what. For example, current demographic trends that reduce applicant pools are not problems but rather inevitable facts. A low yield — too few accepted students who enroll — can be fixed, and the pool can be increased by looking in new places. (For example, people that would benefit from what you offer but don’t know it yet.) But the demographics are facts to be dealt with, not problems to be solved. Balance is key: some leaders resort to firefighting mode and jump to solutions too quickly, while others demand to understand all the variables before acting, and their response is too slow.

The difference between a designer and an engineer is that an engineer has a problem with a single solution: you need to get people across a river, so you build a bridge. Designers solve “wicked” problems: multiple and sometimes ill-defined problems that have multiple solutions. Higher ed is clearly rife with wicked problems. The problem-solving mentality can filter through the entire process. For example, instead of a strategic planning committee focusing on curriculum, create a design team to identify specific problems in the curriculum and create solutions.

No. 2: Use constraints to encourage creativity. Designers have learned that truly innovative and useful results come not from “blue sky thinking” but from working within a particular set of constraints. A smartphone can only be so big and cost so much, or it won’t sell. The familiar and new constraints in higher ed — changing demographics, increased competition, public skepticism and now the disruption of an as-yet unknown number of semesters by COVID-19 and the resulting human and economic consequences — need to be seen not as obstacles to planning but as catalysts for creativity and innovation. The three general constraints on new initiatives that design thinking identifies, and that can spur creativity, are: 1) viability (can it be sustained long-term?), 2) feasibility (do we have the capacity, tools and know-how to do it?) and 3) desirability (does it fit our mission and can we embrace it as an institution?).

For example, an enrollment-related result for one tuition-driven small college in the Northeast, with that region’s declining college-age population, was to attract, retain and serve students who didn’t know they would benefit from attending college in general or this institution in particular. This is potentially viable because it recognizes the decline in population while building on the fact that more students in that smaller pool need what this institution has to offer. It is feasible because there are many ways for an experienced admissions staff to reach to new areas and kinds of schools. (For example, this college started working with technical high schools and inner-city college-readiness programs.) And it is desirable because it will increase revenue as the institution continues to do what it does best — as opposed to simply lowering standards, trying to increase geographic reach or pursuing other enrollment-enhancing techniques. Keeping these constraints in mind also makes it much easier to link strategic design to resource allocation, since both viability and feasibility depend on resources.

No. 3: Determine constituents’ needs, which may not be what they say they need. This is what is called the empathy stage in design thinking, in which you observe people’s behavior in order to find the best solutions. It’s not enough to ask them what they need; as Henry Ford probably did not say, but is often quoted as saying, “If I had asked people what they needed, they would have said ‘faster horses.’” When your students complain that they face a byzantine bureaucracy, follow some of them around as they leap through the registrar’s and the financial aid offices’ hoops. Then simulate potential solutions. Or if your value proposition is not getting out through admissions and marketing, observe students’ and potential students’ responses to current and potential new messages.

I once embedded myself with a summer leadership camp for entering students and discovered that many of the reasons for their choice to attend our institution had nothing to do with what we said in our expensive marketing materials. This is not treating students as customers within a corporate model but simply respecting them as users of the services we offer. (An entire subdiscipline called user experience or UX has occasionally been recommended for higher ed planning.) Especially now that our students are changing from a traditional 18- to 22-year-old cohort to a constituency of all ages with varying and complex life situations — and will be emerging from the trauma of the pandemic with a host of new and different concerns and needs — we should carefully observe the quality of their experience. We all pay lip service to the needs of the students we serve, but strategic plans still tend to focus on the self-preservation and growth of our institutions.

David P. Haney is the former president of Centenary University. He and Jeremy Houska, director of educational effectiveness at the University of La Verne, will present on results-based strategic design at the SCUP 2020 Virtual Annual Conference, sponsored by the Society for College and University Planning. For more resources on results-based strategic design, see davidphaney.com.


Come back on Tuesday for the other three points Dave makes and how I use them with many organizations.

Categories
Uncategorized

Leading By Example

Suzanne McLeod was Superintendent of Union-Endicott Central Schools in New York State. She participates in a program of School Turnaround called Get to Great. A key to this approach which helps good schools get great results is a set of prototype projects. While others may study and plan (ready…aim…aim some more…aim some more) Suzanne fired and then aimed—and in a way that prompted over 15 students who would have not passed courses this year to do so. This is actually enough to move a needle in the population of those most at risk for drop out. I urge you to read the short case on “No Tiger Left Behind” below. This is a surprisingly low-tech approach that shows how that the human touch can make all the difference when concentrated on a specific group of kids at high risk. This project took half an hour of time per week of the superintendent’s time—less than the time spent watching just half of a sporting event or concert. And the difference for achievement in the school was profound.


A write up by Hal Williams, Outcome Guide

This is the prototype project of Suzanne McLeod, Superintendent of Union-Endicott Central Schools. Like everyone in Get to Great Sue did not lead change by edict or exhortation. She led by example.

The Target

A goal of this district is to drop the dropout rate—dramatically. And no group drops out at a faster rate than those students who are returning from involuntary absence. These are youth who had major discipline and behavior issues resulting in a Superintendent’s Hearing. At this district as well as others, the eventual dropout rate of students in this category is very high—often well above 90%.

Sue asked what first predicted that the re-introduced students would not make it. The answer was course failures, so she set as her target that all of the 20 students in this category would pass all courses in the 2010-11school year. No Tiger would be left behind.

The Data

Sue first needed the list of students in the district who had been sent out involuntarily and a way to update it for students who left the district upon reentering. In “Get to Great” any target goes from percents to numbers….and from numbers to names. She then concluded that she needed this data to be able to help students pass all their courses:

–continuing information on their levels of achievement, including grades. She could not wait for quarterly assessments.

–insights from teachers, who can often see early signs (e.g., disengagement) long before tests are reviewed.

–attendance and any disciplinary, medical or other challenges appropriately noted.

–attitude and motivation to pass courses of these students, which she wanted to see and hear directly from them.

Lacking that elusive and comprehensive “data base” Sue simply moved forward with all she needed to track progress of these 20 students: a binder. All of the above data went in it continually and it became the source of needed information in portable, sharable, flexible form.

The Action

Sue began with the assumption that while classroom teachers could pay attention to these high-risk students that they could not reasonably shift limited time from other students to provide the help they would need. At least for the prototype period, her conclusion was that the intervention would be a new one and would be a new approach, a new program, a new policy. The intervention would be her.

As she met with the students it was clear that they had multiple challenges, beginning with attitude, skills and knowledge shortfalls, and, often, a lack of parental engagement. Her view was that she needed a way to help these students begin to achieve in real time, not after a remedial period which would further weaken morale and any sense of possibilities. She literally did “whatever it takes”, providing enough strength and force to the relationship as was needed. She saw kids in the hall, after and before school, and at all times by email or phone. She learned to be sensitive to keeping a low profile as far as other students’ view of these relationships were concerned. She also developed close collaborations with other administrators, guidance counselors, and teachers who often intervened before Sue had to!

Sue notes that she is well aware that her position as superintendent made it difficult for anyone to say no to her. That she simply puts in the plus column. If she has that power, she should use it for student achievement, not for showing up at school events.

In a few cases, this was resisted, but in a positive way. “What do I have to do so that you don’t keep bugging me?” Sue told them and found this a perfectly acceptable motivation to pass courses!

Three students (two 9th graders and an 11th grader) spoke with me to give their insight. All said that Sue’s direct connection to them during the year made a clear difference on their passing classes. The common threads;

–They really took notice because this was the leader—who, as one student put it, “has so many other things to be doing beyond talking to me.” Two said it would not have made such an impression if it was a teacher or someone else in the building.

–They were surprised at the first visit (which began with a call to go to the principal’s office where Sue spoke with them). They found this memorable in part because it was scary. “What did I do wrong?” This quickly turned to other emotions (which I sense they could not readily characterize) when they saw that her only point was to help them pass all their courses.

–Beyond the specifics of questions she asked was the theme that Sue cared a lot about them. She knew their grades and their status on “stuff” due for classes before speaking with them. While only 3-4 times per year (that they remember) the visits were clearly enough to maintain the sense that she was watching them.

–They told none or very few of their friends about the superintendent’s interest in them but did not seem to find the interactions embarrassing in any way. They understood and agreed with the purpose.

–All three believe they will pass their courses this year and that they would clearly not have done so without Sue’s involvement with them.

Two of the students are concerned that if they do not continue to get some kind of watching next year that they might relapse into problems. This raises a good question from the prototyping: how much intensity and duration of such an intervention is needed to insure that a positive change is sustained? As principal Steven DiStefano puts it, these kids are just over the divide where they tottered between academic success and failure. It is of great importance to keep them there.

The Results

Sue now projects that at least 90% of her students will pass all their courses and notes that data from previous years suggests that no more than 10% would have done so. With 20 students that’s a “save” of 16 students. Given that course failures are the leading predictor of drop out, it is reasonable to see this as at least 14 dropouts prevented. The cost of just one drop out is over $1 million of earning by the person and an even higher sum in added social service and health costs.

Sue, like many prototypers, finds examples more compelling than statistics. Here are a few of the students whose progress overwhelmed her:

  • One young man – a 7th grader – is not only passing all of his classes, but recently brought his father to a Research Symposium sponsored by our select High School Science Research program, where our top science students work with university research professors and compete in competitions such as the Intel Prize. This young man’s goal: to participate in this program in high school.
  • One young woman, who has returned after two long-term out of district suspensions, is now passing all of her classes and now aspires to become a physician. Sue and her team believe this is a realistic and achievable goal for her.
  • Three other students, all passing their courses, set their own prototype goals for 4th quarter as having a 90% average or better.

Gains are always relative to costs. In this case, Sue projects that this project took an average of 30 minutes a week of her time. Counting start up, call it 3 hours per month average. That is no more time than that spent attending one sports event or community meeting per month. Sue is clear which is more helpful in increased academic achievement.

Categories
Uncategorized

What happened to results in considering performance?

A recent Washington Post article (6/10/20) looked at how and why funds to use farm surplus to feed hungry Americans led to awards to a San Antonio event planner and a health-and-wellness airport kiosk company.  I know of one key reason:  results and likelihood of their being achieved were not of the highest priority in selecting groups.  When asked by the Post, USDA furnished the selection criteria for awards:  “Proposals for the Farmers to Families Food Box Program were evaluated by, in descending order of importance, the technical information contained, the prices offered, past performance of the offeror and the offeror’s ability to perform.”   Wow.  Hiring a consultant to provide technical data and projecting a low price is more important than what the group has done in the past and its capability to perform now. Something sure is backwards here.

And this is not a solitary example.  In government procurements, the needs statement and program description supplied by a grants writer can add up to more points than the number of people who get a defined gain. And with some philanthropies, the program fit to mission and number of partners listed can outweigh points allocated to results.  

Two additional problems jump out from this observation. The first is that in some assessment frameworks there literally is no category of results.   A recent newspaper column reported on the grade given a superintendent.  The assessment had four areas:  leadership, high quality instruction, continuous improvement, and communication.  None includes how many students went to grade level in reading or math.    

The second problem is the confusion between form and substance. I was just asked to review a foundation application which had the following criteria considered the best way to include results:    How clear and complete is the proposal on program outcomes?  Not are the results high enough to be worth the grant…. Just “are they clear?”   I also see this issue with the evaluation factor that gives points for clarity of the evaluation plan.  I learn that the group has hired an evaluator and will use a survey.  I do not know whether this captures opinions or actual changes in behavior. Again, the process focuses on the form, not the content.   

Categories
Uncategorized

Riding for your Brand

Marc Chardon, the bright and inquisitive former leader of Blackbaud Inc, and I wrote several pieces together that will be featured on this site. One theme we call, “Riding for the brand.”  Brands we note, do not establish a class or cohort of groups. They exist to define and protect the distinctiveness of an organization, whether a cattle ranch in Wyoming or a drug treatment organization in New York. Where’s our beef? It’s underneath our brand!

As foundations move from funding programs to investing in results, they shift from sprinkling money among many organizations to investing more on those organizations that achieve the strongest human gain for the dollars available.  Yet when I ask many nonprofits where they think they stand on achievement relative to other groups in their field and geography, most do not know. And a few shrink from the question—suggesting that this means having to at least implicitly call attention to differences with colleague groups. My advice is to get over that. You need a way to move from blending in to standing out.

No need to badmouth other groups. You simply show your results relative to those of other organizations taken as a whole. You let the viewer draw his and her own conclusions. And if you are interactive with other groups to the point that shared action determines your success, say so. Be clear about how collaborations with other high achieving groups lets you together create results that are well beyond those that sum from your separable activities. Then supporting each party to the collaboration makes good sense to investors.

Categories
Uncategorized

Patient Activation

A cutting edge movement in medicine—and especially in Community Health Centers—focusses on what patients can do to manage and improve their own health. It starts with the assumption that the choice people make in such areas as exercise, diet, smoking, and alcohol are as consequential to health as is all of medical technology. This moves medicine to the change management field and to understanding how to help patients move from passively receiving services to actively managing their health. In many cases, the inability of people to create success for themselves is seen as associate with low self-confidence and self-efficacy. People have to know that they can change their behavior and that the change will lead to better health before they will take great initiative.

In many social and human service areas activating clients, consumers, or customers by another name would seem to fully apply. Do participants see themselves as receiving a service or as getting help to make intentional change? I am impressed with one promising step that takes the idea of setting targets from organizations and programs to participants.The national program called School Turnaround, a division of The Rensselaerville Institute (TRI) uses an intervention approach to reverse decline in failing schools. One of its tactics in the strategy around compelling outcome clarity is student target-setting.  Each student in the school knows what he or she is trying to achieve. I want to learn 5,000 new words…I want to learn how to spot the main idea and separate it from the theme of a book I read…I want to know how to make inferences—not just what is written but what it means. And—yes—I will move from a score of 63% to 78% on this test and here are the October and December benchmarks I will reach.     

When you go into a traditional classroom and ask students what they are doing you hear that they are reading a story. When you go into a School Turnaround classroom you hear that students are trying to learn how to spot the main idea on each page. What difference intentionality makes! How purposeful are your participants in your programs? Are they sitting through the session or are they looking to achieve something in each and every hour with you?  Perhaps “patient activation” can help.

Categories
Uncategorized

Attributes and Achievement

I keep seeing writings that speak to the power of attributes in forecasting achievement in most jobs.  The latest was sent by a colleague at the Humana Foundation who had read a piece by Jeff Nally, head of the Nally Group, which focused on learnings from Dr. Paul Brown, a professor at the Monarch Business School in Switzerland.   He summarizes Dr. Brown: “We’re hiring for the wrong things when we focus primarily on knowledge, skills, and experience. We are really hiring energy, not individuals. We are hiring the unique ways a person uses information, energy, and relationships to achieve outcomes.”

Over and over I see the value and primacy of energy and other attributes over education and experience as a predictor of high achievement.   Another example is a re-read of GRIT by Angela Duckworth. The many studies she cites show the relationship between GRIT scores and achievement in colleges and many other places. It’s $9.88 on Amazon and is a great investment in a very readable book. Which is more important to you in sizing up a person: their Master’s Degree or their grit. This book will help you answer that!

Image result for book reading grit
Categories
Uncategorized

The Achievement Lock of Education

Marissa Dobbert is a middle school math teacher at a charter middle school in Sarasota, Florida.  She was selected as the 2020 Middle School Teacher of the Year in Sarasota County, Florida.  I used her as an example of tracking milestones in a January workshop for nonprofits in Manatee County.

I read about Marissa in an article in the Sarasota Herald Tribune and called her.  Being in the Results First business, I began by asking her how her students achieved academically compared with other students.  In Sarasota County, 55% of students in the lowest 25% performing group made gains on the Florida State Assessment (SFA) in the last reported year.  In comparison, 67% of her students made progress.  

Marissa reflects the characteristics of sparkplug leaders to a T.  First, she is drawn to challenges. She taught advanced courses and could not wait to get back to working with struggling students. Second, she is highly energetic, noting that she stands on chair and moves quickly—bouncing around the classroom to create and sustain attention “like a crazy person.”  Third, she is very focused on achievement and has two highly specific steps she considers essential to achieving them

First, Marissa focuses on creating a connection.  Her sense is that calming fears of students who have never succeeded at math and creating a connection between her and the student is an essential starting point.  In my parlance, engagement is her first milestone. She is clear that no amount of teaching will make much difference until she can see and hear some level of connection between her and each student.

Second, Marissa forgoes dutifully teaching the full-to-the brim math curriculum.  While she is held to the same standards of academic achievement on the Florida State Assessments (FSA) she teaches at a charter school that gives her some flexibility in approach. She uses that to concentrate on what she sees as the most important skills. Her assumption is that knowing all math content is less helpful than knowing the small number of essential tools that let students handle problems in most, if not all, content levels.

Marissa also uses problems to which the kids can relate.  Rather than a text book question, she translates textbook questions to the kinds of situations she and her students face. This gives not just context but motivation. 

There you have it. The procedural lock in social and educational programs is to get through all prescribed content. The achievement lock is to change the process and hold results as the constant.