false
en,es
Catalog
Introducing CASE Insights™ Framework for Brand and ...
Recording
Recording
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Good day, everyone. How are you? We're so glad to see all of you here. I'm Terry Flannery, and I'm CASE's Executive Director. Excuse me. I changed my title. I'm CASE's Executive Vice President and Chief Operating Officer, and I am one of two presenters that's going to be sharing some high-level ideas about this new framework for measurement in brand and reputation today. We're glad that all of you are here. I'm joined by my colleague Jane Shaffer. Jane, would you like to introduce yourself? Hello. I'm the Vice President for Membership Marketing Communications here at CASE, and my role is global, although I'm doing this session from our UK base in Europe. Thanks, Jane. I want to thank some other participants for the webinar today. First, I want to thank our hosts for this webinar, who are the Executive Directors for the U.S. and Canada region of CASE, and they include Chase Moore and Grant Cullet. Thank you both for being our hosts today for this webinar. Chase, would you like to say a word? Absolutely. It is so exciting to see so many of you joining us today from across the U.S. and Canada, colleagues, volunteers. This is such an exciting time for CASE, and we are thrilled that you carved some time out of your day today to be with us for this very important conversation. Terri? Thank you, Chase. Finally, we want to welcome Christy Moss, who's one of the 10 brilliant thought leaders who helped to co-create this framework for measurement for folks who work in marketing, communications, and education. Christy is the Vice President for Membership and Marketing at the University of Illinois Alumni Association. Christy, want to say hello? I would be delighted to. I am so thrilled to have participated in this with the other thought leaders, and I am so glad that you could join us today to hear more about this truly inspiring work. Thank you. Thanks, Christy. You'll hear more from her in a few minutes. Let's start with the purpose of this session. Jane, you want to take this one? Yeah, that'd be great, Terri. Thank you very much. Thank you again for coming along and joining us today. It's going to be an exciting and informative session. What we're going to do is we're going to introduce the framework to you in a level of detail that we think is helpful. We've done this presentation now live to quite a few audiences around the world, and people really appreciate understanding a little bit more about what's in the framework and kind of lifting it off the page. We'd also like to be able to receive some feedback from you live, if we can, and get your reactions to what you've enjoyed about the framework and any challenges you may do, and we'll do that live in the session, and also invite examples of measures from within the framework for you. We're going to do some of that live, and also we're going to give you some tools afterwards to be able to feed directly into what we're calling kind of crowdsourcing some of the measures that we're going to be looking at, and then tell you about what our next steps are, so you can keep yourself informed and maybe make some dates for the diary and how to get involved. Thank you, Jane. Let's kind of do an overview of what we're trying to achieve with the introduction of this framework. Really, we hope to eventually have this lead to two really important outcomes. One is to enable and standardize measurement of performance in our work. To our knowledge, there are no existing standard measures of performance in marketing and communication in the education sector. They obviously are in lots of other sectors, but because our work is organized in such unique ways, and we work in a culture that's so distributed in terms of autonomy and responsibility for things like marketing and communication, it tends to be much harder to do. This is a first of its kind framework that attempts to scale that mountain, which we're very excited about. When we collect and standardize a set of measures, we will begin to be able to provide benchmarks, so that all of you, if you submit your information from your institution, will be able to compare how you're doing to other institutions that look like yours. Those are really the long-term outcomes of this project. We think the reason that it's such an important thing for CASE to do is related to our strategic intent. CASE really aims to define and disseminate the competencies and the standards for our profession, which includes a variety of integrated advancement functions. You all know that CASE serves the professional needs of people that are doing work in fundraising, in alumni engagement, in advancement services and operations, and in marketing and communications. We really consider all of those disciplines part of integrated advancement. Many of you may know that we already have a long history of establishment of standards in the philanthropy sector. In the last now six years, standards and measures for alumni engagement, the alumni engagement metrics, now collect data from several hundred institutions. I think we're up to over 800 institutions that submit data for us through those measures, but there hasn't been something in the marketing communication space. So using our experience and our thought leadership and the convening power to bring really smart people together to figure out how to do this, we're going to finally cross this barrier and come up with some new measures. So the full report, Jane, would you just talk a little bit more about how people can get the full deal? Yeah, so obviously there's a lot of information behind what we're going to talk about today. So there is an opportunity for you, if you're a member, to log on to your CASE account. You can use the QR code to link you through to that. And if you have problems with your login, you can email us at membership at case.org. If you're a non-member on this call, you're very welcome and I'm pleased to have you here with us. And access to the report is by joining as a CASE member and then you can download the report. And there's a link to that today. There are versions on there in US English, British English, and also there will be in Spanish as well. So if you need something in those languages, then they will be there as well. Thank you, Jane. And I know our Director of Online Learning, Christy Grimm, who's supporting us today. Thank you, Christy. We'll include a link to the report for you in the chat. Okay, so now that you know where to find the report, let's talk about kind of the basics. So we have organized this framework of measurement into six broad categories of functions and three levels of measurement. And I want to talk a bit about each, but I want to talk about the group that created this first, because I think it's their expertise from such a wide group of experiences is what allowed us to come up with this very relevant framework. So we convened a group of 10 thought leaders, Christy's one of them, who represent expertise in both marketing and in strategic communications. Some of those experts obviously span expertise in both areas, but really deep experience in both strands of the work. We have members from this thought leadership group who work in schools, independent schools, as well as colleges and universities. We have members of the group who work as leaders in agencies that serve educational institutions, as well as practitioners. And we have folks who've contributed from three of the four case regions. So three of our members are from institutions in the UK and in Australia. So we're really excited that we had that kind of wide view to help create this framework. And the group came up with what they thought really is the best place to start. So to begin with, we've decided that we can't comprehensively establish measures in everything we do. We need to take this a few bites at a time. And so we started with the six broad categories of work activity that we do in marketing and communications and education as the place to start. These are the most common elements of our work across all of those sectors that I just talked about. And so we've got categories of work where there are outcomes to measure in brand development, in recruitment and retention, in strategic communications, in alumni engagement, in philanthropy, and in external engagement in public affairs. And you'll recognize two of those that have direct overlap with our measures in the alumni engagement in the philanthropy area. But in the others, we're creating and crowdsourcing measures that fit within this framework that don't exist in the education sector right now. Together, they represent the beginning of measurement of brand and reputation performance in education. Second thing I want to talk to you about is the levels of measurement. So one of the things that's really been challenging for us to think about is how to create something that will allow all of our members, no matter where they are in their school or institutions, thinking about the purpose of this work, whether it's the most basic sort of view of marketing and communication as promotion, or to its most sophisticated, where your institution and its leaders recognize the strategic value of marketing and communications as a function that builds value, revenue, reputation, relationships, right? If you are in either ends of the spectrum on those things, you might have very different tools and very different investments that represent your institutional context. And so where you start to measure probably matters a lot based on those differences. We've tried to create something that will allow you to start no matter where you are and no matter what your institutional context. And so if you think about the levels of measurement on a continuum from reporting to analysis to insights, you'll get kind of the basics of what each measure represents. So reporting is really a distillation of data into a format that's relatively easy to understand, but it's probably less actionable in informing marketing and communication strategy. Measures like this basically report what's happened, but maybe not the why behind it. So that's a set of measures in these six categories of functions that will be quite basic, but begin to describe what happens. The second level is analysis, and they build on reporting and they add interpretation of data in ways that are useful to evaluating a specific approach. So well-interpreted data explain behaviors, perceptions, and relationships between data that inform your next steps. In a lot of cases, you begin to make connections between behaviors that are proxies for real underlying change in the perceptions that drive brand and reputation, right? So those are kind of measures in that second category of analysis. And the third level of measurement is insight, and these build on reporting and analysis, and these measures derive insights. They begin to predict what will happen, predict behavior, and they inform strategy. So really the most sophisticated measures. So there you have kind of a description of the elements of the framework, and if you think about it, it shapes basically a matrix of six categories and three levels of measurement. And Jane's going to walk you through a little more detail on a couple of examples of measures in the framework. Yeah, thank you, Terry. And I think one of the thoughts I'm just going to put in here before we go into deep dive into some of a bit more detail is that it's fantastic to be able to think that we could look across not only the three levels and the six measures, but potentially compare and benchmark across the world. So I think this is one of the opportunities of case also looking at these measures in this way and keeping it so that people can access it at those different reporting analysis and insight levels. So if we just look at the first one here, and this is an extract of a page out of the report. It's not the whole thing, but you can see the level of detail that we want to go to and look at under brand development. So that's one of those six sections is brand development. And the reporting way that we're looking at it is through brand awareness. Analysis would be brand health and insight would be brand equity. And then we've broken that down even further to say, OK, under brand development, what could be the component parts? So thinking there about enhancing brand loyalty, improving perceived quality of your brand and then strengthening brand associations. So unpacking that even further, we then look across under what's under reporting analysis and insight. So just taking something like enhanced brand loyalty, you might report that something as simple as a likelihood to recommend to a prospective student. So how often we collect those net promoter scores and what those might say about brand awareness. Then if we took that brand loyalty area and looked across under brand health, we might think of a more aggregated net promoter score across a number of different audiences to really see how that is, saying how strong our brand is, how well received it is and kind of taking that temperature check across a timeline. And then thinking about the insight aspect of that, looking really to advocates and how much people are really willing to advocate for your institution and then stand up to tell other people about how great you are as well. And we can look at that by a variety of your audiences. So hopefully you can see by looking deeper into some of these examples how we can really take something quite simple with one audience and then extend it across the different levels. And looking at the report and afterwards if you get chance to look into the presentation, you can see that we've done that for each of the sections to give a flavour of what might go into the benchmark and to the analysis at that level. Thank you, Jane. I think it's really important to reinforce that we didn't try to be exhaustive about examples of measures in each category and level of measurement. It's really just a couple of examples to give a flavour of what kind of measure would fit in that area of the framework versus others. We know there's lots more to do and we are not endorsing any particular measure at this point, but I want to tell you a little more about next steps and how we'll get there before we open it up for feedback and questions. So what will we do next now that we've identified this framework? Well, we're doing the next step right now and we'll continue to do that for another few months where we will ask professionals in the field to give us feedback on how they think this will work, how they imagine using it, and really specifically, as a result of the conversation today, we're going to ask you to submit an example of a measurement you use and tell us where you think it fits in the framework because it'll help us to crowdsource a set of measures from which we will choose the first that we'll launch in an actual data collection. Our intention next is to identify the platform for data collection and CASE does this with a lot of other outcomes measurement in our work, so we'll use a similar platform and then we'll establish and choose the very first measures within the framework that we intend to define in a standard way so that everyone will be able to think about their institution's data in this area and collect and report it in the same way. And then we will identify a set of institutions who agree to participate with us in a pilot to submit the data and this is the messy part, honestly. When we do this process and we're starting with a new data collection like this, it's not easy the first time around and we'll work out the kinks and see where we have issues with definitions or with the method of data collection and that will help us refine an actual survey that will become over years more robust, more reliable, and better as a representation of not only your performance but then begins to enable those benchmarks so you can compare to institutions who've also submitted that data. And as you can tell from all of this, that's gonna take a few years to do, right? So we'll be intending to choose the first measures by the end of this academic year, go into the summer starting to work out the kinks on that, and then we'll begin to think about how we standardize the definitions across those measures before we begin the first pilot survey the following year. So that's kind of our timeline in building this process. I wanna say one more thing about it, and that's that we're following a well-trodden path. So our alumni engagement metrics followed a process exactly like this that really helped us to build a set of measures that represent the best thinking among practitioners who are case members, and allows us to gather input and refine as we go along. So now we get to the fun part where we wanna hear from you about your feedback. So we're gonna ask you when you have an opportunity to tell us what you're most excited about and what you think might be the most challenging. But before I do that, I'm gonna turn to Kristi Moss, and I'm gonna ask her to talk about what she thinks is really the thing she's most excited about or proud of as one of the leaders who helped develop this, and to share a little more about what was one of the most challenging things that the group had to work through to address the development of this framework. Kristi. Thank you, Terri. So one of the things I'm most proud of or most excited about regarding this framework, I have had the opportunity to also be a part of the alumni engagement reporting that Case started about five years ago now, and the University of Illinois has been a participant in that process since the beginning with the pilot phase. And I will tell you, I have seen, because we brought this system of measurement on board, I have seen this part of our industry start to be transformed. Folks who lead alumni engagement efforts now have a seat at the table as leaders are starting to plan their next capital campaigns and alumni engagement goals are a part of those capital campaigns. And I've seen the way that the industry has shifted around this perceived increase in expertise. And we know the expertise was always there, but the perception of that is different. And I'm very excited about what I think that means for the future of our profession in marketing and communications. And I'm also excited about the idea that we're going to get to come together as a whole university to talk about the marketing that we're doing. So I'm very excited about the future of this. And I look forward to a couple of more years down the road when we have our first cohort of collection of data. It has also allowed us in the alumni engagement space to create these groups of practice where we can kind of look at our results compared to each other and share best practices about how we are improving in areas. It's given us a vocabulary and a network of individuals that's bonded a little bit more tightly than just saying we all have similar titles. We're able to talk about the challenges we have related to some of these things. So I look forward to the future on that. And when we talk about what was most challenging, we had to narrow this down. We kept wanting to include more and more areas. So we narrowed it down to six. That was very difficult. I think Terry can attest to that as well. And I think what I saw in that room were a variety of these different sub-disciplines very well represented, very passionately represented, making a case for why we might need to pull this out or consolidate that. So what I'd love for all of you to know is that a lot of care has been taken in the creation of this, and a lot of passionate people thinking very deeply about what each specific word means. And so I think that this group as a whole has been very well represented in this process. Thanks, Christy. And you can actually see all of the thought leaders who are involved in this project in the report itself and on the website whose link we've shared with you, that landing page. So thank you, Christy. All right, so now we get a chance to hear from you a little bit. And I've turned off the screen share so that we can see more of you because there's lots of you. We can't see you all on one screen. I'm happy to go back to the slides if you need them for reference in our discussion at any point. But what I'd love to do is start by asking the question, what are you most excited about? You're all capable of speaking. You need to unmute, but what I'll ask you to do is if you'd like to join in, could you turn your camera on so that we can see you and raise your virtual hand with a reaction so that you can contribute to the conversation? And we'll try and call on you. And I'm gonna ask Chase and Grant to help spot people who would like to contribute. I'm Carrie. We've, someone's asked if we could share the chart again, and I'm thinking that that might be the diagram, the flower diagram, is that right? Yes, let me go back to that. Very good. All right, I see Tara has her hand up. So Tara, what are you most excited about? Hi, yeah, thank you. I am, I mean, this is probably not very helpful, but I am just excited that something like this exists because I've been in this line of work for a really long time and have not ever been able to find a framework like this. And it's been very frustrating because, you know, we're constantly having to prove sort of our value. And I think this is going to be incredibly helpful in doing that. One thing that I, I think a challenge that I have been thinking about with regards to this is like, what are your thoughts on the fact that so much of like marketing and communications and brand awareness and all of that is really tied to so many other components? You know, obviously our work does not exist in a vacuum. It's very inextricably linked to, you know, other departments, other stakeholders. So how, you know, how does that kind of fit into this? Because just in looking at the slide, you know, our rating among peer set for high academic quality, well, that's not solely on a marketing and communications department, right? In fact, I would say it's very, other departments have much more of kind of a impact in that area. So how do you kind of reconcile that? Yeah, it's a good question. Tara, where are you from? What school or university? So my name is actually Tara. I'm at Wake Forest University School of Law in North Carolina. Thank you very much, Tara. It's a good question. So it's really, you're talking about the issue of attribution. What proportion of this performance can be attributed to the investment and the way we implemented marketing communication tools versus anything else? It's a tough question. I'll answer it in two ways. One, in particularly in some of the more digital forms of marketing and communications work, the possibility of attribution is getting refined and getting better. So there will be some measures where you can sort that piece out and not just last engagement attribution, but kind of all the way through a funnel, if you will. But I would also say how many of your colleagues in philanthropy worry about how much of their outcomes are attributable to the investment that was made in philanthropy versus other things that influenced the decision of a donor to invest, right? I think we are put in the position of having to prove that piece in a way that's probably not entirely fair. And if you can measure change compared to where you were or performance relative to others on the same measure, the issue of how much the specific investment made the difference becomes a little less important, but it is a good question. Jane, would you add anything to that? Yeah, I think the benefit of it, I can see the challenge of it for sure, but I think the benefit of it is that it suddenly makes brand and reputation everybody's business. And it takes us to one of those P's of marketing, maybe the product. And sometimes that's one of the areas where in our work, we don't necessarily get invited in to help talk about how something is, how something is landing from a product perspective. So I think it really puts the marketing and branding reputation work as something that's potentially more strategic. But sorry, one last thought about this. It is kind of a double-edged sword though, because if recruitment is down, it's not necessarily because we're not doing our job with brand awareness. It could be because the football team isn't doing well or what have you. So I can see that it sort of goes both ways. It cuts both ways. And those same issues are true in every other sector that uses marketing communication tools, right? But we will try to hone measures that zero in on things like return on investment in ways that you can sort some of that. That'll be part of our work. Thank you, Tara. Patrick, where are you from? And what do you find exciting and or challenging? Yeah, Patrick Pratt, Denison University. Thanks for all the hard work that went into this and the time today. That conversation right there was even really helpful and I agree with all of it. Something I'm really excited about is just the kind of nuts and bolts, the hard information that this will allow us to bring to the conversation to kind of support some of the perspectives that we're bringing to bigger conversations and decisions. A potential challenge that I see with this, and I think it's time-worn, right, is the old adage that what gets measured is what gets done. And so when we are starting to put numbers to all of this and more hard data, then kind of what are we missing that also goes into the work? Kind of the, this is the science half of it and then what's the art half of it? And are we neglecting that? I don't think we will. I think we're all thoughtful. And you guys, like I said, you've all put tremendous work into this, but just food for thought. I think Patrick's a really good point. And one of the things that we've recognized and the Thought Leadership Group spotted this while they were doing their work, this actually will probably drive some decisions at institutions about what technology to invest in, right? And where to invest implementation of marketing communication resources because it will allow you to collect and report data and compare in those areas. So working to what the measures focus on necessarily may pull some attention away from other measures. And we'll get more comprehensive as we go along. We're gonna have to start somewhere, but we'll need to be mindful about that, especially when we're making the choice of those first measures. And Patrick, I would say to the art piece of that on the alumni engagement metric side of things, I have actually seen these metrics pull together and allow those communities of art to actually talk in very real terms, not just about the data behind it, but the strategies that grow out from it and kind of getting to this column of insight, how it is that we can influence these things positively. Thanks, Christy. Other reactions, what you're excited about or what you think is gonna be challenging? I'm looking for hands. Anne's got a hand up. Hi, Anne, where are you from? Okay, hi, Anne O'Connell. I'm from the College of Lake County. I was thinking exactly what Patrick asked. I really appreciate this work. It's very exciting. I'm very interested. I'm really heavily involved with more PRSA. I'm just getting acclimated to case, which I think is great, but I'm very interested in measurement. And I tell my team all the time, I don't care what we did. I wanna know what did we do? Who did it affect? Was the timeline met and how do we measure it? That is just critical. So hats off to you all for working on this. I think it's great. Thank you very much. Other thoughts? While you're thinking and while we're watching for hands, I'll share that some of the feedback we've heard at presentations where we've done this at conferences, some folks said, this is a game changer. This is gonna provide me with credibility and legitimacy on my campus. It's gonna help leaders think about where we should be focusing our work. So there is in that enthusiasm, a lot of sense that it will help us with more professional approaches. Not that our work isn't professional already, but it really enhances the level of credibility and professionalism of our work to have a set of measures that are standardized and shared by a wide group of institutions. Any other thoughts there? Okay, so here's your call to action, everyone. Every good marketer knows about calls to action, right? We are depending on you to help us crowdsource a measure. So what we'd love for you to do is use the QR code or the link that Christy gave you in the chat to go to the landing page for this new framework and think about a measure that you use and submit it to us as an example. And it'll give you the opportunity to tell us which category of functioning you think it fits in, which level of measurement you think it fits in. There's no right or wrong answer, but we're just starting to gather ideas about which measure you already use that fit in the framework so that we can use them from the pool that will choose the first measures to standardize. And if you're willing to do this in the process on the form that is provided, give us your name, your institution, and an email address so we can follow up if we have questions. But that'll be a really important contribution by all of you to this work if you're capable of doing that. And I think the more of those measures we can crowdsource, the more that the thought leadership group will be able to really see where those sort of six categories, three levels, and what gaps we have. So it's gonna be a really important piece of this next phase to see that. But it's been great today to see everybody on the call. Want to particularly thank Christy for your time on the project, but also sharing your experience with us today and also to Grant and to Chase for joining us as well. Also a big shout out to Christy who's been running the webinar for us. She's been putting amazing assets into the chat. She's put in the feedback form, she's put in the slides. So please go and have a look there and get hold of those for you. And like I said, have a look at the framework, download the framework as well. And if you've got any questions about how you do that, just talk to us at membership at case.org. So thank you everybody. And thank you, Terry. Oh, one more for Terry. Thank you. I think Grant has another question or something to add, Grant. Okay. Well, I did see a comment in the chat. Sophia asked, when she said this was all super exciting, she did also ask if there might be a plan to share some of the case study work once we're beginning to gather that. So just if we had an opportunity to respond. I love that, Sophia. You're already jumping to the data already being collected and we're doing case studies now. And somebody else was thinking about how we can use AI to crawl this data. I'm sure those are going to be in our practice. When you look at how we deliver reports on other data that case collects, we provide a basic report to everyone who participates. There's usually a trend report that goes to every case member. And then you can go deeper into the data. But we also provide opportunities to kind of look at institutions that are using the measures well as a model so that we can hold that up as good practice. And there's often the opportunity to join as part of a cohort to understand your data more deeply with another group of institutions maybe that share some characteristic of your school or institution. So there's all kinds of case services that will eventually roll out of this once we have the data collected. That's a few years down the road, but Sophia, we're going to be calling on you. Anyone else? Well, then let's let you all have the balance of your hour back. Thank you all again very much. And please go to that website and share a measure. We're looking forward to hearing from you. Thank you all very much. Thank you. Bye-bye.
Video Summary
In a CASE-organized webinar, Terry Flannery and Jane Shaffer introduced a new framework for measuring brand and reputation within educational institutions. With participation from key figures, such as Christy Moss and regional CASE leaders, the session outlined the importance of this framework in standardizing performance metrics, which currently do not exist in the education sector. The framework categorizes measures into six areas—brand development, recruitment and retention, strategic communications, alumni engagement, philanthropy, and external public affairs—and across three levels: reporting, analysis, and insight. The initiative seeks to enhance institutional benchmarking by gathering standardized data, leading to improved professionalism and strategic value for marketing and communication efforts. Attendees were encouraged to contribute by submitting measures they use, aiding CASE's goal to refine and pilot these metrics. The session emphasized the potential challenges of attribution and ensuring comprehensive measurement. Participants expressed excitement for its potential impact, akin to the transformation seen in alumni engagement metrics, while addressing concerns on art versus science in performance assessment. The project will proceed with pilot testing and eventual data collection over the coming years.
Keywords
brand measurement
educational institutions
standardized metrics
alumni engagement
strategic communications
institutional benchmarking
philanthropy
×