false
en,es
Catalog
Data Privacy Webinar Series
Session 2: Strategies to Meet the Data Privacy Cha ...
Session 2: Strategies to Meet the Data Privacy Challenge
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Hello, let's just give a moment for everyone to get logged in, but we're so happy to see so many of you today and also very timely. As you are coming in, I think we did this last time, but it's always just nice to know who we have here. If you could just take a moment and we've got the chat. We're going to use the chat today for interacting with one another. So if you could just put in the chat who you are and where you're joining us from as you log in, that would be fantastic. And I'm seeing the numbers slowly start to slow down, so I'm going to go ahead and get us kicked off. Welcome, everyone. My name is Jenny Cook-Smith, and I am representing our Case Insights team. I will have my lovely panelists introduce themselves in just a moment, but I'd like to just get us kicked off for part two in our data privacy webinar series, Strategies to Meet the Data Privacy Challenge. So if you weren't able to join us or listen yet to part one, you're still okay. You don't have to leave and come back another time, but I would suggest taking a little bit of time to look through some of the foundations that we set aside in part one, because we're really building off of that in part two as we think about defining what we mean when we talk about data privacy and some of those concerns, and we weren't going to leave you hanging. Now we're going to really start to discuss some strategies. As a reminder, we hope you'll join us in a couple of weeks for part three, which is the thing that I think we all want to do to begin with, right, which is get in the weeds, and we'll be bringing Mark Koenig back as well for part three. But first, I just wanted to mention again, this webinar is part of our Case Insights Services, and this is a webinar that is free to case members. And when we began conversations, gosh, many months ago now, Elise, and thinking about how do we best serve our members in the areas of data standards and research on this data piece, the big concept of data privacy and what does it mean and what do we do and how do we ensure that we're doing right by our constituents and the universities and colleges and schools we serve really became a high priority, and that's what's behind this webinar today. As just a very brief reminder, if you're not familiar with Case Insights, this is the division of case that our goal is to provide you as members the benchmarks and data that are needed to think about all of those pieces across an integrated advancement solution and how those come together, and again, I think data privacy is a really good example of something that touches all of the different aspects within advancement. So as I said, I am really thrilled to be joined by two experts in this field today, and so I'm going to have Elise Walna introduce herself first. Thanks, Jenny, and I'm the one running the slides, so if I go too fast, it's my fault. So I'm Elise Walna. I run Agility Lab Consulting, and I came from working in the in-house perspective for about 16 years, and I got involved in consulting because I was on the fundraising side of things like many of you are, and I was noticing changes in our abilities to reach out to our audience base and to find new donor prospects. So I help organizations organize themselves and come to the table with all their different key stakeholders represented so that we can create smart strategies for making sure that we're respecting legislation as it changes, but also making sure that we're strategic and managing up about financial expectations and how we stay ahead of those things. Thanks, Elise, and we felt like this was a conversation that we couldn't have without seeing the legal perspective as well, and so we're really honored to be joined by both an expert in the field, but also I think one of the most fun legal counsel colleagues that I've met, Jen Sarasa. So, Jen, would you like to introduce yourself? Thank you. That might be the nicest introduction I've ever received. It's not often that fun and lawyer go in the same sentence, so I'm Jen Sarasa. I'm the Vice President and General Counsel for the University of Illinois Foundation. I also serve as the Secretary to the Foundation's Board of Directors. I've been with the University of Illinois Foundation for about two years, and prior to that, I was in a similar role at the University of Central Florida in Orlando, where I was over the foundation and then also served as Associate General Counsel for the University itself. So, pleasure to be here. Thank you for having me. Absolutely. And so the way it's going to work today is we're going to have a couple of polls in a moment because we want to keep things interactive and we want to hear from you. We are, as I mentioned, the chat is available as you want to interact with one another or comment on things you're hearing. Please do use the Q&A so we can make sure we see questions that you have. We will have time at the end where we'll address questions, and if there's anything we need to address offline, happy to do that as well. And the last piece I wanted to mention is we are providing you the slides or a link that you can download the slides. So, I thought that it might be helpful before you hear a little bit from Elise and Jen to do a couple of things. And one is, if you were part of webinar one, we asked a poll question that was open-ended around what keeps you up at night regarding data privacy, and then we just simply took those results and put them into the word cloud you see here. And it's fascinating given that today's topic, I think, really highlights some of the words that come out strongest. So, things like being hacked, data breaches, lawsuits, et cetera. And so, I think it's really timely that as we move from these challenges that we have and we're thinking of those strategies that we can address this from sort of a business risk and a legal perspective. First things first, though, it's always helpful just to know in general how you're representing your institution. So, we have a couple of poll questions that are specific to your role and the type of institution that you're representing today. So, that should be popping up. And you can see, make sure you go down to both question one and question two. And I would say if you're filling those out, I'm just always curious what we missed. If you find yourself in the other category, please go ahead and throw that in the chat as well. Love to know which, who doesn't fit those molds. Data governance, IT, excellent. Great representation. Great to have, glad to have you both, Brenda and Carl. So I think we can go ahead and post those, Christy. And Cindy, I will say the president's CEO, we sort of thought is that leader that oversees functional areas to fit into that category too. Jenny, are you able to see the results? I'm sharing them now. No. Okay, it looks like about 66% are advancement services and operations. And then the next biggest group is alumni relations and annual giving. Quite a few other actually, and then some sprinkling of the rest. And then for industry, we definitely have the most in the four-year college or university at 64%. And then a good sprinkling about 10 to 16% for the rest of those as well. Excellent, thanks, Christy. All right, so let's go ahead and I am going to pass it off to Elise to set the stage a bit for today. And then, as I mentioned, we will be coming back for a panel discussion as well. Perfect. Okay, so this slide is titled, Who's on First? And that is because when it comes to data privacy, when we're all in our data-driven world, when we're all in our day-to-day roles, there is often a tension that I observe within many different organizations, which is who owns responsibility for data privacy? So oftentimes it looks like fundraising feels the business impact first. Like I was saying about my in-house role, you might be seeing it in your KPIs where your Facebook ads aren't working anymore, which a lot of us experienced, or something to that effect. You're thinking about how you're collecting third-party data inputs, but you're not sure what fields are still legal and which ones are changing. So you feel that responsibility. And legal teams might be hearing about the legal risks and not sure of the full breadth and depth of how they need to dig in based on needing to understand what the organization is involved in from a comms and fundraising perspective. So there's this kind of mix of who owns what. So where we might feel the different changes as fundraisers looks different than where we might feel it as legal teams. So these are a few examples that I came up with based on what I see from my client set, but fundraisers often notice things like unreliable email performance data now. And that has largely to do with the Apple iOS update back in 2021 that impacted the fact that people can't be tracked online the way that they once were. For the same reason, we see decreasing reach via advertising. We see changes in our digital analytics reporting when we do things like implementing our cookie opt-in banners. If you're still using Google Analytics, it looks like you might have lost a lot of traffic, but really it's because people are opting out of cookie tracking. And then last, sometimes donors are asking more questions about how their data is being used. So all that to say we feel not just the legislative impact, but the big data impact from big tech. So again, if you missed webinar one, we went through a lot of these terms. So like Jenny said, it would be great if you wanna take a look at those things if you're missing the context there. And then on the legal side, we might see questions related to whether we need to go for an opt-in model and be talking to teams about how they need to ask audiences to consent once or twice to joining our email lists. We might need to get deep in the weeds of how third-party partners are using organizational data, how we're exchanging it and the like. And then the last thing I mentioned here was questions about how or when we should adopt cookie consent banners and whether we need to revisit the privacy policy based on all the things that are going on under our purview. So as you can see, the list of questions from both teams could be long. So what we're thinking about is how do we bring everybody to the table to make sure that everyone's aware of each other's, not just KPIs, but responsibilities to the organization for the health of our audiences and for the mission at large. So when we think about moving toward a unified approach we really need to think about having all teams come to the table to assess the impact in developing our approach. So in thinking about this, there's a couple of key questions that I'd recommend you think about. And Jen is going to tell us a little bit about her approach to how she built this in-house as well. But these are some thought starters that I think are really useful. So the big thing that I talk to organizations about first and foremost is knowing your risk tolerance when it comes to data privacy and having a really foundational conversation between legal, fundraising, marketing and all of those key players to really define that. So when I mentioned moving toward an opt-in model earlier that looks like making sure that if a person fills out an online form, for example the box at the end isn't auto-checked to add them to your email list. They have to go through the extra effort to check that box. But in moving toward that type of model your email list growth might suffer. So it's important for legal to understand the trickle down strategic implications of choices that they make. And on the same side, it's also important for business drivers to understand the true legal risks that everyone is facing and to have that kind of framework to drive. And really the name of the game is making sure everybody has that understanding of what the risk tolerance is. From there, once you know that it's a lot easier to drive the rest of these questions. So the next is looking at gaps in your technology and infrastructure to say, hey, if a person says they don't want to opt-in to cookie tracking, do we have the functional processes in place to make sure we recognize that opt-out and does it carry through our whole technology stack? Next is tied to that in terms of staff training protocols. Does staff know what's happening when they do things like upload an email list for advertising or something of that nature? It seems pretty easy. They've done it a million times before, but as privacy laws change, the diligence on that side might need to change as well. So staff training protocols and really making sure those operational lines are checked in on, it's really important. The next pieces are where we can gain efficiencies in data entry and campaign sourcing so that as we lose any type of measurement abilities with things like third-party cookie loss, we're still able to track our marketing impact. And then last is really, how do we plan to address emerging questions as they come into play? So while it might be easy to create a one and done data privacy strategy, the law is changing on this all the time. There are, last time I checked, 30 pending state laws just in the US related to data privacy, but that number shifts frequently. So being able to stay in touch across your teams and have a plan for, hey, we revisit this information quarterly, biannually, monthly, whatever the case might be, you just need to make sure that you have a plan in place for that. So this is our third full question. I can read it here. Sorry, I've got boxes in my way here. So this poll question is really when you're thinking about your organization and data privacy standards as they exist right now, we're curious to know what your institution's approach might be. So the reason for asking this question is really to get some insights on whether your in-house legal counsel is equipping themselves to develop data privacy expertise, whether they are having third parties address it, whether there isn't in-house legal counsel, so everything is handled via third parties or the last is really an I don't know. And we'll get to why this is important in just a moment here. Denise, I don't know if you can see it, but it looks like we have a pretty even split between I don't know in-house legal counsel with in-house expertise on data privacy or no in-house legal counsel and then a few with in-house legal counsel but outsourcing the data privacy. Gotcha, okay. So you can see as is represented just from this poll, there's a lot of different ways that people are treating this question. And again, Jen's gonna talk about what her approach has been. But having a clear handle among teams about how legal is going to approach this question is gonna be really fundamental to your ability to operate. So getting that for those of you who answered that you don't know, I think that's a really great first question to start with in building this collaboration and managing up and across with respect to data privacy is who are my resources when I have a question, who's taking care of this and how can we work together to make sure that's answered? So these are the structures we just walked through. And like I mentioned, the ideal state is having a firm handle first on risk tolerance to steer relationships with those legal partners because that's what informs how they might advise. So in a lot of instances where there's out of state legal counsel, where there's out of house legal counsel addressing data privacy, they might make recommendations to redline your privacy policy, for example, in a really conservative way that comes back with just, you know, the whole document is marked up and it's really coaching you to be as risk averse as possible, but that might not be what your internal appetite for risk is. So there might be areas where you're okay with accepting, hey, we don't need to make this edit or this change to our policy because we feel well-equipped that this isn't a big issue for us. So really you want to be able to have that guidance of who's on first, how we should be treating, who's answering questions and really where we come at the risk from. So I'm going to pass it over to Jen, who is a lawyer and can talk through how they went about approaching this at the University of Illinois Foundation. Thanks, Elise. I saw in the slides that there is one other lawyer in the webinar, so feel free to identify yourself if you'd like to do so, I'm curious. So I'll go ahead and start with saying that I'll give a little background on the structure of the Illinois Foundation because I think that's important to understand why we take the approach we do. So we are a separate foundation that has its own employees and we are responsible for the University of Illinois system so we have three universities that we service, Urbana-Champaign, Springfield and Chicago. Each separate university is for the University of Illinois and so the foundation provides support to all three universities in there. So this approach is kind of tailored based on our structure but also a risk tolerance. I think when Elise touched on the risk tolerance and who's on first, a lot of this will depend on the risk tolerance of your leadership. I have so often people come to me and say, you know, I have so often people come up to me and they're like, well, is this legal or is it not? And I know everyone hates to hear this but for the most part, there's a wide gray area. It's rarely yes or no. Trust me, my job would be a lot easier if I could just say yes or no without getting into it but I really view myself and my role as a business partner where we sit down and we analyze the risk and I can tell you, you know, here are the confines that we have to act within but there's a wide range in between there as to what you can do. So we take a team approach at the Illinois Foundation which I really like. So we have what's called a data privacy steering team and what we do is we, well, let me back up first and say that data privacy was identified as a risk sort of during our enterprise risk management analysis that we perform. And so it assigns the overall risk itself to our chief information officer which I think is an appropriate place to live but then it also assigns additional business units with responsible elements. So legal is certainly in there but we don't own the entire risk nor are we the final say in what happens and I think that is the correct approach. So to help facilitate conversations, to kind of avoid confusion and really stay on top of these emerging changes in the law regulations, because Elise is correct, it is a full-time job to keep up with the new laws as they're being proposed, what goes in for each state and what's coming out. So we established a data privacy team and it's the steering team is the folks that forth on the screen. So we have an external consultant slash expert who really serves the data privacy officer and that was handled from the beginning before I arrived, before we built up a team of expertise but it's been a really good approach because having somebody who is an expert in this field is aware of the changes as they're happening and can bring that information back to us. We have our chief information officer, our chief data officer, which is a new position that we have, our chief operating officer, our senior vice president for development services. So again, we have an advancement voice in the room, which is really important. And I try to keep in mind in my job that my job impacts fundraising and that's why we're here, right? It's important that you all can go out and raise the funds in an efficient manner and that we're not slowing you down unnecessarily. And then the general counsel is part of the team. So as you can see, it's a pretty high level team but it works well because it gets all the correct people in my opinion in the room who can make decisions. And so we meet monthly actually to discuss ongoing projects, updates, specific concerns. This is also the response team. Should we ever have an incident or receive notification of a breach or an incident of some kind? This is the team that would get in place and meet and discuss next steps. The other thing we do is we talk about kind of trends and what's happening. And I view this really as a risk tolerance team, right? So this is the place where we discuss the opt-in versus opt-out and the express versus implied consent. And, you know, is it legally required? Do we think it's going to be legally required in the future or is it really just a best practice? And so we kind of, we go through those exercises together to talk about, to kind of come up with a game plan approach. And I will say it's shifted over the two years that I've been here, we've shifted some of our risk tolerance. Sometimes you accept a little bit more risk because once you see the trends, particularly trends in higher education foundations, you know where your pain points are and really where it's, you know, maybe not such a big deal. That's not to say we don't take it seriously, but sometimes we find we may have been a little conservative on the outset. The other thing I wanted to touch on for the next slide is we put into place a data share agreement. And so this is kind of, I'll call it an emerging trend or best practice. So the foundation itself has several agreements with our, we'll call them constituents. The university, we have a memorandum of understanding. I bet most of you have one in place with your universities. If you're at a foundation and that kind of governs the big picture items. Our MOU specifically did touch on allowing access to donor data and records, but it was just kind of a brief mention, maybe in a whereas clause. We have what's called an annual services and management agreement at the foundation that essentially it's where we talk about all the services we provide to the university. And then we talk about whatever remuneration or compensation we receive for those services every year. So as part of the budget setting process, and that's probably a more granular agreement. And in there, it also touches on data a little bit and it says, you know, we're going to provide management and supervisory services at the foundation and we're going to maintain this access of records. And then we have other legal agreements. We have, let's see, a business associate agreement that governs healthcare so that we're HIPAA compliant. And then there were some kind of one-off understandings that I bet a lot of you have where maybe the university provides us with some graduate data at the foundation. And so all of these things are important and we had pieces of authorization in a variety of places, but nothing that really kind of governed it in depth. So what we did is we came up with a data sharing agreement and it's a tri-party agreement that we are in the process of putting in place. I should say there's not ink on the paper yet, but it will be soon. So, and that's between the foundation, the university and alumni association, because those are really the entities for us that are sharing these datas. And so we put in this comprehensive agreement in place that sets up and it addresses things like who are the authorized users and who owns the data? I mean, I think this is a conversation that I hear come up at least with my colleagues. You know, is it my data? Is it alumni's data? Who owns the data? And that's an important question because that's kind of who's responsible for the data, right? So the agreement addresses who can use it and who owns it. It permits data transfer. So that's kind of the situation where we talk about, hey, you can have access to these alumni records. Hey, we can provide you with this information for stewardship purposes. But if we provide you with this information, we want you to use an opt-in approach or you can use an opt-out approach. And so all of these elements were kind of addressed within the data share agreement. And I will say during this process of putting the agreement in place, we had an enormous steering team that was sort of providing input. And then we would go to different groups across campus, particularly our vice chancellors who are charged with the fundraising. And some of the feedback we received was, you know, it was eye-opening. They said, you know, Jen, if you put this particular opt-in into place, you're going to hamstring our program. And, you know, I think that's important feedback because sometimes you take a step back and you go, all right, well, maybe this might be considered best practice, but it's not legally required and how many other people are doing it? And we can stop and reevaluate so that you can have a successful fundraising initiative and we are in compliance and appropriately protecting individual's data. So I will say this process took, we're over a year now, but I think we're finally to signature. So it wasn't a fast process, but it was comprehensive and it was good. It really spurred some good discussions. So the agreement also outlines the responsibilities and it addresses things like the questions we're talking about today. Who's on first? Who's responsible? What happens if there is a security breach? How are incident notifications handled? And what are the penalties if somebody is using the data in the wrong way, creating shadow databases and these kinds of things. It really doesn't reference the processes, but it touches on it because this is meant to be a high level agreement that can stay in place and not get reviewed every year, but perhaps every five years. It does have some exhibits that kind of get more into the weeds, I would say, that calls out the very specific data, but those are able to be swapped out annually without going through the whole process of redoing the agreement. So that's kind of the approach we've taken. I won't, I don't claim perfection, but I think it's a really good process and one that we are always working to perfect as we move forward. Great. Thank you so much, Elise and Jen. And what we're going to do is switch to, and sorry, my video is being a touch slow. Hopefully, there we go. Switch to our panel discussion. And this is a reminder to all of you to submit any questions you have in the Q&A, and we'll make sure while we've got our great experts here to have an opportunity to address those. So maybe we'll stop sharing, and I have a few questions for you both. And maybe I'll start with you, Jen, because I think this is sort of the theme of what you just shared with us. And that is, at the end of the day, what I think you're looking to do is, as you said, as much as we want it to be black and white, it's in fact not, but that middle ground is hard. So just would love for you to speak a little bit about that middle ground between fundraising goals, legal counsel goals, and sort of any advice you'd give on that front. So, like I said before, my job would be so much easier if I can say, yes, it's required here. And in some circumstances, I can say that, and that makes my job easier. It may not make my clients happy, but it certainly makes my job easier. And in the other areas where it's gray, we really take a risk analysis approach. And I don't think, in my opinion, that it's legal's ultimate determination, but it's really your business leaders at the institution. And so I sit down with my CIO, my COO, and my CEO. Sometimes we'll even talk about it with our board, where that level of risk is appropriate, and what actions have we seen? If we're in a gray area, what are the potential penalties we could receive if we find out that we aren't conservative enough in our approach? But then we talk to our advancement partners, and we look at things like, when we do an opt-in, what's the response rate? How many people are opting in? I mean, I think that was really important when GDPR came into place, right? How many constituents did we get for express consent in our European countries? And we know that it was, I mean, at least in the two institutions I've been in, it's difficult. So I think those are the kinds of things we look at and then we look at, all right, well, if they say we're not, what happens if you're not in compliance? What are the penalties? And that's really where having also an expert in this field is so important, because they can come in and say, well, here's what I've seen. Here are the trends I'm seeing. At least I'd love to ask the same question of you, actually. Yeah, I think, so I think Jen is a really great example of fostering that balanced approach. And I think that the number one thing that we can do across teams is to make sure that we are feeding data inputs across all of the C-suite that Jen just named and across senior leadership in general, because the thing that I think sometimes gets missed at certain organizations is legal isn't always aware of how much it's impacting teams. And similarly, I'm sure that's true across the C-suite. So I think as we get those data inputs, the more that we can level that up and make sure that it's being prioritized, I think the better, so that we can communicate, hey, we don't have guidance in this area, so we need to get some, whether that's coming in-house or external. And I think that'll help all the powers that be assess whether this is an in-house role for council or whether it's gonna continue being outsourced or whatever the case might be. And I think that's really the answer to finding a middle ground in all the things where I think if decisions are made in a vacuum, you might hurt business opportunity and vice versa. So I think that's a big part of it. I think we could apply that to a lot of areas in life. So you actually both touched a little bit on kind of managing up, reporting to the board. I'll ask you, Alyse, and then same question, Jen. Any advice you give or anything you've found works best in terms of frequency that you are sharing updates with, either upward or specifically with a board? Yeah, so I think in the client settings that I work in, I've seen the most success in adopting data privacy as a whole is really when it becomes cultural and part of the institution's mandate that everybody knows, hey, it's part of who we are, that we respect our audience's right to choose how they wanna be communicated with and what data of theirs they want to be shared. So I think that starting with leadership and the board is a really great first step to create those agreements. And we'll get into it in our next session with Mark at OSUF. But really having that structured kind of idea of how everybody can operate will help guide everyone beneath. But I think it starts at the leadership and board level. And then I think in terms of frequency, it's important to have all staff training so everybody knows how to create those kind of working agreements with every layer of the organization. And then I'd recommend at least quarterly updates in the first year of adopting these pieces. And Jenna, I'd love to hear your perspective on that as well. And it might be able to be decreased in frequency in the future. I just think data privacy is changing so frequently right now that the more that you can ensure that leadership feels confident that you have your eye on the ball, the better. So we don't have a, I don't know that we have a magic number, but I will say I feel like it is constantly communicated. So when we went through the exercise for the data share agreement, while it didn't require in my specific instance, board approval, I did keep my board chair apprised of what was happening. And then we did a presentation prior to really kind of finalizing the document for the board. And we said, hey, this, we would call this a management decision, but we want you to be aware these are the actions we're taking. This is the way we're protecting. And this is how we're doing it. And it was really well-received. There was a lot of good conversations. We report out on data privacy through our audit committee for the most part. That's the committee that's kind of governed with overseeing these things. And I will say, I think it has been on every agenda for the past year and a half, some element of data privacy, whether that's, and AI is obviously a big topic too. And so that's one that they're very interested in. So I don't see us decreasing in the future because they're interested in information and they wanna make sure. And sometimes I do think when we presented on the data share agreement at the board level, I think it's interesting because it gives you a lot of perspective perhaps that we may miss out. Those of us who have been in this field for a while are very aware of why we collect data, what we do with the data and how we use it. And it seems perfectly appropriate to us, but I will find for particularly my older board members, sometimes they stop and they go, wait a minute, you're collecting my information to do what? And we're like, well, yeah, you're on the board. This is, sometimes I'm a little like, what just happened here? There was a disconnect. It's a good reminder, right? That not everybody lives in this space. And so just to kind of take a step back and rethink your approach and sort of how you explain it to people, I thought that was helpful too. Don't worry, I'm going to come back to AI because I think we can't have this discussion without bringing it up. But I do, you know, thinking about our poll where I believe it was close to a third of the attendees, we're in the, I don't know phase when it comes to sort of the who's on first element and what that structure looks like. I think it's probably helpful just to lean in a little bit on this idea of getting started with the data privacy steering committee. So I don't know, maybe I'll flip it to you, Elise, from sort of big picture, and then you can expand a little bit more to your piece, Jen. Just on that, you know, what do we do if we're in this? We're not sure what to start. How do we get started? Who do we include, et cetera? Yeah, so I've seen it work a couple of different ways. I think Jen's example is a great one of starting with the C-suite. Ultimately, I think the C-suite needs to be bought in because I don't think that you make a lot of change without that cultural adoption of where are we at with that, this, how do we use, how do we view privacy in terms of risk like we've talked about, but not just, you know, the direct legal risk or the fines risk, but really the brand risk of if something happens, that can be really bad for fundraising and marketing and all the things that trickle down from that. So I think the C-suite is where you need to ultimately end up in terms of having that buy-in. And that might look like starting with people who are not necessarily at the practitioner level, but who are closer to it to know, hey, here's the data that we're regularly collecting and that we need to be concerned with. And that might look like somebody from the C-suite going and collecting those inputs, but really it needs to have that holistic approach coming from whoever represents for the steering committee. So I recommend having at least one person from legal, one person from IT slash tech, however it's named within your organization, someone from within the digital team, so largely website ownership, and then definitely somebody from fundraising and or marketing, so that all those different KPIs and purviews can be represented there. And you know I love, I thought it was made a lot of sense to depending on your structure if you've got an independent alumni association as Jen mentioned like that. Yes. That's another group where we're seeing lots of data and continuing to grow as well. Oh and athletics, that's the other big one. Yes. A very large portion for some of us, yes. Jen, anything else to add on that idea of sort of the getting started piece? Yeah I think Elise really explained it well. I will just say having that sort of top down approach and knowing particularly at that at that president CEO level who knows that this is a really important piece, it's so helpful and you know I'm fortunate that that we have it, but because in their mind then when we implement we have a relatively new chief data officer and that is an important position that was sort of elevated because we see the importance of this data privacy. And so having that buy-in from the top level down is important and having your board kind of come in and say this is really important, how are we protecting the brand, that helps a lot. But yeah I mean you need a champion at the top to kind of start the process, otherwise you know it can be a little haphazard and I'm sure Elise sees that when she comes in but it's sort of like well I called legal or well I called IT and I told them about this and then it becomes someone's problem to deal with without a team. And then we're moving away from that collaborative approach right where we're really thinking about how we meet the goals. And we're back in a vacuum which is not a good place to be. So let's do it, let's talk about AI for a moment because again I don't think we can have this conversation and I'm thinking about that question of updates and we just came from our case summit for advancement leaders conference, did a session on AI and we're talking about like our conversation in nine months is probably going to be a totally different conversation from what we're having now because it's evolving so quickly. So I won't make you predict the future but what I would ask you a little bit about is when it comes to AI and data privacy, what are you prioritizing? Can we just talk a little bit to that from a legal perspective? And maybe Jen let's start with you and your role prioritizing and then Elise let's look at it from what you're seeing as well. So we have issued some guidance, I won't go as far to call it a policy but just some considerations because we see it pop up and then you also hear the horror stories particularly in the legal field. You hear about the practitioner who tried to use AI to write a brief which generated made-up cases which involved some you know licensure and bar suspension I believe which is you know a terrible place to end up and it's also not a great thing from you know lawyers aren't particularly known for being super tech savvy and early adopters right so we're already dinosaurs and then you hear this one and kind of everybody pulls back and they go oh my goodness. So our guidance was really more around sort of the thoughtful concerns surrounding AI and the privacy that once you put this information in you have to be very cognizant that it can be in the public domain. So we focused on donor privacy and considerations and so your AI prompts should be generic and then you can drill down into the information later personally but use it sort of as a as a true prompt and don't and not a don't dump all the information in was kind of the guidance we gave. We really want to be careful with our donor information was my main concern so our guidance sort of centered around that. And I'll just add a quick plug case global reporting standards which includes right up front a lot of great ethical components but in that is the donor bill of rights and one which is not a case statement is actually an agreed upon statement from multiple organizations that support the full non-profit community and one of the things that we look to is as we've seen programs that are starting to build these AI policies is just to your point Jen having that donor bill of rights front and center can be a really important guiding light. I'll also just add before I go to you Elise that Case did a Pulse survey with GiveCampus several months ago and so again we have to assume the data is a little different now and that is at that time 72% of the respondents told us they didn't have an advancement policy around AI. I can tell you that this is something that Case is committed to helping pull together some best practices and have some advice for our full community as we look towards this so just wanted to make that quick plug since you set me up so nicely. Elise back to you. Sure so I think when it comes to AI my biggest piece of advice is to really understand what those words mean to you at your organization because I think the state that we're in right now is it's being used as kind of a catch-all for all you know there's the generative AI which is you know what Jen's talking about with using it for copywriting but there's other AI that has been in our kind of DNA as organizations for for years which is the machine learning that comes from you know wealth scoring donors and understanding who our ideal segments are and things of that nature that are very data bound so where I start first is really defining what that means to you when you say the words AI and a couple typical use cases of how you see that transpiring because I think that'll guide from there and I think the biggest thing that comes up with respect to data privacy and AI you know in my world is third-party vendors and I'm making sure that they are being forthright about how they're using your audience's data whether that's to inform you know we see companies like Slack using literal chat threads to inform their own AI models and and that's that's not necessarily something that you would have known outright so making sure that you have a cadence for reviewing your third-party partners terms and have that all disclosed on your end I think is is a really important part of the AI conversation because it's shifting really rapidly and I think right now it's it's it represents a lot of opportunity but it also represents a lot of risk of bad actors and us not really knowing exactly where it'll land so I always like to say that it's useful to proceed with an 80-20 rule where you're focused 80% on things that you know work and 20% on innovation and you know more risky investments and that's where I think we're at with with AI right now. It's funny I'm thinking about the vast amount of food we always have at CASE because of our volunteer member meetings we host that if we were mining the chat it would be a lot of AI about these people like to eat regardless we are going to switch to questions that any of you have so please again if you've got some questions for the panel love to take those there is a question that it's come in from Alex and they're asking is there any advice or a practical approach to assess the risk level acceptance with the leadership of an institution and that you noted or they noted the student side you know could be different from the fundraising side I don't know who wants to take that one. That seems to be it's so individual dependent in my experience and it's it's and it comes across various departments we do find in some situations like student services has a a different approach than perhaps a foundation leadership approach and so some of that the way I assess it personally is through a series of conversations and and questions right where we talk about it just really what are your concerns like what what do you and sometimes there's a little bit of a misconception well legally I can't do this and I was like well let's sit down and think why you legally think you can't do that you know is it your concern about the use of information the transfer of information what could happen down the road so having those conversations with each department to determine at the senior level to determine really what what their risk tolerance is is sort of my approach it's it's not quick like I said that agreement took over a year in place and you know we were pretty far down the road when when we had a conversation with the vice chancellors and they had their teams who had been part of the meeting and I had one vice chancellor raise their hand and say you know I think this is going to be problematic Jen from our from our parent support from our parent fundraising initiatives we'd like to launch and let me tell you why I think it's going to be problematic and so I was like oh these are valid concerns let's stop see what my peers are doing and and have the conversation again and there was a little bit of of risk tolerance difference between some departments on that but we were able to kind of sit down and figure out what a good approach was that was you know still a good practice that wasn't going to hamstring the fundraising side of things anything to add on that Elise um I I think that's well answered I think it's very individual I I think the only thing I'd add is um I think on the you know the mass market level we're often thinking about hey what's the risk that we're someone's going to bring legal action or or sue us and that risk you know in most cases can be low um but that doesn't mean that you know if if there was exposure of a breach or you know um if something went wrong that it wouldn't also hurt the organization so I would just make sure that you're examining it from multiple lenses from the PR side as well and then the thing I always tell people is you know from the major donor perspective when we're thinking about um data scoring as it relates to AI and information that we're releasing about people there are always inferred data sets that become created as a result of the data that we're providing so if you wouldn't sit across from a major donor and tell them what made them a good donor prospect that's a good risk flag of hey maybe we shouldn't we shouldn't be entering into these waters um that might be not a good practice so I would say if you can think of some use cases for what that looks like that's a generally a good path. I love that gut check yeah. I had another question and I will say um having been on the vendor side before coming to CASE I really love this question because I dealt with this a lot as well that was um with regard to data security do you have any resources or best practice regarding when to discard data um and they noted that we have requirements for vendors but we often like to hold on to our own data files um outside of CRM and of course that is going to increase risk um great question Angelina. Do you want me to start Jim? I would love for you to start um well I I have a little template I use that um kind of walks through what qualifies as PII so personal information that makes it sensitive and then has a rubric uh I think you need to walk through where you're collecting all those inputs what's the business case behind those inputs who has access to them internal and external um and then it kind of proceeds from there and I think that's really foundational to understanding everything that's under your hood um so that you can say hey if we are collecting you know people send out cultivation surveys and their email welcome series that nobody reads um if we if we have this survey input that we're not using we can probably release that data but if there are business inputs that are really helpful for us that needs to be taken into consideration um so the only thing I would add to just working through that is uh there are really sensitive data fields um that I think people are starting to collect largely because of uh DEI initiatives and wanting to you know in um you know best case scenario diversify their donor base or or whatever that looks like um but you really need to assess whether your organization can protect that data that someone's giving you about what their life experiences are who they are um and if you can't soundly answer that you can protect it you probably shouldn't collect it so I think that's that's really um the pieces is making sure that you have that true sense of of the what and the why. I will say I feel like I often hear um well we should keep it because it's nice to have that's often a reason that I hear from departments or what if we need this down the road and it maybe not even just data but as you look at records retention which you should have a policy in place on how long you keep records and I'd include data in there um and I always say well it's nice to have until it's not nice to have I mean a at some point it becomes unwieldy I think those of us who have been in been in the game long enough know about the off-site storage boxes that we have and you know the physical file records and we used to like lease whole warehouses and hopefully we're not all still doing that um and now people think well it's you know in the cloud and I use this term and they're like it's fine it's well it's not you know we don't just collect and keep and sit on it because there is really sensitive information in there and there's probably also information you know 20 years down the road you don't want um to have for for a variety of reasons people are always like well you're a lawyer don't you want to keep things no no I actually I want to I want a a schedule and I want to discard because at some point if I'm keeping everything and it does end up in litigation that's a terrible place to be in from uh from a production uh of records point so um I wish I had a perfect solution as a matter of fact if somebody does they could type it down in the in the chat comment for us I would I would really appreciate it great well we're um really right at time and I wanted to remind everyone that the third um part of our series continues that will be August 13th um at noon this time noon eastern time and um as a reminder if you signed up for this session you're actually signed up for the whole series so you'll get information about that and um just special thanks to Jen and to Elyse for providing your time providing your expertise and really helping us start to think about those strategies uh and as a reminder next time we're going to dig into the weeds and given the number of folks that said they were on the operations side I feel like you're going to be real happy um so thank you all um please do let us know if you have any questions at all happy to help
Video Summary
In the video transcript, Jenny Cook-Smith and her panelists discuss the importance of establishing a data privacy steering team to address challenges and strategies related to data privacy in organizations. They emphasize the need for collaboration across different departments such as legal, IT, fundraising, and marketing to address data privacy concerns effectively. The panelists also touch upon topics such as assessing risk tolerance, managing AI and data privacy, and the importance of regular communication with leadership and the board to ensure data security. Additionally, they suggest setting up a data share agreement to govern the sharing of data between entities within an organization. The discussion highlights the need for continuous dialogue, periodic updates, and a clear understanding of how data is collected, stored, and used in compliance with privacy regulations and best practices.
Keywords
data privacy
steering team
collaboration
risk tolerance
AI management
communication
data share agreement
compliance
best practices
×