Jessica: Welcome to the Librarian’s Guide to Teaching! Welcome to our second episode. Today we have Cara Berg as our guest and we're going to talk about student learning assessment and outcomes and whether information literacy can truly be assessed. So that should be an interesting conversation. But before we get started with our interview how are you doing? Do you have anything interesting going on this week?
Amanda: Yea! So two things. Work thing, we are testing out this new drop-in research hours with Zoom [Meetings] where a student can come up to the library desk, click a button and they will be automatically connected in a video room with a librarian. So it's a new model that we're trying and we're still working out the kinks but I'm excited to see if students will actually use it. Personally, I just got a new cat about 2-3 weeks ago and she is finally (pause) working out well with our other cat that we already had. They’re playing well together. Jessica: Awww Amanda: No hissing. They’re actually chasing each other around. So I’m excited that that is going well. I was nervous. Jessica: That’s so cute! Amanda: Yea, so what about you? What’s going on? Jessica: So since as I mentioned in the last episode, I’ve been in this new position for about a month now and I’m diving into different assessments that I haven’t done before. So I’m really trying to assess my instruction in this new role. When I'm giving these surveys at the end of the session and students say, “This is what I think went well. This is what could have been improved.”, I'm trying to track that differently. And as I’m tracking their responses to the questions that are supposed to meet my objectives, instead of just looking at the data and moving on, I'm trying to think of a way to track it more in a long-term since I'm doing the same classes over and over again. There’s these English classes that we do for all these different sections over and over again, which I kind of did at Berkeley but not in the same way. Not as much. So that's what I'm kind of trying to do is Googling around about like a rubric for myself or a tracking model. Do I want to just use an Excel sheet? I don't know. I have all these ideas but trying to figure that out right now. And I'm also trying to up my Twitter game. Like we talked about last episode, it can be difficult to engage in those conversations especially with imposter syndrome. But I feel like I did really good this week with following people, and adding people and commenting on things and actually posting. That's something that you kind of have to do so I think I'm doing good! Amanda: I definitely noticed you upped your Twitter game! You were all up in my feed. Jessica: (Laughs) Yay! Amanda: Good for you! I need to take my own advice and up my Twitter game. Jessica: Yes! I’ll tag you in things. Amanda: Alright, before we dive in, we are going to share a brief bio of Cara. So Cara is the Coordinator of User Education and Reference Librarian at the David and Lorraine Cheng Library at William Paterson University. She has worked as an information literacy librarian for 8 years. Her research interests include active learning assessment and general information literacy. Her most recent publication “Updating Learning Outcomes and Engaging Library Faculty with the ACRL Framework” will be published in the Journal of Academic Librarianship. I also want to note that she is this year's ACRL [Association of College & Research Libraries] New Jersey NJLA [New Jersey Library Association] CUS [College University Section] President and a dear friend of mine. Cara and I have worked together through the library - New Jersey Library Association for many years now. We were once the co-chairs of the User Education Committee so I am super excited to welcome her to the podcast! Welcome Cara! Thank you for coming to talk to us today. Cara: Thank you so much for having me! Amanda: So let's dive right in and talk about assessment. We're going to take a few minutes to give some background on our experiences with assessment. I'll go first. So as the Director of Research and Instructional Services, my job is all about assessment these days. I have experience with classroom assessment but now I'm mainly focused on that bigger picture assessment. But to take it back a step, in the classroom I really try to find a balance between practical and engaging to confirm student learning. The bigger picture, my team and I collect data on a variety of touch points that allows us to have meaningful conversations with administrators about how to authentically integrate information literacy into curriculums. We mostly use Springshare’s LibInsights to gather our data. We also use Springshare’s LibWizard to collect classroom data. I have confidence that our data is creating opportunities for us to meet with Deans and Chairs but I do have concerns about our classroom assessment. Jessica, you want to share a little bit about your background and experience? Jessica: Sure! So as an instruction librarian I've mostly focused on in-class assessments and just ensuring that students meet the objectives that I've set forth for that lesson. And since so many of my lessons are one-shots, I rarely get...I rarely feel like I get to see my students develop real information literacy because I don't see them past that session. In isolated classes when I was at Berkeley, I would maybe see students in an Intro Sociology class then an English class and then maybe in a major level class but that really wasn't the norm. And you know, you [Amanda] were always great about keeping all the instruction librarians engaged in the more department-level assessment and showing our value. But there's so many moving parts and different institutional needs and barriers so I wasn't really as engaged in it as you were. I'm still learning and understanding more about how librarians can really prove that their students are information literate on a more campus-wide scale so I'm really excited to talk to Cara today and learn more! Amanda: So let's take a minute to talk about and provide some quick descriptions of different types of assessments. Interestingly enough, an article was just published this week, right, Jessica? Jessica: At least I found it on Twitter this week. I think the post date was October 4th. Yea! Amanda: So this article came out - “Asking the right questions bridging gaps between information literacy assessment approaches”. This article was written by Alison Head, Alaina Bull and Margy MacMillan and it has a really great description of the different types of assessment. So different levels of assessment. So I’m just going to briefly go over them so we’re all on the same page. So first you have your micro which is your short-term impact on in the classroom. So you teach a lesson and then you assess - “Did those students gain those information literacy skills?” Then you have your meso which is validating the need for information literacy instruction. So that's kind of a lot of what I do. I'm taking the data that we input into Springshare and I'm creating reports and presentations, showing Deans the gaps where we're providing instruction, where we're not providing instruction, missed opportunities, etcetera. Then there’s the macro and the mega. Those are both the benchmarking opportunities of kind of nationwide/national-wide where we can see how you we’re doing against multiple institutions or trends in information literacy. So I definitely encourage everyone to read that article. I’m going to talk a little bit about it later when we get into the questions but I think they have some really nice, easy descriptions for people to think about their assessment practices. Jessica: Yeah, I agree. And I liked some of the questioning things that they got into at the end like how to ask the right questions because the data that you get back is really what you're analyzing to show value and learning. So if you don't ask the right questions, you’re not going to get good information. So it’s a great article and we’ll put it in the show notes. Amanda: Absolutely. So just to touch on that point for a second. I always say to people, “what are we doing with that data?” You know? Are we asking the right questions to use this data effectively? If we're not using it, then why are we asking it? Why are we collecting it. Jessica: Exactly. So let's talk to Cara! So we know that you've worked with ACRL’s Project Outcome so could you describe using that as an assessment and any other related projects you’ve done with it? Cara: Sure! So we were actually one of the institutions that were part of the field test for Project Outcome, so when it first came out they were asking who wanted to be a part of this. I actually wasn't the person to volunteer - that was my colleague Nancy. So she signed up for us and what they did was, we set it up during the Fall 2018 semester, and they have several different surveys so we chose to use the undergrad instruction one. So if you actually look on their website they have a lot. I recommend everyone listening, you can go take a look at the information and the data gathered. For us, we implemented it at the end of all instructions sessions that weren't that basic first-year “Welcome to the library” type class. So we did it for the sophomores and then above. Everything else besides the welcome to library introductory classes. We actually had about 500+ responses and we only had the survey open September to October. So prior to the Project Outcome field test, I did all the general assessment survey that we sent out at the end of all classes so it's just a very quick, “how confident do you feel in the library's resources?” type survey. I wanted to keep it really general because I wanted to make sure all librarians maybe who aren't doing their own assessment could use it and feel comfortable just sending it out. So first I was kind of excited because the field test had very similar questions to what I have been asking so I was just like, “Ooh, that’s exciting to me!” But also we already had people kind of in the practice of sending out that survey so I think we were able to get these responses because people were already used to distributing the information. Um, so again, the surveys itself were general enough and I was really happy to see the field test to worked and ACRL now has this survey available as a whole part of the Project Outcome suite. So the questions asked in the information instruction survey was “I learned something new that will help me succeed in my classes.” “I feel more confident about completing my assignment”. “I intend to apply what I just learned”. Honestly, that question there I think is really, really essential because a lot of times when we do instruction, we worry, “well, are they actually going to use what we showed them?” So to me, that is a really key question and I’m really happy to see that there. “I am more aware of the library's resources and services”. I mean, we [unintelligible] want to hear yes after that. And we asked, I mean, they asked “What did you like most about the session?” And “what else could the could the library do to help you succeed in your classes?” So that survey shouldn’t take that long for the students to fill out so you don't have to eat too much into class time but you still get really, really, really, great data. Now similar to the surveys we did in year’s past, the responses were honestly, mostly positive. You know, students had the session and they really see the value in instruction. When we looked at our data, again, it was a lot of - they said they were going to use the resources again, they mentioned concrete things they liked about the session. One thing that's nice to see is someone saying not just, “I learned how to find a book” or “I learned how to find an article” but they mention specific resources or specific skills. I think that’s really important. And then if you see a lot of students mention the same thing, you know you either did really well with that or that might be something you need to go fix. But anyway, we did the Project Outcomes again Fall 2018. So when I relaunched my survey, I took some of those questions and incorporated them into my survey. So my survey is still my old questions plus some of the Project Outcome ones and I like it. I think it works. I think the librarians are still sending out the survey. I still have to remind people to send it out. But overall, think people are doing it and we’re getting this good assessment data. And that was more the Project Outcomes one. But we have done other assessment activities. Same colleague I mentioned before, also surveyed faculty every other year they brought classes in for instruction. And we did that every year for almost like 12 years and then a couple years ago I surveyed the faculty who didn't bring their classes in for instruction because you're trying to find out, “well, why haven't you?” especially if you have a research based assignment. And we found out such interesting information from that. Notably that 33% of the respondents didn’t even know that this was a service that the library provided. Which to me is mind boggling! But we were able to tailor our outreach to that population and knew that, Ok, people don’t know that we do this. So that’s the two really big things that I work a lot on. So I think, Amanda, that's more of a micro or the meso that you were talking about. I feel like that's where that is involved. We do LibQual surveys and I think that’s more on the national-wide data but I think we only do that every three or four years. Amanda: Interesting. Jessica: So have you used the Project Outcome data to benchmark in that way through Project Outcome? Cara: We just, honestly, we just gathered the data, we shared it with the librarians and everything and then it'll be in our assessment port, assessment report, for the end of the year. Jessica: Ok. Great. Amanda: Yeah, that is interesting. How often are you guys reviewing the data and who are you sharing the date with? Cara: Honestly, that’s a really good question. It’s um, with the Project Outcomes, we had that, we showed everybody at the faculty meeting. With the different faculty surveys we did we would share every one of these faculty meetings, too. The individual instructor surveys are a little harder. I have access to them and it’s up to me to kind of distribute it to faculty and it doesn't happen as... I need to get a little better at being more rapid with disseminating the results ‘cause sometimes I get it out and I'll be a while after the class and that's not as useful. So that’s a practice that I'm trying to get more in the habit of, sending him out right away when we get that survey. We use Google Forms for that so it’s a little tough to get in and then get the results out and we don’t have everybody who has the password. Amanda: Right, right. And then I guess one other follow up question is how many...how are you administering your Project Outcome surveys? Are you doing it electronically or paper-based and what's the completion rate like? Cara: So when we do...so right now the survey we do now isn't the Project Based Outcomes one. It’s just the general survey that we incorporated some of the questions. We send it out electronically at the end of the class. When I first developed the survey instrument before the Project Outcomes, sometimes we would email it out and the response rate for emailing students the survey was not good. You’d get maybe one or two students out of a class of 20 who would go back to their email and respond. In the class, it's a lot easier. Most the students will fill it out the, response rate’s pretty high per class. It depends also, if we’re in a lab. I mean, I know it sounds kind of obvious but in some cases we won't have our instruction in labs we might go to the professor’s classroom and depending if the students have their own devices depends if they fill it out. So the response rate is much higher when we have an actual software we can push it out to them but emailing students the survey after the class - the response rates very low. So what we say to do is try to get it out those last 5 minutes. Take the time to send it out. Tell the students. And is some instances we have software that can push the link out the individual computer so the students don't even have to type in a tinyURL. I really like tinyURL’s, I make them up a lot. I think they’re good and easy to use. But even so, if a student gets the survey on their computer they fill it out quicker than if they have to fill out a tinyURL. Jessica: Yeah that's definitely true...Yeah. Amanda: Yea...I looked into the Project Outcome. I was really excited about it but I think it's too...I think what we're doing is kind of beyond that at this point. So, but what I really, really liked about their website in general is the tools that they provide on... for people who are just starting out to do assessment and that explanation. Cause I think, and we’ll talk about this in a minute, I think the biggest challenge of assessment is librarians don't feel like they have the knowledge and confidence to create assessment tools and I think this is a nice stepping stone for librarians who are who are interested in doing assessment. Cara: No, I agree, I agree. I think...I think assessment is scary and I think... I don't think it's because people don't want to know how they’re doing ‘cause I think people want to know especially in instruction - you want to know if you're being helpful. I think it's just kind of intimidating when you’re working in a university setting and you have all these other departments doing this, you know, high tech, really intense assessment and you’re like, “what do I do? how do I do this?” And I like the Project Outcomes. I thought it made assessment a little more user-friendly and it wasn't just instruction too, they have other surveys. You could use it for programming which I thought was really helpful. Amanda: Yeah, yeah I think so too. All right, so let's get into our next question which is, what challenges have you faced the most when trying to show the library's value with assessment data? Cara, you want to start? Cara: Sure! So I think my biggest challenge is what Jessica was saying. So we have, we don't have the direct assessment a lot because we don't have the student papers. So we have a surveys for post-class and I still think that's a great measure but it's still, we're not getting that direct student work and that's really hard to get. And we lose touch with the students once they have the class so we don't know how they do. I still think it's a really valuable measure to do these surveys and I’ll talk about that in the next part of the podcast. But I think that is kind of challenging because when you're talking to people about information literacy and the skills and you’re saying “well the students all felt confident” and then people will say, “well how do you know?”. At some point, if you don't have to work, we have an idea but we’re not, we don't know for sure. But what also is interesting and kind of a little funny, when I show people or I talk about the data, I’m like, “All these students felt really comfortable about this”... I remember I had my results up, this was the first survey I did saying most of the students rated us a 4 or a 5. It was a scale for “how confident do you feel using the library’s resources.” And [unintelligable] a former faculty member looks at it and he’s like, “wow that’s really interesting to me.” Yea, of course, we’re really good at what we do! This makes sense! But I think people are kind of surprised that students felt so good after having an instruction session. So that was really interesting to me. Amanda: Yeah, yeah. I definitely understand where you're coming from. I would say I face a lot of challenges when it comes to assessment and using assessment data. I think, even just to take a step back, my biggest challenge is getting librarians on board to administer assessment and develop assessment tools. One of the issues is, like I said, they don't they don't have the confidence or are nervous that they're not creating a legitimate tool. And then time. I mean when you want to do meaningful assessment at the micro-level, a lot of it is reading papers, open-ended responses, and assessing that data takes a lot of time. And I don't think librarians necessarily have the time to do that and I think it's important but I think that librarians find it too time-consuming. And there's just no way around it. I mean when you want to prove...in the article that I had mentioned earlier about “Bridging the gaps” and asking the right questions, when you want to prove your impact on learning, you have to take the time to really thoroughly assess a paper or an annotated bibliography or a discussion board with a rubric and you know, it's not... it takes time. I guess from my point of view outside the classroom I think another challenge that I face is human error. So the way we have it set up with our instruction is every time a librarian conducts instruction they submit a form that we created but it's a lot of manual input. I tried to automated with pre-filled forms as much as possible but there's some things that librarians don't interpret correctly and it skews our data so I think that's one of my challenges. And obviously some of these things are correctable but over time it's hard to go back retroactively make those corrections. And if a librarian doesn't submit that form then to me that instruction didn't happen and I can't report on that data. So human error and you know getting people to participate in assessment is my biggest challenge. Jessica, you want to share your challenge. Jessica: Yeah, sure! So I definitely think as I mentioned most of my assessment is on that micro level in the classroom so it is just designing inclusive objective-based assessments for a one-shot session that really show student learning. Like LibWizard forms and tutorials are great and group activities and questions are great but it is a challenge in a one-shot session to come in, not knowing the prior knowledge of the students, have to present them new ideas, and then show that they're learning and keep them engaged at the same time. So it's all like a package deal. [Everyone laughs] And I do get bogged down sometimes with assessing certain things that the professors want them to know and teaching those narrow library skills that get them to complete the assignment as opposed to learning skills that might be more widely used across their classes. So that does happen sometimes too. And this is by no means a new problem for instruction librarians but for me it's been the biggest one. Do either of you have kind of go-to in-class assessments that are like a really holistic representation of student learning? Cara: I mean, we have so the survey we've been using, is pretty, it's just really general. So it's just “what did you learn”, “do you think you would be able to use these resources again.” Jessica: Right. Cara: But you know, like Amanda was saying, I’ve had librarians come up to me like, “Oh, I forgot to send out your survey” but I’m like, “It’s not my survey, it’s our survey.” And like, what I do is...we developed the questions based on the learning outcomes and again we have very generalized learning outcomes but you know I really agree with what you say with the skill set versus the “well, help them find an article”. And then if the professor has an assignment and they’re like, “well they need this from this source” and you miss the overall skills part of that. So I definitely get it. Jessica: Right, it becomes, "Here’s how to find an article for a literature review”, and not, “here's how to find articles”...you know, that you can also use these skills in your everyday life. So it gets to be a little narrow sometimes and the same thing that you were saying before, we can't always see their longer-term assessments, their papers, afterwards so we're not able to get that either. Some faculty are good about it but others are not and I understand we have to support, we have to support that but that's my challenge. That's my biggest assessing challenge lately. Amanda: Yea, lately I am all about the computer lab. Or checking out laptops and using them in the classroom. I want students to walk away with something. I want them to find it meaningful to them. So I only go in after they've been assigned topics and I'll only go in if I can do a full workshop where they're submitting and searching and then I can have something tangible to walk away with that's not a quiz. You know, “what is a database?” and then them pick from a multiple choice. That's not helping me. That’s some type of indication but that doesn't help them in the long run so I stay away from that. So my assessment is complex. It is I have to take the time to go over the results and confirm. So that’s my go-to. It’s not quick and dirty but it is useful and it’s authentic and it gives me something to go back to the professor and say “Your students struggled with finding that industry report. Only one student was able to find it. They need this skill for their research paper. Let me provide you with an additional resource.” The only way I'm able to do that is if I ask them in my worksheet, “find an industry report”. Jessica: Right. Cara: We have, um, this Professor who I work with in business statistics and he actually had me help him develop the rubric for the assignment. And that’s amazing because that doesn’t happen that often if at all but it was wonderful because I go in and teach the students what they need and he hasm you know, for them to get an “A”, they have to do a complex library thing. They have to do an interlibrary loan, they have to analyze a scholarly peer-reviewed journal article. And then there was one semester I was able to go in and see their final assignments with like, I sat during their presentations and I got to see it. And that was amazing because I saw what I did in action and I saw their final work and I saw them using it. But that’s not all the instructors. So I get it. That’s another thing. I wish we could do that for every class. I don't have time to physically go to every class and read every single final paper. I wish we did. Jessica: Right. Amanda: I think that's a great segue into our next question which is, on a large scale, can we ever prove that our students are information literate when they graduate? And you know, maybe think about this idea of are we going in circles when it comes to our assessment efforts? Are we ever able to prove information literate students? Cara? Cara: So I think the surveys that we do, so the micro and the meso, I think they're still essential and necessary. Even if it's not the direct assessment because if it wasn't working we would see, you know? The students saying, “I didn't understand this” or “no I'm not comfortable” and I think they're still necessary even if it's not that direct assessment. With that said, I think there's a lot of really interesting research projects that maybe can be done so maybe one thing would be to study a group of students long-term as they go through college and see how their information literacy skills improve. But the other thing is, and Jessica you brought this up, information literacy isn’t just, “I know how to research” it’s life-long learning. So I would love to see some sort of research study of student’s information seeking behavior after they leave college but not just like research-based. Like every day life. Are they still sharing bad articles on Facebook? Are they actually reading things before they send out that information? Are they processing their news well? I don't know how to actually do that but I think that might be really interesting because again, we’re talking information literacy as a skill they have for life, not just “okay I can write a research paper”. And you know, we know this, most of our students are never going to write a research paper after they leave college because they’re not going to go into the field that we’re in. And that’s ok! But we still want to make sure they know how to have these skills. So I know it's a very long answer to that but I don't know. I think, I think there's a way. I just I wish there was a way that we can look at them after they leave and see how their processing information. Amanda: Yeah, I agree. I think Rutgers was doing a study like that with their Masters, with their graduate students. I think - I could be wrong. But I think they were doing something where they were piloting where they agree to participate in a study and they were tracked along the way. Cara: Oh my gosh, that sounds so god. Amanda: I think so. I’m not 100%. I think I read something about that. So here's my take. I think the first thing that needs to be done and this is not easy. This is still my my goal, is to get information literacy integrated into curriculums. I think the only way to authentically assess information literacy is to have faculty talking about it and not just talking about it but recognizing that it needs to be integrated and systematic. Like one shots serve their purpose but it needs to be a part of a bigger piece of the puzzle. It needs to be a part of, “okay this is a class where we're going to assess if they know how to access information.” “This is the class two semesters later when they're going to show us that they know how to use information.” And it needs to be system-wide, all sections of that course so that we can benchmark the students. I also think that there needs to be a post-test exit survey of some sorts before they graduate where we can say “OK this, every student that is in the last semester of their time at this institution will take this information literacy whatever quiz, survey, questionnaire” and we can use that data to be confident that they are somewhat information literate. I think one way would be interesting if I don't know how valid it would be would be to track students through a program like you were kind of talking about. Maybe using your student ID so every time they participate in library instruction students have to provide their idea. I don’t know what the FERPA laws are against that. I think it's, I think it's something. I think it's something where you can say we tracked a cohort of students and we’ve seen their progression, or that nothing has changed. But that's that's, that's my hot take. Jessica? Jessica: I love that take! But I agree, I think the idea of it being more integrated and systematic is really where it needs to go in the end. And I mean, maybe this current information climate will help that happen cause there are all these calls of like, “we need our students to be more information literate!” And I think they're kind of talking on a elementary, middle school, high school level but I mean if we can get it integrated there then college is just as important, you know? And like we said, information literacy is such a lifelong process from when they're little kids that everybody's going to have different experiences every single year with information literacy and it's just going to be we're really getting them in the snapshot of their lives and they're all starting a different entry points so that also makes it a little difficult to measure with all these confounding factors of where everybody’s starting and “what do they know?” When their personal life experiences have such an impact on information literacy skills, too. So I mean at the very least, I hope to also just inspire curiosity in information with my students to at least there's a chance that they continue to learn those skills. Like if I can get them interested in disinformation and that becomes something that they become interested in. Like you said, Cara, like not sharing bad Facebook articles and stuff like that. So if I can find the right thing to hook individual students on learning about learning and learning about information then at least that’s one good thing we did. Amanda: Yeah, I could talk about assessment for hours. Jessica: Yea [laughs] Amanda: It’s such a complex topic and it’s so interesting. I think it’s...sometimes I get exhausted by it sometimes. I feel like we're over assessing. Sometimes I feel like we're not assessing enough. But I I still find value in what we're doing. I still think we are on the right path to proving students are information literate. I think it needs to come from the systemic change so...Thank you so much for sharing your your perspectives! This was a really interesting conversation! We're now going to move into our Work Triumphs/Work Fail segment? Jessica? You want to share your triumph or fail or triumph and fail? Jessica: Yea I have, well mine are kind of interconnected so that’s good. So for some of our standardized instruction and I am at a new institution so I'm doing all of these new lessons that I’ve never done before. They’re all English classes and the assessments are paper worksheets. But since I’m so used to LibWizard from my time at Berkeley, the first thing I did was convert the handouts into LibWizard forms and I started using them in the class. Which has been great! I love being able to run the Excel reports after the class and see everything all in one place. And when I had first started, my first week was already doing a LibWizard evaluation, just to ask them how I was doing. And so now I was able to combine the class assessment for their learning with the assessment of my session and so I get all of my data together. So that was awesome. So that was my triumph. Using technology to get all of my data. But my fail was that that students in the first two classes I used it with were really confused by the form. I guess that they haven’t been assessed or gathered data that way so they weren’t entering the data in the right places, and so when I checked the data after, they weren’t really putting in the articles they were getting. And I thought I’d explained it properly but it wasn’t working out. So I really had to, next class, go through the form from start to finish from like the first 5 minutes of class. Which, maybe someone else would have known to do that but I just figured I could show them as we started each section, “Work on this section...then work on that section.” Now I fixed my fail. I'm on the right track. But that was my 2 class fail of not getting good data because I didn't present the form correctly. Amanda: Better luck next time! Don’t be so hard on yourself. Cara? Jessica: You know me, that's what I do. Amanda: I know! Cara: So um, I'll do the fail first and then I’ll do the triumph. Because the triumph has to do with assessment - the fail doesn’t. So this actually happened this week. I had...it is a very introductory English class. The professor said the students are doing a research paper. This will probably be their first time in the library. I let them pick a topic - anything they wanted. I picked all of these introductory databases like Points of View. I figured they’re all going to do a social issue, something like that and they all came in with biographies. So I had to completely...and I didn't know that. So I had to complete think, I have to show biography resources - and I don't do that often! I’m a business librarian. That’s really more of what I’m dealing with. I had to kind of change my instruction on the fly. Which, I’ve been doing this for a while so it was ok. But I kind of wish I had better...I wish I knew what their topics were ahead of time ‘cause all of a sudden, they’re going around room and it’s all biographies and you can’t use Points of View for a biography. And you have to search the catalog differently for biographies and these students aren't used to what we have in an academic library, they’re used to public library type books but we don’t have that. It was challenging. But I wish I had known ahead of time so I could prepapre differently. That’s kind of my fail. Even though I think I worked through it okay. But my triumph...So I know this has been around for a while but I finally. I just tried a Kahoot for the the first time. Actually it has its own assessment data which is really exciting. So I used it in another first year English class where I explain the difference between scholarly and popular articles. And then after my 20 minute lecture I said we’re going to review the concepts with a Kahoot and first of all, the students were super excited because they all used it in high school. Like so much so where when their classroom professor was working on the volume control because she thought the music was too loud, they were like, “No, no we need the music! The music makes the Kahoot!” [Jessica laughs] It was all so great, because I saw the answers come in and I could say, “ok, half of you got this wrong. Let’s review this again.” And now I have that data. too. So I don’t know. I really made the class more interesting. The students were really, really engaged. It's just funny how this little game that they did in high school. They lit up when I did it. That made me feel really happy. I wish I had started using Kahoots sooner but better late than never. Jessica: Seriously. Those are the best! I love those. We've been doing these half hour, Intro to the University sessions and some students just think it's dry and so they zone out. But then when I turn that on, it’s like the room lights up! It's crazy. Cara: I have...I never...I honestly I did not expect the reaction I got because I’ve done Poll Everywhere and they’re like, “Yea, okay yay…” but the Kahoot. It was, I have no idea. It was awesome! Amanda: I'll have to try a Kahoot. I haven’t tried it. Cara: No?? Amanda: I always get nervous with like that kind of stuff in the classroom but I think I might explore it considering how you guys are praising it. So I'll definitely try it. My triumph and my fail are connected. My Triumph was I couldn't get into a class because of scheduling conflicts. And the professor wanted me to come back and she's like, “can you come back and work with them one-on-one and we'll do like a workshop where they have computers.” And you know I was so excited but then I couldn’t do it. So my triumph is that I created a little LibGuide, I put 3 videos together. I put a LibWizard embedded in the guide with a concrete assessment tool. And I said the professor, “Look, I can’t come in but you could still use you class time to let your students work on the project. Here’s an assessment tool that’s attached to the research guide. Or you could assessment as a homework assignment and I can give you the data to show you how your students are doing.” My fail is is that she never responded, the professor, or responded to my email and I feel weird about following up. I'm really bad at that following up part. Like...so I think that's my fail. My fail is that I created this really great resource and I had a good connection with a professor who was excited about working with the librarian and I couldn't meet her specific need to be in the classroom and now she's not responding to me. Jessica: Oh no! Amanda: Yeah so that was my fail… Jessica: That’s an awesome idea, though! Cara: Yea! Amanda: Yea, I mean I was just trying to be creative. I mean, we can only be in so many places at once. And I just, I couldn’t get there. I couldn’t shuffle it around I thought this was a good alternative. And I mean, but I guess everyone is just different. I’m just disappointed on several levels. One is that I created the resource for her class now it’s not getting used. Two is that I'm hoping she's not turned off by the fact that I couldn’t come. Or three, maybe she’s put off by the fact that I provided her with a suggestion of providing/giving students a homework assignment. I don't know. Maybe I’m completely overthinking it and she just didn’t read my email. Either way it’s a fail and it’s maybe a missed connection at this point. Thank you so much, Cara, for joining us tonight! Cara: Thank you both so much for having me! Library assessment is really interesting and really dynamic and you know, it's good that we're talking about it and who knows, maybe more and more people will start to do it and we can have full conversations. Yea, you never know! Thank you guys so much for having me! I'm so excited to be on this. Amanda: And that was our interview with Cara Berg! We hope you enjoyed our conversation about assessment. Jessica, do you want to tell everybody where they can find us? Jessica: Sure! So I am on Twitter @LibraryGeek611. Amanda is on Twitter @HistoryBuff820. And you can send us an email at [email protected]. So send us an email or a tweet to share your questions, ideas for potential discussions, and triumphs and fails, and you can also hashtag your Tweet with #LibrariansGuideToTeaching. Amanda: Great! I just wanted to quickly note that we are also now available on iTunes. Please be sure to find us there and subscribe to our podcast. We want your feedback, questions, and we encourage you to share your triumphs and fails to be read on an upcoming episode. So we look forward to sharing another episode with you and will talk to you soon, take care! (Upbeat music)
0 Comments
Leave a Reply. |
About the podcast:The LGT podcast is hosted by two instruction librarians interested in sharing their experiences teaching information literacy, discussing current trends, and having meaningful conversations about librarianship. Archives
May 2021
Categories |