Annual Conference on Teaching and Learning Assessment - CONFERENCE PROGRAM SEPTEMBER 13-15, 2017 - Drexel University
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
Annual Conference on Teaching and Learning Assessment CONFERENCE PROGRAM SEPTEMBER 13–15, 2017 PHILADELPHIA, PA DREXEL.EDU/ACONF
MESSAGE FROM JOHN FRY MESSAGE FROM BRIAN BLAKE PRESIDENT, DREXEL UNIVERSITY PROVOST, DREXEL UNIVERSITY I hope you will join us at Drexel The expectations placed on higher for Facilitating Conversations that education to foster and document Matter. students’ active and deep learning I commend our Provost, Brian have never been higher. We live in Blake, and his team for spearheading a time of economic uncertainty, this event. It’s important that we global interdependence, and urgent share best practices across higher challenges. If our students are to be education. Colleges and universities equipped with the skills to succeed face great challenges, and we in such a future, we must reject any must work together as colleagues to find solutions. Effective claims of quality learning that do not include as their focus assessment will be critical to that process. students’ active learning and understanding and our ability to If you’re from out of town, we look forward to hosting in assess such claims. you in Philadelphia. I believe Greater Philadelphia is the At Drexel, our assessment activities are based on institutional hub for higher ed in the mid-Atlantic region, based on a high values that aim to produce relevant and functional data for concentration of exceptional institutions and a long tradition aligning curricular design, course content, and pedagogical of educational leadership. Philadelphia is also a great place to approaches with Drexel’s mission and values. In all be inspired by our nation’s history, and to enjoy yourself at our assessment activities, the faculty and staff endeavor to take amazing cultural destinations and great restaurants. full consideration of the different educational and cultural I am pleased that Drexel’s Conference on Teaching and backgrounds of our increasingly diverse student population. Learning Assessment has become an annual national and The primary objective of our assessment program is to establish international event, and I look forward to seeing you here. a practice of action research that informs planning and results in tangible improvements for our students. In attending Facilitating Conversations that Matter, you will enjoy three days of thought-provoking speakers, workshops, and invaluable networking on Drexel’s beautiful campus, just minutes from the heart of historic Philadelphia and the birthplace of our nation. Come join us as we work together to ensure that all students have continuous opportunities to apply their learning to the significant, real-world challenges which, no doubt, lie ahead for them. 2 BUILDING ACADEMIC INNOVATION & RENEWAL
CONNECT WITH US CONNECT WITH US View the online version of the conference schedule at drexel.edu/aconf/program/schedule. Here you will find all of the conference materials and session descriptions you may need. WIFI for the conference is sponsored by username » aconf2017 password » drexel17 WIFI Instructions: 1. Choose the Drexel Guest network from the available wireless networks. 2. Open a browser and attempt to access a web site, you should be directed to the Drexel Guest login page. 3. Click on “Sponsored User” instead of visitor 4. Enter username and password drexel.edu/aconf 3
LeBow Hall 3220 Market Street CONFERENCE LOCATIONS Pearlstein Business Main Building Learning Center 3141 Chestnut Street 3218 Market Street Shuttle Stop Parking Garage D CLOSE AREA Papadakis Integrated Sciences Building 3245 Chestnut Street Creese Student Center Behrakis Grand Hall 3200 Chestnut Street LEONARD PEARLSTEIN BUSINESS LEARNING CENTER GERRI C. LEBOW HALL (LEBOW HALL) The Pearlstein Business Learning Center is a four-story, The 12-story, 177,500 square-foot home for Drexel University’s 40,000 square-foot facility containing numerous executive Bennett S. LeBow College of Business features an innovative classrooms, technology such as video blackboards and array of classrooms and collaborative academic spaces as document cameras for video conferencing with students, well as an environmentally friendly design underscored by a corporate executives and instructors at remote locations. dramatic five-story central atrium. CONSTANTINE N. PAPADAKIS INTEGRATED SCIENCES BUILDING (PISB) JAMES CREESE STUDENT CENTER The 150,000 square-foot building houses 44 research and (BEHRAKIS GRAND HALL, NORTH & SOUTH) teaching laboratories for biology, chemistry and biomedical Behrakis Grand Hall is the Creese Student Center’s ballroom, engineering and a six-story atrium containing a 22-foot wide, located adjacent to the Main Lounge and left of the lobby of 80-foot tall biowall, North America’s largest living biofilter Mandell Theater. Behrakis Grand Hall is frequently utilized and the only such structure installed at a U.S. university. for banquets, lectures, meetings and conferences, as it can accommodate up to 1,200 people. 4 BUILDING ACADEMIC INNOVATION & RENEWAL
SCHEDULE AT-A-GLANCE SCHEDULE AT-A-GLANCE WEDNESDAY, SEPTEMBER 13 9:00 – 12:00 PRE-CONFERENCE WORKSHOPS An Administrator’s Guide to Fostering a Faculty-Led Assessment Process PEARL 302 Jacob Amidon & Debora Ortloff, Finger Lakes Community College Creating & Assessing Campus Climates that Encourage Civic Learning & Engagment PEARL 303 Robert D. Reason, Iowa State University Ready, Set, Go: The New Middle States Standards and your Assessment Practice PEARL 307 Jodi Levine Laufgraben, Temple University Assessment Toolbox: Supercharge the Direct Assessment of Student Services PEARL 101 Michael C. Sachs, John Jay College Leading Change: Tackling Institution, Program, and Individual Challenges that Derail Assessment Initiatives PEARL 102 Catherine Datte, Gannon University Ruth Newberry, Blackboard Inc. 1:00 – 2:00 WELCOME & OPENING PLENARY Creating a College Culture Where Assessment is a Pathway to Student Success MANDELL TH. M. Brian Blake - Provost, Drexel University Opening Message: Creating a College Culture Where Assessment is a Pathway to Student Success Sylvia Jenkins - President, Moraine Valley Community College 2:00 – 2:15 BREAK 2:15 – 3:15 CONCURRENT SESSION 1 Building Faculty Support for a Quantitative Reasoning Requirement: Holistic Assessment of Curriculum and Learning PISB 104 J Bret Bennington, Frank Gaughan, Terri Shapiro and S. Stavros Valenti - Hofstra University From First to Final Draft: Developing a Faculty-Centered Ethical Reasoning Rubric PISB 106 Genevieve Amaral, Dana Dawson and John Dern, Temple University Student’s Leading the way: Student Driven Assessment PISB 108 Timothy Burrows, Virginia Military Institute Closing the Loop on Data Collection and Program Improvement PEARL 101 Chadia Abras and Janet Simon Schreck, Johns Hopkins University Criterion Met. Now time to Reflect PEARL 102 Kathryn Strang, Rowan College at Burlington County Implementing Assessment in Student Conduct: Understanding a Balancing Act of Challenge, Support, Accountability, and Growth LBOW 109 Jeff Kegolis, The University of Scranton Faculty as Networked Improvement Community: Alignment of EdD Program Learning Objectives, Standards, and Measurable Outcomes LBOW 209 Joy Phillips, Kathy Geller and Ken Mawritz, Drexel University Building a Culture of Assessment and Embracing Technology: A Communication Studies Program Success LBOW 108 Patricia Sokolski, Jaimie Riccio and Poppy Slocum, LaGuardia Community College 3:15 – 3:30 BREAK 3:30 – 4:30 CONCURRENT SESSION 2 Self-Esteem is Doomed: A Paradigm Shift to Self-Compassion Allow Everyone to Thrive in Higher Education PISB 104 Laura Vearrier, Drexel University Snapshot Sessions (5 minute Mini-sessions) PISB 106 Does Class Size Matter in the University Setting? Ethan Ake and Dana Dawson, Temple University I See What you Mean: Using Infographics and Data Visualizations to Communicate your Assessment Story Tracey Amey, Pennsylvania College of Technology The Impact of the 3R2V Strategy on Assessment Questions in the Science Classroom. Deshanna Brown, Barry University and Broward County Public Schools Assessing and Addressing The Digital Literacy Skills of First-Generation College Students Nicole Buzzetto-Hollywood and Magdi Elobeid, University of Maryland Eastern Shore Utilization of External Reviewers for Student Learning Assessment Anthony DelConte, Saint Joseph’s University Core Curriculum Outcomes: Reflections, Reactions, Results, and Other Assessment Tales Seth Matthew Fishman, Villanova University Developing an Exceptional Academic Advising Program Using Student Satisfaction Survey Data Debra Frank, Drexel University drexel.edu/aconf 5
Faculty Centered Assessment: getting the Right People to the Right Place at the Right Time Brooke Kruemmling, Salus University SCHEDULE AT-A-GLANCE Showing Educators How to Teach Traumatized Students. Jonathan Wisneski and Anne Hensel, Upper Darby School District and Drexel University Assessing Critical Reflection: Learning in Faculty-led Short Term Study Abroad Programs: Students in Developed Countries PISB 108 Akosa Wambalaba, United States International University Both a Science and an Art: Designing, Developing, and Implementing Academic Program Evaluations that Work PEARL 101 Erica Barone Pricci and Alicia Burns, Lackawanna College Lost with ILO Assessment? No Worries, We Bet you are Heading in the Right Direction PEARL 102 Jacqueline Snyder, SUNY Fulton Montgomery Community College Mary Ann Carroll, SUNY Herkimer County Community College Our QuEST for Improving Learning: Year Two Analysis of Wellness Course Revisions LBOW 109 Mindy Smith and Susan Donat, Messiah College Assessors of the Galaxy: Using Technology Integration to Shift a Culture LBOW 209 Ryan Clancy, Mark Green and Nina Multak, Drexel University 30 Minute Split Sessions: LBOW 108 Assessing Student Learning in Student Affairs: There’s Just Not Enough Time! Debbie Kell, Deborah E. H. Kell, LLC Assessment Tools for Experiential Learning and Other Highly Impactful Practices Melissa Krieger, Bergen Community College 4:45 – 5:30 ICE CREAM SOCIAL PISB ATRIUM 6:00 – 10:00 PHILLIES GAME CITIZEN’S BANK PARK THURSDAY, SEPTEMBER 14 7:30 – 8:30 CONTINENTAL BREAKFAST 8:45 – 9:45 MORNING PLENARY MANDELL THEATER Reclaiming Assessment: Unpacking the Dialogues of our Work Natasha Jankowski, National Institute of Learning Outcomes Assessment (NILOA) 10:00 – 11:00 CONCURRENT SESSION 3 Background, Methods, and Results of a 7-year Longitudinal Assessment of Undergraduate Business Writing PISB 104 Scott Warnock, Drexel University The Wizards of Assessment: Peel Back the Curtain and Experience the Art and Science of the Assessor PISB 106 Mark Green and Ray Lum, Drexel University Learner-Focused Assessment for the Creative Mind: Cultivating Growth for All Learners PISB 108 Amanda Newman-Godfrey and Lynn Palewicz, Moore College of Arts and Design Working Hand-in-Hand: Programmatic Assessments and Institutional Outcomes PEARL 101 Frederick Burrack and Chris Urban, Kansas State University Assessing Engagement in Active Learning Classrooms PEARL 102 Dawn Sinnot, Susan Hauck and Courtney Raeford, Community College of Philadelphia Collecting Meaningful Assessment Data: an Accreditation Strategy LBOW 109 Jane Marie Souza, University of Rochester Peer-to-Peer Blueprints: Leveraging Hierarchical Learning Outcomes and Peer Consultants to Foster Faculty Discussions of Assessment LBOW 209 Michael Wick and Anne Marie Brady, St Mary’s College of Maryland We Need More: Novel Metrics for Classroom Assessment and Proposed Standards in Nonformal Learning LBOW 108 Caitlin Augustin, John Harnisher and Kristen Murner, Kaplan Test Prep 11:00 – 11:15 BREAK 11:15 – 12:15 CONCURRENT SESSION 4 All About that ’Base: Database Design as Part of Your Assessment Toolkit PISB 104 Krishna Dunston, Delaware County Community College Task-based Assessment: A Step-by-Step Guideline PISB 106 Ramy Shabara, The American University in Cairo, Egypt Cracking the Code of Creative “Capital:” Assessing Student Creativity in Science, Engineering and Technology Courses PISB 108 Jen Katz-Buonincontro, Drexel University Comparative Program Assessment to Increase Student Access, Retention, and Completion PEARL 101 Catherine Carsley and Lianne Hartmann, Montgomery County Community College Acting on Data: Lessons about the Use of Student Engagement Results to Improve Student Learning PEARL 102 Jillian Kinzie, Indiana University Critial Thinking: It’s Not What You Think! LBOW 109 Janet Thiel, Georgian Court University 6 BUILDING ACADEMIC INNOVATION & RENEWAL
Text Analysis as Assessment for Ethical and Diagnostic Purposes LBOW 209 Fredrik deBoer, Brooklyn College Assessment as Research: Using Compelling Questions to Inspire Thoughtful Assessment Practices LBOW 108 SCHEDULE AT-A-GLANCE Javarro Russell, Educational Testing Service (ETS) 12:30 – 1:45 LUNCHEON PLENARY Plenary: An Accreditation Roundtable Discussion Elizabeth Sibolski - President, Middle States Commission on Higher Education (MSCHE) Belle Wheelan - President, Southern Association of Colleges and Schools Commission on Colleges (SACSCOC) Patricia O’Brien - Senior Vice President, Commission on Institutions of Higher Education of the New England Association of Schools and Colleges (NEASC) 2:00 – 3:00 CONCURRENT SESSION 5 Knowing More About Our Students in Foundational Math and Writing Reform: Building Multi-Faceted Assessment on the Front End PISB 104 Fiona Glade, University of Baltimore Snapshot Sessions PISB 106 Assessing our Assessment: A Process for Reviewing Annual Assessment Reports Gina Calzaferri, Temple University Turning 120 Annual Reports Into a Searchable Online Planning/Reporting Database Linked to Strategic Plans Wenjun Chi, Saint Joseph’s University Rubrics: Facets That Matter Diane DePew, Drexel University The Impact of Co-Curricular Activities as an Assessment Tool on the University Students: Muhammad Farooq and Gehan El Enain, Abu Dhabi University Learning from the Assessment Process: HBCU Faculty Perspectives on Classroom and Program Review Pamela Felder and Michael Reed, University of Maryland Eastern Shore Enhancing Online Course Design by Developing Faculty-Administration Collaborations and Using Quality Matters Rubrics Moe Folk and Doug Scott, Kutztown University Collaboratively Assessing Collaboration: Self, Peer and Program-level Assessment of Collaborative Skills Janet McNellis, Holy Family University Developing an On-Line Simulation Activity: Assessing the Need and Implementing Action! Margaret Rateau, Robert Morris University Assessment in Science using I-LEARN Model Hamideh Talafian, Drexel University Implementing a Student Assessment Scholar Program: Students Engaging in Continuous Improvement PISB 108 Nicholas Truncale, Elizabeth Chalk, Jesse Kemmerling and Caitlin Pelligrino, University of Scranton Organizing Program Assessment as Collaborative Problem Solving PEARL 101 Barbara Masi, Penn State University Educational Development and Assessment: Simultaneously Promoting Conversations that Matter PEARL 102 Phyllis Blumberg, University of the Sciences Listening for Learning: Using Focus Groups to Assess Students’ Knowledge LBOW 109 Corinne Dalelio and Christina Anderson, Coastal Carolina University. Gina Baker, Liberty University Rebooting Work Based Assessment for the 21st Century; Shifting to Digital Technologies for Student Nurses LBOW 209 Sian Shaw and Anne Devlin, Anglia Ruskin University (UK) Drexel Outcomes Transcript & Competency Portfolio: Empowering Students & Faculty with Evidence of Learning Using Effective Assessment LBOW 108 Mustafa Sualp, AEFIS Stephen DiPietro and Donald McEachron, Drexel University 3:00 – 3:15 BREAK 3:15 – 4:15 CONCURRENT SESSION 6 Assessing Our Assessment: Findings and Lessons Learned Three Years Later PISB 104 Victoria Ferrara, Mercy College Everything I Ever Wanted to Know About Assessment I Learned from Reality Cooking Shows LBOW 108 Krishna Dunston, Delaware County Community College Grit in the Classroom PISB 108 Rebecca Friedman, Johns Hopkins University Decoupling and Recoupling: the Important Distinctions between Program Assessment and Course Assessment PEARL 101 Nazia Naeem, Lesley Emtage, Debbie Rowe and Xiodan Zhang - York College Encouraging Meaningful Assessment By Celebrating It! PEARL 102 Letitia Basford, Hamline University Application of the Collaborative Active Learning Model (CALM) Simulation An Experiential Service Learning Approach. LBOW 109 Francis Wambalaba and Peter Kiriri, United States International University Process-Based Assessment and Concept Exploration for Personalized Feedback and Course Analytics in Freshman Calculus LBOW 209 Mansoor Siddiqui, Project One Kristen Betts, Drexel University 5:30 – 7:30 RECEPTION: THE PYRAMID CLUB & AQUA STRING BAND THE PYRAMID CLUB drexel.edu/aconf 7
FRIDAY, SEPTEMBER 15 7:30 – 8:30 CONTINENTAL BREAKFAST SCHEDULE AT-A-GLANCE 8:45 – 9:45 CONCURRENT SESSION 7 It’s Prime Time to Shift IE to EE Planning and Assessment – but How? PISB 104 Mary Ann Carroll, SUNY Herkimer County Community College Jacqueline Snyder, SUNY Fulton-Montgomery Curriculum Maps: Who Starts a Trip Without a Map? PISB 106 Alaina Walton and Anita Rudman, Rowan College at Burlington County Using Course Evaluations to Better Understand what Your Academic Program is Messaging to Your Students PISB 108 Beverly Schneller and Larry Wacholtz, Belmont University Snapshots: Mission Impossible and Other Assessment Tales PEARL 101 Joanna Campbell, Maureen Ellis‐Davis, Gail Fernandez, Ilene Kleinman, Melissa Krieger, Amarjit Kaur and Jill Rivera - Bergen Community College Developing Sustainable General Education Assessment: The Example of Oral Communication Assessment at St. Lawrence PEARL 102 Valerie Lehr, Christine Zimmerman and Kirk Fuoss - St. Lawrence University Beyond the Classroom: A Collaborative Pilot to Unify Learning Assessment Across Six Academic Support Units LBOW 109 Jocelyn Manigo and Janet Long, Widener University Many Questions, Multiple Methods: Assessing Technological Literacy and Course Design in a Modular Team-Taught Course LBOW 209 Dana Dawson, Temple University Best Practices in Assessment: A Story of Online Course Design and Evaluation LBOW 108 Gulbin Ozcan-Deniz, Philadelphia University 9:45 – 10:00 BREAK 10:00 – 11:00 CONCURRENT SESSION 8 Expanding and Developing Assessment Practices to include Administrative, Educational, and Student Support (AES) Units PISB 104 Christopher Shults, Erika Carlson, and Marjorie Dorime-Williams - Borough of Manhattan Community College Creating a General Education Capstone: Assessing Institutional Outcomes through General Education PISB 106 Jenai Grigg and Gina MacKenzie, Holy Family University Assessing Information Literacy for Community College Students: Faculty and Librarian Collaboration Leads to Student Improvement PISB 108 Janis Wilson Seeley and Graceann Platukus, Luzerne County Community College Community-building through Assessment Design: Reframing Disciplinary Student Outcomes as Inquiry-based PEARL 101 Brad Knight, American University Data-driven Conversations to Make a Difference in Campus-wide General Education PEARL 102 Mindi Miller, Molly Hupcey Marnella and Bob Heckrote - Bloomsburg University of Pennsylvania Should You Take Student Surveys Seriously? LBOW 109 Zvi Goldman, Jeremi Bauer, Susan Lapine and Chris Szpryngel - Post University Skipping Stones or Making Splashes; Embedding Effective Assessment Practice into Faculty Repertoire LBOW 209 Dana Scott, Philadelphia University 11:15 – 12:00 CLOSING REMARKS PISB 120 8 BUILDING ACADEMIC INNOVATION & RENEWAL
CONSTANTINE PAPADAKIS INTEGRATED SCIENCES BUILDING (PISB) BUILDING FLOOR PLANS ENTRANCE 1ST FLOOR 104 ATRIUM 106 REGISTRATION 108 120 ENTRANCE Please pardon our campus construction this year. Thank you. drexel.edu/aconf 9
LEONARD PEARLSTEIN BUSINESS LEARNING CENTER BUILDING FLOOR PLANS TO LEBOW ENTRANCE 1ST FLOOR 3RD FLOOR (FLOORPLAN. NOT ACTUAL 101 STREET LOCATION) ENTRANCE MARKET STREET 307 308 102 303 302 301 DRAGON STATUE TO PISB 33RD STREET 10 BUILDING ACADEMIC INNOVATION & RENEWAL
GERRI C. LEBOW HALL BUILDING FLOOR PLANS 1ST FLOOR ENTRANCE MARKET STREET 2ND FLOOR 109 ENTRANCE 209 Please pardon our campus construction this year. Thank you. drexel.edu/aconf 11
JAMES CREESE STUDENT CENTER BUILDING FLOOR PLANS BEHRAKIS GRAND HALL ENTRANCE CHESTNUT STREET (JOE COF FEE) (SHA KE S HAC K) ENTRANCE TO JAMES CREESE STUDENT CENTER ENTRANCE TO MANDELL THEATER MANDELL PATIO THEATER PATIO PATIO HANDSCHUMACHER DINING CENTER (GLASS ENCLOSURE STAIRWAY) PATIO PATIO 12 BUILDING ACADEMIC INNOVATION & RENEWAL
CONFERENCE SCHEDULE » WEDNESDAY WORKSHOP 1 PEARLSTEIN 302 WORKSHOP #3: PEARLSTEIN 307 An Administrator’s Guide to Fostering a Faculty-Led Ready, Set, Go: Assessment Process The New Middle States Standards and your Assessment Practice Jacob Amidon & Debora Ortloff, Finger Lakes Community College Jodi Levine Laufgraben, Temple University The conundrum for those of us that are tasked with overseeing an Implementation of the new Middle States standards provide an ideal WEDNESDAY assessment process at a college is that in order for the process to be opportunity to reengage your campus in conversations about assessment. effective, sustainable and meaningful it must be faculty led, but faculty How do your current practices align with the new standards? Where will not, on their own, embrace the assessment process. In this might you improve? In this workshop we will discuss strategies for using workshop we will explore several techniques and tools that can be used the new standards to renew faculty commitment to the assessment of to foster a faculty-led assessment environment. These include how to student learning and reenergize the campus commitment to assessing reframe the act of assessment, building the capacity of the faculty to institutional effectiveness. engage in assessment, creating efficient processes around assessment At the conclusion of this workshop participants will be able to: and managing up to resource and protect the faculty-led process. • Outline how their campuses strengths and weaknesses align with Participants will work through several hands-on exercises around new standards. these core concepts so they can begin to create their own guide to apply • Plan one or more ways to use the new standards to renew campus within their own campus context. commitment to assessment. At the conclusion of this workshop participants will be able to: • Develop ideas for framing assessment on their campus. WORKSHOP #4: PEARLSTEIN 101 • Create initial targeted professional development plan to support faculty leadership in assessment. Assessment Toolbox: • Map out efficiency improvement ideas to support high quality Supercharge the Direct Assessment of Student Services assessment. Michael C. Sachs, John Jay College The Middle States Commission on Higher Education’s publication WORKSHOP #2: PEARLSTEIN 303 Student Learning Assessment: Options and Resources, Second Edition Creating & Assessing Campus Climates that Encourage states “the characteristics of good evidence of student learning include considerations of direct and indirect methods for gathering evidence Civic Learning & Engagement of student learning.” Creating direct student learning assessment Robert D. Reason, Iowa State University tools within student support services can be challenging for student After a brief discussion about the connections between campus climates service professionals. Often many student service programs rely solely and students’ civic learning and engagement, this session will focus on on indirect assessment techniques such as focus groups, evaluations, specific ways institutional leaders can create and assess those campus satisfaction surveys, NSSE results, etc. climates that encourage civic learning and engagement. Although the This workshop will explore the direct student learning assessment emphasis of the workshop will be on participant’s campus contexts, we tools available to Offices of Student Affairs and other services offices will use data from the Personal and Social Responsibility Inventory on campus. These techniques and tools are both qualitative and (PSRI), an ongoing climate assessment project at over 40 institutions, quantitative in intention and design. This workshop will also enable to examine what we know about these relationships broadly. participants to develop program goals, rubrics, and direct student At the conclusion of this workshop participants will be able to: learning outcomes for their student service areas – linked, of course, to • Articulate an understanding of how climate shapes learning on their college’s mission and/or strategic plan. Participants should bring college campuses. copies of their institutional strategic goals and mission. • Draw connections between current (and future) campus programs At the conclusion of this workshop participants will be able to: and climates that encourage civic learning and engagement. • Explain the importance of direct assessment for planning, • Develop a plan that incorporates campus climate, institutional resource allocation and student learning. policies and programs, and student engagement activities • Recognize and understand the differences between direct and to comprehensively assess the development of civic learning indirect assessment in student services. outcomes • Create direct assessment of Student Learning Outcomes for their individual areas / programs that can be incorporated into assessment plans. drexel.edu/aconf 13
1:00 – 2:00 P.M. WELCOME & OPENING PLENARY sponsored by BRIAN BLAKE, PROVOST (Mandell Theater) Greetings and welcoming remarks will be issued by Dr. Brian Blake, Provost and Executive Vice President for Academic Affairs. 12:30 – 1:45 MANDELL THEATER Learning/Director of the Library; and public services librarian Presidents’ Council. In 2016, she was elected to the American WEDNESDAY at Moraine Valley Community College in Palos Hills, Illinois, Association of Community College’s (AACC) Board of Directors Creating a College Culture and director, Library Services and public services librarian at and also serves on the AACC’s Commission on Global Education. Where Assessment is a Virginia Union University, Richmond, Virginia. Her educational In 2016, she was elected to serve a two-year term as an Pathway to Student Success background includes a Ph.D. in Education and Human Resource Associate Member Regional Representative of the Hispanic Studies w/specialization in Community College Leadership Association of Colleges and Universities (HACU). She is a Sylvia Jenkins from Colorado State University, an MLS/Master of Library member of the Illinois Green Economy Network (IGEN) Dr. Sylvia Jenkins was appointed the fifth Science from State University of New York at Albany, and a President’s Steering Committee. She serves on the Cook County president of Moraine Valley Community B.S. in English Education from Grambling State University. Workforce Investment Board and the Forest Preserve District of College on July 1, 2012. Moraine Valley Cook County’s Conservation and Policy Council. She previously She serves on several boards, including the League for Community College is the second largest served on the Forest Preserve District of Cook County’s Next Innovation in the Community College, Community Colleges community college in Illinois. Dr. Jenkins previously served as Century Conservation Plan as well as a state-wide Northeastern for International Development (CCID), Moraine Area Career vice president, Academic Affairs; dean, Academic Development Illinois Public Transit Task Force. Systems CEO Council, Chicago Regional College Program, and Learning Resources; assistant dean, Center for Teaching & and the South Metropolitan Higher Education Consortium WORKSHOP #5: PEARLSTEIN 102 2:00 – 2:15 P.M. Leading Change: Tackling Institution, Program, and BREAK Individual Challenges that Derail Assessment Initiatives Refreshments Available Catherine Datte, Gannon University Ruth Newberry, Blackboard Inc. 2:15 – 3:15 P.M. In keeping with the theme Facilitating Conversations that Matter, this interactive workshop engages participants in conversations focused on successful change initiatives related to assessment. Participants CONCURRENT SESSION 1 will learn to implement the Kotter change model, prioritize initiatives, solicit support, and develop an implementation plan to move a change 2:15 – 3:15 PISB 104 initiative toward success. Success involves a thoughtful, realistic project Building Faculty Support for a Quantitative Reasoning plan, driven by a coalition and supported by a “volunteer army” that can serve as spokes-persons, role models, and leaders to move the Requirement: Holistic Assessment of Curriculum and Learning effort forward. Participants will also learn from one another successful J Bret Bennington, S. Stavros Valenti, Frank Gaughan and Terri Shapiro, Hofstra University strategies to overcome barriers and resistance that limit forward movement. Attendees will document their SWOCh, gaps, and vision We will present a holistic model of outcomes assessment that addresses the with the assistance of the co-presenters Catherine Datte and Ruth ‘fit’ between learning goals and learning opportunities in the curriculum Newberry using the Change Leadership Workbook. In a combined while also collecting data on student learning. To illustrate our model, we approach of information gathering and self-appraisal, attendees will present data and analyses from a recent assessment of Quantitative will begin to develop their unique implementation plans and receive Reasoning. If done well, analyses of goal‐curriculum fit can be powerful guidance regarding specific nuances and challenges related to their motivators for faculty and administration to cooperate on curricular institution. Throughout the workshop, Catherine and Ruth will award innovation. This approach led to a broadly supported improvement in books related to the specific challenges that are often associated with the general education curriculum at Hofstra–a quantitative reasoning assessment planning, change leadership, and team building. requirement–that was adopted less than two years after first being proposed. This session will provide attendees with a blueprint for holistic assessment At the conclusion of this workshop participants will be able to: – combining curriculum analysis with student learning assessment – as well • Identify and prioritize critical actions associated with best as a sustainable method for collecting data using online survey tools that practices in program or institution assessment along with could be scaled up to large numbers of participants with little added effort. documenting practical action steps. • Learn strategies from peers and share challenges and successes. LEARNING O UTCO MES : • Create individualized action steps that drive their assessment 1. Participants will learn how to collect and analyze data on learning process. opportunities and engagement within the curriculum (i.e., goal‐ curriculum fit). 2. Participants will learn a sustainable / scalable method for measuring student learning outcomes Audience: Intermediate 14 BUILDING ACADEMIC INNOVATION & RENEWAL
2:15 – 3:15 PISB 106 driven decision making is key to maintaining successful and effective programs. Data analysis is key to assess effectiveness of student learning From First to Final Draft: and curricular relevance. Closing the loop on data collection is key in Developing a Faculty-Centered Ethical Reasoning Rubric making smart decisions in program design, improvement, and delivery Genevieve Amaral, John Dern and Dana Dawson LEARNING O UTCO MES : Temple University 1. Participants will be guided in using effective strategies on creating In this session, we will address how faculty and administrators implemented descriptive assessment rubrics. a faculty-centered rubric development process. Over the course of one 2. Participants will be exposed to strategies on how to analyze data academic year, the team developed, refined and deployed a rubric for the for course, program, and unit level improvements. They will WEDNESDAY assessment of ethical reasoning in a core text program at a large, urban, understand how to triangulate multiple measures in order to drive state-related institution. What began as an institutionally mandated process decisions for curriculum effectiveness. ultimately shed light on unstated, but inherent program goals, and created Audience: Intermediate opportunities to raise awareness of how ethical reasoning informs text selection and learning activities. Presenters will review the program’s history, 2:15 – 3-15 PEARLSTEIN 102 challenges and lessons learned during rubric development, and plans for implementation. Participants will gain insight into the creation of organic Criterion Met. Now time to Reflect assessment tools that contribute meaningfully to day‐to‐day teaching and Kathryn Strang Rowan College at Burlington County curriculum development, and the process of building rubrics to address Rowan College at Burlington County’s assessment process serves as skills such as ethical reasoning which can be ambiguous and value‐laden. a systematic mechanism to measure the strengths and weaknesses L E A RN I N G OUT COM E S: of the college’s academic offerings on a continuous basis. Through 1. Participants will better understand the stages of crafting an assessment the implementation of self‐reflection summaries the Assessment rubric, and strategies for involving faculty in all aspects of the process. Chairs use this tool to highlight what they have learned by conducting 2. Participants will better understand how to validate a rubric and the assessments whether the criterion was met or not. Often these employ it to carry out a direct assessment of student learning. summaries involve very detailed and specific adjustments to the Audience: Beginner curriculum and instructional delivery. Kathryn will lead a PowerPoint presentation followed by a learning activity and a Q&A session designed for professionals with experience in assessment/teaching/ 2:15 – 3:15 PISB 108 learning who seek to develop strategies for a continuous improvement Student’s leading the way: Student Driven Assessment process of assessments. In this session she will outline RCBC’s academic Timothy Burrows Virginia Military Institute assessment process and the tools and strategies used to establish a This session details the development of a student driven assessment strong continuous improvement cycle. Kathryn will take participants of leadership outcomes that support the Virginia Military Institute’s through the process of generating assessment results, interpreting these mission of developing confidence “in the functions and attitudes of results, and analyzing their implications through the use of reflection leadership.” Often students are not familiar with the role of assessment summary instrument. At the end of the session, participants will be in higher education and lack a general understanding of processes able to understand how outcomes can be used to create an environment in place to help an institution improve. Including students helped to of continuous improvement. create a high‐level of buy‐in and a sense of ownership (Kuh, Ikenberry, LEARNING O UTCO MES : Jankowski, Cain, Etwell, Hutchings, & Kinzie; 2015). This session 1. Participants will be able to employ a culture of continuous is relevant because it provides a positive example of participant improvement by learning how to: • implement change based upon evaluation and assessment (Ftizpatrick, Sanders, & Worthen; assessment outcomes from various well‐defined performance 2011) in a holistic and natural setting. This process highlights how the indicators relationship between several academic‐support units and students can 2. Participants will be able to employ a culture of continuous foster stakeholder buy‐in and ownership. improvement by learning how to: • design a reflective summary L E A RN I N G OUT COM E S: tool to use at their college 1. Participants will be able to develop new possibilities for student Audience: Intermediate driven assessment practices at their home institution. 2. Participants will be able to debate the benefits, pitfalls, and challenges 2:15 – 3-15 GERRI C. LEBOW HALL, 109 facing the implementation and use of student driven assessments. Implementing Assessment in Student Conduct: Understanding a Audience: Intermediate Balancing Act of Challenge, Support, Accountability, and Growth 2:15 – 3-15 PEARLSTEIN 101 Jeff Kegolis The University of Scranton When considering assessment within a Division of Student Affairs, Closing the Loop on Data Collection and Program Improvement historically Student Conduct is a particular functional area that creates Chadia Abras and Janet Simon Schreck Johns Hopkins University challenges for administrators and educators. Although learning may This session aims to present how to collect effective data from course take place over the course of time in one’s student experience with the assessments using descriptive rubrics. The session will also present conduct process, it may be difficult to understand how a student believes how data collected can be analyzed and utilized to close the loop they are growing through their circumstances and/or the competencies on course and program improvements. Creative and effective ways they have improved upon through reflection and processing of their to derive meaningful inferences from assessment data sets will be situation. Ultimately, this session focuses upon how assessment of explored. In light of an assessment driven culture at most institutions the conduct process was implemented, specifically related to conduct of higher education and compliance with accrediting agencies, data‐ meetings and the results students identified related to their experience. drexel.edu/aconf 15
Depending on the size of one’s institution or one’s Student Affairs implementing assessment mechanisms while wrestling with technical division, attendees who attend this session will be able to engage limitations resulted in a stronger and better articulated program. in dialogue related to best implementing assessment to understand Building a culture of assessment is the best answer to the current competency measurement, the importance of connecting assessment to skepticism over the value of a college education. The systematic one’s office mission statement/division’s priorities/university’s strategic inquiry into teaching and student learning provides a way for higher plan. Additionally, Student Conduct is a programmatic area that may education institutions to demonstrate accountability. Our presentation be difficult to assess due to the nature of the process and a student’s will show the applicability of our college’s assessment model. / We will lack of willingness to be held accountable. Therefore, through this explain how we gained faculty participation, how we implemented a session, attendees will engage in conversation around their individual loop system of assessment, how we overcame technological limitations, WEDNESDAY department’s implementation of assessment and complete a SWOT and what we learned in the process. This should help attendees who analysis of how their assessment is implemented. have to revise curriculum and develop relevant methods of assessment L E A R N I N G OUT C OM E S: especially for a digital ability. 1. Participants will discuss best practices related to assessment within LEARNING O UTCO MES : one’s functional area, and future direction of their assessment based 1. Participants will formulate a proposal for programmatic on lessons learned from previous assessment utilized. assessment. 2. Participants will acquire skills to help develop assessment of 2. Participants will design exercises that assess digital ability specific competencies in relation to their programmatic areas. effectively. Audience: Intermediate Audience: Beginner 2:15 – 3-15 GERRI C. LEBOW HALL, 209 3:15 – 3:30 P.M. Faculty as Networked Improvement Community: Alignment of EdD BREAK Program Learning Objectives, Standards, and Measurable Outcomes Refreshments Available Joy Phillips, Kathy Geller and Ken Mawritz, Drexel University This manuscript describes how Drexel University School of Education 3:30 – 4:30 P.M. faculty have aligned EdD program principles with Carnegie Project for the Educational Doctorate (CPED) design principles, national Council for the Accreditation of Educator Preparation (CAEP) Advanced Program CONCURRENT SESSION 2 Standards, Drexel Student Learning Priorities (DSLPs), and the Drexel School of Education Program Themes / Proposal provides example of 3:30 – 4:30 PISB 104 a participatory, bottom‐up process to align outcomes and assessment Self-Esteem is Doomed: A Paradigm Shift to Self-Compassion activities that includes data/evidence with program‐level learning priorities. Faculty in EdD program aligned Program Learning Outcomes Allow Everyone to Thrive in Higher Education Laura Vearrier, Drexel University with national and institutional standards. Participants can use this process as a model for developing a cycle of continuous program improvement. The goal of this session is to teach educators about the elements of self‐ Faculty will share a multi‐step process of identifying program learning compassion—self‐kindness, shared humanity, and mindfulness— and how outcomes (PLOs) beginning with individual course learning outcomes. this construct is more productive than self‐esteem. Self‐ esteem involves This interactive session provides participants with templates (see attached) the need to feel above average and special in comparison to others and will as examples and as working documents to enable participants to engage in inevitably wane in higher education settings. / In a society where being such assessment work at their own institutions. average is unacceptable but the norm, most assessments will be perceived as failures and be unpleasant for the educator and the learner. Self‐ L E A R N I N G OUT C OM E S: compassion involves transitioning from self‐judgement to self‐kindness, 1. Reflections from discussion of faculty working as a network isolation to common humanity, and disconnection to mindfulness. This improvement community to align EdD program learning construct allows for a more positive experience. / Self‐ compassion is for objectives with national standards and measurable student the attendee to learn about for personal well‐being. They can then guide learning outcomes. assessments with the principles of self‐compassion for a more fulfilling, 2. Examples in the form of templates for conducting program‐level productive process for themselves as well the learner. alignment of program learning objectives, standards, and student learning outcomes. LEARNING O UTCO MES : Audience: Intermediate 1. Participants will be able to understand the components of self‐ compassion and how it differs from self‐esteem 2. Participants will be able to apply self‐compassion for oneself 2:15 – 3-15 GERRIE C LEBOW HALL 108 and then use the construct to guide productive self‐ refection in Building a Culture of Assessment and Embracing Technology: learners A Communication Studies Program Success Audience: Advanced Patricia Sokolski, Jaimie Riccio and Poppy Slocum. LaGuardia Community College This session will tell the successful story of a community college communication studies program faced with the challenge of implementing and assessing new general education competencies. Revising objectives and learning outcomes, creating new assignments, 16 BUILDING ACADEMIC INNOVATION & RENEWAL
3:30 – 4:30 PISB 106 reading strategy was developed to assist students to decode test items and unlock the meaning of test questions. Good readers interact Snapshot Sessions (A Collection of Mini Sessions) with the text and decode, read fluently, activate their vocabulary SS1: Does Class Size Matter in the University Setting? knowledge, and use multiple text comprehension strategies. Science Ethan Ake and Dana Dawson, Temple University assessments contains an assortment of information such as graphic organizers, graphs, photos, drawings, and other graphic features. The This shapshot session presents the findings of a multilevel model 3R2V strategy assists struggling readers to strive towards such tasks. examining the relationship between class size and Temple University Attendees will explore the potentials of the struggling reader and how General Education course grades for 172,928 grades nested in 7,704 the 3R2V strategy could be the bridge to their achievement in science. sections across ten semesters in a five year period (Fall 2011‐Spring WEDNESDAY 2016). The study is currently under review for journal publication. The LEARNING O UTCO MES : findings, which indicate that class size is NOT a statistically significant 1. Participants will explore the components of the 3R2V Strategy variable in predicting General Education course grades, have and its potential benefit to struggling readers. implications in terms of teaching and learning, classroom dynamics 2. Participants will examine the implementation process of the 3R2V and policy changes. Class size is a particularly important issue today Strategy in the classroom. given budgetary pressures IHEs face (e.g. declining state aid, RCM). As Audience: Beginner per the conference theme, Facilitating Conversations that Matter, this session aims to stimulate conversation among stakeholders about the SS4: Assessing and Addressing the Digital Literacy Skills of role of class size. The findings also encourage stakeholders to weigh the complex relationship between class size and student achievement and First-Generation College Students consider class size in relation to pedagogy and assessment. Nicole Buzzetto-Hollywood and Magdi Elobeid, University of Maryland Eastern Shore L E A RN I N G OUT COM E S: 1. Participants will be able to understand the relationship between The assessment of the digital literacy skills of first‐generation students class size and student achievement (1) for all students in all course attending a historically Black university through the use of the IC³ types, (2) in the ten General Education program domains and (3) certification program with incoming freshmen will be discussed as for the five student racial groups. well as an evaluation of a core course offered as part of the institutions 2. Participants will be able to critically consider how a statistically general education curriculum. There is a common, and growing, non‐significant result is influenced by pedagogy, assessment and misconception that students enter higher education with the digital student‐instructor interaction. literacy competencies necessary for success; however, the research Audience: Intermediate shows major skill deficiencies among students. This topic is relevant as institutions strive to meet the needs of students with varying levels of technological readiness. Attendees will be provided with relevant SS2: I See What you Mean: Using Infographics and Data questions to help them evaluate the effectiveness of their current plan Visualizations to Communicate your Assessment Story for addressing the digital literacy competencies of students. Tracey Amey, Pennsylvania College of Technology LEARNING O UTCO MES : Effective Data visualizations and infographics are highly useful tools 1. Participants will be able to critically discuss the relevance of for conveying messages and complex information. This session will assessing and addressing the digital literacy skills of students with demonstrate easy‐to‐use and readily‐accessible tools, from Powerpoint a particular emphasis on students who may be from underserved to free online programs, that can be used to communicate the story populations. your data tells in an engaging and approachable manner. Representing 2. Participants will be to use relevant questions to evaluate the assessment data visually can be an effective and approachable way to effectiveness of their current plan for assessing and addressing the gain a quick, yet profound understanding of complicated data. This digital literacy competencies of students. session will provide simple, yet effective tools for the user to create data Audience: Advanced visualizations. Assessment data can be overwhelming, not only to those working with it, but also to those who are try to understand it. Data SS5: Utilization of External Reviewers for Student Learning visualization provides an alternative way for the user and the audience to approach data and can be done with common tools, like Powerpoint. Assessment Anthony DelConte, Saint Joseph’s University L E A RN I N G OUT COM E S: 1. Participants will gain a new awareness of the effectiveness of data The session will discuss use of external assessors of interpersonal and visualization for complicated data sets. oral communication skills in real life role‐play scenarios. Students 2. Participants will learn about common and easy‐to‐use tools that must convince a reviewer to part with a resource using the skills and can create data visualizations techniques practiced in the course involving selling skills. Many of our Audience: Beginner interactions involve a series of communications where we persuade or influence thinking or behavior. The ability to communicate that we have something of value to offer in exchange for something else that we SS3: The Impact of the 3R2V Strategy on Assessment Questions value is essential for those in the sciences, humanities, and in business. in the Science Classroom. The ability to “sell” is critical to success in the world. According to Pink, only one out of nine workers have a job title that includes sales. As Deshanna Brown, Barry University and Broward County Public Schools educators, we are selling nearly every day. Evaluating this skill set can High‐stakes testing are assessments with consequences positive or be accomplished in role play scenarios using external reviewers. negative, such as student retention or promotion (Vacca and Vacca, 2009 p. 164). To help my students succeed on science assessments, a drexel.edu/aconf 17
L E A R N I N G OUT C OM E S: SS8: Faculty Centered Assessment: Getting the Right People to 1. Participants will be able to determine innovative ways to evaluate interpersonal and oral communication skills. the Right Place at the Right Time 2. Participants will be able to utilize external assessors to validate Brooke Kruemmling, Salus University student learning outcomes This session will discuss the importance of selecting members for a Audience: Beginner faculty university‐wide assessment committee who are positioned most appropriately to contribute to both programmatic and institutional assessment goals. This presentation will identify the challenges with SS6: Core Curriculum Outcomes: Reflections, Reactions, Results, selecting such individual and the strategies used to develop a cohesive and and Other Assessment Tales WEDNESDAY functional committee. In the current landscape of educational outcomes Seth Matthew Fishman, Villanova University assessment, accrediting bodies are focusing on engagement at all levels The presenters will candidly convey the challenges faced with their current of the program. Creating an effective assessment committee builds upon strategy from our core curriculum’s first large‐scale assessment project institutional tools to promote sound assessment practices. Given the using an ePortfolio. We focus on our Foundation courses, the shared standards relative to assessment of most accrediting bodies, it is critical to intellectual experience of interrelated courses all undergraduate Arts establish college or program level processes to ensure that faculty have the & Sciences students take. We will discuss the lessons learned from our ability to remain engaged in their own assessment practices. pilot experience in 2016 and current process which reviews a sample of LEARNING O UTCO MES : ePortfolios, which involved 15 evaluators from four departments. Attendees 1. Participants will recognize the importance of engaging the will benefit from learning about the process of utilizing ePortfolio and appropriate faculty in the key assessment committee roles. working with multiple departments and academic disciplines, along with 2. Participants will understand the logistical challenges with using evaluator recruitment, training, and tech issues and costs. time effectively, and selecting the right individuals to participate L E A R N I N G OUT C OM E S: in a university assessment committee. 1. Participants will gain at least one strategy to assess a general Audience: Intermediate education/core curriculum. 2. Participants will identify challenges faced when utilizing SS9: Showing Educators How to Teach Traumatized Students technology coupled with evaluative review teams. Jonathan Wisneski and Anne Hensel, Upper Darby School Audience: Intermediate District and Drexel University Increasingly, schools are working to educate and test children who are SS7: Developing an Exceptional Academic Advising Program living with trauma. It is important that educators understand how trauma impacts their students so that they can respond appropriately to Using Student Satisfaction Survey Data effectively educated these students. This content matters because students Debra Frank, Drexel University who are impacted by traumatic events learn and behave differently. This In this session, I will show how we used student feedback on their presentation will summarize current research about trauma and its effects experiences with their academic advisors to improve advisor performance on brain development. We will discuss best practices for working with and achieve an overall satisfaction rate of over 92% to 97% on advisor children living with trauma and share practices that we have implemented efficacy, advisor characteristics, and advising satisfaction from baselines of in our building to build a trauma‐ informed staff and school environment. 72% to 84%. I will share the Qualtrics survey that was developed in‐house LEARNING O UTCO MES : and reviewed by NACADA academic advising experts. There is a direct 1. Participants will learn skills and strategies to support students connection between student retention and effective academic advising. dealing with trauma. Without clear student feedback systematically collected from advisees, 2. Participants will learn how to teach other adults the action steps advisors have tendencies to overrate their own efficacy and consider any to deal with students who have suffered from trauma. negative student feedback as anomalous. By providing clear feedback Audience: Beginner and benchmarking performance against the average of high performing advisors, advisors motivation to participate in professional development activities and to accept mentoring was increased. As a result, lower 3:30 – 4:30 PISB 108 performing advisors worked to improve their performance to meet the Assessing Critical Reflection: Learning in Faculty-led Short Term standards set by the advising manager and the high performing advisors. Advisors who are able to establish positive relationships with their advisees Study Abroad Programs: Students in Developed Countries and then get positive feedback about their work feel appreciated and this Akosa Wambalaba, United States International University increases job satisfaction and productivity. Critical reflection is a transformative learning outcome of the L E A R N I N G OUT C OM E S: embedded faculty led short term study abroad program (Gaia, 2015; 1. Participants will be able to explain the connection between clear Russell and Reina,2014) Windows to the World-France at United feedback and staff performance. States International University. We look at short term study abroad 2. Participants will be able to identify the elements of an effective assessment activities and assess evidence of critical reflection skill student satisfaction survey. learning by developing country students in a developed country, with Audience: Intermediate implications on assessment objectives and methods. The focus is on assessing process oriented transformative learning (Mezirow,1998) through multiple channels: engaging field discussions, informal interviews (novel), videos, home campus presentations and thematic research. Faculty in short term study abroad programs must continuously provide evidence of the competencies students acquire, 18 BUILDING ACADEMIC INNOVATION & RENEWAL
You can also read