The University of North Carolina at Greensboro

School of Education - Educational Research Methodology

ERM Students and Faculty Present Research at NCME & AERA

April 7th, 2014

ERM students and faculty had another highly successful showing at this year’s annual conferences of the National Council on Measurement in Education (NCME) and the American Educational Research Association (AERA) being held in Philadelphia this past week. In total, ERM students and faculty were involved in the 12 papers and posters delivered with numerous collaborators, which reflects the high quality research being conducted by ERM in the areas of measurement and research methodology. The research delivered by ERM students and faculty include:

Read the rest of this entry »

Read More

OAERS Receives Contract Evaluating US Lacrosse Coaching Education Program

March 23rd, 2014

In March of 2014, US Lacrosse finalized its plans to contract with ERM’s Office of Assessment, Evaluation, and Research Services (OAERS) to conduct an evaluation of their Coaching Education Program associated with the Level-1 Certification for the Women’s Lacrosse Game. US Lacrosse is the national governing body of men’s, women’s and youth lacrosse, with a membership of nearly 500,000 players, coaches, administrators, parents, and fans. As the home of the nation’s fastest growing sport, a primary goal of US Lacrosse is educating coaches about the technical aspects of the sport, coaching strategies, and player safety.

Read the rest of this entry »

Read More

ERM Students and Faculty Shine at 2014 NCARE Conference

March 11th, 2014

With the 2014 annual meeting of the North Carolina Association for Research in Education (NCARE) being held in Greensboro this past February, it was the ideal time for ERM to have its greatest showing ever at this event. A total of 15 ERM faculty, students, and alumni authored research and training sessions presented at the conference. ERM’s contributions to the conference began by ERM Assistant Professor Dr. Holly Downs and ERM students Jonathan Rollins, Emma Sunnassee, and Lindsey Varner presenting the Preconference Training Session “Introducing Powerful Tools for Qualitative Research”. A copy of the slides used for this presentation is located on the OAERS website under Workshops & Trainings.

Read the rest of this entry »

Read More

OAERS Shows Strong Growth and Positive Impact on ERM Student Learning

February 23rd, 2014

It was only a year ago that ERM’s Office of Assessment, Evaluation, and Research Services (OAERS) was formed. The intended purpose of OAERS was simple – to provide the needed infrastructure to support hands-on learning experiences of ERM students. OAERS meets this purpose by supporting and housing analytic and evaluation projects supervised by ERM faculty on which ERM students work to gain real-life, practical experience. In addition to bringing projects in-house, OAERS facilitates internship and practicum opportunities for ERM students with organizations in need of methodological support. In this manner, OAERS in quickly becoming the hub of out-of-classroom, practical learning experiences for ERM students. An added benefit of OAERS is its ability to package the rich ERM expertise in measurement, data analysis, and program evaluation as a marketable service to organizations.

Read the rest of this entry »

Read More

OAERS Secures Data Analysis Internship with Chatham County Schools

January 29th, 2014

ERM’s Office of Assessment, Evaluation, and Research Services (OAERS) not only secures data analytic and evaluation contracts on which ERM students gain valuable experience, it also facilitates internship and applied field experience opportunities for ERM students. A prime example of this is the recent placement of ERM student Josh MacInnes in a semester-long internship with Chatham County Schools this spring. In this role, Josh will work closely with principals and district personnel to analyze school level and student level data for eight public schools within the county. The internship will focus on school growth, student subgroup performance, and the accuracy of formative assessments in predicting student success on summative assessments.

Read the rest of this entry »

Read More

ERM Establishes New Library for Measurement, Evaluation, Psychometrics, and Methodology

December 25th, 2013

The new ERM Library is now open! The ERM Library holds a wide range of books and resources related to research methodology, statistics, item response theory, program evaluation, structural equation modeling, hierarchical linear modeling, educational measurement, psychometrics, and assessment. Access to the holdings of the ERM is permitted only for ERM students to ensure that these important resources will always be available to the current students of ERM. The establishment of the ERM Library adds another layer of instructional support to ERM Students that complements the deep ERM curriculum in research methodology, educational measurement, psychometrics, and program evaluation (see the ERM Academic Programs webpage), as well as the hands-on, practical learning experiences offered by ERM’s Office of Assessment, Evaluation, and Research Services (OAERS). Information on the holdings of the ERM Library and instructions for how to check out a book from ERM Library can be found on the Resources page of the ERM website, as well as the Current Students page of the ERM website.

Read More

ERM Professor Receives Distinguished Paper Award

December 1st, 2013

Randy Penfield, Professor and Chair of ERM, recently co-authored a paper with Dr. Corinne Huggins of the University of Florida that received the Distinguished Paper Award from the Florida Educational Research Association (FERA). The paper, entitled The Relationship between DIF Location and Conditional Equating Dependence Issues, explores the manner in which differential item functioning (DIF) effects of specific items contained in anchor tests lead to violations of equating invariance. The results of the paper shed light on the potential impact that DIF in anchor test items can have on the validity of equated scores when a single equating is applied equally to different demographic groups. Because the presence of DIF often reflects the presence of a biasing factor in the test, the paper addresses an important topic related to equity in testing.

Read the rest of this entry »

Read More

ERM Professor Develops Computer Interface for Measurement and IRT Models

November 23rd, 2013

Dr. John Willse, Associate Professor in ERM, has developed the computer program MGR “Measurement GUI for R”. MGR is a point-and-click interface (programmed in JAVA) that allows the user to make use of measurement and item response theory (IRT) analyses conducted by R packages without having to engage the R language. Thus, rather than having to learn the R language to make use of the IRT-related analyses possible with R, users employ the user-friendly MGR software package to run analyses. When the user selects a particular analysis from MGR’s point-and-click menu format, MGR calls the relevant analysis from R (i.e., MGR composes the R code that is then read into R by MGR) and then prints out the results in user-friendly output tables (i.e., MGR reads the output of R and reformats this output into user-friendly tables). Over time MGR will have the potential to run any measurement analysis contained in R’s ever-growing library of analyses.

Read the rest of this entry »

Read More

ERM Students and Faculty Participate in National Conference on Program Evaluation

November 16th, 2013

ERM faculty and students recently participated in the annual meeting of the American Evaluation Association (AEA), held in Washington D.C. ERM participants included ERM faculty member Holly Downs and ERM students Katherine Ciccarelli, Christine Meyer, Keshia Martin, Emma Sunnassee, Jonathan Rollins, Kshawna Askew, and Shureka Hargrove. Participation in AEA included delivering research presentations, posters, and roundtables pertaining to culturally responsive techniques in program evaluation, the processes of STEM pilot programs, the contextual frameworks for planning multi-site project, and the use of R (Comprehensive R Archive Network) for data management, data analysis, and data graphing needs in conducting program evaluations. This involvement in AEA provides ERM students with important experiences and connections that complement the training provided by the ERM Academic Programs.

Read More

ERM Alumnus Josh Goodman Receives Early Career Award

November 1st, 2013

ERM alumnus Dr. Josh Goodman received the School of Education’s Early Career Award at an awards ceremony held during UNCG’s Home Coming. Dr. Goodman received his Ph.D. from ERM in 2008, specializing in educational measurement and psychometrics. Since graduating from ERM, Dr. Goodman has become a nationally recognized expert in testing and measurement, currently holding the position of Senior Psychometrician with Pacific Metrics. At Pacific Metrics, he provides psychometric expertise to large-scale assessment development, delivery, and scoring. Prior to joining Pacific Metrics, Dr. Goodman was an Assistant Professor of Psychometrics in the Department of Graduate Psychology at James Madison University (2008-2010) and a Research Scientist at Pearson (2010-2012). In addition to his highly accomplished work with large-scale testing organizations, Dr. Goodman has also been an active researcher, with several published articles in Applied Psychological Measurement, Educational and Psychological Measurement, and Applied Measurement in Education, as well as numerous research presentations at the annual conferences of the National Council on Measurement in Education (NCME) and the American Educational Research Association (AERA). Congratulations Josh!!

Read More
Connect with the School of Education!