Skip to main content
SearchLoginLogin or Signup

The Value of Data, Metrics, and Impact for Higher Education Makerspaces

A variety of data collection methods available to higher education makerspaces are presented, along with their use and potential impact. Pros and cons are evaluated and presented for both passive methods (gate counters) and active methods (sign-in systems.)

Published onApr 01, 2021
The Value of Data, Metrics, and Impact for Higher Education Makerspaces
·

Malcolm N. Cooke1 and Ian C. Charnas2

1Malcolm N. Cooke; Sears think[box], School of Engineering, Case Western Reserve University; [email protected]

2Ian C. Charnas; Sears think[box], Case Western Reserve University; [email protected]

Abstract

A variety of data collection methods available to higher education makerspaces are presented, along with the use and potential impact of those metrics. Pros and cons are evaluated and presented for both passive methods such as gate counters and active methods such as sign-in systems.

Introduction

The journey of designing, constructing, operating, and managing a higher education makerspace is far from trivial and fraught with many challenges that require creative, team-based solutions. But by far the most important management tool in the team’s makerspace toolbox is “data.” Data is a very powerful ally, and difficult to ignore when used to craft and promote one’s makerspace story, support strategic decisions, measure and validate metrics, and gauge impact.

Data Collection

A question that comes up frequently when operating a makerspace is, What data should be collected? The answer is a resounding: As much as you can! One can always be selective in which data set to use to tell a particular story or assess outcomes, but if that data set has not been collected, then one is left to tell anecdotes, which without supporting evidence are easily dismissed. There are many different techniques and systems that can be used to collect data, and these decisions will be very much a function of how a particular institution manages and operates its makerspace.

ID Card Readers: The majority of institutions adopt some form of campus ID card that is issued to students, staff, and faculty. These have either (or both) a magnetic strip or use RFID technology to store and encode data that identifies the holder. These ID cards can easily be used as one of the major inputs in a data collection system where the holder would swipe his or her card when entering the makerspace. Figure 1 shows a card reader unit located at the think[box] Welcome Desk. For users who are not affiliated with the institution and therefore lacking a valid ID card, a government issued photo ID card (e.g. driver’s license) can be scanned. The data can then be mined to produce a wide array of reports and statistics.

Fig. 1. Card Reader Unit

Photo ID Scanner: For users without campus ID cards, data can be collected by photographing a government issued photo ID card. Figure 2 shows an ID scanner located at the think[box] Welcome Desk.

Fig. 2. Photo ID Scanner

However, this is only capturing an image of the ID, and OCR software or human intervention would be needed to convert the image data into database format for mining and analysis.

Sign-in Apps: An alternative or accessory to ID collection data is to develop or purchase “sign-in” application software that can be set up on an iPad or tablet. Using this solution provides ancillary data of any type needed by the makerspace or required by funding sources. This could also be implemented as a paper form at institutions without the needed support from IT services. A simple and quick solution is to create a customized Google Form which automatically populates a Google Sheet to store sign-in data. The form would be designed to capture whatever data is required to track makerspace users. Figure 3 shows the iPad sign-in Google Form that is used in think[box]. Users are prevented from exiting the app thanks to a kiosk app and “guided access” (part of Apple’s IOS). This Google Form captures the following data:

  • Email

  • Reason for the visit:

    • Tour

    • Course (with course ID)

    • Research

    • Entrepreneurship

    • Design Competition

    • Personal Project

    • Business/Corporate User

    • Other

The Google Sheet can then be mined for the data of interest, such as the number of courses utilizing a makerspace, or the percentage of users visiting to support their research projects.

Fig. 3. iPad Sign-in App

Gate Counter: Large groups, tours, VIPs, and certain other visitors often bypass the ID Scanner and Sign-In stations at makerspaces. In order to provide an estimate of these visits, some makerspaces employ a gate counter. The sensor can take many forms, from the traditional turnstile (Fig. 4) to a break beam sensor (Fig. 5) or a thermal camera with image processing (Fig. 6). These sensors provide only an electric signal such as a relay which closes and opens again as a visitor walks by. In order to maintain a running count, this signal can be measured with a hobby microcontroller such as the commonly-available Arduino or Raspberry Pi, or through an industrial solution such as a totalizing counter. The data can be displayed locally or sent to a server for addition to a spreadsheet or an AirTable database that allows back-end software to interrogate and compile custom infographics to display Key Performance Indicators on a graphical dashboard (Fig. 10).

Fig. 4. Turnstile

Fig. 5. Retroreflective Break Beam Sensor

Fig. 6. Irisys Gazelle Thermal People Counter, and Interface

Fig. 7. think[box] celebrates 100,000th visitor in 2015
Red Lion EPAX0600 Display containing MPAXC020 Totalizing Counter

As an example of combining this data, Table 1 shows the total number of unique users as well as total visits to Sears think[box] over a five-year period.

Table 1 Total Users & Visitors

Fiscal Year

Unique Users

Total Visits

2014

903

17,982

2015

2,906

57,870

2016

4,150

66,235

2017

5,102

81,429

2018

6,014

95,984

The numbers in Table 1 were mined from the iPad sign-in station, Figure 3, located at the think[box] Welcome Desk, as well as the gate counter.

Registrar Data: Data from sign-in systems at higher education institutions is often merged with registrar data to allow deeper inspection of the types of users accessing a particular resource. The collected data can also be used to determine statistics on the breakdown of users in terms of percentage of undergraduates, graduate students, staff, and faculty using a makerspace, or as a breakdown of users by departments and majors (Fig. 8). Registrars at universities that receive at least some federal funding must also maintain gender and ethnicity information for each student using nationally-accepted definitions from the IPEDS program. Using this information, diversity information may be obtained for users to provide a picture of makerspace diversity in terms of gender (Table 2) or ethnicity (Table 3).

Fig. 8. User Breakdown by Background

Table 2 Gender of think[box] Users

Female

Male

Transgender

Non-Binary

36%

63%

2%

Table 3 Ethnicity of think[box] Users

Asian, Asian American

African
African American

Latino
Hispanic

White

25%

4%

6%

61%

User Surveys: Very useful for focused data collection, there are a number of online software survey/questionnaire products (e.g. Survey Monkey and Qualtrics) that can be easily customized to produce professional surveys and questionnaires. At Case Western Reserve University (CWRU), we employ an annual student user survey for many purposes, such as to see which equipment and services are most popular, to ascertain the quality of the user experience with staff and student workers, and many other functions. Through this instrument we were able to determine, for example, the percentage of student users who reported that think[box] was a significant factor in their decision to select CWRU as their destination university (Fig. 9).

Fig. 9. Influence of think[box] on Student Recruitment

Fig. 10. Dashboard displaying Key Performance Indicators (KPIs)

Noteworthy Projects: A great source of job satisfaction for university makerspace leaders is the quality of the projects that students and other users create in these facilities. These stories can be used to promote the individual makers as well as the makerspaces that were a part of the story. Some makerspaces employ project intake forms to catalogue the projects coming through the facility. Though logical, some makerspace leaders see forms and paperwork of this sort as barriers to access. At think[box], “cool projects” are identified at poster shows, start-up weekends, thesis defenses, routine lab walkabouts, and at every other opportunity. When staff and student workers identify projects that are technically challenging, novel, are being commercialized, or are noteworthy for another reason, that data is fed into two places. First, a photo is requested or taken, and that along with contact information and a title and description of the project are uploaded to the think[box] website (with the user’s permission) for public consumption. Projects are also promoted in a monthly newsletter, and selected exceptional projects are displayed as a rotating slideshow on large LCD screens on each floor of the building. Entrepreneurial-focused projects (possible commercial opportunities) are also added to a spreadsheet, and users are queried twice a year for totals on patents, jobs created, and funding obtained.

Impact

Decision Making: Most importantly, good data collection should support good decision making. Through user surveys and other metrics, makerspace leadership can determine which machines are overly utilized, to guide purchasing of additional units. With numbers on gender and ethnic background, as well as area of study, goals can be set to increase diversity, and data-driven decisions can be made about which outreach programs are proving most successful.

Talking Points: Each institution and makerspace must generate its own “selling points” for use in student recruitment, alumni relations, donor engagement, and other purposes. Data can come from unexpected sources, and makerspace directors and managers are constantly searching for data that demonstrates impact – even of a corollary nature – of their spaces. For example, staff at think[box] attended a research showcase showing posters for 497 research projects at CWRU, and worked with research administration staff to produce a data set of which posters used which “core facilities” on the campus. From this, a talking point was generated when staff discovered that think[box] was the first most cited “core facility” at the University. Another talking point was generated when staff obtained data from the provost’s office on sign-in data from the gymnasium, library, health services clinic, and dozens of other student-centric facilities on campus, and discovered that think[box] was the third most popular student facility on campus after the gym and the library. Data may come from anywhere, and seemingly unrelated reports can be mined for valuable insights.

Grant Reporting: Foundations, government grant programs, individual donors, and other sources of funding commonly used by makerspaces may stipulate data collection and reporting in their terms of acceptance. Leadership at makerspaces can thus easily find themselves in the position of having to collect data for this purpose. Certain types of data can be more difficult to collect than others, and well-meaning foundations can sometimes ask for data which is overly difficult to obtain; however, terms of acceptance can often be modified, especially before the agreements are signed. By collecting data early and understanding this requirement, makerspace leadership can often work with foundations staff to ensure reporting requirements specify only data that is obtainable – ideally and most easily, data that is already being collected.

Conclusion

The collection, organization, and analysis of data are an integral and critical part of developing, operating, and validating an academic makerspace. As we have shown, data can be generated and/or collected in many ways, using an array of different systems and techniques. The power associated with disseminating information derived from these data sources in as many ways as possible cannot be overstressed. However, accurate and honest data collection is imperative; Most, Craddick, Crawford, Redican, Rhodes, Rukenbrod, and Laws (2003) [1] describe “quality assurance” and “quality control” as two approaches that can preserve data integrity and ensure the validity of results. In all cases, quality assurance (standardization of data collection protocols) and quality control (consistent, well documented monitoring, processing, and dissemination procedures) of data needs to be maintained before, during, and after data collection.

As higher education makerspaces become increasingly integrated into mainstream academic programs, the opportunity for faculty to engage in pedagogical research using these spaces as their “research lab” increases. In particular, NSF has issued an RFP in the form of an open “Dear Colleague Letter: Enabling the Future of Making to Catalyze New Approaches in STEM Learning and Innovation” [2].

Additionally, the VENTUREWELL organization [3] supports faculty in developing programs that cultivate student innovators and promote institutional change through grants, workshops, training, and conferences. These VENTUREWELL programs provide grant support for faculty and students in the areas of pedagogical innovation and STEM entrepreneurship, and can integrate closely with the activities of an academic makerspace. The ability to collect data to validate outcomes from these programs would be critical.

Finally, data is the second (people being the first!) most important asset associated with running an academic makerspace; collect, process, and disseminate it wisely and it will repay with huge dividends.

References

  1. Most, M.M., Craddick, S., Crawford, S., Redican, S., Rhodes, D., Rukenbrod, F., Laws, R., “Dietary quality assurance processes of the DASH-Sodium controlled diet study,” Journal of the American Dietetic Association, 103(10), 2003, 1339-1346

  2. https://www.nsf.gov/pubs/2015/nsf15086/nsf15086.jsp

  3. https://venturewell.org/

Comments
0
comment
No comments here
Why not start the discussion?