The NICE Challenges are designed to provide students with real-world experiences while providing educators with valuable data regarding the methodology used by students to arrive at solutions. Our holistic approach to creating this data involves automatically collecting part of it through our automated scoring system while also requiring the student to produce part of it by writing various pieces of documentation.
Automatically Scored Data - When a NICE Challenge attempt is submitted, all data collected from technical challenge objectives, also known as checks, are formatted into time-bar graphs. These data graphs give educators a visual representation of tracked technical objectives throughout the entire challenge attempt. With these graphs, an educator can see a wealth of data including how long it took to complete a technical objective, if an attempted hack was successful, or whether a critical business resource experienced downtime.
Student Created Data - While attempting a NICE Challenge, students are asked to document their solutions to issues and the reasoning that brought them to that solution. This information provides educators the insight needed to award higher scores to creative and secure solutions while also providing critical feedback opportunities when students mistakenly subvert security to solve an issue.
Using both forms of data together, educators can then make informed decisions about whether a student was successful. The NICE Challenges, just as in the real world, have many possible solutions and we leave it to the educator to define what solutions are acceptable.