Over the last few days, I spent a little time looking over the inBloom Data Store Logical Model. Based on what I have seen there, I have some additional questions and observations about the data that is stored within the system. The questions included here are not comprehensive by any means. Rather, this is a short list compiled after spending around an hour reviewing the data model.
A. inBloom Could Be Used to Screen Immigration Status
inBloom can store information about how a person verifies their identity. The values used here could be used as a screen to check immigration status. Given some of the laws passed at the state level, I would hope that schools would not be passing on this information. What educational goals are supported by collecting this data?
I also like how "Family Bible" is included as a proof of ID.
B. Getting Ready For People to Opt Out
inBloom appears be anticipating that people will opt out of tests. The Reason Not Tested list includes parental waivers, and parents opting out.
Using this data, people or organizations with access to data stored in inBloom could create a rolodex of parents - complete with addresses, emails, and contact info - who are opting out of testing.
C. Getting Ready To Restrain
inBloom has the capacity to track when students are restrained.
However, inBloom makes no similar accomodations for tracking students that are subject to corporal punishment. According to congressional testimony (quoted below), there are between 2 and 3 million occurrences of students being hit in school each year, and corporal punishment is legal in 30 states. Given how inBloom supports tracking other disciplinary actions, this seems like an odd and unnecessary omission.
The prevalence of corporal punishment of children in schools remains high in the United States. In spite of many education and other national groups calling for corporal punishment in schools to be banned, the United States remains one of the few industrialized countries allowing corporal punishment in 30 states.\2,21\ According to the Office of Civil Rights (2007), school officials, including teachers, administered corporal punishment to 223,190 school children across the nation during the 2006-2007 school year.\8,12\ Experts note that there are about 1.5 million reported cases of physical punishment in school each year, but calculate the actual number to be at least 2-3 million; as a result of such punishment, 10,000-20,000 students request subsequent medical treatment each year.\8,9,12\ During this same period, the top ten states for students being hit were, in order of highest to lowest frequency: Mississippi, Arkansas, Alabama, Oklahoma, Louisiana, Tennessee, Oklahoma, Texas, Georgia, Missouri, and Florida.
D. What Student Characteristics Really Matter?
inBloom supports the ability for schools to track Student Characteristics
Apparently, "Immigrant" and "Single mother" are "conditions" that get recorded. See point A, above, about how inBloom could be used to target families based on immigration status.
E. Collecting Social Security Numbers
According to the enumerations, Social Security Numbers are among the ID's stored by inBloom for both Staff and Students.
Additionally, inBloom's FAQ states that social security numbers will be stored if everyone agrees that they should:
inBloom discourages storing social security numbers in its data service, but legally school districts and state may record student social security numbers. inBloom prohibits storage of social security numbers in the data store unless agreed to by both inBloom and the state/district on a case-by-case basis.
However, less than a month ago, Iwan Streichenberger, the CEO of inBloom appeared to say (via Twitter) that inBloom does not store Social Security numbers. As I asked a couple days ago, however, it looks like inBloom defers to states and/or districts, and that they will store what they are provided.
In many ways, inBloom is helping to bring visibility to the issue of data collection, data storage, and data sharing. inBloom is a data store, collecting data from many sources into one location. inBloom is different than other past efforts for its scale and partnership efforts. It would be great to see inBloom and the various agencies collecting data be more proactive about how data is collected, when the collected data can be reviewed by students, teachers, and parents, and how inaccurate date in the system can be reviewed or deleted. Right now, inBloom appears pretty silent on most of these questions, which does nothing to dispel concerns about how - and by whom - the data will be used.
When data is collected at scale, on a large number of people, over time, the role of for-profit companies in the ecosystem needs to be blatantly, obviously clear. When a data set is large enough, even a small number of data points from within that data set can be used to target and identify individuals within that data set. Given the value of student data, and the lack of transparency around how that data is used once it has been handed over, both inBloom and any schools, districts, and states collecting data need to clarify the rules, and how people can be certain these rules are being followed. In the absence of guarantees, students and parents need to be given access to their data so they can review and correct it as needed.
As we have seen, sometimes data is completely worthless. Moreover, if a student is at a school where corporal punishment is practiced, how much can we trust a discipline report from the same person who hits kids in the name of education? There are lot of open questions here, and these open questions undermine the value of any data that would be collected at scale.
Most importantly, kids aren't going to school to provide researchers with data points. The purpose of school isn't to get people comfortable with life under constant observation. The endless efforts at data collection to capture what "works" with learning have the potential to disrupt the learning they are trying to capture. Learning requires trust; treating students like subjects - rather than people - is a surefire way to erode trust before it has a chance to get started. Without clear, obvious, and fully transparent rules around data collection and how that data is managed, we run the risk of observing our public education system into irrelevance.