FACIAL RECOGNITION CAN BE USED FOR GOOD TOO | pixevety

FACIAL RECOGNITION CAN BE USED FOR GOOD TOO

 In Children, Photo sharing, Privacy

Not all facial recognition (FR) solutions are the scary, dystopian, nightmarish creations profiled in the media of late. FR can be used for good too – after all FR is merely a tool, it is the where, why and how it is used that makes all the difference.

My company, pixevety, is uniquely concerned with ensuring that the digital footprints of our kids are carefully managed by those who are custodians of their data… and that not a single student photo is published by schools in brochures, on posters, in the news media or on Facebook/Insta/Twitter without proper consent. What this means, in practice, is that we need to offer schools and other organisations a tool that ensures our concern about digital footprints can be managed (in a realistic, replicable, meaningful way) in practice.

The journey began…

The pixevety journey began when my daughter was told when she started school that she couldn’t participate in normal school activities if her parents did not sign a blanket photo consent form. That was the first time I asked myself “how did we allow social media to influence the classroom?” and “how did schools allow social media to affect student experiences?” Whether you like social media or not, it has caused a significant headache for schools when it comes to sharing images and consent. Before we go on, may I point out that sharing on social media is publishing and when I mention this to schools typically their awareness grows.

Our journey then continued when our daughter’s image was used several times inappropriately by the school without our consent (to promote the school on the side of a bus, as the poster child for its remedial learning program). This caused us a lot of heartache and still has implications today. We tried to find a realistic solution with the school with the systems they had in place at that point in time but to no avail – they just didn’t have the right toolkit to manage consent properly. It became very clear to me at this point that technology had caused a significant problem that only technology could fix!

This is just one family story of many family stories which I am sure are out there but simply people are too embarrassed to share.

Putting aside commentary on how *anyone* could harm or bully a child for *any* reason, the above incident raises an important question about photo use and publication by schools (for marketing purposes, or anything else)… and the extent to which schools really understand their obligations to observe privacy laws and the consent obligations contained therein.

Traditional consent processes used by schools are unwieldy, unreliable and unlikely to meet the VICS test (i.e., that consent is truly Voluntary, Informed, Current and Specific). In the context of student photos alone, imagine the challenges for a school to manually map a parent’s paper-based consent for use and publication of their child’s photos (taken on campus, on field trips, at sporting events over a period of days, weeks, months or years) back to each and every photo of that child (wherever the school may have stored it, including on staff personal electronic devices and the cloud network that they are connected to). The managing of just one child’s photos, if done properly, could be a full-time job!

Enter: privacy by design (PbD). PbD is about ensuring that privacy is built into the design of a project or system or initiative (or whatever) up front, as part of the foundational specifications, as opposed to being retrofitted or bolted-on at some later point in time. To address the issue of photo use consent management, two things were key: 1) that any consent would meet the VICS test and 2) that the consent solution would be easy for schools to implement, and equally easy for parents (and, depending on their age, kids) to understand.

While it may seem counter-intuitive to some, we determined that facial recognition (FR) technology could (if specifically deployed) allow for a school to identify the images of one student in seconds and ensure that the privacy wishes of parents (and the students themselves) in relation to those images can be acted upon quickly and reliably by the school.

pixevety isn’t your typical case study in FR. There isn’t a surveillance element. There isn’t a law enforcement element. There are no vast face data banks (but rather, limited and clearly defined student identification images – which the school already holds – locked down and accessible only to that school). There is no profiling. There is no AI running in the background. pixevety uses FR for one purpose only: to ensure that the consent for photo use and publication provided by a student (or their parents) is accurately mapped to each and every image of that student held by a school… in real time… and that consent is able to be changed to suit a student’s particular circumstances without any administrative drama.

So how does it work?

  • A school seeks consent from students (or their parents) to use or share student photos in a particular way (which, by the way, has traditionally been managed ineffectually by schools due to limitations of process and general compliance misunderstanding – very rarely is this deliberate, more a result of modern demands and circumstance… just think about the catch-all “you really don’t have a choice” photo use consent forms you’ve been asked to sign for your kids in the past!)… and then
  • pixevety ensures that, based on one known image of the student (their school ID photo), the consent provided maps across to the potentially thousands of images of that student in the school’s gallery (the gallery is a centralised version of the school’s filing cabinet or desktop collection used in the past).

With all the media attention to FR, and the scary dystopian uses of the technology (like ClearviewAI), I want to ensure the community is aware of the good, privacy-forward uses of FR that are out there. Not all uses of FR are antithetical to privacy. Not all FR is about surveillance. If the purpose for deploying the FR is privacy-minded, so too can be the final product.

My name is Colin Anson and I am the CEO of pixevety. I hope you enjoyed reading this article and have a better understanding over how FR, when used appropriately and ethically, can be good for society and used to protect our children.

Recommended Posts

Start typing and press Enter to search