This article is more than 5 years old.

The 2013 NASIG Conference was held in Buffalo, New York, from June 6th to 9th. I flew in two days early so I could attend an all-day Executive Board meeting on the 5th, in my role as incoming Vice President. It was nice to be back on the Board and get into the issues facing NASIG, although I can’t really talk about what we discussed (confidentiality and all that).

As for the conference content, the opening and closing Vision Sessions were particularly interesting and formed neat bookends (Derrik did a great job describing Megan Oakleaf’s Vision Session on the second day). First up was Bryan Alexander, of the National Institute for Technology in Liberal Education (NITLE). Alexander described how computer interfaces have changed dramatically and how they have grown in ubiquity. He talked about how the use of computer technology to reach out to the public has grown so much that even the government is using computers to communicate in unprecedented ways (in a funny coincidence, just after he said this, I fidgeted with my phone and checked my email, and received an email from the North Carolina Wildlife Commission reminding me that my fishing license was due to expire and offering me the chance to renew online. From a meeting room in Buffalo, NY). Alexander was very matter-of-fact about how pervasive computer technology is throughout our lives. He described a project, or possibly a new app, in Denmark that uses facial recognition technology to identify people in a photograph, which then takes you directly to their Facebook page and social media presence. I was shocked by this, because it sounds like a stalker’s delight, but Alexander did not seemed disturbed by the development. Perhaps he is concerned about the privacy implications of such technology, but it wasn’t apparent during his speech. Alexander went on to describe three possible futures that he sees developing from the proliferation of information technology: 1) The Phantom Learning World – In this world, schools and libraries are rare, because information is available on demand anywhere. Institutions supplement content, not vice versa, and MOOCs are everywhere. 2) The Open World – A future where open source, open access and open content have won. Global conversations increase exponentially in this world, but industries such as publishing collapse, and it is generally chaotic (malware thrives, privacy is gone). 3) The Silo World – In this world, closed architecture has triumphed and there are lots of stacks and silos. Campuses have to contend with increasingly difficult IP issues. Alexander acknowledged that the three variations were all pretty extreme and what eventually develops will probably have features of all three. But he emphasized that, as information professionals, we have to participate in shaping our information future.

While Alexander’s speech seemed to accept that the horse was already out of the barn when it comes to our privacy in the information technology realm, Siva Vaidhyanathan’s Vision Session speech was very much focused on privacy issues. Vaidhyanathan is from the University of Virginia, and he wrote the book “The Googlization of Everything-And Why We Should Worry.” He discussed how Google tries to read our minds and anticipate our behavior, based on our previous online behavior. He argued that Google’s desire to read our minds is actually the reason behind the Google Books project, which won’t make money for them. So, why do they do it? Vaidhyanathan argued that Google is trying to reverse engineer the sentence. They want to create an enormous reservoir of millions and millions of sentences, so they can sift through them to find patterns and simulate artificial intelligence. This would give Google and huge boost to their predictive abilities. Furthermore, he argued that Google is in a very close relationship with the government which should be worrying (particularly in light of the Edward Snowden case, which broke just days before his speech). Considering the enormity of the data at Google’s disposal, this could have enormous consequences. Vaidhyanathan argued that there is currently no incentive to curb Big Data, from the point of view of government, business and even academia. Why go small when there’s so much data to trawl through? Nobody’s trying to stop it, even if they should be. Vaidhyanathan went on to discuss Jeremy Bentham’s idea of the Panopticon, which was a prison with a circular design, with the cells placed in a ring around a central guard tower. The guard tower would have mirrored windows which would prevent the prisoner from ever knowing if they were being watched at any particular time. This was presumed to keep the prisoner on his best behavior. Vaidhyanathan argued that we now live in a Cryptopticon, where we don’t know who is watching us and when (here he gave the example of store loyalty cards, which are used to create a profile of your purchases that is cross-referenced with your credit card, and which is shared with other commercial entities). Unlike the Panopticon, which had the goal of keeping you on your best behavior, the Cryptopticon has the goal of catching you at your worst behavior. And while the Panopticon was visible, the state wants the systems of surveillance to be invisible (hence the Cyrptopticon). The state wants you to do what comes naturally, so as to catch you if you do something wrong. Vaidhyanathan argued that hidden instruments of surveillance are particularly worrying. For example, he discussed the No Fly List and the Terrorist Watch List. We don’t know what it takes to get on or off one of those lists. In essence we’re not allowed to know what laws are governing us, and that’s wrong. And these lists are very fallible. While there are a lot of false positives on the lists (people who don’t belong on the lists, but are, such as the late Sen. Edward Kennedy), there are also a lot of false negatives (people who aren’t on the lists but should be, such as the Boston Marathon bombers). The No Fly and Terrorist Watch Lists could be useful, but they are poorly executed. Vaidhyanathan argued that these lists might function better with more transparency. In conclusion, Vaidhyanathan discussed how, thanks to the proliferation of data about our lives on the web, we are creating a system where it’s hard to get a second chance. Youthful indiscretions and stupid mistakes will be with you for good. It made me think that the classic Vice Principal threat, “This will go down on your permanent record,” is now true. Vaidhyanathan argued that while savvy technology users may be able to take measures to protect their privacy on the web, we should be worried about protecting everyone’s privacy, not just our own.

Of course there were also a number of sessions that I attended, but I think I’ve already written enough and hopefully provided some food for thought.