Tuesday, March 27, 2018

Terms and conditions apply - what went wrong?

Terms and Conditions by Nick Youngson CC BY-SA 3.0 Alpha Stock Images
"If it's free then you are the product". Yes, we've heard that many times over the years but somehow chose not to take it so seriously. We merrily approved all the terms and conditions that popped up when we signed up for all of our social media networks and tools and kept on clicking. We basically gave Google, Facebook, Twitter and the rest of them the freedom to gather enormous amounts of personal data and sell it to anyone willing to pay for it, whatever their motives. Now with Facebook in the eye of a storm of outrage and Google in similar trouble, we can see what the cost of "free" actually is. Basically most commercial online media that are "free" are also in the business of tracking and selling data to advertisers (read more in Doc Searls Weblog, Facebook’s Cambridge Analytica problems are nothing compared to what’s coming for all of online publishing). At the same time we are so hooked on "free" that it's hard to break away. Try to imagine your digital life without all these commercial giants, especially Google. I'm trying to limit the damage at present by switching platforms (e.g. moving from Chrome to Vivaldi and searching with DuckDuckGo), going through all the security settings and so on, but it feels like I'm hopelessly enmeshed.

So what about education in an age where free has been compromised? So many communities of educators and students are built around Facebook, Google, Twitter etc. Do we close them down and move elsewhere and if so where to go? Some institutions offer safe platforms for staff and student blogs and wikis, as Tony Bates describes in his post Our responsibility in protecting institutional, student and personal data in online learning. There are also still plenty of open source wiki sites and other non-profit services but they lack the glitter and stickiness of the commercial solutions. Many users will no doubt set up new alternative networks and platforms but they involve considerable administration and development and will cost time and resources. Some will try to limit the damage and continue to use the old favourites but being more aware of their limitations (e.g. Siva Vaidhyanathan's article in the New York Times, Don’t Delete Facebook. Do Something About It). Whatever happens we need to revise our practices and attitudes.

One interesting aspect of this mess is raised in an excellent post by Autumn CainesPlatform Literacy in a Time of Mass Gaslighting – Or – That Time I Asked Cambridge Analytica for My Data. She proposes platform literacy as a key skill for the future; the awareness of the power that platforms have and the ability to limit the amount of data that platforms can acquire from you. Personalisation it seems has been the pied piper leading the children to their doom.

Personalization in learning and advertising is enabled by platforms. Just as there are deep problems with personalization of advertising, we will find it is multiplied by tens of thousands when we apply it to learning. Utopian views that ignore the problems of platforms and personalization are only going to end up looking like what we are seeing now with Facebook and CA. The thing that I can’t shake is this feeling that the platform itself is the thing that we need more people to understand.

Platforms gather data and data is the new oil. That crude data can now be distilled and some of the applications are proving to be deadly, threatening democracy itself. Maybe we are now beginning to realise what that often misused term "disruption" really means? Even our learning management systems are powerful platforms that gather data on students' interactions, access to material and performance. This can be used to enhance learning as many experts in learning analytics have demonstrated, but what if the data escapes into the wrong hands? We need to become more aware of the power of platforms and what we can and cannot share on them. 

What if we were really transparent with the data that learning systems have about students and focused on making the student aware of the existence of their data and emphasised their ownership over their data? What if we taught data literacy to the student with their own data?


Tuesday, March 20, 2018

Learning to live with edtech skepticism


We all have a tendency to read articles and research that support our own preferences and ideas. We may try to achieve a healthy balance, but somehow any research that criticises our own standpoint is just a little harder to accept. Cognitive bias is always a factor and pure objectivity is extremely hard, if not impossible.

Those of us who believe that educational technology can play a vital role in making teaching and learning more collaborative, empowering and inclusive are often frustrated by other teachers who simply chose to continue teaching the way they always have done. How can they ignore all the seemingly convincing articles and research findings that we recommend? Maybe we need to realise that all those articles are not going to win them over and that other tactics are needed. This problem is illustrated well in an interview in EdSurge with Lauren Herckis, an anthropologist at Carnegie Mellon UniversityWhy Professors Doubt Education Research (listen to the audio file of the interview below). She discusses the problem of why many educators show little interest in the findings of research into the use of technology in education. It may be a frustrating standpoint for those of us who believe in the benefits of educational technology, but we should maybe see it simply as a pragmatic approach to teaching.

Then there are people who will say, "I've been teaching since I was a graduate student. My students are very happy with the teaching. I feel pretty good about my teaching. I understand that you have a PhD in curriculum design, but I don't really need that.”

There's also the suspicion that the modern research is too much theory and too little practical experience. If you've been teaching for 20-30 years and feel good about it why complicate things? You can certainly be an excellent teacher without embracing technology.

For faculty who believe that teaching is an art, that it is just something that you develop with experience and time, that you can't learn from a book, you need to learn by doing more or learn from your students, no amount of exposure to learning science research is going to disrupt their sense that this is something they learn by doing, or that they need to follow their gut on.

Given the lack of time many teachers have for course development, the prospect of completely overhauling a perfectly good course is not particularly attractive, no matter how well grounded the changes may be in current research. Good teaching does not need technology and we need to remember this. Instead of trying to win them over, we should try to see if there are any elements of their course that they feel could be improved or any time-consuming elements that they would like to cut down. Maybe there's a digital tool that could help somewhere? Take it from there.

Here's the audio interview with Lauren Herckis.



Friday, March 16, 2018

Open, free and safe - a tough combination

CC0 Public domain by Jerome Dominici on Pexels
Once upon a time there was an optimistic view that many of us subscribed to. After the end of the cold war we thought that the world would now be a safer place and that democracy and international cooperation would flourish. Then came the internet offering us global networking, the free exchange of ideas, a multi-cultural meeting place where we could collaborate and learn from each other. Platforms and tools were developed to facilitate free and open networking and we developed exciting concepts like social networking, the wisdom of the crowd, crowd sourcing, open education, MOOCs and so on. What could possibly go wrong?

Now we see international cooperation and understanding being replaced by suspicion and fear and as a result countries are turning inwards and reinforcing borders. The companies who offer platforms and tools for global communication have grown so gigantic and powerful that the original objectives have drowned under the weight of commercialisation. Our privacy and integrity have been undermined as vast quantities of data about each one of us are shared with advertisers. The net itself has developed a dark and menacing flip side, being used to spread hate, fear, lies and provocation; something few would have predicted 20 years ago. The openness and freedom we thought the internet would foster has developed into something more sinister. 

One result of this is that global corporations are being challenged and even taking them to court for shortcomings in their use of personal data (e.g. German court ruling against Facebook). They are being forced to answer questions on their level of responsibility for what is disseminated on their platforms and they are slowly beginning to admit a degree of responsibility. Tougher legislation is being passed to prevent the misuse of private data and to guarantee the right to be forgotten. We have learnt that free and open can be interpreted in many ways and they are seldom combined with security and privacy (see also Mashable article, Stop letting Facebook get away with all of this).

In Europe we have the new GDPR (General Data Protection Regulation) legislation that will come into force this summer and aims to protect and empower all EU citizens data privacy and to reshape the way organizations across the region approach data privacy. This is welcome protection but the question is whether the corporations will be able to meet the new legislation. It is necessary because we have surrendered our privacy and integrity thanks to blindly accepting the complex and lengthy terms and conditions that flash in front of us every time we sign up for a new service. We have all been rather naive and assumed that all companies will respect our privacy and right to our own data. Control is needed but could this come at the expense of the dreams of open interaction?

I use social media both professionally and privately and have spent the last 10 years looking at how they can be used in education. It has been immensely enriching and I have been lucky not to encounter any significant negative effects. Will the new legislation mean that we turn away from commercial social networks and revert to more restricted but safer home-grown alternatives? Will universities and schools who use social media as an integrated part of their teaching (blogs, discussion groups, video forums, collaborative writing tools, etc) have to rethink their strategy? How do we build safe social networks that allow open collaboration but where privacy issues are fully respected? Is the open internet being divided up into smaller networks, some safe and some not? Is openness a tainted concept?

Many questions and if anyone out there can supply some answers, please feel free to comment.

Sunday, March 11, 2018

Accessibility as default


Over the last couple of years I have become increasingly aware of accessibility issues in online education. I admit to previously not giving much thought to how people whose hearing, sight, physical mobility or cognitive abilities are different to my own, interact with digital media. However by meeting and discussing with people involved in this area, I have gained a few insights into accessibility questions and a whole new world has opened itself to me.

An article on the Webinar blog, Webinar Slides And Text-To-Speech, highlights the issue of writing to facilitate text-to-speech applications. If we follow some simple guidelines we can make sure that all our digital resources are more accessible. Instead of making alternative accessible versions, why not make accessibility default?

In a perfect world, we would make a version of presentation materials that are optimally designed for a sighted audience listening to a narrator, along with a second set of hardcopy materials that can be referenced by a larger and more physically diverse audience. But there are practical considerations for how much time and effort presenters can dedicate to their materials. If you can only make one version, why not make it accessible to everyone?

The article refers to an excellent guide to writing for accessibility: the British Dyslexia Association's Writing for text to speech. For example, when writing presentation slides, extra attention to punctuation can make an enormous difference for those who need to listen to the text. If you write bullet points without a full stop or semi-colon at the end of each point, then the text-to-speech app will simply read all points as one long sentence. With punctuation, however, it will be read as a list with pauses between points. Some other simple tips:

  • Write numbers manually in bullet lists since the automatic numbering is not picked up by the text-to-speech apps.
  • Dates should be written using the name of the month rather than combinations of digits (11 March 2018).
  • Times should use a colon instead of a stop to separate hours and minutes (10.30). 
  • Put stops in acronyms, otherwise the app may say it as a word (U.S.A. or e.g. would be best). 
  • Use styles to show headings or sub-headings rather than bold normal text. 
The list goes on. The point is that by following some simple rules, you can easily make your texts more accessible. Furthermore we also make our resources clearer and more consistent for all. Text-to-speech is used by many with perfect sight, for example listening to text in a mobile. Shouldn't we make sure we teach accessibility as default and not as an optional extra?

We all have so much to learn!