Puppets on a string

CC0 License: https://www.pexels.com/photo/bag-electronics-girl-hands-359757/

 The post is inspired by a L2talk I did at the Learning2 Europe conference in Warsaw.

“every storyteller has a bias – and so does every platform”-  Andrew Postman  “My Dad Predicted Trump in 1985 – It’s Not Orwell, He Warned, It’s Brave New World.” The Guardian. Guardian News and Media, 02 Feb. 2017

I am an addict. Are you too? Don’t you hate it when you can’t find your phone, and a friend has to call it.  Maybe the first thing you did this morning was check your phone and the last thing you did today was check your phone.  Think of it, we walk and text, and even drive and text. Have you had this happen,  you are in a social situation and you go the bathroom  to check an update. You are standing on a street corner and suddenly realize you are on your phone swiping at it, unconsciously. Then there is the feeling you get when you post a picture on a social media feed. The “likes” start coming in. It feels good, really good,  and then you check back and back. You post an update and there no “likes”.  You start wondering to yourself what is going on?

(CC BY 2.0) Photo taken by Angus MacAskill “Rat” https://www.flickr.com/photos/19951543@N00/3908678004/

I am sure you’ve heard about B.J Skinner’s rat experiment. The first rat had a lever in its cage, and every time it hit the lever food would come out. The second rat in the same set-up, hit the lever and nothing came out, no food. The third rat, same set-up, when it hit the lever a little food came out, then nothing, and then a lot, and then nothing again. The third rat developed an addiction. It quickly realized as long as it hit the lever it had a chance of getting some food.  This is called the principal of variable rewards. That feel good feeling, the dopamine rush. Behavior design as explained in this article (Scientist who make our apps addictive by Ian Leslie 1843 Economist October.November 2016) is a critical part of every app development. Tech companies employ behavior economist, psychologist, and psychiatrists in the creation, design and curation of our apps ecosystems to ensure we keep coming back.

So many of our interactions with devices are subconscious.  In Eric Pickersgill thought provoking photos series “Removed” (do spend some time on the link) he highlights the idea of being alone together as Sherry Turkle so aptly describes in her book Alone Together. We are often physically together with another person in a space sometimes even intimately but our mind’s burrowed in a phone.

As adults, we are quick t0 point the finger at kids for not being able to manage their screen-time. Think of this, the first time an infant will interact with a digital device is watching a parent using one. What does it feel like for a child in a pram looking up at their parent to only see a blank expression immersed in their smartphone.  The dinner table conversation interrupted by parents checking work emails. Mary Aiken in her book “ The Cyber Effect” states we are asking the wrong question.  Mary Aiken writes “We should not be asking at what age is it appropriate to give a digital device to an infant, but be asking the question when is it appropriate for an adult to interact with a digital device in front of an infant.”

A good example of behavior design is Snapchat and the new feature “streaks“.  The idea of streaks if you have a dialogue with a friend over 24 hours and you continue this over days, a flame emoji shows up.  In tandem a number counting your interactions keeps tally. Should one of you stop posting, an hour glass shows up giving you a heads up that the streak will disappear if you do not stay on. For adolescent’s social media relationships can be a gauge of their social capital.  Streaks adds a layer of complexity to the interactions.

I am not against digital devices. I have been working in Education Technology as a coach, coordinator, IT Director and Director of eLearning for over 20 years. I love the seamless and frictionless experience of our digital environments.

By Jim McDougall from Glasgow, Scotland (Puppets on a String Uploaded by Snowmanradio) [CC BY 2.0], via Wikimedia Commons
It is a fact that our online data  (health apps, social media, travel, online games, GPS, shopping, search etc…) is collected, analyzed, and then sold to third parties,  or curated to give us a personalized online experiences with a clear goal to manipulate our behaviors. We as educators have an ethical responsibility to be skeptical of behavior design’s narrative. Let us challenge our learning communities to question the complexity and consequences of behavior design in our lives. Stuffing a digital citizenship lesson for 15 minutes during a Friday morning advisory is not enough. We need to make this narrative an integral part of the living curriculum.

Do we want to end up being puppets pulled by the strings of choreographed digital ecosystems which we do not control?

I think it is important to understand schools are most likely the last place where children interact with digital devices with balance and pedagogic purpose. We cannot take this for granted.

If we ignore behavior design we will loose something. Free will. I and you do not want to lose this.

John @ http://beyonddigital.org

=

Hal, is in the house.

Harvest
Fall Harvest Photo jmikton

A colleague of mine and her Kindergartners were busy exploring where an egg comes from. “Was it born like a baby? Does it grow on its own? Where do they come from? Different perspectives and ideas were shared enthusiastically. The children discussed and challenged each other with their theories. At the end of the activity, one child turned to her partner and said, “when I get home, I’ll ask Siri for the answer.” A routine response in our classrooms? Or an important moment to understand that artificial intelligence (AI)  has embedded itself in our day to day lives? For a generation of children who have been raised on iPads and Siri,  AI – with a name and voice like a human – is as ubiquitous as any other technology.

AI is a tool that learns, anticipates and predicts. It provides us with instantaneous information or completes routine tasks remotely. The Amazon Echo and Google Home, two new devices that have recently gained traction, have begun to enter the home as personal assistants. The Echo and Home are two of many voice-activated AI assistants that tap into vast artificial intelligence networks. They aggregate information based on our digital footprints and predict our habits based on a learning algorithm that engages continuously with the data we share on our digital devices.

A shift has occurred in our relationship with AI and the impact is profound. It is the seamless adaptation of AI into our lives – a frictionless experience that is slowly making us dependent on this predictive technology. This new relationship meets our unique taste and needs, and only gets better the more it knows about us. Over time, this is changing the way our brain functions when interacting in the digital world. This short video by AcademicEarth.org -“ Cognitive Offloading,” is a reminder of the neurological changes AI is having on our learning.  We collectively feel more and more comfortable subcontracting out tasks to AI. The term ” let me google this” is an example.

For educators, this shift is showing up in our classrooms informally and in some instances invisibly.  Artificial intelligences are important elements of the devices which exist in our school tool kits. These include mobile devices, apps, browsers, search engines, smartwatches, and more. Writer and professor Jason Ohler asks an important question in his article “Bio-Hacked Students On the Outer Edge of Digital Citizenship”. How should we, as educators, shift the curation of a scholastic experience when students come to the classroom with embedded or wearable artificial intelligences? This alters the value of the commodity of knowledge in the classroom and highlights a potentially new hierarchy where AI supplements a user’s expertise. Suddenly, we have 24/7 access to predictive and anticipatory information which has the potential to disrupt the independent learning experience of a typical classroom. In his article “Artificial intelligence is the next giant leap in education“, Alex Wood reflects on the role AI could play in education.

 <https://upload.wikimedia.org/wikipedia/commons/thumb/f/f6/HAL9000.svg/2000px-HAL9000.svg.png>
source: <https://upload.wikimedia.org/wikipedia/commons/thumb/f/f6/HAL9000.svg/2000px-HAL9000.svg.png&gt;

Coming to terms with these exponential changes takes time to digest. As educators, we need to understand that engagement and critical thinking are vital components of education, especially as AI shifts the classroom narrative. The ethical issues which surround these exponential changes are here now. The complacency that schools engage with in the discourse of what it means to be in a world dominated by AI is a tension we cannot ignore.

What will a world look like when companies can remotely delete pictures and videos which do not fit a predefined perspective fueled by an AI?  Danny Yadron questions this in his article “Apple gets patent for remotely disabling iPhone cameras.” What will a world look like when you scan a person’s image on the street and instantly receive their aggregated digital profile? In Shawn Walker”s  “Face recognition app taking Russia by storm may bring end to public anonymity  ” he shares the dynamics of the “FindFace” application, reminding us of the reality at our doorstep.

As educators, we have a unique opportunity to design curriculums around the narrative of artificial intelligence. We need to be encouraging our students to not only be good digital citizens but proactive digital leaders who understand the complexity of a world fueled by artificial AI. Schools should promote the skills and inquiry mindsets which provide students with the capacity to harness the power and opportunities of AI and not become complacent with the technology. Ultimately, we want our students to be active leaders and architects of AI’s continued growth. As educators, we have a responsibility to ensure our students have a working understanding of how to navigate a complex and changing world fueled by artificial intelligence for the good of future generations.

John@beyonddigital.org

 

 

 

Our Connected Data

unlearning learning
.

Over the last few weeks, an outstanding series produced by ARTE called “Do Not Track Me” has become available and is getting quite a bit of social media attention.

The different episodes explore and highlight the technologies, algorithms, data mining and aggregation of our online information through digital devices, tools, ecosystems and environments. Each of the episodes breaks down the architecture and rationales of companies, organizations and governments tracking in an interactive manner and examines how this impacts our privacy and digital footprint.

For many of the pre-internet generation, privacy both online and offline is an important right. Having full control of our privacy and being able to independently curate and choreograph where our information is and goes is an expectation. Naturally, there has always been an aspect of our information which we have not controlled, but in general, the collective expectation in most democratic countries has contained a certain level of privacy.

A paradigm shift has occurred in the last decade initiated by many companies realizing that instead of generating income from only paid advertising (remember pop ups?), they can also keep track and personalize their users’ internet experiences to generate far more valuable information, which can then be leveraged into an income. The incomes generated from a personalized web (where you get search results, ads, and information that is tailored to your tastes and based on a saved history of where you have been) are impressive. This interactive live stream shares the incomes generated by internet companies in real time.

We each are generating information that contributes to Big Data: large sets of aggregated information which we each create as we interact, live and work in the Internet’s ecosystem. Every click,”like,” scroll, connection, purchase, browse, download and action generates a footprint. This tracking and aggregation of our data is done on all of our devices connected to the internet and/or cellular networks. (A video that unpacks the technology behind tracking information)

There is no doubt that some of the tracking taking place is positive and provides us with certain efficiencies including a more tailored online experience. On the other hand, there is also a large amount of information that is collected without our knowledge which does not add to our browsing experience.

For many of us the convenience of a frictionless experience with our digital devices, tools, and environments is a huge plus. For this frictionless experience, many of us are willing to give up a level of our privacy to third parties. After all, a convenient and seamless experience is the key for users. Nowadays many of the actions, processes and uses we engage with on digital devices, tools, and environments are almost subconscious actions. Our usage is so embedded in our daily routine, both social and professional, it becomes a non-negotiable part of our life.

With this precedent, we have entered a world where personal information aggregated over time is combined, analyzed and then generated into a longitudinal profile of us. This rich set of information is then sold, traded, and curated by organizations, governments and companies. It is from these information landscapes that services and products we might need can be accommodated or altered based on our profile.

The question of course, is what our world will look like as every single digital device, tool and environment is consolidated, monitored, aggregated, and analyzed over time. Yes, maybe you could try to opt out, but it is unfortunately becoming harder and harder to do so as the internet becomes more integrated with our culture. Commerce, social media, communication, socialization and work have all moved to an online environment 24/7 in most parts of the world.

Does it matter? Are our online profiles and habits a true reflection of who we are? Does this aggregated information sold, traded, and curated by companies and organizations provide us with services and experiences that supersede the erosion of privacy? Either way, the discourse is clearly an integral component of our connected data experience.

“Most human beings have an almost infinite capacity for taking things for granted.”

― Aldous Huxley, Brave New World

John@beyonddigital

 

There is a war going on in my internet!

IMG_2429Somehow late in the game I have suddenly realized in the background of my life there was a war going on in my Internet and still is most likely, and without much fuss or noise it  took place and I suspect is still going on. It is one of these new dimensions of war, it happens undercover, behind the scene with little fan fair and without much human interaction. So not being aware of it is somewhat understandable. According to some pundits this has been going on for a few years but recently a Security company http://www.kaspersky.com/ started noticing this virus called “Flame” and since has been trying to better understand its workings and complexity.  Flame’ Virus explained: How it works and who’s behind it  and Flame and Stuxnet cyber-attacks. What to me is interesting, is that potentially nation states can engage in a destructive war within the internet, damage, capture, manipulate and steal information and in return shut down, wipe out or paralyze computer systems deemed a threat, or the country with the machines deemed a threat. This is now fact, and in the last years different groups have been busy at work using this powerful technology. Is this in the headlines, part of our daily discourse, something night news talk shows are spending time on? No, not at all this is something that happens undercover behind the scenes with little information or we hear of it after the fact.

With the global information glut and overload we consume, engage and live off, we just cannot keep up or be in tune with the various events, stories, and key pieces of information that might frame a better understanding of everything that is taking place in our world.  Then there is the information that does not get shared, or buried deep away from the main stream traffic, headlines and captions. As humans we tend to engage with our digital devices, apps, web environments and new technologies in removed manner, less questioning or critical at an ethical and moral level of the role these have on us as humans. The sheer convenience of the digital devices, apps, web environments and new technologies we live, work, play and entertain ourselves with, dilutes often a critical engagement in understanding how these different tools are impacting us as individuals and a society. No time with our day to day business and the bombardment of new devices, apps and technologies daily being pushed out to us, prevents avenues for us to really stop, think, question and engage with this issue.

We are at the cusp of an age where there are more digital devices than humans on the planet: Mobile devices to outnumber people on planet this year.

New Technologies, digital devices, and web environments as they become more seamless, integrated, and part of our day to day fabric for us to function as humans. We are at many levels defaulting executive decisions to these devices/environments  independent of our input. I suspect most of us might feel that it is a small price to pay for the convenience of these digital devices, apps and technologies making the mundane decisions we need to engage with work, play, and living in a connected world. These two articles illustrate the wonders of some of these new technologies but at the same time test and confront our own morals and ethics.

Running repairs: An experiment on rats brings hope to the paralyzed
http://www.economist.com/node/21556209?frsc=dg%7Ca

A big step toward ‘designer babies’ – and big questions
http://www.csmonitor.com/Commentary/the-monitors-view/2012/0608/A-big-step-toward-designer-babies-and-big-questions

There has to be a cautionary tale if this convenience overrides our ability to stop and reflect, think, probe, discuss, and question the world of digital devices, web environments, and technologies we adopt.  If we are happy to sub contract our digital devices and new technologies to work, care, entertain, and support our day to day lives, how far do we give up the control for the convenience of it all? This article from the Economist highlights for me “Morals and the machine: As robots grow more autonomous, society needs to develop rules to manage them“, that we need to have this conversation as educators, organizations and a society in general, and be fully engaged with what is the potential impact to us all.

John @http://beyonddigital.org

while I was waking up…

This post is dedicated in memory of Gil Scott-Heron.

Connectivity, seamless integration, multiple digital devices all connected to my habits and likes. The seamless options to integrate my blogs with my social media accounts…..all provide wonderful opportunities. They simplify many tasks and interactions I deal with on a day to day basis. At times these can get messy and I understand that many of these integrations between the digital devices I use and social media platforms I interact with are still trying to evolve. I believe the future of this convergence of digital devices and communication platforms will only get more seamless and effective, that is exciting. For users the potential is huge in leveraging  these tools and opportunities into our social and professional lives.

and now……

I am noticing something, and it seems in the last few months all this seamless integration of digital devices and social networking media is generating some caution by a few. For me the first odd event was when Facebook suddenly decided without asking me (actually they never asking me anything especially if there are changes) that it would only feature in my news feeds the friends I interact with on a regular basis and not the  friends who I just interact with rarely or periodically…. my news feed narrowed in its diversity of people I could see. Good news you can change this, and I did. The issue is who should make the decisions for us?

I also am noticing with my search results (Google/Bing/Yahoo) that they tend to be little different when I search the same topic as my wife and kids….. the search algorithms seem to learn my likes and dislikes and then provide me with information which falls into my previous search patterns and within my opinion and interest range. Diversity of opinions or information which I do not agree with seems slowly to be pushed away from me, I am reading only what I want to believe .  This seems to be a growing trend as explained in an excellent: TedTalk Eli Pariser: Beware online “filter bubbles”

The Apple Developers conference again showed how our devices, operating systems and virtual worlds are now taking on more tasks without us having to be involved, as a way to increase our efficiencies. Now you do not need to save in OS Lion, if you have iCloud there is sinking of content between different devices automatically without you being involved. Your digital devices in the Apple environment now can be independent of your desktop or laptop, giving these devices the ability to do all the necessary tasks right there and now. These are exciting changes and definitely provide the user with more seamless tasks which we do not need to be involved with. This in someways is pushed even further with the IOS5 function that if you have list or tasks to do there is a geographic locater embedded so when say you drive by a grocery store and in your IOS5 device you have a shopping list it will pop up and remind you of the list and the option of doing this now as it has located a grocery store.

The list goes on…. our technology tools and environments are being equipped with algorithms and (spiders and robots) automated tasks which are becoming more intelligent, and at a level are given more independence to make executive decisions to enhance tasks in reaction to our online behaviors and habits. This is a huge shift, and with this a whole set of philosophical questions and dilemmas arise which delve into privacy, who has ownership, who gets to decide what these tools and algorithms do, and how much independence should they have? I am reading and seeing more evidence of this change, where we are asking our technology tools and systems to think for us and help make decisions. At this stage it all seems useful, helpful and harmless. Who minds having something save everything without you having to remember. Who minds having their devices do in the background of our awareness, updates, sinking, and analysis ? At this time it is useful and a time saver….and to be honest this has been going on for a while with a variety of technology tools. An example is commercial aviation which has been relying on automatic pilot controls for a large percentage of tasks related to flying. There are definitely huge advantages, and at many levels these make the processes we rely on more efficient and seamless .

As we move forward with our digital evolution:  our tools, operating systems and devices are given greater independence to manage our lives. The question I currently struggle with is at what point do we feel comfortable giving up control and let many of these devices  have complete autonomy of certain tasks, decisions, and information we get to have access too. To what point do we let convenience and efficiency erode potentially our own independence to make decision ourselves with these digital environments. There is no doubt that for the companies behind these tools, devices and operating systems, this control and information is becoming a critical commodity to generate information databases which leverage a greater capacity to target products, habits and behaviors effectively to the user. This then generating profits for the companies behind these devices, software and operating systems.

I am like many, I love the seamless integration, the fact that more mundane tasks are being taken over, and me not having to think about them. But when is too much, and when will we suddenly wake up and realize so much has fallen over to algorithms and (spiders and robots) automated tasks that we have lost control and now are having many of our decisions and tasks dictated by others who we have little input with.

As Gil Scott Heron says so aptly ” the revolution will not be televised

John@https://beyonddigital.org

break on through

In my experience there is an odd side to international schools when it comes to the issue of making sure to teach the virtues of freedom of expression, studying great leaders, thinkers, philosophers, mathematical theories and ensure our students get a balance in the way we approach learning.  We believe generally that an open door philosophy towards different opinions, perspective and views is important. I would say we are quite passionate about this, and feel it is a critical element of many of our school’s missions.  The goal to ensure that our students get exposed to as many different perspectives as possible allowing them to construct their own knowledge. That is nice and of course something most parents and educators would find difficult to argue against. Then comes Web 2.0 Social Networking and/or some other technology and the general first reaction and approach is BLOCK IT! Now many will argue that social networking or blocking certain technologies is not the equivalent of teaching the virtues of freedom of expression. I disagree. Okay in my situation at my current school, we block facebook, twitter and others on our wireless network, a decision by the collective leadership team. On the two labs and one library lab we do not block it on the machine connected to our LAN network, the rational is that there generally is always someone there to supervise. This situation is flawed in my mind. In some ways I am to blame as IT Director for not pushing or developing a strong enough argument with my fellow administrators to have the conversation to unpack what we are doing, and exploring the pedagogic value of such blocking. Always easier to reflect in hindsight.

What happens then is when students sneak through our firewall via proxies, or have their own independent connection to the internet through a USB modem from the local cell phone company that they plug into their laptop, the blocking becomes useless. The times a teacher catches someone then the issue is brought up, and we the IT department have to again explain that however much we block the chance is that some student will find a hole. This is the flaw, we are focusing on the blocking and the events where students get around it and not on the more important issue what, how, why and when are they using this. We avoid the  opportunity to leverage this tool to engage both student and teacher into a conversation on the pros and cons of using this, and how and what might be responsible use of such tools in a school setting. This whole dialogue and dynamics is completely swept under the carpet.

Most educators would argue I assume that the issue of Facebook (Social Networks) as a teachable moment has no place in the classroom, and blocking it is good, as this allows us then to focus on the important task of teaching the lesson at hand. I challenge this perception and view. There are about 500 million plus folks on this planet involved on a regular basis in social networks in different shapes and forms. Attached to this is a huge industry developing to take advantage of this new form of communication. With this dynamic there are big economic opportunities for individuals, companies, organizations and institutions to generate incomes. This is something that will continue to develop and the reality is it will become more and more part of our own social communication fabric both on a personal and institutional level. Some would say it has already happened.

So……I think with this shift there is now a critical role for educators to start exploring how to integrate social networks into school curriculum.  We have a responsibility to share, educate and develop an understanding of the intricacies and options of using such communication mediums in our day to day lives. If Grade 2-3 students are setting up Facebook profile you cannot expect them to clearly understand the privacy setting tools on their own, you cannot expect them to read the fine print of an agreement. It has come to the point where instead of blocking this, and letting them work things out on their own undercover, we as educational institutions need to develop a robust set of learner outcomes for our students on the dynamics of social networks. It needs to be not the responsibility of some IT department technician or counselor but part of  the day to day fabric of each teachers tool kit: sharing, exploring, facilitating, and mentoring our students how to be responsible users of social networks. We need to let them explore these mediums with a critical mind unblocked, as we would expect them for the ~French Revolution, Plate-tectonics, a perspective on Macbeth or the Israeli- Palestinian situation.

The world changed yesterday! Today we need to engage ourselves in a clear understanding that social networks, youtube, chat, texting, virtual worlds, and the current digital landscape are now an integral part of our day to day fabric both socially and professionally. With this there are a whole host of issues, learning, understanding and perspectives that we need to equip our students with to be able to survive effectively with balance and as critical learners.  In today’s world, we as educators,  have an important ethical responsibility to take charge of this, and engage throughout our day within our own lessons what this all means, and how to develop a critical understanding of how best to use these: when, where, and appropriately….. if we do not, basically we are ignoring today’s world that we all live in. I would find it hard to believe that any of us would want this as part of our own educator’s philosophy.