Cybersecurity for International Schools

Cybersecurity is a multifaceted and complex issue, particularly in today’s world where a significant portion of our time, information, and lives are tied to digital devices, both for personal and professional purposes. It is increasingly crucial to possess a basic understanding and knowledge of securing one’s devices and managing digital security. For schools, the stakes are particularly high, and developing a solid cybersecurity plan is a critical aspect of managing the risks associated with digital devices in this day and age.

Recently, Dan Taylor, the host of the #internationalschoolpodcast, and I, the co-host, had an opportunity to discuss cybersecurity in schools, with a particular emphasis on international schools. Both of us have a keen interest in this subject, and Dan has been actively supporting international schools with Google Education Workspace’s robust security tools and processes. He has a genuine passion for this topic and has been doing a lot of work in this area. Meanwhile, I have facilitated workshops for parents, created videos and educator sessions, and worked with groups that provide one-on-one support and workshops to seniors focused on navigating digital ecosystems and devices.

In our discussion, Dan prepared an excellent slide presentation for the ECIS Leadership Conference, which he used to facilitate workshops for school leaders. We took the opportunity to share our perspectives and experiences, offering tips and strategies for school leaders to consider. Podcast version also available here

Beyond ChatGPT: Automation in Education

Over the last weeks ChatGPT and Natural Language Models of Artificial Intelligence have created a real buzz for many in the technology industry and general media. For schools the arrival of these have generated important reflections and introspection on the role of AI, Chatbots and Natural Language Models in schools and the classroom. ChatGPT bringing on important moments to think about the role #AI in a classroom- schools. How does this refocus and challenge educators pedagogy, which can often in schools be focused on teaching content -knowledge with assessments designed around tests and exams.

We as educators and schools need to invite ourselves to ask what then is the value added proposition of learning in a classroom and school in the age of #AI #ChatGPT3. How do schools position themselves for a future with #AI. We all need to create the space, time, and support community voices to engage with this creative tension. To find the time and space, and hear these voices, will only allow us to be better prepared for such cohabitation.

I had the privileged to participate in a conversation on this topic facilitated by Camillo Montenegro Beyond ChatGPT: Automation in Education with fellow educators Tim Evans, and James Steinhoff. The recording for reference

Further resources to consider

AI in Education collaborative site with a lot of resources, lesson ideas, guides and information to support educators
https://sites.google.com/ecolint.ch/aiineducation?pli=1

Podcast with 3 International School IT Directors discussing its implications.
https://www.theinternationalschoolspodcast.com/e/88-greg-warren-and-wolfgang-with-dan-and-john-look-at-chatgpt3-in-education/

Digital Citizenship to Digital Fluency

Pfannenstiel Switzerland

Over the last 18 months, our time spent online has simply increased to levels maybe not experienced prior to the pandemic. As we continue to juggle the complexity and nuisance of the pandemic, this also maybe is an opportunity for schools to re-explore their relationship to digital citizenship. The growing erosion of our privacy as well as our amplified cohabitation with Artificial Intelligence (AI) present us with new challenges. 

We all have become so much more aware of being tracked 24/7 with digital ecosystem grids which are seamless and frictionless parts of our daily routines. In (The Age of Surveillance Capitalism), Shoshana Zuboff describes this process of tracking “behavior surplus. Behavior surplus is the personal data that we leave on our devices and give away daily based on a mutual agreement (user agreements) between the digital companies and us. These agreements (when is the last time you read a user agreement?) give permission for our behaviors online/offline to be tracked, collected, monitored and analyzed by companies and in some cases governments at will. The purpose of “surveillance capitalism”  is to leverage this “behavior surplus” to mitigate the uncertainty of our desires and to better predict what we will do. This is then turned into a profitable commodity. The value of our “behavior surplus” is unprecedented and the raw material of human data is fueling the engines of innovation, economics, politics and power.

Over the past few years, and in some ways accelerated even more in the last 18 months, AI has a growing impact on our lives, more often than we realize. Daily, it seems we develop a growing dependency on this cohabitation with AI: be it our GPS, HomeAssistant, iRobot vacuum cleaner, Health Device, Dating Apps, SmartWatch, or SmartTV.  For our students, this seamless integration of AI into our lives often comes as a frictionless change. Tik Tok is a great example of this – a social media platform with sophisticated AI and unprecedented tracking algorithms, which in a short time added 1 billion users. Overnight, Tik Tok become a teen favorite and serious competition to Snapchat and Instagram.  For many educators, new digital consumables are embraced with hesitancy but somehow often the convenience is enticing enough for us to succumb to the charm of the “smart” and “wifi“ ready products.

I have worked with groups of educators and students to build a series of lessons around ARTE’s Do Not Track  in order to highlight the complexity and intricacies of how we are tracked. The different episodes are thoughtfully constructed with interactive components breaking down the erosion of privacy. I am surprised how often a percentage of students confidently express their indifference with this erosion of privacy and its implications. In some ways this makes sense. If the current privacy landscape is the sole point of reference, the current state of privacy is interpreted as normal. In comparison, educators interacting with ARTE’s Do Not Track respond with far more anxious discomfort as for many this erosion is compared to experiences where individuals felt greater control over their privacy. As we re-explore digital citizenship, we need to take these varying perspectives into consideration.

The fact is that most of our students are highly proficient digital consumers and not digital natives. The same goes for many educators in general. If we think of our own interactions with digital environments, it’s very likely that most of our time is focused on consumption over creation.

We need to consider re-framing how we support educators and students in a school setting away from a sole focus on digital citizenship to a broader focus on digital fluency. This requires us to develop an approach where the focus is on developing purposeful connections to our digital ecosystems with the goal of becoming ethical digital creators of content. 

The concept and idea behind digital fluency is built on the work of the DQ Institute and its DQ Framework and the 8 digital intelligences. Digital fluency is facilitating an approach where learning opportunities are constructed around the natural connections of our day to day lives with these 8 digital intelligences. The important aspect of this focus is not excluding other essential learning in the curriculum. To make this meaningful, digital fluency needs to have clear connection points to personal experience, ensure these connection points are purposeful, and build on the learning already taking place in a school’s curriculum and the different learning pathways of the units of learning.

Grade 6 responses to survey on what 8 digital intelligence they would like to focus on ranking them 1-8

The above graph is one sample of several surveys done with Grade 5- 6 students asking them what areas of the DQ Framework they would like to learn and focus on. Interestingly, there was a clear pattern across several groups for Digital Safety as the highest priority (from the DQ Competencies.)

An important aspect of this is allowing student voice to actively guide the design of these digital fluency connections. They are identifying valuable needs and ensuring this open communication is key to making this shift meaningful to them.

Here are some examples of what digital fluency could look like, and what some schools are already actively creating. One example is giving high school students a LinkedIn account and spending time supporting what it means to have a public profile and how to curate a positive digital footprint compared to a personal social media footprint. Other schools are creating blended courses for parents on how to understand the difference between the pedagogic use of digital devices in schools and the challenges of a more open ended environment of digital device use outside of school in the home. Another example is having students develop public service announcements regarding malware and then coaching younger students on how to identify phishing emails and how to manage an antivirus app. Another is walking through the architecture of effective password creation and developing sustainable strategies to ensure a solid level of security in the students personal lives as a podcast. Or having students coach their parents through the privacy and security settings of their favorite app and create a how-to help screencast.

It is through these activities that participants build on a set of dispositions, skills and knowledge where they feel a sense of autonomy in addressing the complexities, challenges and opportunities of the digital ecosystems we are so intimately connected to. 

The new year, 2022, at our doorstep will be even more intrinsically connected to cohabitation with AI and a continued dilution of the autonomy we have with our privacy. Scaffolding digital fluency as an essential part of the learning pathways provides a guide for students to shift their energies away from being passive digital consumers to active digital creators. Digital fluency provides a mindset to better understand the importance of the ethical responsibilities of digital creation and the implications of the digital ecosystems which permeate our lives, both visible and invisible. Ignoring this will just amplify a society of passive digital consumers, while eroding our free will.

John@beyonddigital.org

Sources-References

The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power by Shoshana Zuboff.” Goodreads, Goodreads, 15 Jan. 2019, https://www.goodreads.com/book/show/26195941-the-age-of-surveillance-capitalism
Asthana, Anushka, et al. “The Strange World of TikTok: Viral Videos and Chinese Censorship – Podcast.” The Guardian, Guardian News and Media, 7 Oct. 2019, https://www.theguardian.com/technology/audio/2019/oct/07/strange-world-tiktok-viral-videos-chinese-censorship
Written by Yuhyun Park, Founder and Chief Executive Officer. “8 Digital Skills We Must Teach Our Children.” World Economic Forum, www.weforum.org/agenda/2016/06/8-digital-skills-we-must-teach-our-children/.
DQ Framework: “What Is the DQ Framework?” DQ Institute, http://www.dqinstitute.org/dq-framework/.
“Coalition for Digital Intelligence.” Coalition for Digital Intelligence, www.coalitionfordigitalintelligence

Beyond Screens: Managing the Screen Time Dilemma

I am here sharing from a series by the IT Library Department colleagues called Digital Life a series for parents. This a concept managed and hosted by colleague Nancy and the Communications team. This session we explore and focus on screen time and some of the dilemmas we all juggle both parents, student and staff. All credit, resources, and inspiration goes to https://tacticaltech.org/#/ and https://datadetoxkit.org/en/home who over the last couple years have been an outstanding resource, and guide for a lot of the work I get to facilitate with colleagues at school. In this session I share what are the different types of screen times and some ideas on exploring strategies to consider.

Puppets on a string

CC0 License: https://www.pexels.com/photo/bag-electronics-girl-hands-359757/

 The post is inspired by a L2talk I did at the Learning2 Europe conference in Warsaw.

“every storyteller has a bias – and so does every platform”-  Andrew Postman  “My Dad Predicted Trump in 1985 – It’s Not Orwell, He Warned, It’s Brave New World.” The Guardian. Guardian News and Media, 02 Feb. 2017

I am an addict. Are you too? Don’t you hate it when you can’t find your phone, and a friend has to call it.  Maybe the first thing you did this morning was check your phone and the last thing you did today was check your phone.  Think of it, we walk and text, and even drive and text. Have you had this happen,  you are in a social situation and you go the bathroom  to check an update. You are standing on a street corner and suddenly realize you are on your phone swiping at it, unconsciously. Then there is the feeling you get when you post a picture on a social media feed. The “likes” start coming in. It feels good, really good,  and then you check back and back. You post an update and there no “likes”.  You start wondering to yourself what is going on?

(CC BY 2.0) Photo taken by Angus MacAskill “Rat” https://www.flickr.com/photos/19951543@N00/3908678004/

I am sure you’ve heard about B.J Skinner’s rat experiment. The first rat had a lever in its cage, and every time it hit the lever food would come out. The second rat in the same set-up, hit the lever and nothing came out, no food. The third rat, same set-up, when it hit the lever a little food came out, then nothing, and then a lot, and then nothing again. The third rat developed an addiction. It quickly realized as long as it hit the lever it had a chance of getting some food.  This is called the principal of variable rewards. That feel good feeling, the dopamine rush. Behavior design as explained in this article (Scientist who make our apps addictive by Ian Leslie 1843 Economist October.November 2016) is a critical part of every app development. Tech companies employ behavior economist, psychologist, and psychiatrists in the creation, design and curation of our apps ecosystems to ensure we keep coming back.

So many of our interactions with devices are subconscious.  In Eric Pickersgill thought provoking photos series “Removed” (do spend some time on the link) he highlights the idea of being alone together as Sherry Turkle so aptly describes in her book Alone Together. We are often physically together with another person in a space sometimes even intimately but our mind’s burrowed in a phone.

As adults, we are quick t0 point the finger at kids for not being able to manage their screen-time. Think of this, the first time an infant will interact with a digital device is watching a parent using one. What does it feel like for a child in a pram looking up at their parent to only see a blank expression immersed in their smartphone.  The dinner table conversation interrupted by parents checking work emails. Mary Aiken in her book “ The Cyber Effect” states we are asking the wrong question.  Mary Aiken writes “We should not be asking at what age is it appropriate to give a digital device to an infant, but be asking the question when is it appropriate for an adult to interact with a digital device in front of an infant.”

A good example of behavior design is Snapchat and the new feature “streaks“.  The idea of streaks if you have a dialogue with a friend over 24 hours and you continue this over days, a flame emoji shows up.  In tandem a number counting your interactions keeps tally. Should one of you stop posting, an hour glass shows up giving you a heads up that the streak will disappear if you do not stay on. For adolescent’s social media relationships can be a gauge of their social capital.  Streaks adds a layer of complexity to the interactions.

I am not against digital devices. I have been working in Education Technology as a coach, coordinator, IT Director and Director of eLearning for over 20 years. I love the seamless and frictionless experience of our digital environments.

By Jim McDougall from Glasgow, Scotland (Puppets on a String Uploaded by Snowmanradio) [CC BY 2.0], via Wikimedia Commons
It is a fact that our online data  (health apps, social media, travel, online games, GPS, shopping, search etc…) is collected, analyzed, and then sold to third parties,  or curated to give us a personalized online experiences with a clear goal to manipulate our behaviors. We as educators have an ethical responsibility to be skeptical of behavior design’s narrative. Let us challenge our learning communities to question the complexity and consequences of behavior design in our lives. Stuffing a digital citizenship lesson for 15 minutes during a Friday morning advisory is not enough. We need to make this narrative an integral part of the living curriculum.

Do we want to end up being puppets pulled by the strings of choreographed digital ecosystems which we do not control?

I think it is important to understand schools are most likely the last place where children interact with digital devices with balance and pedagogic purpose. We cannot take this for granted.

If we ignore behavior design we will loose something. Free will. I and you do not want to lose this.

John @ http://beyonddigital.org

=

If we forget to look out of the window.

Photo by John M
A window out

Every year has its moments, and 2016 was no exception. Various significant shifts occurred, including changes in the political landscape in the United States, United Kingdom, and other countries around the globe. And the horrors of war, civil strife, terrorism and an underlying global tension have been constantly fed into our digital lives from the comfort of our screens.

As we consume the aggregated algorithmic social network feeds, each customized to ensure we get what we want to digest, we are choreographed into a more divisive world.

Information is power. This year, the pollsters, news agencies, and pundits got caught out with two big votes, and so many predictions seemed off.

Our landscape of information has entered a level of Orwellian curation, and what is news, fact, or reality seems dictated by emotion and perspectives constructed from our own curated news feeds. They are rarely factual. “Post Truth” – Oxford English Dictionary Names ‘Post-Truth’ Word of the Year by Jon Blistein is the word that defines these moments and a shift to a new narrative.

For many of us, this Orwellian curation has us struggling to distinguish fact from fiction. The level of sophistication of not only the algorithms but how these are manipulated to shift thinking is the new power. In schools, we are being told by various studies that our students capacity for media and information literacy is weak. (Students Have ‘Dismaying’ Inability To Tell Fake News From Real, Study Finds by Camila Domonoske ). When you consider we as adults struggle with this landscape, it is no surprise that our students struggle too.

In a world of algorithms where the sophisticated digital curation of social media, news, blogs, and video feeds can be manipulated to match an individual’s perspective, the challenges we face as educators are immense. This manipulation, shared in this sobering article “ Google, democracy and the truth about internet search by Carole Cadwalladr“, highlights the complexity of being truly media literate.  The prevalence of third party curation in social media feeds during elections highlighted in this article “Macedonia’s fake news industry sets sights on Europe by: Andrew Byrne” emphasis the challenges we all face in understanding what is “real” news.

To be complacent is short-sighted in a school setting.  There is a tendency with school professional development to not explicitly address the digital reality that engulfs our lives as an essential part of our professional learning. Information and Media literacy are what frame our own democratic values: choice, perspective, empathy, resilience, and critical thinking. If we as educators are going to assign students critical thinking tasks and ask them to engage with media and information while juggling screen time in a complex digital landscape, we cannot be passive bystanders.

As school leaders, we need to re-frame our engagement with the role of digital life in professional development. Together, we need to understand the complexity and impact of algorithmic information flows on our devices.

We also need dedicated spaces for this professional learning. We must learn how to mentor information flows, authenticate media, source perspectives, and understand the pedagogic impact of a curated news. We must approach this with patience and empathy, and allow everyone to build an understanding of the digital flows we live by, tapping into the talent of our librarians and digital coaches as guides. We must take advantage of the frameworks available to us (e.g: #1 or #2) and use them ourselves, as a point of reference for a pedagogic consensus on how to mentor our school community.

The paradigm shift asks us to look at Digital Intelligence as a core intelligence. As defined by http://www.projectdq.org: “- the sum of social, emotional, and cognitive abilities essential to digital life.” and shared out in the World Economic Forum  article: “8 digital life skills all children need – and a plan for teaching them“.

Digital Intelligence needs to be woven into the curriculum. We do this on a daily basis with all other aspects of the curriculum. Let us do it with Digital Intelligence. Re-structure the focus and content to explicitly encompass screen-time management, privacy management, cyber security management, digital footprints, and digital identity; use these to make authentic connections based on our experiences. Then, reflect on our digital habits, likes, tensions, questions and understandings to create activities to share. In this process, we should hope to find comfort in being honest with our own vulnerabilities.  We can then use this life-learning to support our students’ understanding of digital intelligence.

Being explicit about implementing Digital Intelligence in faculty professional learning ensures this is an essential part of our educators professional growth.  Working together, as adults learners, we need to harness the complexity of the choreographed digital world. By ensuring this is in our professional learning landscape, we are then empowered to share our digital intelligence to students. It is the only way to counter an Orwellian curation of information in a “post truth” world.

a wonderful resource by Joyce Valenza : Truth, truthiness, triangulation: A news literacy toolkit for a “post-truth” world

John @ beyonddigital.org

Living in a “GAFA” world.

Think of what Google, Apple, Facebook and Amazon services and products you use daily. How much are they a vehicle for communications, work, social life, purchases and tasks? How often do you connect to them? Count the number. How many? Surprised? Now, out of the 4 companies, how many do you use? Or do you not. The reality is that you probably use at least one, if not all of the four, very frequently.

Lac Leman, Rolle Switzerland -photo J.Mikton

Welcome to the “GAFA” (Google, Apple, Facebook, Amazon) world. The”GAFA” world is where most of humanity’s internet users and consumers work, communicate, socialize, learn, entertain themselves, and share, in services provided by one, two, three or all four of these companies: the “GAFA” grids.

We have become comfortable with “GAFA’” being part of our lives in multiple venues, and as a result, schools, educators, students and parents are investing significant amounts of monies into “GAFA”. It is an essential component of our ability to function at school and at home, and the collective convenience and seamless experience of “GAFA” intoxicates us.

In Terry Heick’s (@TeachThought) thoughtful article “How Google Impacts The Way Students Think”,  he highlights how learners working in a Google ecosystem develop an appetite for a black and white information age.  The expectation? Immediate answers, 24/7. The convenience of this immediacy creates an illusion of thinking, but actually disengages the user from deep critical thinking. It does this by simplifying the process of gathering information and giving the impression it is all connected.

In order to have a constant infusion of innovation and creativity, “GAFA” also hungers for start-up companies. By absorbing these companies, they are able to facilitate the pollination of ideas, products and services and enrich their ability to generate more seamless methods of connectivity. In this way “GAFA’s” largeness and versatility is engrained in all aspects of our lives

This innovation also provides “GAFA” with opportunities to tie our lives closer together with multiple platforms and venues in a frictionless environment. Examples of this reach are Amazon’s cloud service, which hosts large architectures of company websites, services, and databases, including the CIA’s; Google moving into the home with Nest and pursuing the development of artificial intelligence (Dark Blue Labs and Vision Factory); Apple’s acquisition of Affectiva, a company that specializes in emotion recognition, and investments in health apps and services; and Facebook’s expansion into virtual reality.  Making its services ubiquitous, as with the “free wi-fi-with-check-in ”in hotels and small businesses. Its purchase of “Whatsapp” is another example of how a “GAFA” company spent billions on an innovative service.

The algorithms provide a treasure trove of information with which to understand our behavior, habits, aspirations and desires.  In Raffi Khatchadourian’s  article “We Know How You Feel”, we are reminded that the hunger for data is tied to a hunger for emotional interactions. In Shelley Podolyn’s New York Times article, “If an Algorithm Wrote This, How Would You Even Know?”,  she highlights the level of sophistication of writing algorithms generating news articles and books. In tandem, the growth of “The Internet of Eyes“ in objects we interact with, as part of the “ Internet of Things.” brings about a new dynamic to data mining. It is a reminder that many of these algorithms being designed within “GAFA” play an almost non-negotiable role in our lives.

Many schools believe that their curriculum’s should allow for authentic connections to the world around them. What about “GAFA”?  Should we as learners, guides, mentors, and facilitators highlight “GAFA”? Is this important? Should its presence be considered in our learning outcomes?  To ignore “GAFA” is to create a disconnect with present changes that are reshaping all of our lives. It sidelines a reality that is the future. What does “GAFA” mean, to us, our schools, community and educational institutions? Schools have a responsibility to ensure this is part of the curricular discourse.  We need to construct learning moments and scaffold time to pause, reflect, understand, explain and critically think about what it is to live in a “GAFA” world.

If personal privacy, independent thought, critical thinking, differentiation, balanced perspectives, mindfulness and our capacity to be unique are in our school’s mission, we need to address what it means to be curated by “GAFA”.  Will we not lose an important aspect of humanity, if we continue to ignore “GAFA”?

John@beyonddigital

P.S: Next time you are at a Starbucks drinking your coffee remember that the free wifi is a “GAFA” gift!