22 October 2016

1:1 - Why?

1:1 via edtechteacher.org

Every now and then I come across an article that, while on the surface level seems fairly innocuous, causes me incredible consternation, articles like this,

"Kids Who Have to Share iPads Learn Better Than Kids Who Have Their Own".

The article is not a new one, but like many articles of its ilk, it has a habit of resurfacing periodically, as it did this week, finally motivating me to put fingers to keys.

There are so many things wrong with the assumptions made by the writer of this article, that it’s hard to know where to start. So in the absence of any better course of action, I’ll start at the beginning.

Firstly can we all just assume that of course sharing is a good thing, and so by implication is learning to share, but the truth is that it's the sharing that is beneficial not the device being shared, I see kids sharing and collaborating all the time even when using their own screens; the extent to which this happens is all to do with the classroom culture carefully crafted by a caring teacher and nothing to do with the nature of the particular item.

Secondly, what is the evidence basis for the the findings of the research? Performance in “a standardized literacy test at the end of the year compared to the beginning”. Oh, that’s okay then; God forbid we should have any other metric in school for judging the efficacy of any initiative other than a test, and I hate to imagine what the nature of this test was, but something tells me it involved a lot of multiple choice questions, maybe even a few cloze passages...   I loathe the way so many of these kinds of studies assume that standardised tests as the measure for success for everything is acceptable, it's not acceptable it's completely unacceptable, not to mention completely irrelevant... Just because it's easy to measure doesn't make it valuable. There are plenty of other people who have done a better job than I could do here, starting with the magnificent Alfie Kohn.

An improvement of 28% v 24% in a study of 352 students really is not statistically significant, despite what the study's author says, another reason why we don't rely on one source for anything of any real substance. Then, if that wasn’t bad enough, the study extrapolated the results of a literacy test, to relate to their work with basic geometry?

I could possibly accept basing the efficacy of a study on a standardised test if the focus of the study was specifically related to the test, eg working on improving spelling for example, but in this case, as in most of the cases of this kind, they make no effort whatsoever to relate the standardised test to the actual nature of the use of the devices. Which tells you a great deal about the study, that they didn't feel it worthwhile to actually describe what they are using the iPads for, which would seem to be glorified textbooks, which would explain why they felt standardised test would be a valid measure. All they are concerned about is to measure the extent to which students have absorbed specific surface content, without any consideration about deep conceptual development or creativity and all those other soft skills that really do matter much more. You see a classroom where all the iPads are used for is glorified textbooks, or for educational "games" and skill drill, then sharing one iPad between five, or ten or even twenty really is not a problem. But a classroom where the teacher expects kids to actually create things that are meaningful over time is a classroom that benefits from the lowest possible ratio of student to device.

What has all this got to do with 1:1?

Whenever I encounter someone who is under the impression that providing students with their own device is a little, well, excessive, I know there is something profoundly dubious about the assumptions they make about the way we encourage students to use these devices. The truth is you can be sure that any advocate for shared devices never shares their own device 50:50. Can you imagine how far you would get in your daily work if you had to share your laptop 50:50 with a colleague in the office? You can be sure that the same person so gleefully anticipating a social nirvana where all of these students happily share their devices is suffering from a profound case of media bias, or device disorder. I’m sure the same person would never countenance asking the same students to share a pencil, or a paintbrush. How about an exercise book? You start from the front, and I’ll start from the back... These devices are all tools, very few of which were purpose built for a classroom, but all of which can be very successfully repurposed for an educational context by skilled teachers. I find teachers that are blasé about the need for students to have their own devices tells me more about the lack of importance they associate with the device than it does about the use of it.

Don’t misunderstand me, I am not saying that a 1:1 context is a prerequisite for successful learning, many teachers all over the world, do amazing things everyday with limited resources, but that doesn’t mean that this paucity of resources is something they find preferable! Anyone who thinks so, clearly has never attempted to use these devices themselves.

Allow me to illustrate with an analogy.

Cycling is good for you, it’s also much less harmful for the environment than an aeroplane. So next time you want to travel between, say London and Singapore, don’t fly, cycle!

This logic only makes sense if you never had to actually travel between London and Singapore yourself (and if you’re not in hurry). There is something to be said as well for determination, I have a good friend who shares his laptop with the 24 kids in his class, on a rota basis. Do they benefit? Yes. Is the sharing beneficial for them? Maybe. Is this his preferred arrangement? Of course not.

Back to the bicycle.

Would I ever countenance the idea of cycling from London to Singapore? No ... unless that was the only way I was ever going to visit Asia, and time was no object. Consciousness of the desirability of the goal has a direct bearing on one's determination to persevere despite the obstacles that may be present. Would it be good for me? Yes. So am I going to do it? No. I am not. Well, maybe. For many years, teachers who are profoundly aware of the value of designing experiences for their students to enhance their learning with digital tools have persevered despite many obstacles to make this a reality for their students, but would they prefer 1:1? Of course they do. How do I know? I was one, more than once. Scavenging abandoned computers, salvaging parts, and spending hours beyond number to build a rudimentary ‘lab’ for my students was a frequent experience for me when I was wrestling to enhance the learning of my students in the early days at the turn of the century when ‘TEL’ still was yet to become a ‘thing’.

1:1 works better - shall I count the ways?

When my school announced five years ago, that we were embarking on a tech enhanced learning (TEL) initiative, it was assumed that the 1:1 ratio only applied for older students, middle school and up. While the ratio of devices in the Primary School was going to be increased, from about 5:1 to more like 2:1, the intention was never to provide 1:1 in the primary school as well. So what changed their minds? I did.

Can we work with shared devices? Yes. Can we work better when we have our own device? Yes. Interestingly the main pressure to go 1:1 came from our teachers, even when we expanded to a 2:1 ratio, the more effective they became at utilising digital tech, the more ridiculous expecting the kids to share devices became.

The truth is that the benefits of 1:1 have really surprised me, I was kind of oblivious of how powerful that really is, just from a logistical standpoint. With shared devices it is all too common for students to accidentally delete each other's work which is quite soul destroying, and especially with video editing in the junior school attempting to work on a project over several weeks is impossible with a shared machine. This means that any creating on the device (the most important use) has to be confined to short simple activities that can be started and completed within one lesson, this really does diminish the power of those tools.

This means that the main reason for going 1:1 is not really about two kids needing to use the device at the same time, although that is a factor, it's about honouring and protecting the importance of the media created by each individual child. The biggest advantage I found by going 1:1 is to do with the fact that the work on that device cannot be accidentally tampered or deleted by a well-meaning (or maybe not so well-meaning) friend. If all the kids use the device for is shallow tasks like skill and drill apps, taking tests, and passively consuming media, then clearly sharing them is less of  an issue. However I think this is actually highlights a bigger problem! If we are encouraging our kids to do meaningful creative work on these devices and they will have media saved on the device that they would be upset about if it was accidentally deleted by a classmate.

Not to mention the issue of 'ownership', a child who is responsible for their own device, apart from the obvious personal social merits of having to take that responsibility, is also a child who feels like the work on there is work that is all theirs. This aspect became quickly apparent, kids really do benefit from their "ownership" of one device, including in ways we hadn’t anticipated, such as: customising it so that it operates the way they want it to; using a picture of their face for the wallpaper; being able to actually choose to share content on their iPads with their parents directly, this is the kind of thing that a one-to-one environment would make very straightforward but that they shared environment would be quite difficult. This even extends to the physical device itself—sharing ‘their’ device with their parents at parent teacher conference means there is something quite empowering about that kind of "ownership" even at such a young age. This aspect encourages a sense of responsibility that is powerful in terms of 'digital citizenship'; such as the fact that the teacher can expect the student for example to curate and manage their camera roll with their media responsibly; there is no way the student can evade responsibility by blaming other students who also use the iPad—a common issue with shared devices.

So when I encounter people who are under the impression that 1:1 is excessive (the implication in this article) I know there is an assumption behind these ideas that the digital tools are used so infrequently and so ineffectively (ie skill drill, and games) that expecting kids to share them is no big deal, but in classrooms where these tools are effectively integrated and used to record, reflect and create, they are actually very difficult to share, not because of a lack of willingness to do so, but because both kids actually need to use the device at the same time, and really value the content they are curating and collecting on their own device.  You can be sure the journalist who wrote the article wasn’t using a machine she was sharing; why?

She uses it to create

15 June 2016

Laptops and Lowest Common Denominators

A colleague shared the outcome of recent research into the efficacy of 1:1 laptop schools, more specifically it was a meta-analysis of 10 studies that examine the impact of laptop programs on students’ academic achievement.

I read it with mixed feelings. On the one hand the fact that the researchers concluded that schools where students have their own laptops see a "significantly positive average effect sizes in English, writing, mathematics, and science" was encouraging. They felt that "the impact of laptop programs on general teaching and learning processes and perceptions was generally positive".

So that's good then. Right?

Well, no; not really.

I couldn't help but notice that their expectations of the actual use of these devices was, to put it mildly, far from tapping into the true potential of these devices. This is despite their inspirational opening clarion call to change the world,

"We believe that the affordances of computers for learning and knowledge production are radically different from those of radio, television, and film, which explains why computers, unlike those previous technologies, are bound to have a very different educational fate from the one suggested by Cuban (1993a, p 185), who wrote that “computer meets classroom: classroom wins.”

So exactly what uses do they have in mind? How do they envisage these radically different affordances? By inspiring the creative expression of learning through the exciting synergies between video, image, text, audio and the deft analysis and application of data?

No, they see the main affordances of these devices in terms of use "to write and revise papers, conduct Internet searches, and engage in personalized instruction and assessment using educational software or online tools". (p 2)

What? That's it? These devices hold the potential to radically transform their world, but let's just use them to type up reports (so they're nice and neat), Google stuff, and take online tests.

How depressing.

Then it gets worse. Having applied the law of the lowest possible expectations of these tools, they proceed to use the worst possible measure to determine their efficacy. We find ourselves in the familiar territory of, when faced with the option of assessing those aspects of learning that are the most important (creativity, solutions, innovation etc), but of course the most difficult to qualify, instead they opt to measure the aspects of learning that are easiest to quantify, with, yes, you guessed it, standardised tests:

"quantitative findings of the impact on students’ academic achievement. [...] Measurements of academic achievement were standardized assessments or norm-referenced district- or school-wide tests." (p 5)

So the measure of efficacy all boils down to that which can be measured on a standardised test. How depressing, and how inappropriate for a medium as rich as that of digital technologies. Like judging a ballerinas dancing ability, based on her spelling. So we use them ineffectively, then assess the efficacy of their use in ways that are utterly unsuitable. Is this really what we expect when we talk about 'technology enhanced learning' in 1:1 environments?

I sincerely hope not. I tell you what though, it would make my job a lot easier if I did.

To be fair to this study, they do accept that there are problems with the ways they are assessing the efficacy of these devices, "studies on this topic have largely done a poor job of assessing learning outcomes that are not well captured by current iterations of standardized tests. As the United States and other countries move to more sophisticated forms of standardized assessment, these new measures may be better aligned with the learning goals believed to be associated with laptop use." (p 25)

I have to wonder whether the corporate world is as obsessed with trying to validate the influence of digital technologies in the workplace as we are with attempting to defend them in the classroom? Do any of us really believe that the corporate world would be better off without digital technologies? Then why would we would we believe that classrooms would be better off without them? Do we really believe the corporate world would spend the millions if not billions it must cost every year to maintain their IT infrastructures if they did not feel they was essential, important, effective?

Don't Settle

As it is I'm not prepared to settle for a lowest common denominator approach, where we abandon any attempts at using these devices anywhere near their potential, and instead settle for using them in the ways that are the easiest, and therefore the most common, even if they are far from being the most effective. By easiest/least I mean ways of working that most closely replicates the traditional approaches to learning that were the norm before the advent of the digital revolution; writing becomes typing, and researching in the library becomes Googling, the results of which we present, in writing. That's it.

Vitamin D[igital] Video, Image, Text, Audio, Data. VITAD.

No, these laptops should be used to create in all five domains, not just text, but image, video, audio and data, and all sorts of overlaps between them. These technologies should exploit all of the attributes that digital tools excel at; situated learning/working, access to global communities/experts, multimodal artefacts, mutable work flows, sharing and collaborating on content using social networks.

In the paper, they state,

"Contrary to Cuban’s (2003) argument that computers are “oversold and underused” (p 179) in schools, laptop environments are reshaping many aspects of education in K–12 schools." (p 24)

But the truth is that if all they are exploiting is their use is as word processors and web browsers, these machines are definitely underused. I guess this could be described as giving up on transformation and focusing on amplification instead, if I'm honest, is this a bad thing? It would certainly make my job easier... Maybe having tech being oversold and underused is better than having them ignored and unused.

I guess the question that I need to wrestle with is, if I need to lower my expectations... Maybe if we just focus on using one domain effectively (text), we'll still see benefits in terms of learning, but perhaps more effectively and consistently—less is more? Perhaps, but I doubt it; focusing on just one domain out of five strikes me as making as much sense as buying a car and using it to keep you dry in the rain, or cool in the sun.

Useful? Yes.

Appropriate? Maybe.

Ideal and Effective? ... 😕🤔😬

Zheng, B., Warschauer, M., Lin, C. H., & Chang, C. (2016). Learning in One-to-One Laptop Environments A Meta-Analysis and Research Synthesis. Review of Educational Research, 0034654316628645.

09 April 2016

Desktop Zero - 4 compelling reasons to make this an essential habit

Yes, this is mine. No I did not cheat, well maybe one folder... 

You don't need me to tell you that your environment affects your productivity. Since a great deal of our work is now done with a screen, it stands to reason that your desktop environment can play an important role in your productivity. Seriously can you really look me in the monitor and tell me that you'd rather work on a desktop that looks like, this?

Messy Desktop by RuthOrtiz

Five Reasons to Change

There are plenty of reasons for making the effort to aim for 'desktop zero', I'll attempt to lay out a handful for you here:

It is Irresponsible. 

Desktop etiquette—every teacher is a role model, and as a teacher, every time you share your desktop with your students, you demonstrate to them the kinds of organisational and work habits you expect them to imitate. 

Everytime we share a cluttered desktop with a class, or even with parents, we effectively also share our inability to self manage, our lack of organisation, perseverance, diligence, need I go on? The biggest problem is that all of these behaviours are built on bad habits, but these are bad habits I see teachers (and parents) passing on to their children every day.

It is Insecure.

Ironically one of the most common reasons I hear for storing files on the desktop, is their critical importance, 'those are files I need, and I can't afford to lose them...' Really? Because unless you are in the habit of fastidiously backing up your Mac with Time Machine, like every day (in which case you are probably already at Desktop Zero, or close enough), you run the risk of losing it all, one hard drive failure, and that's it, all gone. Desktop files, are the most common space/place where data is lost in my experience. If those files had been placed in a Google Drive folder (or DropBox) then they would have been safe. literally every edit, backed up, in real time—but nothing on your desktop (and your students, if they're imitating you) is being backed up to the cloud, nothing.

Top Tip -  on the Mac, you can create an Alias (right click, or command+option drag and drop) from any 'buried' folder/file so there is a shortcut or alias of it on the desktop, it acts just like the real thing (the parent folder) but with the advantage that it's really ensconced safely within a cloud backed up folder. 

It is Inefficient.

Your computer's desktop is a starting point for your entire computing experience, but—like anything else—if you let it get cluttered your productivity will take a dive, and your stress levels will rise; few things are as frustrating as you or our students not being able to find that file exactly when you/they need it, especially if that entails creating it again... and again... Next time you save a file to the desktop, wouldn't it be nice to be able to find it immediately, and not have to engage and a insanity inducing game of 'Where's Wally'. That's a game I have to aly almost every say that I work with a teacher on a desktop like ... that *shudders*

Clean-desk-high-productivity-toblender.com [modified]

It literally impedes

Because of the way OS X's GUI (graphical user interface) works, the icons on your desktop take up a lot more of your resources than you may realise... Just remember that every single icon on your desktop is actually a small transparent window with graphics (the icon) inside, so if you have, say, 100 icons on your desktop you have 100 windows open, each one stealing memory. And no, dumping them all in folder doesn't really help much, the fact that there is 2764 files in ONE folder, still means that OS X will still have trouble handling one folder with that many files in it..

Computer Desktop & Table Desktop

When we work with students on this, we are attempting to inculcate good habits, habits that will last a lifetime, one such habit is to work from desktop zero, an analogy we find helpful is for them to treat their computer desktop the same as they treat their table desktop in their classroom, as busy as it can get in the course of a normal working day, every day before they go home they are expected to return that space to what is effectively desktop zero 'IRL' (in real life). Everything gets put in it's right place, whether they have finished with that project or not, it goes in the appropriate folder. The difference being with computers being that you can actually work in files while they are in the folder, there's no need to take it out, and so need to put it back, this is why Desktop zero on a computer is easier than desktop IRL. In the same way when you place a folder in the appropriate folder (in Google Drive in the Finder) you can leave it there, and work on it while it is in there.

So, with this in mind, you shift your conception of the role of the desktop, the desktop becomes a temporary, easy to locate, grab, upload, rename "I need it in ten minutes or so" dumping ground. I only use my desktop as a temporary holding place for files I'm working with. Nothing remains there past the end of the day.

Cluttered desk via abcnews.com (Getty Images)

Upgrade Your Workflow

In actual fact the desktop is a folder, it's just a folder that you start from, and while it can function as a storage folder, as so many people have unfortunately proven, that is not its purpose. It was only created as an allegory so people would have something analog to relate the new digital experience to, just like the trash can in the corner‚we don't really keep tiny trash cans on the corner of our table tops, but it functions as an approximate analogy. And like most analogies, it has it's limits. One way forward is to start working the way you do when you use an iPad or similar device. 

New OSes like iOS and Android have thankfully ditched the "file icon sandbox" idea. The only things you are presented with when you look at your device is a launchpad for apps and services. Your data is invisible and agnostic and available only when you are in a program that knows how to display or use it, and you know what it works just fine, no clutter. 

Become more app oriented and less file oriented

In iOS, if you're working on a file, you start by opening the App, then you locate the file from within that App, well the exact same method work on the desktop. Working on a word document? Don't look for the file first, open Word, then you will easily find any recent files in the recent files view. All you need to is drop down the menu bar 2 spaces from Open, to Open Recent—there that's not so hard is it?

Open Recent, don't just Open.

You will find the same feature in any application you use. Trust me. These are conventions that are cross-platform, that means you will be able to take advantage of this workflow no matter what computer or platform you ever use. Invest in now, and you will reap the rewards the rest of your life.

File less, search and sort more

I've written about this already here, spend less time creating and organising folders (although that is important too) and make sure you name your files with keywords you can search for. On all your devices now instant search is everywhere, and on your Mac, you can search in literally any folder you open, from 'All my Files' to 'Documents' if it's in there, somewhere, search will find it, regardless of the folder it's in, but that's no use if the file is called 'Untitled.doc' or "Screen Shot 2015-03-14 at 5.38.12 am". Rename it, then move it.

Sort out your Sorting

When you have a bunch of files on display in your finder, make sure you take advantage of the button which lets you 'change item arrangement' pick whatever option will make it easiest to move the files you want to the top - I personally find the 'Date modified' to be the most useful, but there are options there for everyone.

Illustration by Ben Wiseman via nytimes

Don't procrastinate you can do it today!

The solution is not to just create another folder (which is actually inside the folder which is the desktop) and dump them all in there, it just means you've buried the problem. By all means dump all the files in a (cloud connected) folder (or 3 or 4), just make sure you've deleted the files you won't need again, and give the ones you do need a name you can search for. Once you've done that you'll probably find there are 'themes' forming that lend themselves to folders, but don't let that be an excuse to procrastinate, as you can always change your mind later, computers are convenient like that... 

Clean desk[top] policy via awanbee.com

07 April 2016

Aims, objectives and semantics

One of the first goals we were faced with on our first visit with the T2T Cambodia team was to really establish what the fundamentals of a lesson need to be. It is not until you are forced to defend your rationale for the structure of a lesson that some of the issues of semantics really do come into focus.

Take the typical traditional lesson structure:
  1. Objectives
  2. Activities
  3. Outcomes

With some seasoning from our recent workshops in formative assessment with Dylan Wiliam, this quickly morphed into something a little more nuanced ... When combined with the  5 key strategies of formative assessment, the first three of which are more or less synonymous with the traditional lesson structure...
  1. Clarifying learning intentions
  2. Eliciting evidence
  3. Feedback that moves learning forward
[Students as learning resources for one another
Students as owners of their own learning]

We ended up with something more like:
  1. Learning intentions/objectives
  2. Activities that elicit evidence 
  3. Outcomes as a result of feedback  

And before you know it, with a room full of teachers, it looked like this:
  1. Learning intentions/Aims & objectives
  2. Activities that elicit evidence though active engagements 
  3. Outcomes informed by feedback and based on clear success criteria 

Now trying to explain all that though a translator to a room full of teachers in a room without air conditioning in a temperature in excess of 30° with only the most rudimentary of teaching resources...

What I have found is that you find yourself having to distil everything down to the absolute bare essentials which for me now look something like this, something which funnily enough has enhanced my own understanding of my own practice an intern hopefully improve my practice as a teacher.

For me it is ended up being as simple as:
  1. Aim or Goal or BIG IDEA
  2. Activity
  3. Feedback

But, and this is essential, it has to be iterative.
[Aim, achieved? Great. No? Either change the activity (or maybe even the aim) then try again]

Getting the Aim right is CRITICAL, if the aim is any good then in order to achieve it you will have to move through a series of "objectives" which will automatically require the achievement of "learning intentions" and the design of an activity that facilitates those goals, but that ultimately has one outcome, the achievement of the aim...


The last IT lesson I observed had learning intentions of:
  • Create a table in a spreadsheet
  • List occupations
  • Add a new column for images
  • Insert images that match the occupations

But what was the AIM? And a well considered aim would make the individual learning intentions redundant. Of course the aim has to be worthwhile, authentic, meaningful… in this case because it was a FOCUS lesson I was able to intervene and redesign the lesson with the teacher) right there, right then. What we did was establish an aim which in this case was...

Use a table to compare a range of at least 5 career opportunities that interest you. Consider the following aspects of each of the occupations you have chosen:

  • Title
  • Brief description
  • Illustration
  • Positive
  • Negatives
  • Salary
  • Qualifications required

With a well-written aim, the specific articulation of learning intentions naturally follows, agonising over them is no longer actually necessary as they will have to be identified in order to fulfil the aim of the lesson. Don't they need to be expanded? Articulated in sentences? I don't think so, any teacher worth their salt will out the mat on the bones, and hopefully also provide feedback in relation to those specific learning intentions, whether or not they actually need to write them on the board is another question.

What was even better about this was that it quickly became obvious that there were quite a few aspects of the occupations that interested students that none of us were in a position to answer… for example salary, instead we asked the students to estimate what they think the salary per month would be, the we did the same for  of the other aspects of each of the qualifications that they chose. Then (using the student resources for one another) we are the students to compare their work… this sparked some passionate discussions as some students had (for example) the lowest paid as a police officer while other students had the police officer highest paid… Discuss!

What was fascinating is when we get the students to then research online to find out what the actual answers are and then compare the estimates with the reality and then to reflect on the disparities or consistencies that they found.

What started out as a rather banal activity in table creation and meaningless data entry became a transformational lesson in career guidance while also fulfilling the (arguably more mundane) ICT requirements. It's all about the aim

06 April 2016

21st Century Spelling

Spilling had never bin maw impotent

Spelling has never been more important, as my example above attempts to illustrate. In an age dominated by screens, misspelling is tantamount to an admission of idiocy—but the ways we teach spelling needs to evolve to take advantage of the unique affordances and challenges of spelling in a screen environment. Please note that none of the words in the title are actually misspellings, but mistakes they are, and a right twazzock you will look if you spell in a way that is overly reliant on proofreading tools as a safety net. It's time we took account of the fact that in a world dominated by screens the ways we teach spelling needs to evolve to take advantage of the unique affordances and challenges of spelling in a screen environment.

These days the likelihood of interacting with others in a digital environment is an extremely commonplace scenario. Even more critical, people who misspell in these environments are generally assumed to be less intelligent, less articulate, and despite their possible intelligence/experience, their perspective will be dismissed or demeaned if it is littered with misspellings. It has never been more important to master the ability to spell correctly. Unfortunately most schools, despite the criticality of spelling in the 21st-century, still rely on 19th century strategies to teach spelling. This really does need to change. So, with that in mind...

14 critical considerations:

  1. Spelling should be managed within the context of writing, and not as a separate "subject", For that reason keeping a separate spelling book is discouraged; a better practice is to think of and learn about words and misspellings and sounds within the context of writing, so for example words that are encountered that are challenging to spell should be recorded at the back of a student's writing book, not in a separate spelling book.
  2. Less is more, more frequent opportunities for kids to think about spelling, but for much shorter periods of time (10-15 minutes per day)
  3. Make direct connections between spelling and handwriting, the actual physicality or skill of the formation of the letters as they are literally connected is meant to reinforce the way the sounds are connected, and builds a visual reinforcement. The idea is to combine physical visual, oral and aural practise to reinforce the feel, the shape and the sound of a word.
  4. Children (and adults) can only spell words they know, sounds obvious, but so many of the spelling lists that are used with students contain words they do not know, so could not possibly be able to spell, other than through guesswork, which leads us to...
  5. There is a much greater validity to the skill of being able "guesstimate" in a TELE (technology enhanced learning environment), and ‘phonological awareness’ is more essential than ever, as an accurate phonetical estimation is relied on by computers to substitute for a correct spelling. A student who cannot phonetically 'attack' a word is unlikely to be able to approximate something that a computer can correct. Related to this is the critical importance of being able to spell the first half of a word correctly, most modern computing devices can now auto complete a word if a student is able to spell the first half of it correctly. Apple's 'QuickType' in iOS 8, and apps like "SwiftKey" utilise this approach very effectively, and the power of Dictation (speech to text) has never been greater, but it will still struggle with homophones (same sound different spelling and meaning). An alternative approach in a 'spelling test' context is to award 2 marks to each word, one mark for being able to spell the word phonetically correctly, or for spelling the first half correctly, and 2 marks if the word is perfect. 
  6. Stop using spelling tests for whole classes with lists of words, this is a nonsensical approach, considering the sheer quantity of words in the typical English dictionary, somewhere in the region of 400,000 words. The words that children learn should be unique and curated from their own literacy life, related to their own writing, reading and speaking, and viewing and listening experiences, or related to specific vocabulary that they are using/used (not will use) in a current unit of study.
  7. Wordlists curated by students should be seen as a source of vocabulary expansion, not just for spelling. Becoming a personal thesaurus/glossary that they should review regularly when writing to enhance the richness of their prose; use it or lose it.
  8. Less reliance upon "spelling rules" which are very rarely consistent, and in many cases can lead to a great deal of confusion. Like when students are asked to note the position of a certain vowel in a word and its impact upon other vowels or consonants within that word, also using acrostics like 'big elephants can always understand...' you get the idea, and of course they only work for one word… Instead focus on more reliance on building familiarity with the way words look and the way words sound, so 'look say cover write check' still works well as a useful skill/drill practice, but with fewer words, more often. This is strongly related to the student's reading life as a synergetic enabler in their spelling life. This becomes a context where students are encouraged to see words as 'friends' and building a large community of 'familiar faces' ie, the more they see these words the more likely they are to be able to spell them, or arguably just as important in the 21st-century, to recognise when the word is not spelt properly, ‘it just doesn't look right'.
  9. Skill drill tasks (practise makes permanent) should also be related to an activity that reinforces their comprehension of the meaning of the word, so ideally students should also invent (not copy) a sentence that uses the word, or even better, more than one of the words in the same sentence, that clearly demonstrates that they can use the word/s with an understanding of it/them. For some students it might be better for them to make an oral recording of them speaking the sentence rather than writing a sentence, if the writing is a challenge to reluctant writers, as the focus is on understanding meaning, and oral recall can be just as effective for building meaning, this is especially important with homophones.
  10. More recognition of the kinds of spellings that are particularly relevant to a screen centred writing environment, this means a greater emphasis on distinguishing between words with similar sounds and different patterns, homophones, homonyms, homographs.
  11. Making smarter use of digital tools to facilitate this kind of practice, while spelling activities that are built on skill drill using pre-set wordlists have their place, particularly for high frequency for younger learners, for older/more proficient students, encourage spelling drills that are built on individually curated wordlists. Unfortunately Apps that facilitate this kind of curation are not very common, but at least one that does this very well is Squeebles SP, although you have to ask students to pretend to be a teacher to do so.
  12. Use a word processor to spell check, before using a teacher. This could be a simple as a Notes app on a mobile device) to enable students to check spellings without the tedium of using a dictionary. Then the teacher reviews the spelling for careless mistakes, or more likely mistakes resulting from misconceptions about phonetics/word structure. Students need to be empowered to build habits of capturing/collecting words that they know, but cannot spell in their curated lists. The point is, it is better for the student to attempt to type the word in a text application and have the computer suggest corrections than it is for them to try and search for it in a dictionary. While the latter is still helpful, the former is a better cognitive process for learning the spelling of a word, and is also more relevant/likely as an activity or skill set in the 21st-century. Very few adults look up words in a dictionary, most rely on the prompt given by the computer in a word processing environment.
  13. Encourage students to learn how to use the "define" search term in Google, effectively turning any Google search window into a handy Dictionary, eg - define: magnificent
  14. Digital technologies are changing which words are traditionally understood to be "tricky" words/spelling Demons/sneaky spellings… so for example any word typed in a text environment will automatically switch the 'ie' in a word like receive, but will not be able to distinguish between homonyms.

Squeebles Showcase

Squeebles Spelling - multimodal drill and practice
I'm not usually one to emphasise a tool, but from time to time a tool emerges that has affordances that are ridiculous to ignore, Squeebles Spelling is one of those. Digital tools like Squeebles can transform spelling practice by making traditional equivalents pale in comparison, consider the following:


Click to see Squeebles in action! 
Kids can 'masquerade' as a parent or teacher to curate their own lists, careless errors are mitigated by the built in spell check—obviously this feature is not activated when they are actually practising! Alternatively, there are a wide range of built in word lists to choose from that cater to all skill levels.

Multimodality and meaning

It's not enough to spell a word, they need to know how it sounds and understand the meaning. In Squeebles kids can record the sound of the word, as well place it in a sentence, eg "Pear. I like the taste of a pear better than an apple. Pear." Better still make it fun by having the kids make up silly sentences, as long as it shows they understand the meaning anything goes! This makes the activity aural and oral - this way the kids say the word, hear the word, and see the word. 

Immediate feedback - differentiated

No need to wait for a teacher to collect in all the spelling tests, then wait a few days to get them all back, even then, actually acting on the spelling errors is a chore, never mind tracking these over time. Squeebles provides immediate feedback, but even better keeps a record of any errors in a collection called 'Tricky Words' that reflect the words that this individual is struggling with.


Last and maybe least, Squeebles 'gamifies' the successes into mini games, so kids feels a tangible sense of reward, over and above the real reward—improved spelling.

05 April 2016

Deliver us from tedious tests and rubrics

via hippoquotes
Assessment drives everything educational. So, not surprisingly, assessment is the biggest factor in terms of planning the use of tech in effective ways. This means that it's critical to ensure that we use a varied range of assessment strategies, which is where I find a surprising lack of options.

Why do so many teachers assume that only rubrics and tests are suitable for assessment? Sure they have their place but only within a suite of assessment strategies...

It feels to me like every educational reference I read or hear about, especially in tech circles, assumes that the only viable option has to be a rubric. I don't mean to denigrate any particular assessment tool—clearly rubrics and tests can be effective assessment tools, but when they dominate, they have an unfortunate tendency to diminish the importance and efficacy of all of the other tools that are available. It is depressingly common to me that in virtually any educational context (classroom, conference, online) when the conversation inevitably turns to assessment, the question seems to default to, 'what rubric or test will we use?' rather than any awareness that there are a plethora of other tools and strategies that could be just as effective if not more so.

Now I am concious that I may be overstating my point, after all, I have to confess I don't hate them, I hate the way they are so often assumed to be the only option worth considering. I loathe the majority I see that are poorly conceived and poorly written. They are often bloated verbose attempts at teasing out questionable differences in attainment, many that seem to be based on the assumptions that just adjusting superlatives is sufficient, like well, very well, independently, with assistance...

Of course I'm not the only one who has a problem with rubrics:

The most famous of whom is probably Alfie Kohn who speaks to the false sense of objectivity and how rubrics have misled many.

And I really like Joe Bower's take on Rubrics, in 'The Folly of Rubrics and Grades'

"Grades and rubrics are a solution in search of a problem that will further destroy learning for its own sake.  
It’s been five years since I used a rubric. I simply don’t need them, nor do my students.
Rather than spending time asking how can we grade better, we really need to be asking why are we grading. And then we need to stop talking about grading altogether and focus our real efforts on real learning."

Most of the rubrics I've seen could be easily replaced by a continuum, at least then all you would need to is define the extremes, but and I guess this is a statement about teaching as a profession, far too many teachers use the term 'rubric' as if it is synonymous with 'assessment tool'.

Rubrics are one of many ways to assess learning, and they are used far too often. Used well a rubric can be a powerful assessment tool, but in my experience I rarely see them used well, and I often see them used inappropriately.  So, yes, they have their place but only within a suite of assessment strategies...

Here's one way to use a rubric well, by making it more student centred' this way the teacher defines a central standard (eg a level 3 on a 5 point scale) and then leaves the students to define and justify the level they feel there work sits in comparison to that, (above or below, or in the middle) with examples.

There are other ways to assess... 

Next time you're assessing, at least consider some alternatives to rubrics. Now before someone accuses this of being more new fangled thinking, here's some out of the Ark:

But one of my favourite summaries of assessment strategies and tools, is this grid from the PYP:

Unfortunately the PYP is allergic to the term 'tests' and (somewhat simplistically in my opinion) assume that all tests can be summarised as 'checklists'. Still, if more educators made more effort to tick all the elements in the above diagram in one year everyone would be a winner. I've always found this matrix from the PYP to be particularly useful to illustrate this, although you may be surprised by the omission of tests from this grid, I believe they (somewhat disparagingly?) categorise these as 'check lists':

Do less, but do it better.

Now of course it's highly possible that teachers are unaware of the wider range of assessment tools they use effectively almost everyday, such as the ad hoc/informal conversations (conferences in the jargon) with students every day, to spirited class debates (not lectures) that utlise skilful Socratic strategies, which are in and of themselves valid assessment tools. The problem is that I think these are seen as somehow inferior to a "proper" test/rubric. All this does is create a lose/lose scenario for the teacher and the student. Rather than focusing on tests and rubrics, wouldn't it be better for everyone if we were to embrace a much wider tool kit when it comes to assessment? To see them all as valid/powerful, maybe that conversation/conference was so effective that adding a rubric or a test is not only unnecessary but possibly even counter productive?

I think if you had asked most teachers why it is that they rely so strongly upon rubrics and tests as opposed to all of the other powerful forms of assessment, I think you would find that they would point to one sad fact; they feel they need paper with marks on, that they can attach a grade to, so they can point to it as being hard evidence of their assessment judgement. While there is clearly a place for this kind of formal (usually summative) judgement, in my experience it is far too frequent and far too common. Teachers could do themselves a favour and do their students a favour by focusing on the goal of learning rather than the need to have a hard artefact to present evidence of every stage of progress.

What if instead we were to focus on the goal, that is, as long as the assessment tools you use allow you to provide effective individual feedback to the student and enables them to progress in their learning point where they are improving compare to their previous level of competence (ipsative assessment), then the goal has been achieved! So why not work a little smarter and use a range of assessment tools that are a far more varied. In so doing you create a classroom environment that is more dynamic, and far more effective for both the teacher and the student.

So what does this have to do with edtech?

From my perspective, a classroom that exploits a wide range of assessment tools is a much richer environment within which to be able to integrate digital tools that can truly enhance and transform the way teachers teach and the way the students learn, and demonstrate the extent to which they have mastered the skills, knowledge and understanding that is truly the point, not just in ways that can be measured quantitatively on another test or a rubric. You don't have to look much further than an early childhood classroom to see this in action. Why? One thing these very young students can't do is demonstrate their understanding via tests or rubrics, which opens up a whole range of extremely rich engaging ways of demonstrating skills knowledge and understanding that would benefit many students that are considerably older, 

04 April 2016

Kids, Concentration, Boredom, & Tech

Photograph: John Slater/Getty Images

Boredom is not a new problem, it is a condition that has to a greater or lesser extent been an aspect of human existence for eons. And yet it seems to me that a pervasive myth is developing, along the lines of assuming that boredom is the fault of computers, that students that use computers are students that cannot concentrate, articles like these are a case in point:

"Technology Changing How Students Learn, Teachers Say"


"Technology Creating a Generation of Distracted Students"

The general gist of the arguments could be summarised thus:

Teachers (from middle and high schools) say today’s digital technologies “do more to distract students than to help them academically.”

"There is a widespread belief among teachers that students’ constant use of digital technology is hampering their attention spans and ability to persevere in the face of challenging tasks, according to two surveys of teachers..."

".. roughly 75 percent of 2,462 teachers surveyed said that the Internet and search engines had a “mostly positive” impact on student research skills. And they said such tools had made students more self-sufficient researchers.

... nearly 90 percent said that digital technologies were creating “an easily distracted generation with short attention spans.”

... of the 685 teachers surveyed in the Common Sense project, 71 percent said they thought technology was hurting attention span “somewhat” or “a lot.”

That said, these same Teachers remained somewhat optimistic about digital impact, with 77% saying Internet search tools have had a “mostly positive” impact on their students’ work.

Arguments abound, although ones like this strike me as quite strange:

"This could be because search engines and Wikipedia have created an entire generation of students who are used to one-click results and easy-to-Google answers."

Wait. What?

You're saying that if you can get an answer to a question with one click, that is a bad thing? Sure, there will be times when you will have to do a lot more than one click, because you have not been able to get a satisfactory answer to the question. But... if I could get a good answer in one click, believe me I would. If anything, access to the treasure trove of information that is the Internet, makes it much easier to get a multiplicity of sources, rather than only one, much easier than I could with books - yes I said it.

If your students can get the answers to your questions with one click... You're asking the wrong kinds of questions, boring questions. Maybe try asking questions that they can't just google, or that are difficult to google?

So. To the hordes of disgruntled teachers who are so quick to blame technology for short attention spans, I have this to say.

Get better. Get creative.

If your kids are bored, that is because, you are boring them, you are allowing them to be bored. Face it, move on, build a bridge, get over it, and use this as impetus to improve. As Dylan Wiliam says, "teaching is the hardest profession because you can always get better at it; and, "A complaint is a gift" (Although it won't feel like that at the time)."

"The cure for boredom is curiosity. There is no cure for curiosity."

(Widely attributed to Dorothy Parker)

"by removing lecture from class time, we can make classrooms more engaging and human." 

"Why Long Lectures Are Ineffective" Salman Khan

It is unfair to blame technology for short attention spans… We (the human race, not just kids) have had short attention spans for many years, it's just that students are now less inclined to put up with it. Certainly the Time magazine article cites research from 1976, well before the advent of digital technology as we know it - I was a (bored) 6 year old.

I know this may come as a huge shock to anyone who knows me, but I have always had a short attention span; and that predated computers by at least a decade... I am not the only one. Chances are many of them are in your class (and are also your students' parents).

In 1996, in a journal called the National Teaching & Learning Forum, two professors from Indiana University — Joan Middendorf and Alan Kalish — described how research on human attention and retention speaks against the value of long lectures. They cited a 1976 study that detailed the ebbs and flows of students’ focus during a typical class period. Breaking the session down minute-by-minute, the study’s authors determined that students needed a three- to five-minute period of settling down, which would be followed by 10 to 18 minutes of optimal focus. Then—no matter how good the teacher or how compelling the subject matter—there would come a lapse. In the vernacular, the students would “lose it.” Attention would eventually return, but in ever briefer pockets, falling “to three- or four-minute [spurts] towards the end of a standard lecture,” according to the report.

Just in case you didn't catch that. Let me just make that a little clearer:

10 to 18 minutes of optimal focus.

That's it.

So, what we need to do is instead of complaining, get creative.

via technorati

Maybe, just maybe, boredom is nature's way of telling you that you need to change.