Thursday 16 July 2020

Read it, note it, (re)do it, teach IT: How to learn effectively

It's no secret that IT is a career in which you need to keep learning. 

It's also a given that many IT professionals are not given the time, space or resources they need to do this at work, so they end up doing it in their "spare time", to the detriment of other things they might otherwise like to be doing. Others are given this at work, but at a low priority, or in a half-hearted way where you have 8 hours of work a day you're expected to get done every day - sure you can spend some time learning, but you still need to deliver that 8 hours of what the business considers "work"! 

So it makes sense that no matter what we do, we ought to maximise the ROI on our "learning time"!

Lifelong learning

I recently talked about how Dunning-Kruger applied to knowledge in IT. Here, we'll be examining how we might most effectively navigate a path from the summit of Mount Stupid, through the chasm of the Valley of Despair, scale the treacherous Slope of Enlightenment and set out on our trek across the infinite Plateau of Sustainability - through better learning. 

As "knowledge workers", IT professionals have to get comfortable with the simple fact that they have co-opted a need for lifelong learning and professional development, and a high degree of information literacy. But how can we most effectively employ our limited learning time? 

There is, of course, research in neuroscience around learning - and forgetting. Here, I'm going to focus on circumventing the annoying "forgetting" circuitry that stymies those of us without an eidetic memory. 

Whilst learning styles are a somewhat contentious and mostly disputed model, it is helpful to understand how you prefer to learn (and it may be worth trying a few different styles to see how they work for you and picking the most effective, rather than the most pleasant). I hate audio and video for most technical topics. People ramble even more when they speak. They do so slowly; speeding the video up distorts the audio in various distracting ways. A lot of people have terrible diction or distracting accents (or terrible taste in background music). It's hard to skip past the irrelevant bits where they can't edit down the waffle (guilty as charged!). The playback speed doesn't intuitively react to my own processing of the words (unlike when I read). I like written words and (mostly) still pictures - I've spent something approaching 40 years training my brain to use this mode to onboard information (and yes, I had a stage where all I wanted to do was read "fact books", and a stage where I memorised the binomial Latin names of fishes). My wife often comments that I can't possibly learn the way I do - I read a book a couple of times, and that is my "learning technique" (I long for the days of my youth where I only had to read the book once...). I find actual dead trees books better than screens (kindle or laptop). 

It's possible there are better ways to learn (or "not forget"), no matter what your current method is. 

Let's examine some. 


The neurological basis of memory

human brain toy
Brain and neuron
By 
@averey https://unsplash.com/photos/IHfOpAzzjHM

You may already know that memory is encoded in the interconnections of neuronal cells in your brain and the synapses between them. That getting a new memory requires novel re-wiring of parts of your brain. What seems to be the case with memory is the more those particular pathways are re-triggered (you "remember" and use those memories) the stronger they are. Rarely used memories decay. Some experiences never really get the chance to be long term memories. Really "significant" memories are stronger. For those with a programming background, our memories are heavily in associative arrays - we link things together with other things, and we most effectively learn and remember when we have several triggers (associations) for that memory, and have new information linked with something we already know. How often have you said or thought "oh, that reminds me..."? 

As capacious as your brain is for remembering things, there are limits to its capacity, and most people forget most of what they come across. There are also research thoughts that too many memories may be distracting or sub-optimal for quick decision-making. Remember, we're the product of billions of years of snap decisions around whether that thing over there necessitates one of the 4 Fs - Fighting, Fleeing, Feeding and Reproduction. Relatively little fitness has resulted from a lot of sittin' around, cogitatin' for most of evolutionary time, and there is a large body of thought that our abilities in this regard are primarily the equivalent of a peacock's tail - with similar drivers related to sexual selection and the outlandish results that can have. We're wired for what worked for (thousands of generation of) common ancestors, not necessarily for the modern world. Witness phobias, which are almost always around things that exist in the wilderness - there aren't really any people who have a phobia around cars, or high voltage electricity - both more dangerous and more frequently encountered than venomous snakes or spiders by most people today. Circling back to the point here - your brain's mechanisms are all based on what worked over thousands of years, and that was mainly "chuck out most of this coincidental garbage and keep the real gems".

This forgetful filter includes discarding things you might not want to forget, and you have little chance to circumvent that (although my brain seems to prioritise information where I think "hmm, that is interesting" - and were that happens, I will often at a later date recall that thing and having "read it somewhere"). This garbage collection doesn't pay much attention to "this is work knowledge and therefore inherently important". Sorry! 

Similarly, things we don't regularly use tend to decay. Can you remember all your past childhood home telephone numbers? I can remember some of them, including one from the ages of 6-10, in part because I mentally repeated it so often, in part because I remember the DTMF tones it made, and partly, because it is a pattern-y number (464 4550); indeed, when I think of this number, I get the mental sound of both my voice saying the number in a particular cadence, and the DTMF tones synchronised with the number, and even a mental picture of my finger hitting the buttons themselves on the specific handset. How do I know the DTMF tones? Well, I used to dial it from the home phone and listen to the cool sounds... I typed my credit card information into enough web forms that I memorised it. Similarly, writing in a bank account number monthly onto a form to be paid as a contractor. South Africa's obsession with ID numbers on everything means I remember that, too. Of course, sadly, not everything we need to know is interesting or stimulating, or used super-frequently. 

So there must be ways of hacking this system, right? 

Yes, but there are no short-cuts (much like in uncovering new exploits) - there isn't a script kiddie friendly version of Kali for your brain - you're going to have to put in the hours, one way or another! 

It bears stating that all of this is based on developing science and understanding, and on models of learning and memory that may be incomplete, or perhaps even wrong - so see how you go with each of these and how they work for you, and figure our your own personal bag of tricks. And do more research of your own on this, too! 

Repeat, repeat, repeat. Link, link, link. 

I've noticed that repetition, in all its various forms, seems to be pretty key to lasting memory. Research apparently suggests much the same. Things you don't regularly use decay. Things you use regularly are right there to be used. Things somewhere in between are harder to "access". 

Similarly, things that are linked to other topics, feelings, sounds, scents and so on are also easier to remember in the first place, and then to recall or trigger those memories or associated knowledge. 

So most of the neurological "hacks" are based on subverting the neurological machinery that sorts memory into the "keep" and "discard" piles. At its simplistic essence, this comes down to "things that are used often get remembered" (repetition), and "things that are linked to many other things are probably important" (linking); forming internal models of a concept and contextualising that knowledge (in terms of rationale and importance of it to us) also seems to help. 

Be aware that neural plasticity changes with age - kids are literally wired to be learning sponges; older people's brains start to loose this plasticity (the "evolutionary psychology" just-so-story runs along the lines of: Kids must learn! You, old person, survived this long, your current mental tool-set is good enough, change may be worse. Don't change!). 
So yes, kick yourself for not starting this journey younger. And yes, I do find it considerably harder now than I did as a kid (where it was quite literally effortless, so long as I "saw the point of it"), but I'm still quite good at it, I think - but even by university, I found learning more effort than I did as a tweenager, and the dumbing down "lies to children" that we do because "kids can't cope with more than that" strikes me as a missed opportunity. Oh well. 
A sobering, related point: if you're over 40, you are, for a human, extremely old compared to the average lifespan of humans over most of our 100,000 or so year evolutionary history; anything beyond that is a bonus, and, for your genes, somewhat uncharted waters. You've been an adult for a while! 

Repetition

man in black pants under blue metal frame
Repetition!
By @serhatbeyazkaya https://unsplash.com/photos/6OmkdtxJzYE

An absolutely key feature of all effective learning seems to be repetition. Go over it enough times, and information sticks. This means you can expect to spend quite some time to really learn a subject. 

When you read someone else's journey to a certificate, and they say "it took me a year and a half of concerted study" and your think "but I can read the book in a day or two", those are not the same journey at all. The person who's lived with the material for months knows it. They've gone over the material several times. They've played with it. They've probably read several different books on the same subject. The key to professional development (not simply passing exams) is you need knowledge that sticks. This only comes through time and repetition!

On making notes

eyeglasses on white notebook
Notes, FTW.
from @grohsfabian https://unsplash.com/photos/GVASc0_Aam0

Good note-taking does several things. Not only does it provide you with a fully customised set of knowledge in your own voice, creating that content requires you to engage with the material in a different way, and a way that involves more senses. It really does seem to help you retain the content. It is, of course, a form of repetition, but a particularly effective one, it seems. 

When I listen to someone, there is a certain element of in one ear, out the other. Take notes while I do that, and the experience is much more transformative. Similarly, receiving a set of someone else's notes, hand-outs or copies of slides, is absolutely not the same thing as making your own notes, and has considerably less value. I'm often tempted to go "yeah, I've got the book though, I can just re-read it". No! Bad learner! Well, even stubborn just-read-the-book-learning me recognises that notes are effective. Similarly, highlighting passages makes the learning more multi-sensory, but actually (hand) writing notes is much more effective (and doesn't deface books). Talking of multi-sensory, remember my example of childhood memories, specifically phone numbers? Not only repetition, but visual (numbers on the keypad), and sound (DTMF tones). 

The least effective is if I try to transcribe what someone is saying by typing it out in near real-time - this is a straight ear->fingers matter, and I am absolutely not inwardly digesting the material. For some reason, attempting to hand write the same matter is a bit more effective, but summarised key point notes are much more effective. A small study illustrates this quite effectively. A key difference between "traditional" note-taking and the attempted live transcription that seems to happen once you have a keyboard is there is a greater degree of active processing in taking notes on paper - thinking about the material immediately, highlighting key or important points - and often includes the random mental tangents that aren't even part of the original material, but are your reaction and personal context for the material. It is semantically much richer to the individual. All of that makes the subject matter much more "important" and sticky for your memory!

Demonstrate this to yourself. Take a low stakes (i.e. free) course in something where you take notes, and a similar (but not the same, obviously) subject without taking notes. How well do you do in a final test on each subject? How well do you do if you repeat the tests for each six months later? Ideally, you should have some replicate trials, but I suspect one will be enough to demonstrate the point!

Make notes. If you hate making notes, figure out some other way that you have to actively transcribe, collate, relate, contextualise and relay the information into different formats yourself. Try to force yourself to take notes anyway. 

How you take notes is entirely your own deal. It doesn't have to be a neat essay - indeed, for most people, a crazy mess of their own "hieroglyphics", arrows, underlining, circles, boxes, relative alignment and other features seem to work better - indeed, some notes look a little like mind maps (see later section). But what works for you is what you should use in the end. Experiment! Long term legibility may not even be important - it is the active learning process of the note-taking activity itself that seems most important, not that there is a reference to review. Indeed, approaching things you're taking notes on merely as initial guidance or an introductory framework for further research and study from other sources (much like university lectures used to be, and perhaps still are in some courses) is probably a good move if you're trying to achieve mastery.  

Re-certification

Most IT certifications have ceased to be "lifelong" - if they ever really were. Whilst it can be somewhat annoying to have to keep forking over chunks of your hard earned to some soulless corporate behemoth, spending time studying, and subjecting yourself to exam stress, there is an upside. 

Firstly, the curricula evolve, usually for the better, and help you keep up to date with the ceaseless march of technological progress. Secondly, re-doing the same exam helps you to cement the details in your mind. Much like re-watching or re-reading favourite movies or books can bring out new elements, so does re-learning topics; you'll revisit the theoretical with practical experience; you'll make deeper connections; and see new context for how all the bits fit together. I'd certainly encourage people to move up the certification hierarchy, but there is value in re-doing your existing qualification if the higher one isn't for you just yet. And eventually, you can't go higher, so you'll be "stuck" redoing the "expert" level qualification exam. (Tough live, innit?).Well, congratulations on reaching the top, but there's still more to learn, believe it or not... By this stage, you might just reach the joyous circumstance where you feel you still know nothing and are still a bit of a fraud. If you do, congratulations, you've got Impostor's, so harness that energy to get something great done!

Perhaps cynically, part of the reason for the IT learning treadmill is, arguably, a need to introduce new features so they can sell new versions of the product - but you'll still need to refresh the basics, and re-certifying is a useful way of doing that.  


Writing

First of all, I'm not talking about notes. I'm talking about long-form writing in a semi-formal way - making documentation, creating blog posts or even writing book chapters or entire books. This doesn't have to be for public consumption, but you should put effort in as if it were, striving for clarity and insight (and being amazed when you deliver it). 

In the same way I like reading written words, I also like playing with them on paper myself. I find it's a good way to highlight areas that are "grey" in your mind. Setting out to put down a reasonable treatise on a topic tends to show you exactly what bits you're not too certain about, spurring you to discover up until then "unknown unknowns" and immediately turn them into "know unknowns" - which of course, being the diligent autodidact you are, will spur you to do the work to turn that topic into a "known known". 

It's not as effective as teaching, because your own sense of what is clear isn't always totally on point (*cough*) but it is a useful technique to help you cement thoughts. 

You can of course share this writing (the internet makes it rather easy to do so). Even if you feel you're "howling into the void", someone may read it, and it may have been your particular angle on the topic that finally made it "click" for them. Outstanding job. 

It also shows you a sort of measurable output or outcome of your learning that not only documents your understanding, but contributes meaningfully to it. 

Labs

two brown dogs
Lab every day!
By @wadeaustinellis https://unsplash.com/photos/NdZxzD9QlSY

For practical disciplines, nothing can possibly beat actually doing the discipline. You can theoretically discuss storage subsystems, or hypervisors, or the BGP bestpath selection algorithm until you're blue in the face, but mostly, people care about how that manifests in systems doing what they're supposed to do in practice. So it's pretty intuitively obvious that you're going to want to get down to the "doing" at some point - and that will typically be in some sort of lab environment. Scrounge, borrow or buy spare kit, or virtualised versions of the platform you're trying to master. You can find free examples of some, or pay for access to others, or build up your own lab, either as virtualised hardware that's become reasonably affordable to many because of the relentless march of Moore's Law. 

We'll cover this in more detail below under Practice, but doing and re-doing lab exercises is inherently "repetition". 

Different treatments of the "same" content

It can be helpful to get subject matter content in a variety of forms or from different authors/creators. People have different ways of processing information, so it just might happen that you find some information just works better for you. Even in the absence of that, the repeated exposure to the concepts helps you to cement the information and better retain it. There may be higher value ways of "repeating" knowledge, though (labs, notes), but this is worth trying. 

Written word not working for you? Try video or audio; I don't find them that helpful, but you might be very different (or, perhaps, suffer from dyslexia, making reading a struggle). 

Here's another way to approach this - do something in a different way. Struggling to learn to program? Get an arduino and some basic passive electronic components and build some programmable circuits. A lot of people find the concepts click better when they can hold a physical artifact in their hands; it's more "real" than lines of code, somehow. 

You'll see a lot of these topics, when you step back and look across them at a whole (and use several of them) mean you're inherently doing different treatments of the same content. This is good for learning, not only because it's repetition and linking, but because the different "modes" of content delivery help you to potentially find meaning in different ways - and that also helps!

Practice

Ever noticed how many successful IT pros are encouraging people to #LabEveryday? There is a reason for that. 

Firstly, learning in production is... unwise. 
Secondly, you can break things with no worries about problems, and see what happens for yourself. 
Thirdly, lab everyday is practice, which is... repetition! 

You're continually reinforcing your knowledge of command line syntax and commands; what the outputs are like, and, vitally, what "normal" looks like. It's sad that this is mostly expected to be at home, and expects a certain level of income so that you can even afford to do so (most of the top platforms are not available for free, although things like vlabs exist, and you can often get a demo version of a software product or cloud service). It obviously makes the most sense to practice on platforms that you are using day to day or aiming to certify in, once you've grasped the basics. For the vendor neutral basics, of course, anything will do, although you may build up mental models that are strongly influenced by some of the unusual ways some vendors do certain things. You must be amenable to changing and extending those mental models!

For example, Nortel/Avaya and VLANs is... backwards, in that you add ports to VLANs, not VLANs to ports like pretty much everyone else - but there is some elegance to this "other" way of doing it, in a way - but it stands out as "wrong", because everyone else does it another way - and, well, when I go back to a Nortel/Avaya (an infrequent event) it has historically taken me a while to remember the "backwards" syntax (confession: I usually ended up googling it). Now that I have the relatively recently discovered mental model "Nortel/Avaya is backwards", it's much easier to remember this fact and way quicker to get back to the right syntax (hey, I suspect if you ever get into this situation yourself, you might even think "wait, Nortel/Avaya VLANing is backwards, I read that somewhere" and get to the answer way quicker - possibly without google). This is also an example of reframing, as well as linking it to the rest of your mental model of VLANing. You will also find that people who learnt to VLAN on Nortel/Avaya possibly think everyone else is backwards (my first VLANing was on Cisco). Whether this was an architectural vision ("ports are members of VLANs, so you configure them under VLAN commands" - rather than the alternative "ports are tagged with VLANs, so you configure this under the (switch)port commands"), or simply trying to avoid getting sued for "copying" by a vendor with a logo that looks like a famous bridge who everyone knows does it the other way, I'm not sure! I suspect an awful lot of the slight variations in command syntax either comes from different conceptual models of how the various IT layers fit together and what belongs to what - or, sadly, probably more often - a keen desire to avoid getting sued for "copying" features. I suspect the trend towards (net)devops, cloud and Infrastructure as Code will lead to further abstractions here quite quickly, and a lot of people will never even realise that different commands are being run on different platforms. Until then (and for advanced users), you're going to have to learn the syntax (or perhaps pay the syn-tax? ;)) for each platform you use; different vendors have different degrees of pain here. 

If you regularly lab with different vendors (particularly if you use one vendor in your day job, but want to stay current with others), you'll retain that knowledge far better if you can keep "current" across them. It may be helpful to have some clear context switch when you're learning each vendor - something as simple as only doing one single vendor (or platform, where those differ) on a day, or having very different looking terminal windows or GUIs, or even doing different vendor labs from different positions in the room (or home). Similarly, the more time you spend doing anything, the better you get at it - particularly if you carefully reflect on what you're doing and actively seek improvement. We rarely learn multiple foreign languages in the same classroom or with the same teacher, helping us to separate out those languages. IT platforms are not much different from that!

There's another big reason - practical, personal experience is way more significant in cementing learning than abstract knowledge. You remember it far easier if you type the instructions that watch a video of them being typed. The Socratic method of teaching, or variants of it (where you teach by asking questions people have to answer) is quite effective for more advanced groups (my A-level biology teacher taught like this a lot in our tiny 6th form top set of six people, and treated us more like 3rd year undergraduates than high school kids) - you have to engage your brain, and you're not just waiting there like a sack of potatoes for knowledge to be thrown at you to see what sticks. You'll remember tech stuff even better if you keep having to type in the commands without blindly following a "recipe". Most people expect you to be able to do most of the basic and intermediate commands from memory; the more advanced you are, the more is expected to be "at your fingertips" - which is why you get paid the big bucks. You can achieve more complicated things more quickly, and they will be more robust/anti-fragile. You've figured out what is needed, the best way to provide that, and already know how to implement it. You'll probably still test it out in some sort of test environment, because that is what professionals do. Similarly, you'll know the system well enough that you look at it in a troubled state, and things jump out at you as "odd", or you know how to get it to spit out the appropriate arcane diagnostics to really dig into the issue.

Labs are great for experiencing the topology diagrams in books, or the actual behaviour (or a close analogue) of systems; to experiment with new ideas. To do the "what if" without causing P1 outages. All of that "playing" helps you to practice your day-to-day administrative skills, and build a richer personal context for the conceptual learning. It is always worth spending a few days with as close to the functional environment as you can get for testing "what ifs" before rolling out a new architecture. The bigger the change, or the less familiar the gear, the more helpful this is, particularly where there are differences between physical kit and virtualised platforms. Of course, there are many things that are hard to accurately model (full internet scale particularly, and full production loads in many cases). 

Also, there is no substitute for experience - having been there and done it (possibly even having got the t-shirt), you just have a larger mental experience base to apply to problems - and suggest better solutions from a place of experience, and can say:  
"yeah, it works in the lab at small scales, but when you scale it up to production environment of X nodes, not so much".  
"Oh yeah, I've seen this before, it usually happens when...".  
"Ooooh, this looks like when X does Y because Z". 
You might also have see more mixed vendor interaction behaviours, and hopefully have picked up nuggets of experience across IT, being able to give slightly more holistic answers to bigger problems.

Beware vendor lock-in in your own training (balance this against the possibly huge amounts of effort invested in multiple vendor knowledge). If the only tool you have is a hammer...

Linking

Linking concepts together with others helps you to remember things. It's worth finding ways to help you to find links you can use to enrich these inter-connections and pathways in your own mind.

Quad at OSU showing paved desire paths. 
from https://alumnigroups.osu.edu/tuscarawas/wp-content/uploads/sites/83/2016/03/oval_aerial_2887.jpg

Wikis

The most immediately obvious interlinked knowledge artifact is a wiki. They work very well for hyperlinking information and related concepts, and I really like them for technical documentation within technical teams, because they're easy to create, expand, and hint at missing information that needs to be filled in. They're best used as collaborative tools, and you may find value not only in using a wiki, but in extending one your team makes use of. You'll also find it a useful place to curate infrequently done tasks that take considerable time to work out the first time, but are done so infrequently that the next time you do it, you have to start from scratch - unless you've documented the process! Writing out clear instructions in there is a good step to being able to offload tasks to more new or junior team members, or a good stepping-stone to automation. It's also a low effort way to build up considerable documentation, and is considerably easier to zip around than a gigantic text document, or worse, a directory full of documents (even if you use search, table of contents or in doc hyperlinks). They're less useful to build up study notes, but if this works for you, by all means, use it. Certainly, with their tendency to cluster related topics and easily launch to them, they work well with the way my brain tends to explode outwards into related topics in a way that relies a lot less on footnotes and parenthetical comments, and is probably easier to follow as a result!

Mature teams have good documentation. At risk teams use scattered "institutional knowledge" stored within the crania of particular people. 

Mind maps

In the 1990s, there was a big fad for this guy called Tony Buzan and the concept of "mind maps" (although there were things like this much earlier, and even Buzan started promoting this from the early 1970s). One of its big claims is that it's supposed to work  a little more like our brains do in sorting, linking and retrieving memory. Not totally outlandish, and you may find the approach useful in unpacking ideas or making graphical notes on topics. It can be a good way of exploring how you link up various concepts. Another tool for you to read about further and try out!

How does <this> relate to <that>? 

A lot of IT is inter-related - it pays to be aware of the broader context and inter-relationship of all the bits involved. Obviously, there is a lot of content to it; the more you're aware of, the richer your overall understanding can be. It can be quite useful to note these inter-relationships as a further structure for your thoughts or to highlight areas you don't know, or are fuzzy. Those inter-linkages will serve you well in two ways; firstly, it helps get the knowledge lodged in your brain (linkages help memory!). Secondly, it helps when you're troubleshooting, because you have a pre-compiled map of how various things fit together that you can use to more quickly figure out what bit of the infrastructure stack has broken. 

Also, note how much easier it is to link another instance of an object type to a pre-existing category. My classic example is it's easy to add a new entry to the list of faces called "Dave", but it's very much harder to onboard a new name, perhaps Tarquin. But once you have, subsequent Tarquins become easy to add to the category "Tarquin" (they're symbolic links, in a way!). It may take several meetings before I'll get a new name down (and it's much easier if I see it written down). The stranger the name, the harder it is to remember. See, again, "linking" helps. 

I like drawing diagrams of how things fit together, and annotating them. This might be a habit from doing lots of network diagrams, or doing lots of biology - where annotated diagrams were part of the learning and assessment material. Again, it expresses information in another slightly different format to just words on a page, and does so whilst explicitly making the relationships and linkages clear. 
Protip: Annotated diagrams are phenomenal tools when troubleshooting problems; use a hard-backed A4 size notebook for all your work notes - a habit instilled in my biology lab days that has served me well. Redrawing a diagram is better than photocopying or printing or simply just looking at someone else's diagram. You'll remember the Visio network diagram you drew from scratch way better than the one you printed out from documentation; you'll remember the network you built better than the one you inherited. 

Time spent noticing links and interactions makes seeing them or investigating their potential impacts in future much easier. Wise people have said that if you can't sketch a conceptual diagram of your network on the back of a napkin in a bar, it's too complicated!
 

Mnemonic devices and songs / movements / stories

Have you been to a preschool lately? They are weird spaces, doing learning in weird ways (compared to the staid classroom of later life). Yet those actions are really effective on our brains - you probably remember them really well. (Head, shoulders, knees and toes...). 
Recite the alphabet. Does it have a melody attached?  
Counting to 12 inevitably gets accompanied by a 1970s era funk backing track. 
1,2,3,4,5
6,7,8,9,10
11
12. 
Thanks, Sesame Street. 

It might be worth thinking up some ways to harness those modes - repetition, song/melody, specific movements - to concepts. You might feel like an idiot, but hey, if it works...! And if you come up with an Expert level curriculum topic aid for IT concepts as a preschool level song and dance, please let me see it. These are excellent examples of how your brain links these multi-sensory experiences into rich, durable memories. 

What else do we have in preschool? Story time! Yay! Humans LOVE narrative. In one of the Science of Discworld books (I think number 2), the authors make the point that we should be called "Pan narrans" - the storytelling chimp (rather than the rather grandiose "Homo sapiens" - wise man). I like that, and I think there is some truth to it (even though the rules of taxonomy would make us all Homo, because Homo sapiens was named before any of the Pan chimpanzees, so it is the senior synonym, and thus takes precedence. Why, yes, I did taxonomy and systematics as a former profession). 
They also introduce a fictitious substance "narrativium" - a mysterious aether that suffuses stories, and, where a story needs a thing to be, brings it into being. Stories are much more memorable than simple brute fact. A good story goes a lot further in the human mind than a good fact (witness the spread of urban legend and real "fake news"). A good story has legs, and will run off into the sunset, possibly with your wallet. Stories are how we first built meaning, and how many cultures have conveyed history and knowledge through time immemorial - longer, certainly, that we've been writing things down, let alone formally learning things. Techies love a good tech story. Magic/more magic, anyone? The 500 mile email outage? Making good tech stories makes it easy to remember. Good tech stories also make great presentations, and good ways to bring facts together into a memorable whole.
Great teachers and memorable lessons are all too often really good storytellers - and stories. 
A list of facts is instantly forgettable. 
Weave them into a narrative structure, and people might just carry them forever. 

Mnemonic devices are, in a way, tiny stories. Please Do Not Throw Sausage Pizza Away is a useful way to remember the order of the seven OSI network model layers, and hey, I love some pizza, so it's memorable, clearly much more so than the one about All somethingP somethingS somethingT Need Data Processing that lays them out the other way around. I know someone called "Nita", and that makes remembering the four TCP/IP model layers quite easy. NITA. (some people prefer LITA). But it was an odd name and took me a while to remember it!

If you don't get presented with useful recall/learning enhancing devices like these, it can be worth making your own up. The process of thinking it up even helps to make it more memorable!

You know what else is weird about preschools? They are free to learn through play. "Kindergarten" is German for "child garden" - it's designed to be a space to learn by playing and experiential learning. Montessori takes a somewhat more structured approach to semi-guided play-based education, but with a lot of self-determination and very centered around constructivism
Quick pedagogical aside: Logo was created as a constructionist (an extention to constructivism) approach to getting children into computer programming (Scratch is a descendant). There's a lot of current work around bringing more of this learning approach into schools, and it is very promising, and done in the right way is very effective for inculcating "21st century skills". If you want to learn more about that, check out Invent to Learn.

You know where adults learn IT by playing? Labs! (Maybe in Dev. Never in Prod. OK, fine you'll learn a LOT when Prod breaks, but goodness, that will involve some sweating...). 

white ethernet switch
Find a lab to play in...
By @thomasjsn https://unsplash.com/photos/qTEj-KMMq_Q

What do you vividly remember about classrooms later in life? The time when something blew up in chemistry lab (which is a lot if your teacher is this outstanding educator), or the time when the physics lab bench caught fire because the teacher used the wrong gauge wire connected up to a car battery and panicked until a lab tech sorted them out with a fire extinguisher... Notice that not only was such an experience multi-sensory and steeped in personal physical experience, but it's often a story, too. Sure, repeated enough times, you'll recall that "Mitochondria are the powerhouse of the cell", but your vivid memories will be actual experiences, not rote learned facts.
Oh, another kind of lab! Hmm, it's almost as if applying theory to practice and doing knowledge things with tangible results in our own hands and in multi-sensory ways is good at helping us learn... See the "Practice" section above if you didn't earlier. 
Rote learning? No thanks - but if you repeat it enough times, it works too (repetition!). 

Chunking

Your mind likes to get information in familiar "bite sized" chunks. Unsurprisingly, this is called "chunking". You will often find that organising things into these particular formats makes them easier to remember, either short or long term. 

A good example is telephone numbers. Many countries chunk ten digit numbers in patterns like 3-3-4. In those countries, you will find it MUCH easier if you follow this pattern (perhaps assuming you ever lived in the "learn phone numbers" era). For example 5554644550 is much harder to temporarily store (for instance to go from a directory to typing it in) than 555 464 4550. Francophone countries chunk phone numbers in groups of two digits, but I suspect that's because their number system is madness. 98 is literally "four twenty eighteen".  This can cause you significant grief when your grasp of French is tenuous, you forget about the chunking, and neither of you is conversing in your first language. (I had this experience in Madagascar in 2003). Similarly, you probably have a PIN buffer that is 4 digits long. Then you move to a country where PIN digits are 5 digits long. Then people invent six digit OTP, and at first, that is hard, until you develop a six digit buffer pattern storage area. 

Similarly, IP addresses are chunked and transformed. Remembering 32 binary digits is a pain in the ass. Remembering dotted quads? If you've been in IT for any length of time, it's easy. Now you need to work on your IPv6 buffer. ;) (No, don't, that is what DNS is for, aside from perhaps the prefix that covers your network).  

Extend these principles to other bits of information you can "chunk"!

Mind Palaces

Some people have success with linking concepts to a mental image of a place or location - Sherlock Holmes' mind palace is a fictional example of the Method of Loci. There was a time before widespread print (or google) and people needed to work out methods to get their brains to store more information than we typically do today and had fewer options to go back and review information (particularly before the printing press). This may be something you can get some mileage out of for storing information, but beware, it is likely to be one of those avenues that takes a while to get any mileage out of. Note this subverts several bits of your memory apparatus, particularly linking knowledge memories to other things you already know.

Beware of humour?

A long time ago, I read somewhere about how humour suppresses the formation of memory - in other words: Why is it so hard to remember really hilarious jokes? There is some research around the topic, of course. Most of it notes specifically that it is the unexpected twist in the most effective humour that makes it so hard to remember, because it subverts our mental predictive machinery. Humour is the result of the unexpected, but the rest of the brain machinery goes on the fritz as a side effect. My worry is that by using too much humour in teaching materials, this suppression of learning may go a little further than just making the jokes themselves hard to remember - and extend into the substantive learning content. Also, there is a risk that humour can fall flat or be distracting from the content if it is overdone or done poorly. Of course, enjoyable writing is always easier to get through. Chris Parkers's outstanding NetworkFunTimes aims to blend a love of (stand up) comedy and network engineering. I've not noticed the odd bit of amusement putting me off my learning game going through his content, so this may be paranoia! Others even note humour can aid memory - but I think that's more the "narrativium" story-as-memory aid (or meme, in the original Dawkinsian sense) effect than humour being the magic touch. See how humour works for you! 

Strength of stimulus / reward / punishment - high vs low "stakes" learning

If you learn about things like operant conditioning and the effect of various pleasant and unpleasant stimuli on memory, you will find that particularly "memorable" things are paired with particularly "effective" rewards (or punishments). If you consider your own experiences, you can probably remember particularly unpleasant events - as well as some particularly great or pleasant events and experiences, and your memory around this will often be extremely clear/vivid - often undesirably so for unpleasant or disturbing aspects of your past. Middle of the road mundane "non-events"? Typically pretty fuzzy, or entirely gone. 

You can possibly use this knowledge to apply to turbo-charging your own learning. I vividly remember lessons that were particularly awesome - but I also learnt fast in classes where the teacher was terrifying (however, I don't remember as much of the content from the terrifying classes as I do from the pleasant ones - but that may have more to do with the subject not being "interesting" to me than a side effect of the teaching style). A quick search around will show you that fear is typically negatively associated with good learning outcomes. Strong stimulus, and strong "stakes" prompt strong memory formation - this makes sense from an evolutionary standpoint; if it's high stakes, you're going to want to make sure the resulting (possibly hard earned) knowledge is going to stick around for later. 

This doesn't mean you need to get yourself a drill sergeant to push you (this is probably counter-productive), or get a sports car (or whatever) reward after you do well, but it takes little consideration to think "hey, I've noticed learning in these ways really works for me" and "doing this before or after learning really helps" - and do more of that. 

A fairly low cost way of having some kind of stake riding on the process is to be accountable to someone else for your learning - perhaps a "study buddy", partner, colleague or friend - with whom you can share your progress and demonstrate how you are (or are not!) progressing along your learning timeline. Accountability and deadlines are closely related - setting a defined date on which you are going to take a particular exam can also help spur you along on your journey - otherwise you may find competing priorities end up in infinite deferral! 

Project/Problem-based learning

In the section on mnemonic devices and related handy memory scaffolding, I briefly mentioned constructivism. Here, I'll focus attention on it again, because from a significant amount of reading I did a number of years ago (when I was a school sysadmin trying to better understand what good education might be, and how tech might fit into that) I came across this paradigm, and it is powerful and extremely amenable to enriching technical content in particular. I will again plug Invent to Learn - I think it is an important book, and that you will find it interesting and applicable to how you engage with young people, mentor or teach, or even discuss learning with teachers. I think you'll also immediately see how you can apply it to how you would build tech skills and understanding yourself. If you're trying to hack your own learning, it helps to understand some theory about teaching, learning and education!

There's at least a whole book length worth of reading for you to do about this, but this style of learning is powerful in part because it's self-directed, and also, because you set the target and make your own "meaning". You don't have to do the reading, but trust me, you'll learn a lot better if you set up a technology you're trying to learn and tweak things to see what happens.

One immediately applicable concept is that you're going to learn best by having a concrete project to achieve - not something like "finish reading the textbook", or "get that certification" - but "build something"! 
Learning about networks? Build one! 
Learning about RAID? Build RAID arrays! See the failure modes! Be amazed (if your array supports hot plug...) that you can pull a drive out and it keeps going! 
Learning about hypervisors? Install one! 
Play with the toys. Learn what they're doing. Click all the things. Type all the CLI commands. 
Better still, have a complex project in mind to build. 
Experiment. 
Go wild. 
Have fun, even. 
To the lab with you!

This is also why you learn a hell of a lot from solving real world problems with your infrastructure - it's undercover project based learning, and it's probably "high stakes" to really get those neurons firing!

Other people's topologies and lab exercises are always a little less effective than those that have real meaning to you. If you need a prompt, imagine rebuilding some part of your business infrastructure - or a "better replacement" based on the tech you're learning, and design and implement around that. 

Teaching

two men watching on silver MacBook
Teach someone to cement your own knowledge!
By @josealjovin https://unsplash.com/photos/JZMdGltAHMo

I don't think that people that "just get" a topic necessarily make very good teachers. I think those that struggle to do so are ultimately more effective teachers of a topic, both because they can empathise with those grappling with the topic for the first time, and because they probably then have several different ways of mentally approaching that topic, which can help you to bring a topic to life in meaningful ways for everyone in that class. Have you ever sat in a classroom with a teacher that looks at you with the "but how can you not just see this thing" expression, exasperated as to how they can possibly explain it anyway other than "it just is that way and you must learn it"? (I was sometimes that kid in Maths class, but teacher's pet in others - and I've met way too many Maths teachers in particular who can't dynamically approach a teaching problem!). The problem with maths is it's too absolute and un-fuzzy, so it's hard for people who are good at it to approach it from any other angle. Similarly, if you've only ever interacted with very intelligent or (over)educated people, it can be a rude awakening to sit in a classroom with "normal" or "below normal" people - which is going to be most of the working world, and customers - so get used to it, and learn how to relate. Teachers love the top set classes - but the true test of pedagogical mastery is the bottom set! Teaching the top set is fun. Teaching the bottom set is an achievement. You've really got it when you can teach anyone. Learning is good. Teaching is better. 

I really understood subnetting after teaching it to high school kids. 
I learnt a lot about defense in depth when I... taught it to high school kids. 
I cemented my elementary networking knowledge when I literally wrote a book about it... for high school kids. 
I went on to further amplify all of these things mentoring adult team members too, but the crucible of getting this right was with teenagers - the most terrifying of all audiences (outside of a maxsec prison, anyway). 

For these reasons, I think that teaching and mentorship are excellent ways of cementing knowledge. Firstly, if you're doing it right, you're going to scaffold a way of introducing technical topics in a way that builds up from a solid foundation, and helps people link topics together in tangible ways. 
Secondly, you're going to make damn sure you know the topic before standing in front of other people. 
Thirdly, implicit in all of this is a stage of repetition, and often, of reframing knowledge, both of which revitalise your mental machinery - stomping down old pathways, and forging new links and ways of seeing things. 
Fourthly, when you come across people not "getting" it, you're going to be forced to approach topics from new angles and unique perspectives that you alone cannot possibly dream up. 
Fifthly, you're going to work out "hooks" to keep it interesting and people engaged - and that works on you just as much as them. 
And yes, doing this well takes a lot of time. 
As an added bonus, you improve other people, which is good and worthy in and of itself. 

What are some effective study methods from this "teaching" perspective? 

  • Well, the obvious route is to "teach" people you lead and mentor. 
  • Presenting tech talks is a thinly veiled teaching exercise!
  • Once you get advanced enough, you may actually want to try more formal teaching of those subjects in appropriate places - either like school, or hopefully, more like university lectures. 
  • You can create study groups in which you take turns "teaching" each other topics - it's pretty easy to find groups of people at similar stages in their journey to you online - or you can mentor people earlier in the journey, but be careful to overstate your expertise or unintentionally mislead or misinform. This works quite well in you're in larger teams and you get the more junior members to work on this stuff, with a more experienced guide to make sure things don't go too far off the rails.
  • You can extend this study group concept into something similar to an academic "paper club" or "reading group" - take an RFC, presentation, white paper, current topic or some other useful bit of knowledge, present it ahead of time, and open up discussion within your peer group. This is obviously easier where members of the team have met this format before - it's daunting when they haven't. 
  • You can write informative articles on the topic, although the feedback loop here is poor compared with a "live studio audience" of faces staring at you! 
  • I've even heard of people teaching technical topics to their pets - but that is a slightly less useful, if at times rather enthusiastic, audience. 
All of these are like teaching, and all of them are worthwhile adding to your basket of tricks. The more like formal teaching it is, the more valuable I think it is - both as an activity in its own right, and in improving your own mastery. Building up a syllabus or curriculum and delivering units of knowledge, practical experience and in some way assessing formative and summative knowledge are all useful. 
By all means make use of things like "flipped classroom" approaches, or treat your direct contact sessions more like lectures (introductory guidance meant to stimulate independent further research, thought and discussion outside of class) rather than school lessons (where the content delivered is the content expected to be regurgitated). Or even like Oxbridge tutorials, which tend to very much be about group dialogue, co-discovery and shared construction of meaning through discussion, debate and the meeting of independent research by several people. Doing so may help create a culture of enhanced information literacy and independent, critical thought, and that is a good thing! Many of my teachers liked answering questions with more questions - initially frustrating, this caused you to switch your brain into further thought, and independent research. Thanks, "difficult" teachers! 

Finally, remember that a lot of IT support is actually "just in time" teaching - providing exactly the right prompt to a customer, end user, or colleague to solve a particular problem they are having. Try to resist "doing it for them", unless there is some sound reason not to do that. 

Reflect on your own "learning styles"

bamboo raft
Reflect on your learning!
By @joshuaearle https://unsplash.com/photos/EqztQX9btrE

As I mentioned earlier, there are schools of thought that different people learn better in different ways - in terms of ways of supplying info to your brain. This can also be extended into considering some of the ways of enhancing whatever delivery method you choose as covered above. Try to critically reflect on which methods actually work best for you. 

I  find it hard to believe there are those who learn technical theoretical content best from YouTube videos. Sure, it's great for practical skills that have intricate motions (perhaps uncharitably: monkey see, monkey do!) - but most technical IT content is not like that (with the obvious exceptions like terminating and dressing cables). I suspect a lot of people like it because it is more entertaining (to them) rather than educating. "Edutainment", if you like. How much of that added entertainment enriches the content, and how much detracts? Are you getting more education, or more entertainment out of it? I also really dislike podcasts. I am all about books and blog posts - written works; static, labelled pictures you can pore over and consider at you own pace. You might be very different from me and vehemently disagree with my assessment of this - and that is fine! But do spend a little time trying to find some "objective truth" about how well you learn through different methods or media and stick with the ones that work best for you - if it's not available in your preferred medium, make it so, and share that content, if appropriate. Not how easy you find to gloss over the content and how "pleasant" the process is, but how well it sticks (months later). I am also cognisant that reading has been my dominant information onboarding method, so, maybe I'm biased, but I don't think so - it really is superior.  

I also know I have developed an ability to "cram" a lot of knowledge into a temporary mental buffer space. Some of it sticks, some of it does not - it's been a great way of getting more points in exams throughout my life, but in some ways, it is cheating yourself in the long term! As I know I have this ability, I have a choice to use it - or not. One of those decisions is harmful in the long term if I don't then make it more "honest" by cementing that short term gain in various ways.

What are your learning goals? As a professional, it's not displaying you can pass the certification exam - it's displaying you deploy that level of skill, insight and knowledge day-to-day in the real world. The certification is a validation of your learning, not the target! Don't cram learn (and definitely don't use "dumps"). Don't seek the minimum required content (is this going to be on the test?) - you're trying to be a professional, not just get grades with the minimum effort possible. Realise you might (you, in fact, absolutely should) circle back to "old" topics to refresh your understanding, and build it to a deeper (or higher?) level, and link it to more topics and greater understanding. I guess this is why the advice on good CVs is to write about achievements, not simply list responsibilities and qualifications - show how you apply this stuff in the real world. The sysadmin code of ethics specifically mentions education - "I will continue to update and enhance my technical knowledge and other work-related skills. I will share my knowledge and experience with others." . 


Campaign for Change!

So the treadmill is never going to slow down, and you can't get off. What can you do to improve your quality of life, other than optimising training for yourself in your "free" time? 

Well, the more people in the industry make training and development a "benefit" they seek, or transform regular training into something that makes obvious business sense, the more likely we are to have this stuff mainstreamed into our working lives, freeing up free time for more enjoyable pursuits (hey, no judgement, labbing is fun too, but you need to get out the house and spend time with loved ones too - wear a mask). 

Imagine: Supportive managers who campaign to get us that training, or some on the job free hours, or make kit available for lab-ing. Who note the need for paid study leave - and give it to us. A C-suite who recognises training their staff gives them a competitive advantage in both business as usual and recruitment. If you are a technical manager, you have some sway to get at least some of this done, either tacitly or through low grade guerrilla warfare!

Make business cases - how much does it cost to hire an implementation consultant? How much does it cost to train staff to an equivalent level? (There are of course times when it's not a question of money, it's a question of time - training someone always takes longer than using a trained resource, which you will absolutely see as you start training/mentoring people yourself). Once trained (and retained, and sustained) you keep on having that resource - and may even be able to further monetize it. 

Around 15 years ago, I was in an organisation that was thinking about how we could build capacity across various technical (scientific) fields in lower and middle income countries (across the eastern coastline of Africa and its Indian Ocean islands). 

It came down to 4 key things: 

Attract: You need to get people into your organisation and field.  

Train: You need to equip them with the right skills. 

Retain: You need to make sure they don't switch out careers (to another field or another organisation) - long is the list of qualified PhDs who end up working for banks or other choice employers in the developing world instead of research or conservation - or move to other countries to use their skills. Money talks - but so does professional satisfaction, above a certain level of meeting Maslow's hierarchy. Think about how this relates to your own sentiments about where you work, and should you lead or mentor others, how this might influence them. 

Sustain: You need to keep them happy, and keep challenging work coming and ensure the resources needed to support that work are available - and develop healthy institutions to keep the ball rolling, long term. It is depressing going overseas to a prestigious PhD programme, and return home to dysfunctional nothing - aside from low pay, this lack of career prospects and support was the leading cause of hemorrhaging of talented and trained people. Science, much like tech, often needs shiny and expensive toys to progress (physicists and astronomers take this to particularly impressive heights). How can you sustain skilled professionals in you field? What do you need to feel "sustained and supported"? What do your team members need? 

This clearly required a highly integrated and holistic (if you'll forgive the phrase) approach. It helped to improve entire education systems, to grant scholarships, and to provide career-long access to research platforms, facilities, funding and opportunities. It was bold. It was visionary. It was very difficult to sell to politicians. However, it will be easier to do this at a rather smaller scale - your own organisation. How can you achieve those 4 key pillars in your own efforts to strengthen your team? 

Learning is Work. 

To some extent, I think our professional attitude and personal thirst for knowledge may do us a disservice, in that workplaces may take advantage of this, pushing more and more of this out of work hours and into time we should be resting, recharging and relating to other people. 

In the same way that "glue" is work, learning is work, so work should support it! There's been a huge trend in modern workplaces to expect staff to arrive with, or self-gain knowledge, skills and experience; I recall my dad being sent off on training for all sorts of things which I can't see much evidence of happening in the modern workplace. I can't recall ever being sent off for technical training - and the only "training" I've had has been around workplace rules or safety, or basic "how to use this product". So I think we need to make sure that it is understood that time we take outside of the work day to do this is a luxury or added bonus, not an expectation. We need to make sure this message is heard, loud and clear. 

Balanced against that - there are limits to what you can expect, particularly if what you want to learn isn't quite in that company's interest. Mostly, unfortunately, they want you to stay put where you are. If you want to progress, you may having to support your own development. But if they expect you to gain additional skills, or deploy new technology, part of the overall project should be educating the workforce involved in those changes. It always used to be in other industries!

So... get learning recognised as work!
  • Discuss this during performance management processes. 
  • Reframe the conversation: Learning IS Work
  • Learning and practice lead to innovation.
  • Request a library of technical books.
  • Put it in the "suggestions" box. 
  • Bring this up in interviews. 
  • Write about it. 
  • Talk about it. 
  • Demand it. 
It is all too common to have a "professional development" section in performance management which is entirely ignored - this is a short-sighted practice, and we should as stringently demand progress and protected time for this activity as we do for any other KPI. This should arguably be to the point that you could excel in every other area of the review and its targets, and yet still not even achieve "satisfactory" on the overall review if you didn't meet the learning goals. People achieve what they are rewarded to achieve; they do what is demonstrated to be the expectation - they do what you measure! 

To some extent, learning-as-work will help with workplace diversity and inclusion. If work develops people and allows a healthy work-life balance, it makes careers accessible to people from lower income groups; it allows more scope for families. This simple action - treating learning and staff development as important work and learning outcomes as work product - should help foster diversity and gender parity in the workplace - and keep your tech workforce up-to-date and current with the interminable gale force winds of change. 

How Long? 

Many people want some measure of how long learning anything is going to take. The answer is "somewhere between 20 and 10,000 hours". After Josh Kaufman's 20 hours, spent the right ways, you will be, I suspect, right at the summit of mount stupid - but, and here's the important bit - you have enough knowledge/skill to really get into practicing and experimenting, and have a foundation to launch further investigation from. Twenty hours isn't a lot of time (certainly not when considering ten thousand!), and you're by no means going to be an expert in anything (that's more 10,000 hours territory) - BUT you'll have enough to really get stuck into a topic or lab things, and start applying that in the real world. 

One key point that is made is that for practical skills, in particular (programming, CLI commands, building things, troubleshooting, etc.), reading about them isn't as helpful to learning them as actively doing them is, at least once you've grasped the "fundamental" basics. Much of IT is skills that have some sort of a knowledge component to them - you need to know what a command might be, but you need the skill to use those commands in the right places and at the right times, and to see and experience how they behave and effect the theoretical knowledge in the real world. 

The 4 key points in that video might also form a useful scaffolding for your own initial learning and progress, but as always, do what works for you - experiment a little with your learning and find what really works for you. This work is often done for you, particularly step 1; the exam topics/syllabus/curriculum are basically the broken out key skills and knowledge you need to get. I'll reiterate that most of IT is actually practical and skill-based, and less academic and cerebral - look at the way most people code - they literally hack it. They throw ideas at a problem and see what sticks. When it's good enough, they move on. Similarly to learning the 4 chords from the "axis of awesome" lets you blag your way to seeming like you're vaguely reasonable at music, there are many similar tools that will equip you to go quite far in IT; basic building blocks you cobble together in various ad hoc recipes to solve the challenges you meet. 

Learning more arcane theory (Big O notation, or CAP theorem, more obscure algorithms and functions, for example) can be useful further down the line (and it is useful to know that stuff exists and you don't yet know it). At the end of the day, though, you probably want to go beyond "basic apparent competence" and move some distance towards the "mastery" end of the scale - and that is definitely going to take more time. It is often worth curating a list of things you want to learn more about later on that is procrastination if you do it now. Put it at the back of your hardback notebook, or in a "stuff to learn" text file. Start with the basics!

Commit 20 hours to learning something (not an entire professional field, but some important or intriguing part of it), with most of that time spent playing with the actual tools, and see how far you get - likely, far enough that your pleasure in getting somewhere is a useful ego-boost that will spur you on to do more - and you'll have a usable building-block you can reuse and recycle. And then wend your way down the 10,000+ hour path of expertise/mastery. 

Kaufman's 4 key 20 hour skill-building steps, after you've decided what you want to learn, are: 
  1. Deconstruct the skill
    1. Break apart a career skill-set into components. Each of those is a skill. Each protocol is (at least) a skill. 
    2. Yeah, that might be quite a lot of 20 hour chunks of time for a career. You can eat an elephant - one meal at a time!
  2. Learn enough to self-correct
    1. You can go overboard with acquiring resources to learn from, and this can be a form of procrastination.
    2. Most resources will have practice examples. DO THEM! Do not just read about them or watch the video. 
    3. You need enough understanding to get going - and therefore notice things that are mistakes or errors. This is particularly easy with something that is a straight physical skill (playing a chord incorrectly gives you instant feedback; in tech, run the command sooner and check you get what you expect - and nothing else). 
  3. Remove practice barriers
    1. Set up a lab!
    2. Use the lab!
  4. Set aside time (Practice at least 20 hours)
    1. Calendar set times and guard them jealously. 
    2. Swap out (within reason) "empty" activities for enriching yourself. 
      1. Doomscrolling is fun for a while, and you get the occasional nugget, but you'd have learnt more by doing for that hour. 
    3. More concentrated learning periods tends to work better - scattered bits of learning work less well, because if they are short and infrequent, you spend a lot of each session "getting back to where you stopped". 20 hours in 5 minute chunks is rather different in effect to 20 hours in 2 hour (or perhaps more) chunks - remembering to take the odd break!
Bear in mind Kaufman isn't looking for "expert" mastery, or to develop a career - just competence enough for personal pleasure. But I think it's important that you see you have some hope of learning enough to start doing or understanding even very complex things in very attainable stretches of time.

You'll probably recall, if you've read at all about DevOps or "Lean" manufacturing, the fundamental importance of  rapid feedback loops to optimised and efficient processes. The same is true of education; ongoing quick/immediate feedback is great for learning. Delayed feedback is not (that essay you get comments on 2 months after submission? Useless!). Contrast formative and summative assessment. You rarely learn anything from summative assessments (other than where you lack knowledge - think exam); formative assessment can actually help you learn (end of chapter quizzes; more advanced learning platforms that give you immediate feedback of various sorts; pop quizzes; rapidly returned work; commands that immediately give you verbose errors; tail-ing logs). 

IT is quite good at instant feedback, in many instances: things work or they don't - use this to the benefit of your learning. "Book learning" has delayed feedback loops (it only gets battle tested some time later). Move to knowledge implementation, rather than knowledge acquisition, as soon as you can. We are actually quite bad at doing summative assessment in near-realtime in formal education - there is an area some startup will eventually make a shedload of money (or by giving it away, perhaps change the world) by using machine learning or similar to do things like adaptive difficulty and deliver near instantaneous feedback to ongoing assessments - a personal 1:1 trainer is ALWAYS more effective than a lecturer teaching 1,000 students. The tighter the feedback loop, the more powerful it is (I started scrawling notes on this whilst reading DevOps books, because the parallels to education were obvious to me). That's also the point at which e-learning platforms will really take off and live up to the hype. 

I have consistently found "just in time" learning to be the most effective for individual skills. I have a problem, I need to figure it out, and I've never used this technology before. Read some things, test some ideas, work it out. That knowledge is really "sticky". 

But - a word of warning - too much Just In Time learning may lead you to a situation where you have a lot of "gaps" - if you've gone quite far without more formal curricula (i.e. only by "crisis" or "immediate project need" learning), you may benefit from taking a step back, perhaps even going "back to basics" and reviewing the possibly much larger systemic knowledge pool you'd gain from a more systematic learning programme, and filling up your patchy knowledge to a more level whole. But take the obvious parallel - the sooner you apply "book learning" to actual problems and gear (virtual or otherwise) the more quickly you will learn, the more embedded it will be in your mind, and, likely, the more flexible you'll be in applying it - and it will often help you to understand the technology and underlying reasoning, too. There is a big difference between being able to recite correct answers, and actually understanding what they mean, and why they are the correct answer.

Continued practice, whether at your day job or in a lab of hypothetical scenarios refreshes that learning continuously. Remember how you "got" trigonometry at school decades ago, and there's something about SOHCAHTOA? Can you actually still do trig, or have you forgotten ? If you're a natural Maths Genius or regularly still do trig, ignore this, but realise ordinary mortals forget this stuff very quickly without regular repetition and practice. Much knowledge and skill is like this - regular practice and reinforcement is vital. 

I strongly suspect that changing from high stakes summative assessment at the end of several years of learning to regular summative assessment in short "modules" means people actually know less after a period of time. If you only have to learn a term's worth of information, you can kind of cram it into short-term memory and do quite well in an exam on it. Several years worth of information does not fit in there, so you have to have the larger knowledge area truly mastered. Importantly, short term memory decays fast - so there is quite a lot of what I knew that I knew I no longer really know, either because I crammed it, or because I've never needed or used it again, and brain processes have discarded it - or it was never even a candidate for long term storage in the first place. However, it definitely has "echoes" that allow me to reacquire working knowledge or rapidly find the method with a lot less effort. I  can't remember the full details of pKa, but know it exists and the rough "shape" of the idea it expresses and its applications, and can therefore quickly find what I need to work out how to make a physiological buffer of a particular pH (and, no, I haven't needed to make a buffer in like 20 years). Of course, it could be argued that all I need to be able to do is google "pH 7.3 buffer solution" and the recipe is probably out there. Similarly, I often have context- responsive knowledge - I need the visual reminder of the menu navigation options for a program function or the CLI prompt before I can walk someone else through a process (this is a sign of intermediate or infrequently refreshed knowledge, I suspect - you know it, but not enough to do it "in your sleep"). "I know it when I see it" applies in a parallel sense as much to some forms of knowledge as its more infamous connotation. 

Gut Feel

Finally, a word on tacit knowledge, which you may have seen experts use all the time. This is the sort of "spooky" knowledge or "gut feel" that mastery seems to produce; you know what the problem or solution is, but you may have a really hard time verbalising why that is the case. The only way you acquire this is practice and deep familiarity with things. You can see it "looks wrong", or it "feels like category problem X", or "the answer is probably Y". Once you find you get these intuitions regularly - and they are mostly correct - well, congratulations, you're probably way down the road to expertise. This is the result of your brain's pattern-matching circuitry working on possibly thousands or millions of examples of things that were "right" and things that were "wrong", and an internally constructed model or simulation of those systems against which observations are run, and providing an answer for you - but you'd struggle to reduce that to a learning programme that was anything other than "spend 20 years doing this stuff, and you'll also get it". 

So go start getting that raw data stuck in your brain and learn by doing and in practice, and build those mental models! Some people, who "just get" certain domains of knowledge also find much of that entire field is effectively "tacit knowledge" - it just is that way - and they will struggle to teach or mentor anyone who is not also like that. I think this is why I've often found maths teachers to be particularly bad (yeah, sorry, I am bringing this up again) - I don't naturally "just get" mathematical concepts; I have to work at them. Whereas some mathematicians, for whom it seems to me that it "just clicks", end up as teachers - and seem to struggle to verbalise - and therefore, to teach - that same content to others. The whole of maths, to them, is perhaps tacit knowledge. Many (but not all) people who display amazing mastery of things are (often) not great teachers or mentors, but they can certainly be inspirational - or, more likely - aspirational goals! Whether you are looking for someone to help you along, or are considering teaching, coaching or mentoring someone else, realise that teaching, coaching and mentoring themselves are skills that need practice, and be aware that those who "just get" a topic may find it harder to relate to those who don't. It is surprising how little effort we are expected to expend in learning how to master leading, managing, mentoring or teaching people in the workplace.

If you do end up around people that know amazing things but can't really explain why or how, figure out how to leverage that anyway - just watching them work can help you understand things. Figure out how to do that unobtrusively and without them feeling like a bug under a microscope! Sometimes, you can actually coax answers out of them by asking the right questions (at the right time - in the middle of a P1 outage is almost certainly not the right time). You might also just need to write down a whole bunch of new words and concepts and google them later! Their war stories will probably help your brain add to the development of your own spooky pattern recognition circuitry, too. 

In Summary

So this is, for a blog post, quite long and information-dense. Apparently, it helps to summarise the key points at the end, so...
  • Your brain is amazing at remembering things, but it protects itself by forgetting things, too.
  • You don't get much choice in this.
  • You can influence it primarily in two ways to make your learning more effective: 
    • Contextualisation and linking new knowledge/ideas to other existing memories, emotions and understanding.
    • Repetition is absolutely key.
  • Use different expressions of knowledge:
    • Figure out what modes of learning work best for you; try different sources (content creators) and styles (written, audio, video, etc.), because different presentation or different ways of seeing the same topic may help you build you own meaning.
    • Take handwritten notes.
    • Lab. #LabEveryday
    • Teaching is a form of practice, and because teaching often involves explaining things in several ways so people "get it", enhances your own mastery and detailed understanding of concepts - you also have to contextualise, clarify and distill the concepts in order to teach, which, again, is good for really knowing.
    • "Practical" learning is best - enact concepts you learn in physical and tangible ways.
      • If you teach, use this!
      • essentially, this is blended learning - theoretical knowledge plus virtualised practice with immediate formative assessment and feedback.
  • Practice implementing knowledge as soon as you can after getting it; link theoretical concepts to practical implementation of them. If you have a fantastic or stimulating project idea, write it down in the moment and implement it as soon as you can. Do not start a new chapter, section or topic until you've done at least something with the last one that is "real". 
    • Shorten and enhance feedback loops in your learning. 
  • Use project-based learning - have a defined target or task in mind, figure out what you need to know to get there, get that knowledge, and practically apply that knowledge; invent to learn. Whilst gaining a cert or a job (or a job title) may be goals, they are not projects - build something as a project. 
  • If you are in any position to influence culture in your organisation or team, try to get learning-as-work normalised and widely practiced. If nothing else, freeing up some space to just think and play often leads to breakthroughs, but it's also a powerful way of ensuring your organisation as a whole really becomes a "learning organisation". 

Further reading

blue wooden door surrounded by book covered wall
Need Moar Books!
By @eugi1492 https://unsplash.com/photos/6ywyo2qtaZ8


This post was mainly inspired by thoughts triggered by: 

No comments:

Post a Comment