Introduction

In an increasingly technological, scientific, structured and logical world, it’s not uncommon to hear from business managers, CTOs and other team managers that their biggest concerns come from optimizing tasks, such as managing, motivating, and training people. This concern comes from the undeniable fact that team building and training play a decisive role in the efficiency and overall operational performance of any company. Consequently, creating and designing effective and repeatable processes, as well as achieving the best results out of them, remains a challenge of great interest to us all.

At Growin, when we decided to create a Nearshore Technology Center focused primarily on building and deploying Scala Software Engineering teams, we faced a major challenge: Scala is hard… In fact, apart from being recognized as one of the most powerful and productive languages in the market, it’s seen as one of the programming languages with the steepest learning curve. So when we set the goal of efficiently training people in a demanding programming language such as Scala, we knew we would have to focus on creating an efficient and repeatable process which would allow (and has allowed) us to quickly train juniors and/or transition people from other programming languages to Scala. We named this process the “Scala Academy.”

In this article, I’ll guide you through our thought process, as well as the research and development we did to create, improve, and continuously fine-tune the Academy. I’ll also explain some of our conclusions and resulting guidelines that can also be used in the creation of other training programs or other types of processes.

Click here to download our free “Scala Academy Brochure” and learn more about this training program.

 

Are we “teaching” or are we “training?”

This question is not frequently asked. Even so, it serves as a great starting point for our conversation and it is certainly interesting to know the difference between these two approaches, which both serve the purpose of instructing, guiding, and mentoring someone along a learning process. While answering the question, we’ll also define the “group” where our Academy sits and to which our goals align.

To start, what often leads someone to actually ask the question is the fact that the definitions and applicability of the verbs “teach” and “train” are regularly intertwined, sometimes confused, and even often seen as complementary in multiple areas and contexts. When researching the topic, the most commonly found consensus is that “teaching” can be seen as the transfer of extensive knowledge on a certain topic, while “training” can be seen more as a process of instructing or helping someone to acquire a new skill through practice and/or repetition of a certain task. Our Academy is based mainly on guided knowledge acquisition through examples, hands-on practice, and, sometimes, trial and error. Thus, the Academy process is clearly aligned with the “training” trait just described.

 

We knew what we wanted, but how did we get there?

When we started our journey of creating a training process to develop our trainees, we had a clear set of ideas and high-level goals. However, much like every other “idea,” we reached a moment where we decided to turn it into a usable process and, we decided to invest time and effort into carefully designing it. To do so, we chose to apply a “Circular Design Process,” which by definition encompasses the notion of circular iteration.

The design process has essentially 3 phases:

  1. The “think it” phase, where what we’re trying to build is described in full detail, the multiple goals we’re trying to achieve are defined and drilled down, and the tools or ways of measuring our success are identified.
  2. The “build it” phase, where we actually compose or build the product or process we’re trying to create.
  3. The “try it” phase, where we either test the product or apply the process while monitoring users’ reactions to it and/or its outcomes.

 

Applying this design process has proven to be effective whenever:

– The final output of what is being designed is something that is to be repeated more than once or, at least, could be repeated if the outcome is not satisfactory.
– The project/process has a finite duration, preferably one that is possible to be estimated.

– The project/process has an expected outcome (e.g. acceptance criteria) or has a defined validation routine (some kind of test or assessment).

Considering that the Academy ticked all these boxes, it was a fitting use case to apply the process to.

 

Learning isn’t easy… So how do we make it easier?

During our “think it” phase, our primary concern was to make learning Scala and Software Engineering practices easier. To find clues on how to achieve this, we decided to explore the science behind human learning processes to then take those into consideration when building and improving our training process and program.

In human psychology, the subject of “how do we learn and acquire new knowledge?” has been the focus of many studies, theories, thesis, and resulted in many learning frameworks which we do not intend to explain nor mimic during this article. Nevertheless, in summary, the most relevant concepts present in all of them can be condensed into two learning factors:

 

(1) The previous knowledge that the students have of the subject, pre-requisite subjects, related subjects, or subjects with similarities with the one they will be learning.

(2) The motivational effect associated with the challenge of learning a certain skill or assimilating a new concept.

 

Factor (1) is easily understood through simple examples, such as “understanding how multiplication works through previous knowledge of sums” or “learning how to ride a scooter after learning how to ride a bicycle.” This “learning through similarity” applies to pretty much anything regarding knowledge absorption and, in fact, human beings unconsciously look for similarities with previous lessons when trying to learn new ones. Whenever we can’t find any, we tend to devise models to help us memorize or assimilate parts of that new knowledge through “induced similarity” to existing but unrelated concepts. Examples of these models are mnemonics and storytelling. From this factor, the main take-away point is that, in regards to what we can control, we should carefully pick the content and the way it is displayed so that concepts which might be harder to understand are presented only after more foundational ideas.

Next, factor (2) focuses on easing learning through motivation and guidance. It derives mainly from a theory described at the beginning of the 20th century by Lev Vygotsky, which was named Zone of Proximal Development (ZPD). In this model, “knowledge,” from a learner’s perspective, can be grouped into 3 areas:

 

– What the learner can do/understand completely unaided. The zone where learning is “too easy” and can consequently become “boring…”

– What a learner cannot do/understand even if aided. The zone where learning is “too hard” and can consequently become “frustrating!”

– What the learner can do/understand if aided (ZPD).

 

This last area, the ZPD, is the zone where learners will feel challenged because they feel that with enough effort they can understand and assimilate the knowledge that is being passed to them. Consequently, this challenge is, motivationally speaking, “fun” and, if it’s capitalized on, will speed up learning and its effectiveness. Finally, the main point to retain here is that the trainers should carefully pick the content that will continuously make the learner feel challenged but never overwhelmed, anxious, or pressured.

Click here to download our free “Beginner’s Guide to Scala Software Development.”

 

About Quality… How do we measure it?

After defining some rules on what to include and how to present it in our Academy program and before finishing the “think” phase, we had to define metrics we would be using to grade the effectiveness of our process.

Training processes can be built more efficiently when you think beforehand about what skills you expect your students to have learned by the end of the program. It might come as a surprise to you, but even the post-validation of quality is often forgotten. We made sure we avoided that by devising tests, exercises, etc. to continuously monitor and test our learners throughout the duration of the program. In our academies, the pace isn’t enforced by the trainer; students can take as much time as necessary to complete courses. Obviously, there is an expected range for the duration and we use that as a measure of performance to try and detect students who might be having trouble or identify ineffective working practices.

 

Done with thinking, on to building and documenting!

With the two learning factors previously mentioned in mind, and knowing all the content we were trying to relay and how we’ll test if it was understood, we could now carefully build and document the Academy program with all its containing modules. Doing this not only produces a script for the defined process but also allows proper handover and sharing of the plan with everyone involved.

 

All set, now it’s time to give it a go!

Our Academy takes a variable amount of time to be completed as it’s pace is adjusted for each student. For that reason, it’s total duration can take from 3 to 6 months to be completed and, when necessary, it can be extended by adding content and depth to the internal pet projects to keep students entertained and learning skills for a specific project in mind.

During the duration of the execution phase, it’s also very important to collect results, feedback, and take notes as these will be useful later for the improvement of the process of following iterations.

 

Rinse and repeat!

After performing each iteration, we usually gather massive amounts of feedback and conclusions that give us clues on what content needs to be improved or rebuilt. Essentially, this means that we need to repeat the whole design process from the beginning and use those conclusions to improve the content we’re providing and/or the evaluation we’re performing. This might seem obvious, but it’s not uncommon that when things go good or great, people just repeat the “execution” step without giving value to the other two steps of the design process.

With the Academy focused on fast-evolving technology, it’s common for us to change some of the content after each iteration, either because a new version has been released and rendered our content obsolete, something new has shown up that completely replaces a tool, library, or framework, or even because Growin has decided to adopt new standards or patterns. These impact every iteration of the process and require continuous investment to keep the process as up to date as possible.

 

Keep on improving!

In this article, I’ve tried to demonstrate through the successful example of our Academies that if you strive for quality and efficiency in training processes, you must embrace the complexity and use as many resources as possible to attain the best outcome.

For us, it became clear along with the multiple iterations of our Academies that the “ultimately best outcome” does not exist. Things move too fast for the process to ever be stale. Unfortunately, it comes with a drawback of making different iteration results hard (or even impossible) to compare, but it’s the price to pay for the sake of quality and an outcome that continuously fits the shifting market needs and keeps us always improving the tools, processes, methods, and, last but not least, ourselves!