I've been providing instructor-led classroom training for software developers, off and on, since about 1996. Most of that time has been me, physically in the classroom, standing in front of a bunch of eager developers, face-to-face.
From the point of view of the me, the instructor, there are some really great advantages in this type of training. The biggest is the ability to monitor, largely subjectively, the "vibe" of the students in relation to the material that's being presented. Facial expressions, the sounds of typing (or not typing!), etc. provide clues as to the level of engagement from the students.
For the students, there is also a bit of a panopticon effect. They are a part of something. They can't just get up and wander off (though I've seen it happen!) without the subtle judgement of those around them. Drifting off and doom-scrolling a social media site instead of paying attention would most likely draw the eyes of those around (or behind you).
Early on, in the transition to fully remote training at the onset of the pandemic, I experimented with requiring the students to have their webcams on for all or a portion of the course. While it did help me, as the instructor, a bit in terms of some subtle feedback, it was awkward at best.
Having a webcam inches from your face, while you look at a Hollywood-Squares style grid of all your students live, directly at their mugs does not replicate the classroom experience.
Some students really pushed back on this. Many had the resources to create a private area in their home for work and training, but others were sharing space with other family members. Artificial backgrounds in Teams or Zoom meetings help with this a bit, but also push the whole thing further into a distracting uncanny valley.
For many, it is just very uncomfortable. A scratch of the nose in the classroom would be largely unnoticed, but when the camera is focused on your face? No hiding.
As the instructor, it would be hypocritical if I didn't also have my camera on. Sometimes I do - if I'm presenting material, like a PowerPoints, etc. When we are coding, though, the presence of my face is really just a distraction from the material being presented (through screen share, etc.) Mostly, though, I find it really distracting to be staring at my face as I teach.
There are some instructors that insist on students having their cameras on for the duration of the course. I'm not going to try to assume their motivations completely, but I'd imagine that some of it is a desire not only for the feedback, but also a sort of authoritarian move. Let me explain.
As a contract instructor, your job is to fill gaps. Students are sent to you because their job requires them to have a set of skills they are lacking. The dynamic is a weird though. Any programming class I teach has to assume some commonality in the skills and experience of the participants. While these are usually listed in the "prerequisites" section of the course description, those are often either ignored, or, I find, students (and managers) have a challenge in trying to assess the actual skill level of developers in regard to those prerequisites. My job as an instructor is often very limited by this. If there is something that is a prerequisite for the class, and the student finds during the class that they didn't accurately assess their skills in this area, they have a few choices.
- Go off on their own during the class and start Googling like hell to figure out what it is I'm talking about. The problem, obviously, is that I'm unaware this is happening and just naively keep going, and the student falls further and further behind.
- Shut down completely. Some students reach a critical mass of "the things they just don't understand" and just stop participating in the class.
- Ask me. This is the hardest one for many students, but is usually the best option. Usually they are just missing a small piece, some little thing I can take a moment or two to explain and they'll be right on track.
The problem with #3 is if I'm with a group of students and this becomes the major thing in the course, then I run out of time to deliver the material I was contracted to deliver (and what the students that astutely assessed their skills in relation to the prerequisites need).
This isn't something that is solved through forced camera usage. However, the second item listed - just shutting down - has been a real problem. I run my classes largely on virtual machines shared with the students. I can see the number of hours and minutes they have actually used the VMs during the class. More than once I've discovered at the conclusion of the course that one or more students used their VMs for only an hour or two in a 21 hour course. "Fine", I might say. They will not be able to produce the work they should be able to produce after the class is over, and their karma will come due. However, they get to evaluate the course, and therefore, me. They go back to their teams, I imagine, and are asked how the training was and they say "It sucked. Didn't learn anything". And they can prove it because they can't do anything.
While there will always be a certain percentage of people that sort of fake their way through their jobs to some level, and see training as a few "days off" - and remote training, like work - provides too many opportunities for distraction - I think the real issue here is the assumption of a sort of mono-culture of skills and knowledged in group training.
If a company thinks they are providing adequate training for their developers by just offering instructor-led training they are failing..
It sets up a bad dynamic where the students come to expect that "whatever I need to know to do my job will be delivered to me in the form of a class", and project managers who think that no matter the challenge, a class or two will get their developers up to speed.
Students that work for companies like this often really struggle in class because it becomes a somewhat self-fulfilling prophecy. They feel "If I don't get this - ALL of this - now, I won't get it anywhere else."
That feeling that "As soon as I'm done with this class I'll be expected to apply this to code for a multi-billion dollar company" is overwhelming. It would completely paralyze me.
There has never been a decent class I've taken where I understood absolutely everything that was presented. A good class often leaves me with way more questions than when I came in. If I had some weird consumerist view that made me think this was a problem, that my needs weren't being met, I would never have gotten anywhere.
Good training courses should be followed up by mentoring by trusted associates. It should involve low-stakes application of the concepts from the classroom. It should include an honest assessment of the developers skill or knowledge gaps discovered as part of the training, and a plan to help them fill those gaps.
If you come to a three day class where we spend <1 hour doing things at the command prompt, for example, and you've never done that before, no amount of explanation that I can fit into the classroom is going to fill that gap for you without hijacking the entire class. If you are in a mindset that says you have to understand all of that or you are done for, there is something wrong with both the culture of where you work and your own approach to developing your own skills. Fake it (in class). Ask a question or two if you can, but make a note to get some resources on this - to learn more. A big part of any education is helping you identify the things you don't know, and not always providing you that information on a silver platter
My courses are a mixture of lecture/demonstration, discussion, "code along with me" work, and practice exercises/labs.