Soft Skills for Work and Employment

Soft skills for work and employment to complement technical skills have been recently highlighted, again, by a Deloitte Australia media release, following is a summary.

Soft skills for work and employment have been recently highlighted, again, by a Deloitte media release.

Soft Skills for Work (Image copyright Pexels)

 

While the future of work is human, Australia faces a major skills crisis – The right response can deliver a $36 billion economic bonus

12 June 2019: With skills increasingly becoming the job currency of the future, a new Deloitte report finds that the future of work has a very human face. Yet Australia is challenged by a worsening skills shortage that requires an urgent response from business leaders and policy makers.

The path to prosperity: Why the future of work is human, the latest report in the firm’s Building the Lucky Country series:

  • Dispels some commonly held myths around the future of work
  • Uncovers some big shifts in the skills that will be needed by the jobs of the future
  • Reveals that many key skills are already in shortage – and the national skills deficit is set to grow to 29 million by 2030
  • Recommends that businesses embrace, and invest in, on-the-job learning and skills enhancement
  • Finds that getting Australia’s approach to the future of work right could deliver a $36 billion national prosperity dividend.

 

Employment Myths busted

The report dispels three myths that tend to dominate discussions around the future of work.

Myth 1: Robots will take the jobs. Technology-driven change is accelerating around the world, yet unemployment is close to record lows, including in Australia (where it’s around the lowest since 2011).

Myth 2: People will have lots of jobs over their careers. Despite horror headlines, work is becoming more secure, not less, and Australians are staying in their jobs longer than ever.

Myth 3: People will work anywhere but the office. The office isn’t going away any time soon, and city CBDs will remain a focal point for workers.

 

The big skills shift ahead: from hands…to heads…to hearts

 

“That today’s jobs are increasingly likely to require cognitive skills of the head rather than the manual skills of the hands won’t be a surprise,” Rumbens said. “But there’s another factor at play. Employment has been growing fastest among less routine jobs, because these are the ones that are hardest to automate.”

More than 80% of the jobs created between now and 2030 will be for knowledge workers, and two-thirds of jobs will be strongly reliant on soft skills.

 

Critical skills and the multi-million gap

 

As work shifts to skills of the heart, Rumbens said the research reveals that Australia already faces skills shortages across a range of key areas critical to the future of work.

“These new trends are happening so fast they’re catching workers, businesses and governments by surprise,” Rumbens said.

At the start of this decade, the typical worker lacked 1.2 of the critical skills needed by employers seeking to fill a given position. Today, the average worker is missing nearly two of the 18 critical skills advertised for a job, equating to 23 million skills shortages across the economy.

 

The business response?

 

Rumbens said that getting ahead of the game will require concerted action.

The report includes a series of checkpoints business leaders and policy makers, can use to inform, and drive action. These include:

  • Identify the human value – Identify which jobs can be automated, outsourced to technology such as AI, and which are uniquely human. Use technology to improve efficiency, and increase the bounds of what’s possible.
  • Forecast future skills needs – Understand the skills, knowledge, abilities and personal characteristics of your employees.
  • Re-train, re-skill, and re-deploy – People represent competitive advantage. Consider alternatives to redundancy such as re-training, re-skilling or re-deploying as options to support existing workers reach for new opportunities.
  • Involve people – The people who do the work are often the best placed to identify the skills they require to succeed. Find ways to involve employees in the design and implementation of learning programs.
  • Talk about technology honestly – Engage in an honest dialogue about the impacts of technology to support staff and generate new ideas for managing change.
  • Manage the robots – Introduce digital governance roles to evaluate the ethics of AI and machine learning, alongside existing frameworks.
  • Use mentoring and apprenticeships – Micro-credentialing holds the key to unlocking the value of emerging job skills, while apprenticeship models are re-emerging as an effective way for business to develop a future-ready workforce.
  • Recruit and develop social and creative skills – Recognise and reward social skills such as empathy, judgement, and collaboration when recruiting and developing workers.

 

For more articles and blogs about soft skills and adult learning click through.

 

University Education – Student Teacher Tutors or Professors?

Interesting article from The Conversation regarding university tutorial teaching or tutoring quality, students or academics?  The glib answer would be neither form of pedagogy, in fact ‘andragogy’ for adult learners shows that many should be learning together as students, not through teacher centred direction.

Can students teach as well as professors?

Student Tutorial Teachers or Professors? (Copyright image Pexels)

Research shows students are as good as professors in tutorial teaching

February 19, 2019 5.23pm AEDT

Professors and graduate students are at opposite ends of the university hierarchy in terms of experience, qualifications and pay. But at many universities, both do the same job: they teach tutorials offered in parallel with lectures.

Our research explores whether it makes sense for professors to teach tutorials – and we found it doesn’t. They are no more effective as tutorial instructors than students.

This finding implies that universities can reduce costs or free up professors’ time by asking students to teach more tutorials.

Measuring instructors’ effectiveness

We conducted a survey about tutorial instruction in OECD universities. Our results show that tutorials are used in 63% of OECD universities. At 25% of these institutions, tutorials are taught by students, 29% by professors and 46% by a mixture of the two.

Using professors to teach small groups is expensive, and reducing costs is a central concern given the increases in tuition fees and student debt.

We have studied the costs and benefits of using tutorial instructors with different academic ranks, using data from a Dutch business school that offers four key features. First, tutorials are taught by a wide range of instructors, ranging from bachelor’s students to full professors. Second, the school’s dataset is large enough (we observe more than 12,000 students) to give us enough statistical power to detect even small differences between instructors.

Third, at this business school students are randomly assigned to instructors of different academic ranks, creating a perfect experiment for seeing whether academic rank matters. Finally, we were able to supplement these already excellent data with measures of students’ satisfaction with the course, and students’ earnings and job satisfaction after graduation, for some of these students. This is important since instructors might matter in many ways and we need to cast a wide net to capture a range of student outcomes.

Students just as effective

Overall, our results show that lower-ranked instructors teach tutorials as effectively as higher-ranked ones. The most effective instructors – postdoctoral researchers – increase students grades by less than 0.02 points on a 10-point grade scale compared with student instructors. The differences between all other instructor types, from student instructor and full professor, is smaller than that.

Full professors are also no better than student instructors in improving students’ grades in the next related course or job satisfaction and earnings after graduation. We do, however, find that higher-ranked instructors achieve somewhat better course evaluations, but these differences are small.

These findings are counter-intuitive. Yet they are consistent with the general findings in primary and secondary education that formal education does a poor job at predicting who teaches well.

What could be the reason why all the extra qualification and experience of professors does not translate into better results for their students? The content of tutorials might be adjusted in a way that students can easily teach them. Further, lower-ranked instructors may compensate for their lack of experience by being better able to relate to students and being more motivated.

Key implication

The implications of our findings are obvious. Universities can free up resources by not asking their most expensive staff to do a job that students can do equally well. We show that the business school we study can reduce the overall wages they pay to tutorial instructors by 50% if they only employ student instructors.

There are, of course, reasons why universities might not want to exclusively rely on student instructors. Students might not be able to teach some more technically advanced master’s courses. There might be some research-inactive but tenured professors whose most valuable use of time is tutorial teaching. And, as with other research that rely on data from one institution, future studies need to show whether our results hold in other universities as well.

But even if these studies uncover some benefits to students of being taught by a professor, we would be surprised if these are worth the extra costs.’

 

Unclear what is quality teaching and learning? Higher education or universities put great importance upon narrow and high-level specialised knowledge exemplified by a doctorate, i.e. content or subject matter expert. Further, the vocational Certificate IV of Training & Assessment TAE40116 is included on many job descriptions as a desirable teaching qualification and meanwhile ‘real world’ experience can be ignored by institutions and/or embellished by the beholder (unlike the ID points system, all factors are not taken into account).

Related issues here, theory of teaching and learning, pedagogy (for children) is cited but for adults we should be speaking about andragogy.  Andragogy of adult education focuses upon adults’ need for knowledge, motivation, willingness, experience, self-direction and task-based learning.

Good instructional or learning design for adult centred learning:

  • broad and deep needs analysis based on learners’ knowledge, expertise and real skill gaps
  • motivated when they have input and some control over learning, activities and outcomes
  • participate in learner centred activities, interaction and social learning
  • opportunities to contribute knowledge, expertise and reflect on their business practice
  • contribution to and management of learning activities through tasks and problem solving; post course too.

A more complete qualification is the UK Cambridge RSA CELTA or TEFLA, especially behavioural theories fitting ‘andragogy’, including teaching skills, and dealing with significant numbers of adult students for whom English is not their first language.

Another issue to emerge has been that of ‘ID Instructional Design’ on behalf of university teachers, but not based upon subject matter or teach/learning skills (when ID is implicit for any competent teacher).

Finally, explaining in terms of cost (cutting or savings) may seem mercenary when high fees are now the norm for most students.

 

Student Evaluations in Higher Education and Universities

While student evaluations or ‘happy sheets’ become routine in higher education and universities, some question both effectiveness and efficiency in using such instruments to assess quality. Further, what is quality in teaching, learning, assessment, technology, administration and student well-being, then how and when should it be applied?

Student feedback and evaluations in higher education

Student Experience Feedback (Image copyright Pexels)

From the AIM Network Australia:

Mutual Decline: The Failings of Student Evaluation

November 30, 2018 Written by: Dr Binoy Kampmark

That time of the year. Student evaluations are being gathered by the data crunchers. Participation rates are being noted. Attitudes and responses are mapped. The vulnerable, insecure instructor, fearing an execution squad via email, looks apprehensively at comments in the attached folder that will, in all likelihood, devastate rather than reward. “Too much teaching matter”; “Too heavy in content”; “Too many books.” Then come the other comments from those who seem challenged rather than worn down; excited rather than dulled. These are few and far between: the modern student is estranged from instructor and teaching. Not a brave new world, this, but an ignorant, cowardly one.

The student evaluation, ostensibly designed to gather opinions of students about a taught course, is a surprisingly old device. Some specialists in the field of education, rather bravely, identify instances of this in Antioch during the time of Socrates and instances during the medieval period. But it took modern mass education to transform the exercise into a feast of administrative joy.

Student evaluations, the non-teaching bureaucrat’s response to teaching and learning, create a mutually complicit distortion. A false economy of expectations is generated even as they degrade the institution of learning, which should not be confused with the learning institution. (Institutions actually have no interest, as such, in teaching, merely happy customers.) It turns the student into commodity and paying consumer, units of measurement rather than sentient beings interested in learning. The instructor is also given the impression that these matter, adjusting method, approach and content. Decline is assured…

…Education specialists, administrators and those who staff that fairly meaningless body known as Learning and Teaching, cannot leave the instructing process alone. For them, some form of evaluation exercise must exist to placate the gods of funding and quality assurance pen pushers.

What then, to be done? Geoff Schneider, in a study considering the links between student evaluations, grade inflation and teaching, puts it this way, though he does so with a kind of blinkered optimism. “In order to improve the quality of teaching, it is important for universities to develop a system for evaluating teaching that emphasises (and rewards) the degree of challenge and learning that occurs in courses.” Snow balls suffering an unenviable fate in hell comes to mind.

Student feedback or evaluations are an essential part of assessing, maintaining and improving quality in education and training.  However, much research and expertise is required for such instruments to be used optimally for positive outcomes.

For more articles and blogs about higher education teaching, CPD continuing professional development, enrolled student feedback, evaluation, student satisfaction and university teaching skills, click through.

 

ID Instructional Design Models in Education

Cognitivism and Connectivism Learning Theory page as part of an EdX Instructional Design course.

Cognitivism is student centred learning via an existing knowledge base and building upon it according to learner preferences, how they organise memory, how information is linked, learning how to learn, problem-solving and the student learning journey is supported by clear instructions and information (Hanna, 2017).

Further, there is the Three-Stage Information Processing Model including Sensory Register to assess inputs, Short-Term Memory where input can be stored e.g. 20 seconds and then Long-Term Memory and Storage retrievable by linkages that have been developed (Mergel, 1998).

Application of andragogy for adult learning versus pedagogy for school.

Adult Learning Theories in Higher Education (Image copyright Pexels)

Connectivism is like social learning through others or networks, identify patterns, knowledge based round networks and exemplified in complex learning e.g. round information and technology (Ibid.).

Both can be used for the same education and exemplars, by using both theories to support instructional design, student centred activity and learning, building upon knowledge and experience for inexact outcomes; as opposed to behavioural focus.

In the first case, cognitivism using a course e.g. ‘Introduction to Digital or e-Marketing for Small Business’,  focus upon one learning outcome, ‘ability to analyse (digital) marketing and communication’

Rather than present information or content activities which maybe new and/or overwhelming, assess the knowledge level before training, then drive instruction and achievement of learning objectives via learners and learner centred activity (but monitored an assessed closely).

Instructional Design for Adult Learners in ‘Introduction to Digital or e-Marketing for Small Business’:

Preview by using images to elicit key words, channels etc. related to conventional marketing and communication.

Presentation repeat preview to include digital also and elicit the elements.

Practice by learners listing both types of elements in a small business example marketing and communications; report back to class.

Production in pairs for their own business, assist each other, compare notes then present to each other/class.

Wrap-up Class discussion and/or milling activity to compare with other learners’ ‘production’ and feedback on key points, rules or issues.

Connectivism can be applied to the same course area and learning outcome, not just in the direct learning environment but post learning, i.e. back in the workplace and business environment.  Accordingly, if learners are mostly small business people, already responsible for marketing and communications and sharing a desire to improve application of digital in their business practice, they should be motivated for connectivism.

Within the formal learning, connectivism would fit cognitivism approach above, with symmetry in each phase, but especially with increase in learner interactivity with production and wrap up or review.  Connectivism can then also be followed up informally by learners remaining in communication with each other (e.g. WhatsApp or LinkedIn Group), industry sector networking opportunities and/or local chamber of commerce.

Andrew Smith Melbourne LinkedIn Profile

 

Reference List (Harvard):

 

Hanna, M. (2017) Learning Theory Matrix. Available at: https://pdfs.semanticscholar.org/8d28/2833c35fb8b9ea74bf2c930cea22fb1e0fad.pdf (Accessed on: 16 November 2017).

Mergel, B. (1998) Instructional Design & Learning Theory.  Available at: http://etad.usask.ca/802papers/mergel/brenda.htm#Cognitivism(Accessed on: 17 November 2017).

 

TAE40116 Certificate IV Training Assessment Package – ASQA Review Submission

Submission for TAE40116 Training Package Review

 

Written by Andrew Smith; submitted 3 April 2018

Introduction

There has been much discussion amongst training practitioners about the updated TAE Training Package.  One of the main issues has been the perception that it has been designed for quality administration and assessment while neglecting quality of actual training delivery and learning.

ASQA Australian Skills Quality Authority Certificate IV Training and Assessment TAE40116 Review

TAE40116 Certificate IV in Training and Assessment – ASQA Review (Image copyright ASQA)

This has been experienced by the writer currently upgrading BSZ to TAE via a registered training organisation (RTO) by distance learning; PO Box with ‘assessors’ and ‘trainers’ based offshore.  Further, the delivery is based upon basic pedagogy of presentation of content, regurgitation of content according to instructions while seemingly unable to offer explanations or insight for trainees, especially delivery and learning skills based upon andragogy.

 

What is the TAE Training Package?

 

Description

 

This qualification reflects the roles of individuals delivering training and assessment services in the vocational education and training (VET) sector.

This qualification (or the skill sets derived from units of competency within it) is also suitable preparation for those engaged in the delivery of training and assessment of competence in a workplace context, as a component of a structured VET program.

The volume of learning of a Certificate IV in Training and Assessment is typically six months to two years. (Department of Education & Training 2018).

At face value the TAE40116 appears to be a relevant and practical qualification for vocational education and training to deliver accredited training packages, assure quality with a focus upon assessment.  However, there have been many criticisms of the package from practitioners, industry and other stakeholders, why?

Issues with TAE Training Package and Delivery

 

According to the Resources Training Council

 

It has become a qualification for the training industry, not industry that trains. They do not understand workplaces where training and more importantly the outcomes (assessment) must be fit for purpose to achieve what VET is all about.  VET should be about producing safe, proficient (productive) workers and providing an opportunity for learning to be built on as people move along their chosen career path (Munro 2017).

 

From an experienced VET training practitioner

 

To improve assessment practices of RTOs and improve skills and knowledge of trainers and assessors we need to:

  • Update our regulatory framework and move to a real outcome-based regulation, where relevant industry stakeholders have a say in the registration/re-registration process
  • Support the National Regulator in building the required capabilities to assess compliance in a diverse and complex environment
  • Ensure the National Regulator provides an even-playing-field to RTOs
  • Identify the different issues within the assessment system, and consequently identify gaps in current workforce skills (in all AQF levels not only entry level) and update the TAE training package accordingly (Castillo 2016).

 

Research criticisms have included: one size fits all approach whether novice trainer or an already well experienced and/or qualified trainer or teacher, trainer assessor expertise or skills, questions over subsequent assessment outcomes, lack of depth related to training and learning delivery, no clear development pathway and the skill outcomes from the TAE for practitioners to deliver (Clayton 2009).

The focus of criticism has been directed at sub-optimal education and training pedagogy (learning theory for children and young adults or novices) of both the TAE and practitioners for quality delivery, learning and trainee assessment outcomes.  These revolved round, preparedness of trainers to train, opportunity to learn content knowledge, delivery quality, learning the practice of good teaching or training, learning from experts, then more about planning and assessment (Ibid.)

The latter issue of assessment has been raised within sectors whether validation between providers, or simply better understanding of assessment by practitioners (Halliday-Wynes & Misko 2013)

Further, expert input often hints at what is lacking by focusing upon learning theory or ‘pedagogy’ for children and youth, as opposed to ‘andragogy’ for youth and adults.  The latter would be exemplified by self-directed learning or training, responsibility, experience, motivation to learn and preference for real tasks and problem solving (Educators’ Technology 2013); supported by well skilled trainers.

 

Training Delivery and Learning Quality

 

However, delivery of some TAEs has more to do with education and training or ‘pedagogy’ influence from two generations ago manifested in trainer or teacher directed, or top down.  This assumes trainees have no relevant knowledge or practical input to offer, focus upon systems, processes and assessment round any given package, but not delivery i.e. developing quality training and learning skills.  Additionally, very content driven for good reason, however, it is presented or instructed (not elicited) then regurgitated or replicated for satisfying requirements for assessment, then assumed optimal for the workplace?

The significant size of the VET sector requires standard packages, systems, processes and assessment to be compliant and manageable.  However, the risk is that system quality may be based upon indirect top down paper-based systems and processes of (quality) compliance that are reactive when issues emerge, if discovered.  For example, sub-optimal training and learning, versus proactive measures through more intrusive evaluation of actual training delivery quality or bottom up informing.

Quality maybe improved by intrusive quality assessment through mystery shopping on any given TAE course, dynamic (publicised opportunities) for feedback from trainees and clients, evaluation of specific programs and trainers or evidence of dynamic quality evaluation of skills versus merely possessing a TAE qualification or ‘ticket’.

 

References:

 

Castillo, A 2016, Newly endorsed Certificate IV in Training and Assessment – Same Issues, LinkedIn Pulse, viewed 22 March 2018, < https://www.linkedin.com/pulse/newly-endorsed-certificate-iv-training-assessment-amaro-castillo/ >

 

Clayton, B 2009, Practitioner experiences and expectations with the Certificate IV in Training and Assessment (TAA40104): A discussion of the issues, NCVER Melbourne, viewed 22 March 2018, < https://www.ncver.edu.au/__data/assets/file/0023/4658/nr08504r.pdf >

 

Department of Education & Training 2018, MySkills: Certificate IV in Training and Assessment, viewed 22 March 2018, < https://www.myskills.gov.au/courses/details?Code=TAE40116 >

 

Educators’ Technology 2013, AWESOME CHART ON “PEDAGOGY VS ANDRAGOGY”, viewed 1 April 2018, < http://www.educatorstechnology.com/2013/05/awesome-chart-on-pedagogy-vs-andragogy.html >

 

Halliday-Wynes, S & Misko, J 2013, Assessment issues in VET: minimising the level of risk, NCVER, viewed 22 March 2018, < http://www.cmd.act.gov.au/__data/assets/pdf_file/0015/801600/AssessmentIssuesInVET_MinimisingTheLevelOfRisk.pdf >

 

Munro, J 2017, The TAE debacle – a resources sector view, Resources Training Council, viewed 31 January 2018, < http://www.resourcestraining.org.au/news/the-tae-debacle/ >