Last week, an interesting article was posted on the Bashaar-IL forum from Times Higher Education, titled “Do universities teach critical thinking skills?”
The article attracted considerable attention, and scholars debated what universities provide their students.
The article discusses a book that was recently published, which claims that “There’s a discrepancy in that people are qualified – they have the stamp from universities that says they can do certain occupations – but then employers find they don’t have the skills needed for the workplace.”
The book discusses the results of tests conducted by the OECD that were published on 30 August under the question ‘Does Higher Education Teach Students to Think Critically?’ The results were disappointing since only 45 percent of the students were proficient in critical thinking.
One of the book authors said, “Critical thinking is a skill that I think [many people] just assume is taught… Universities, at least the ones that we have talked to, have said ‘It is not our job; they should have learned these things in high school’…everyone feels like it is somebody else’s responsibility to teach these things.”
The book suggests that some of the world’s “largest employers are losing faith that a good university qualification guarantees a candidate of a certain quality.”
Interestingly, however, none of the scholars mentioned the case of the critical theory, the postmodern theory that permeated the social sciences along with the neo-Marxist scholarship.
The two theories hijacked the liberal arts education in favor of a dogmatic view of reality comporting with the left-leaning political agenda of scholar-activists.
Hundreds of Israeli scholar-activists have been recruited to teach critical, neo-Marxist themes in the last three decades. They have been promoting each other.
For example, when Neve Gordon, then a professor at Ben Gurion University Department of Politics and Government, wrote his book Israel’s Occupation at UC Berkeley under the guidance of Nezar AlSayyad in 2005, he wrote that since 1967, during the Israeli occupation’s first two decades, in the health field, “practices were introduced to encourage women to give birth at hospitals (a means of decreasing infant mortality rates and monitoring population growth) and to promote vaccinations (in order to decrease the incidence of contagious and noncontagious diseases). Palestinian teachers were sent to seminars in Jerusalem, where they were instructed in methods of ‘correct’ teaching. A series of vocational schools were established to prepare Palestinians who wished to join the Israeli workforce, and model plots were created to train farmers. Many of these controlling devices aimed to increase the economic productivity of the Palestinian inhabitants and to secure the well-being of the population.”
Most unbiased readers would applaud Israel for helping the Palestinians to improve their living standards. But for Gordon, all these good measures were “Biopower,” a term taken from critical theory denoting the means of governing and control.
As for promoting each other, Professor Yehouda Shenhav, another critical scholar who was recruited to research the Sociology of Organizations but shifted to exploring the “Arab Jews,” helped to recruit Yael Berda to the Sociology Department at the Hebrew University. Berda has recently published an article about Queen Elizabeth II’s legacy, that “British colonialism here is not a thing of the past.” She writes. “Israel-Palestine is one of the few remaining places in the world where the organizing principles of British colonialism form the basis for present-day bureaucratic, legal, and political mechanisms. One of the central characteristics of British colonialism is the combination of racial hierarchy and extreme violence meted out against non-European subjects.”
According to Berda, Queen Elizabeth II’s legacy “looms large over the Israeli regime’s obsession with separation and segregation of communities and its racial discrimination against native and ‘uncultured’ groups.”
Berda’s article is another example of “critical” theory.
Using the term “critical” scholarship for denoting leftist agenda robbed students of engaging in critical thinking.
Do universities teach critical thinking skills?OECD researchers offer evidence that students aren’t getting ‘generic skills’ needed for world of work – with potentially big implications
September 6, 2022
Professional services giant PwC’s recent announcement that new recruits will no longer require at least a 2:1 degree was seen by many as the latest sign that some of the world’s largest employers are losing faith that a good university qualification guarantees a candidate of a certain quality.
The firm is by no means the first to look for new ways of determining the talent and potential of recent graduates as employers become increasingly vocal about the supposed failures of even the top universities to ensure that those entering the workforce have obtained the status of being “job-ready”.
In response, governments and policymakers around the world have emphasised the need for more practical, vocational degree courses that are closely tied to real-world experiences. But a new publication from the Organisation for Economic Cooperation and Development (OECD) argues that it is in the teaching of more generic critical thinking skills where universities can make the most difference.
“There’s a discrepancy in that people are qualified – they have the stamp from universities that says they can do certain occupations – but then employers find they don’t have the skills needed for the workplace,” said Dirk Van Damme, who co-edited the new book and recently retired as the OECD’s head of innovation.
“The assessment done by universities doesn’t guarantee that candidates have the problem-solving skills that employers think are important, and so they have to find ways to test this themselves.”
The notion that institutions are lacking in this regard has long been suspected, and the researchers behind the study think they may finally have come up with a way to prove it.
If all this sounds familiar, it is because it is. Many of those involved in the research also worked on the OECD’s aborted Assessment of Higher Education Learning Outcomes (Ahelo) project, which sought to establish a global system for assessing students’ skills at the end of their degrees.
Billed as a university-level equivalent of the highly influential Programme for International Student Assessment (Pisa) tests for school pupils, the scheme faced stiff opposition from elite institutions – some of which were arguably motivated by fears for their positions at the top of the hierarchy if teaching outcomes were to become better known.
More fundamental questions were also raised about whether such skills could accurately be assessed across institutions and borders, and the project fell apart in 2015.
A handful of countries remained committed to the idea, however, and have been testing students’ critical thinking skills ever since using the Collegiate Learning Assessment (CLA+), developed by the Council for Aid to Education (CAE), a US-based non-profit. The assessment includes a performance task and set of questions designed to test a student’s cognitive thinking, rather than their ability to recall knowledge.
“There’s no way that any one specific assessment can measure all of critical thinking,” acknowledged Doris Zahner, CAE’s chief academic officer and the co-editor of the new book.
“What we do really well is measure a specific, well-defined component of critical thinking: namely, analytical reasoning and evaluation and problem-solving,” she said.
“That includes data literacy, understanding quantitative information, being able to gather information from various sources and then making a decision based on this and crafting an answer that supports your argument and refutes the opposite – that’s what the assessment does.”
The results of the tests, published by the OECD on 30 August in the book Does Higher Education Teach Students to Think Critically?, are stark: on average, only 45 per cent of tested university students were proficient in critical thinking, while one in five demonstrated only “emerging” talent in this area.
What’s more, the “learning gain” of students between the start and the end of their courses was found to be small on average, while there were big discrepancies between courses, with those studying fields closely aligned to real-world occupations – such as business, agriculture and health – scoring the worst.
For Dr Van Damme, the results reflect a move away from the teaching of critical thinking in higher education, with less emphasis being placed on engaging with content and with some sectors abandoning exercises such as essay writing.
“Critical thinking is a skill that I think [many people] just assume is taught,” Dr Zahner said. But she pointed out that it has never been reported in university transcripts, so there has traditionally been no way of knowing if a student has developed these skills. “Universities, at least the ones that we have talked to, have said ‘It is not our job; they should have learned these things in high school’…everyone feels like it is somebody else’s responsibility to teach these things,” she said.
The authors recognise the limitations of the research, particularly the self-selecting sample of students, confined mostly to campuses in the US, with only a fraction coming from the other five countries taking part – Chile, Finland, Italy, Mexico and the UK – meaning that data for these countries could not be said to be representative.
But the authors believe they have demonstrated that “an international, cross-cultural, comparative assessment of generic learning outcomes of higher education is feasible”.
While the OECD does not yet seem to have mustered the will for another go at instigating an Ahelo-type project, the study’s repercussions could be major.
“What I personally believe this will do is lay the foundations for placing greater weight on the quality of teaching in higher education,” Andreas Schleicher, the OECD’s director for education and skills, told a launch event for the book in Hamburg.
He said that employers had “seen through” the degree system and that students were becoming more discerning consumers because they were having to shoulder more of the cost of their education.
Therefore, he continued, it was getting harder to “hide poor teaching behind great research”, while demand for skills that were easiest to test and teach – such as memorising and regurgitating knowledge – were exactly the areas that were losing value most quickly.
“Teaching excellence needs to obtain the same status – the same recognition – as academic research, which is still the dominant metric for valuing academic institutions, whether you look at rankings, research assessment frameworks or performance-based funding,” Mr Schleicher said.
To critics, all this sounds suspiciously like groundwork for the creation of a new ranking, something that was never an aim of Ahelo even though many thought its data were likely to eventually feed into institutions’ scores in global league tables.
Dr Van Damme said that while many criticisms of rankings were justified, there should be a recognition that they are not going to go away and, therefore, it would be better to find ways to ensure that they accurately reflect the quality of teaching – something that could change the complexion of league tables completely.
“In an ideal world, where you have as much transparency for teaching and learning as you have for research, there would be a profound impact not only on rankings but the hierarchy and landscape of the system,” he said.
“It is certainly not the case that universities that are excellent in research are also automatically excellent in teaching and learning; and if you placed greater weight on teaching, you would get different results [in rankings].”
As well as an upheaval in institutional reputation, greater focus on the teaching of critical thinking could fundamentally alter the types of courses that are seen as necessary for societies and economies to thrive, according to Dr Van Damme.
Politically influenced drives towards utilitarian approaches to education that produce students who are immediately employable in a certain occupation – which tend to favour the STEM subjects – neglect to consider volatility in the labour market and the need to train young people for their entire lifetime, he said.
“The economy and labour market are in transformation because of digitalisation, and so the job reality in 10 years’ time will be completely different from today. There should be more interest in teaching the generic skills that matter in the long term,” he added.
In this world, it is the much-maligned humanities that truly come into their own, and the CLA+ results showed that those students pursuing these fields displayed much higher levels of critical thinking, according to Dr Van Damme.
He said studies have demonstrated that while vocational training produces better employability results in the short term, these wane after five years and “those with better generic skills have much better employability and earning prospects over a lifetime”.
Dr Zahner said universities would likely come under increased pressure from industry and governments to address these issues, whether they like it or not.
“Hopefully the universities will hear this messaging. It’s great if you can graduate your students, but it is not so great if you graduate all these students and they don’t have success in their careers. We’re hoping being able to increase critical thinking skills will be able to close that gap.”
The Queen is dead, but her colonial legacy lives on in Israel-Palestine
While the British Mandate ended 74 years ago, its legacy of racial hierarchy, divide and rule, and emergency regulations is still visible in Israeli policy.
By Yael Berda September 20, 2022
This article was published in partnership with Local Call.
Of all the countries Queen Elizabeth II visited over the course of her 70-year reign — of which there were over 120 — she never once set foot in Israel. But she needn’t have; the legacy of the British Mandate continues to have a tangible impact on the day-to-day management of the Israeli regime.
Israelis tend to think about the British Mandate as a historical remnant, and the rule of the Monarchy as a brief moment in time that belongs in the past. Israeli Jews who hold liberal views often joke that they hope for the “return of the British Mandate,” as if British rule over Palestine ushered in an era of infrastructure and efficiency, replete with cars, maps, statistics, and electricity. The implication is that ever since the British left, things have only gone downhill.
While they may say these things in jest, British colonialism here is not a thing of the past. In fact, Israel-Palestine is one of the few remaining places in the world where the organizing principles of British colonialism form the basis for present-day bureaucratic, legal, and political mechanisms.
One of the central characteristics of British colonialism is the combination of racial hierarchy and extreme violence meted out against non-European subjects, with a near-obsessive preoccupation with political legitimacy and legal normativity. In other words: a fixation on the rule of law.
Since the days of the East India Company— which was the first to use emergency legislation to establish the death penalty and the practice of deportation — this obsession meant that as long as there was some semblance of procedure, any violence against a given population could be justified under the pretext of warding off “security risks.” As “hostile” natives increasingly resisted the violence of empire, however, the definition of “security risks” had to be muddied further.
In recent years, historians of the British Empire from across the political spectrum have come to understand that colonialism and liberalism — including the importance of the “rule of law” as a supreme value — cannot be separated. But while the world tries to rid itself of this legacy and begins to think about decolonization in the realm of politics, society, and even the economy, it ignores the fact that British colonialism continues to shape the lives of citizens, residents, and subjects between the river and the sea.
The British administrators in the colonies realized fairly quickly that they could not maintain control over native people through force alone. Therefore, they began to adopt advanced population management methods, including the classification of different populations in accordance with their supposed level of security risk. This is the first organizing principle of colonial bureaucracy: the systematic separation of populations, followed by the creation of separate governing practices for each group.
Another key tool used by the British was the restriction of movement. This was done through declaring closed military zones; administrative detentions; preventing passage from one colony or subdistrict to another; and permit regimes, which blocked, limited, and slowed the movement of the population. British surveillance systems turned not only policemen and soldiers into sources of control and intelligence, but also teachers, postal clerks, and medical staff.
One can recognize some of these colonial organizing principles in Israel-Palestine today. They are, of course, most strongly expressed in the privileged treatment of Jewish settlements on both sides of the Green Line — whether the kibbutzim and moshavim inside Israel, or the settlements in the occupied West Bank.
The first principle is racial hierarchy: initially between Europeans and natives, and later within Jewish population groups— Mizrahim, Ethiopians, Jews from former Soviet states — according to how “cultured” they are. The second principle is administrative flexibility, meaning the management by officials in the field, due to their proximity to the subject population, rather than by laws passed in parliament.
The third is secrecy. While most bureaucracies work with published laws, colonial bureaucracy uses secret laws, unknown decrees, directives, and internal regulations that even colonial officials don’t know about, cloaked under the guise of threats to security or “the order of the colony.” The fourth is personalization. A person’s identity determines the laws or the practices that will be applied to him — the exact opposite of equality before the law. The fifth is the creation of exceptions. This form of control is actually based on a collection of exceptions, which constantly change in a routine manner, as opposed to long-term planning.
Legacies of racial separation and partition
In Israel, the historical relationship between Zionism and British colonialism is usually considered through two prisms. The first is that of the final two years of the British Mandate, during which the three Jewish underground organizations — Haganah, Etzel, and Lehi — declared armed struggle against British rule to expel the occupiers, which the colonial authorities deemed “terrorism.”
The second prism through which the relationship is viewed is that of one of the main tools of the executive authority in Israel: emergency legislation. We tend to forget that both the construction of relations between Jews and Palestinians as one of racial hierarchy, as well as the 1947 Partition Plan — which sought to split Palestine between Palestinians, who made up the majority of the population at the time, and mostly Jewish settlers — were both born out of an imperial desire to manage the conflict while maintaining its control and influence over the region.
When the British Mandate came to an end, the newly established State of Israel adopted the (Emergency) Defense Regulations, enacted by the British during its rule, which grant extraordinary powers to Israel’s executive authority. In fact, a declared state of emergency has been in effect ever since the founding of the state.|
There is no doubt that these regulations are the beating heart of the Israeli regime, nor is there any doubt that their abolition is an essential step on the way to the establishment of a truly democratic regime. In practice, the Defense Regulations shaped how Israel’s first governments treated its Jewish opposition, but most importantly it shaped the military government that ruled over Palestinian citizens of Israel between 1949 and 1966, allowing the government to seize Palestinian land and property under the guise of “military necessity,” and prevent Palestinian refugees and internally displaced persons from returning to the homes that the state expropriated after 1948 as “absentee property.”
The organizational infrastructure of the occupation is also based on the emergency regulations. While preparing military orders for a possible future military occupation in 1963, four whole years before Israel would come to control the West Bank, Gaza, East Jerusalem, the army cut out the words “His Majesty” as the ultimate sovereign in the region, and replaced them with “the military commander of the area.” Israel’s Supreme Court, far from questioning the existence of these regulations, has continually upheld their legitimacy and the military occupation that derived from it.
Israel’s anti-terrorism law, which was passed by the Knesset in 2016 and which contains broad and vague definitions of terrorism while entrenching many of these emergency regulations into law, turned British colonial legal tools that had been in use for 80 years into legislation. This same logic — which transforms any political risk into a security risk — was the motivation behind the decision by Defense Ministry Benny Gantz to declare six Palestinian civil society organizations as terrorist groupsand try to have them shut down.
Queen Elizabeth II’s legacy is not only seen in Israel’s emergency regulations. It looms large over the Israeli regime’s obsession with separation and segregation of communities and its racial discrimination against native and “uncultured” groups living between the river and sea under a single government, without fixed borders of sovereignty. In that sense, even in her passing, the empire she represented is still very much with us.
A version of this article first appeared in Hebrew on Local Call. Read it here.
Yael Berda is an assistant professor of sociology and anthropology at Hebrew University and a visiting researcher at Harvard Kennedy School. She is the author of ‘The Bureaucracy of the Occupation’ and ‘Living Emergency: Israel’s Permit Regime in the Occupied West Bank.’