They are dumbing us down…

Are they dumbing us down?

Think about that question for a moment. My thesis for this post is that there is a concerted movement to dumb us down.

By ‘us’, I mean the middle class… by which you might also be free to infer the white middle class, or the white male heterosexual middle class, or the middle class of whatever would have been considered the ‘normal’ middle class of whatever country or culture you live in. Or any combination of the above.

The quick to respond amongst you will simply say that ‘they’ are dumbing ‘us’ down to protect their own interests: their narrativetheir financial intereststhe seat of their power. If that is your response too, then I believe you are very likely correct. But in the interests of full and frank scientific discussion, let’s analyse the situation through the lens of initially two contemporary examples.

Academia

It won’t have escaped many of you that for most of the last decade I have worked almost exclusively in academia. During this time I have watched the gradual scope creep and brain drain that, like a glacier carving its way through a mountain range, has masterfully captured and worn down much of academia and her academicians.

You might be forgiven for wondering where I am going with this because ostensibly, our universities – the collective ‘academia’, are supposed to be there to make us smarter, not dumber. However, please bear with me. There are various ways in which universities overall are actually, and I believe deliberately, making us dumber.

While the university and university culture is responsible for much of what is happening with respect to the dumbing down of education, it actually starts even before your child gets to our once-hallowed front door. Infants, primary and high schools now focus less on academic skills and merit, and more on the DEI/EDI touchy-feely-ness. I get that some teachers will say that they or their school doesn’t do this but generally, most schools have fallen for these decidedly un-academic distractions.

A common theme I hear (and observed first-hand when my son was attending public school) is that the ostensibly qualified teacher that leads the children’s class is gradually spending less and less time in the classroom. An ever-growing number of teaching aides, assistants and students are actually responsible for the minute-to-minute lesson presentation and classroom culture. The teacher simply sets the overarching lesson plan and context and these lesser qualified (and sometimes unqualified) agents are responsible for carrying it out. I am sorry to say that for many schools now there are financial incentives for keeping children ‘dumb’, or at least for creating an environment where learning may not be effective.

Also, the child may be encouraged to express behaviours that would not have been observed two or three decades ago. Whether with or without the parent’s knowledge and consent, these children may then be spuriously labelled as ‘on the spectrum’, or ‘special needs’ and the school can claim additional funding for undertaking what we are told are ‘special measures’ and teaching approaches ostensibly to help our precious little one to at least achieve the median expected for the class – that is, to at least fit in the ‘box’ that ‘is’ that class or year level.

The quantum of that median too has slipped during the last two decades, such that by the time they get to grade 7 or first year of high school they are often only at the level that was grade 5 when I was in school, which was at the level of grade 4 when my father was in school. This gradual ‘slip’ continues such that often the first 6 months or year of undergraduate education at university now is spent catching them up with the math, science or other knowledge needed to undertake the rest of the degree to which they have enrolled.

For this reason, we now see occupations requiring a Masters degree before graduates can undertake employment in their chosen field in the same way prior generations did. For example, take nursing where in my day I was taught to debride and suture wounds and did training placements in cardiac intensive care, in the operating theatre, and with the psychiatric crisis, assessment and treatment team (CATT).

As a first year nurse about 6 months into my training one of the first procedures I was required to undertake solo was removal of an underwater sealed drainage (UWSD) unit and suturing up of the resulting internal and external layers of a patient whose left lower lung lobectomy surgery I had observed first-hand in theatre several days before. Yet now, much of this in some health services is outside the scope of practice (and training) of even a graduate nurse – and a Masters degree may be required in order to undertake many of the more advanced nursing skills (working in surgical or intensive care nursing and so on).

Postgraduate qualifications can be necessary for essentially anything beyond taking and evaluating observations of patients and pushing the drugs trolley around. I believe this, in part, is why the attrition rate in nursing has increased from about 4-5% in the mid-late 1990s when I was doing my training, to as much as 25% today. It is also why so many nurses leave the profession in the first 5 years of their careers.

Not only are they underpaid and undervalued, they spend 3-4 years and many thousands of dollars ($)/pounds (£) to become qualified at something that they mostly can’t do, and watch as ‘cheap’ (Band 2 and 3) staff with as little qualification as a 6 week occupational training certificate spend more time with patients than they do.

Effectively, what we now see is that more people with lesser quality qualifications are being employed in frontline jobs that pay at the minimum wage baseline – thus keeping them poor or at least at the breadline, while those with the ability to go to university are being financially drained out of the middle class at a cost hundreds of percent higher than their parents paid, and when they graduate they are finding the university and public systems demand they go back and accept yet more expense and become even poorer to get yet more university qualifications before they can do what they thought their first degree would allow them to do, but actually never taught them to do.

The net result is the middle class are now saddled with huge debt for qualifications to get a job that now also pays not much above the breadline. They are becoming the almost bankrupt lower class and often end up worse off when compared to their lower class, less qualified colleagues.

Two or three decades ago, when you got to university you were considered to mostly be an adult. You would begin to be treated as the adult you were expected to be once you graduated and took responsibility for your work in your chosen profession. The academic staff didn’t hold your hand, and most generally didn’t let you make excuses for why you failed to turn in your work or attend your classes, tutorials, exams or placements.

This meant that a key part of being at university was the need to learn skills that were necessary across any profession – time management, punctuality, diligence, attention to detail, domain knowledge and, that most basic of qualities, intestinal fortitude. For those that struggled with these things, you very quickly learned to take a teaspoon of concrete and harden up! You hopefully learned responsibility and maturity. Whether you liked it or not.

Sadly, most of what we see today is students who fail to attend placement time and time again, who are repeatedly late with their coursework, and who are constantly looking for ways to circumvent the entire need to apply themselves to their studies and just cut to the ‘pass’ mark.

Nowadays universities place other ideals above merit and enrol students based on DEI/EDI quotas and whatever flavour of the month ideology is currently doing the rounds. This has seen some students enrolled in university degree courses who do not meet the entrance requirements or academic rankings that were previously fixed for that course.

In some cases these students may not have sufficient math, science or some other requirement, may have barely passed their high school exams (or failed in one or two subjects) but, and in some cases under a variety of labels that some are now coming to realise operate solely to brush aside their shortcomings, are enrolled – and it is the lecturers for that course who are blamed when these students struggle to complete coursework and pass their courses.

Some lecturers have described how it is harder to fail students than it is to develop the improbable perpetual motion machine – whether the difficulty is self-imposed1 or results from pressure of the management of the university itself2. Even I in a previous academic post had been called in to account for why I failed a student who was clearly never going to pass no matter how much assistance they were given.

I have also seen students whose work I was able to prove was more than 50% plagiarised get their academic infraction overlooked and awarded a passing grade by university administrators. When the academics are no longer in control of what is or isn’t a passing grade, the students are allowed to get dumber and the entire education system is broken.

Information Technology

I came to academia after a successful 16-18 year career in Information Technology (IT). A career that flew me all over the world, saw me work for a range of household brands, some you would instantly know and others you might not realise you know until I jog your memory by telling you why you know them. A career that in what was my home market at the time saw me in many ways with my face pressed firmly against that invisible barrier known as the glass ceiling.
And no, not that glass ceiling that feminists espoused as keeping women out of the board room… the glass ceiling wherein you reach the top of your game and in the employer or employment market where you work, it seems like there is simply nowhere higher up to go.

In my last IT roles I had reached the role of Infrastructure and Solution Architect, and in some ways was also acting as the Enterprise Architect to devise IT strategy and recommend not just individual solutions but future IT direction. I had watched over my nearly two decades in IT as IT Managers and corporations flip-flopped on a semi-regular basis between outsourcing

  • that is, spending sometimes six or seven figures to move their IT servers, software, services and data assets outside the company premises to be hosted by a datacentre hosting company or telecommunications provider, and insourcing – where they spent a similar six or seven figure amount to re-enliven their on-premises server rooms and move all of those hardware and digital assets back under the corporation’s own roof.

A key set of skills I developed was the ability to size up and design these migrations – to develop the strategy that got them done in the most expedient but safest way possible.A key aspect of my approach that was often lacking in the methodologies of the big IT brands of the day in my home market (EDS, HP, Fujitsu, IBM, Datacom, and so on) was that I kept the lights on and the computers running even as my virtual hand was moving digital assets from or to some large datacentre on the other side of the city/country/continent/planet. Sadly, there are many examples both then and now that suggest this art has been, or is being, lost.

While I am sure some IBM or HP chap would love to retcon history and by inserting their own name, tell it different, but I was one of the first people in the APAC region of the Southern Hemisphere to undertake VMware training – back in the days when VMware was some unknown eclectic West Coast of America Silicon Valley-ish start-up. There was no v-Motion. No shared storage across multiple GSX servers. And no ESX anything.

I made a name for myself using VMware technology and Cisco networking everywhere from banks and government server rooms, to a small Windows Mobile-based software development house in Auckland, New Zealand.

Without even a whiff of recognition, some of the code and tweaks I and others in an online VMware GSX internet relay chat (irc) channel developed to enable us to snapshot both memory and storage state and store delta changes in a temp file in order to hot-backup GSX virtual machines onto other servers somehow became the basis for ideas like host clustering, shared storage and vMotion that VMware techs incorporated into ESX 2.5 and 3.0… And the rest, as they say, is history.

Outsourcing was, and I argue remains, a perennial con. The offer of cheap IT services provided somewhere you often can’t go by people you will never meet. It sounds cheap at the start because it shifts more of the company budget from capital expenditure (CapEx) to operational expenditure (OpEx).

Organisations no longer needed their own skilled IT propeller heads (those strange guys with pocket protectors or thick glasses often stored in the basement or some other dark room) and you could decommission that server room once and for all and turn it into a staff cafeteria, office gym, or file storage for all that paper you were supposedly no longer going to be using. However, those of us not totally brainwashed by or financially dependant on ‘big tech’ would wait and watch as the honeymoon period of this marriage of convenience soured and the costs either per hosted computer, per gigaflop, per gigabyte or per some other metric slowly, or in some cases very rapidly3, rose.

The attitude in datacentre or hosting companies was that once they had you in their datacentre, you would pay because it was cheaper to pay up and maintain access to your organisations digital assets, than to do anything at all about it – sounds like something our American brothers and sisters call racketeering.

You were in effect trapped, especially when the outsourcer often absorbed some of the better experienced or qualified of your staff, and you now only had a skeleton crew of the least qualified helpdesk people whose primary role was simply to run through a script that ascertained whether it was an ‘us’ (meaning an issue with the user, computer, on-premises or ISP network) or ‘them’ (meaning a problem with the servers, storage or network at the hosting centre) problem. Us problems fell into a queue that a pimply-faced youth would come around to solve.

Them problems fell into a much larger queue that often required repeat calls, escalations, change requests and service fees to resolve. It was in outsourcing that the IT brain drain started and, as we will discuss in a moment, that brain drain has finally and inexorably reached its peak.

Another important aspect of the day is that companies and institutions had a range of qualified IT technicians and engineers, and usually a team of helpdesk cadets who were often kinesthetically learning their way into systems administration and engineering roles. The IT department of universities was a fertile breeding ground for IT techs and tech aspiring people to develop skills. As such, many of today’s technologies, especially those that you take for granted or that are household buzzwords, actually had genesis somewhere in the mind of a student or staff member at a university IT lab. Everything from the internet and email to quantum computers4 and e-Readers5.

Fast forward to today and consider the landscape before you. The vast majority of companies and institutions have not only been conned into moving everything into the ‘cloud’ – that vacuous buzzword for outsourcing they started using in the early 2000s to get around the fact that some old-school IT managers distrusted outsourcing, but they have effectively ensured not only that organisations end up with no truly skilled IT staff and thus suffer the severest effects of the brain drain, but that the IT industry outside Microsoft/Google/Apple/Amazon and similar datacentre ‘cloud’ operators is itself killed off. In this way, even if your organisation ‘woke up’ and realised what is happening, you can’t do anything about it.

Leaving aside that Microsoft, Google, Amazon, VMware and others are now moving all their software and services to be cloud-only offerings – so that in some cases you are no longer able to purchase software and return to on-premises IT, if you can’t get people like me who can build you a robust IT infrastructure and migrate you back to on-premises systems you are effectively stuck with giving up your organisation’s data and privacy to the insipient and ever-increasing demands of these globalist tech companies and their digital voyeurism and espionage.

You are letting companies like Microsoft, Google and Amazon use your organisation’s data and staff as inputs to feed their capricious and insatiable digital surveillance and AI training appetites. You are giving them your data to trawl through, package and repackage and sell to whomever they choose with no recourse to you. And many of you (or your employer) are paying them to do it. Now, more than ever, the question we should be asking is…

Do you or your staff actually know what you’re giving up?

The juxtaposition between academia and information technology

Several things have been happening almost in lockstep during the last few years.

First, we have seen companies like Microsoft and now VMware move away from their traditional software licensing and services business into being a cloud service provider constantly nudging organisations and individuals to store and do everything in our cloud.

While it was the traditional model that made Gates, Balmer et al their billions, they no longer want you to use on-premises software because they see that model as a one-and-done approach – you buy a license from them and they don’t see you again until that license expires or the next major version is released.

It also means that your digital assets; that is, your files and personal information, can remain locked away from them and inaccessible as a source of either knowledge about you or a secondary revenue stream. After all, total ownership of your data and digital self is the ultimate goal.

Second, as companies buy into these cloud services, especially Microsoft’s, we have seen a steady decline in in-house IT systems administration and engineering jobs.

While some get redeployed to undertake the ‘cloud migration’ (i.e.: to move the organisation’s systems and digital assets into the cloud service provider’s platform) and others get absorbed into the cloud provider’s staff, many are either taking lesser qualified jobs in order to stay in the industry (such as becoming helpdesk managers or deployment engineers that set up and deliver new computers around the organisation) or move sideways and reskill into areas like software and app development (where it is continually being said that a skills shortage exists).

However, a good many with the types of practical systems engineering and architecture skills necessary to designing and building a company’s own on-premises IT infrastructure like I possess have no other choice but to leave IT altogether.

That means the skills necessary to creating and keeping a company and their staff and customer’s data out of these cloud environments are being lost – and I believe this is happening by intention. They don’t want us to be able to do anything that might remain unwitnessed by their omnipotent gaze. Remember, total ownership of your data and digital self is the ultimate goal.

Third, companies like Microsoft, Google and Amazon have ingratiated and embedded their products into university curriculums. They do this in two ways. First, they offer the university free or greatly reduced cost access to their software offerings.

The idea initially was to get students hooked on their software or services such that when the student completed their degree and went out in the world, the fact that they were already very familiar with that company’s software applications would mean their employer would buy that company’s software. Second, they insinuated themselves onto university stakeholder committees and insisted that rather than teaching generalist IT skills, teaching skills specific to their particular brand of platform or services was what employers wanted.

These issues didn’t change as on-premises and end-user software became cloud offerings – it simply meant that cloud providers now had a potentially far more lucrative new revenue stream afforded by the ability for all-pervasive access to what you were doing, thinking, saying and even where you were when you did it.

But still, this visibility only covered what you did while you were logged in to their cloud, while you used their apps, and while you were using the university or employer’s cloud-connected systems. This meant the cloud service providers like Microsoft still didn’t get to capture everything. Again, total ownership of your data and digital self is the ultimate goal.

Fourth, during this entire period we have seen how insecure and exposed these cloud platforms really are. In many cases they give the appearance of being far easier for black-hat hackers to access than many of the on-premises platforms that preceded them.

Using Microsoft as an example, here we have the organisation who designed the software and who set the gold standard for how to use it, having made IT decisions and mistakes that many of us who designed and built on-premises IT would never make – from (i) using the same security key across their entire hosting environment that enabled complete access to everything from tens of thousands of cloud-hosted organisation’s email and office documents, to full administrative rights on their cloud SharePoint.

OneDrive and Teams platforms, to (ii) blatantly negligent security practices that allowed Chinese hackers to access and spy on US Govt and banking Azure accounts, with the flaw remaining on production systems and Microsoft’s customers remaining unaware of the exposure of their private data long after Microsoft had been informed of the issue. I am sorry to repeat myself again, but total ownership of your data and digital self is the ultimate goal.

Fifth and finally, each time one of these cloud platforms is hacked, the cloud provider in essence blames us as the users or makes it our fault by making how we access and interact with their systems more complex. A multitude of technologies has been created to make accessing your user account and data more difficult secure – password complexity and expiration rules, challenge/response cards, smart cards, RSA tokens, and two-factor authentication (2FA) using one or more of the preceding absent or in combination with codes sent via email or text message.

Yet, and even as each subsequently fell victim to the hacker’s abilities, excuses were made that required us – the user – to do something different. Emailing 2FA became we need your mobile phone number so that we can prove it is you at your computer by texting the code to your smartphone that you always have with you. When that became insuffficient it became we need you to install an ‘Authenticator App’ on your smartphone or device primarily, we were told, because your phone might be compromised or you might be forwarding the text message onto someone else to access your user account on your behalf.

The contradiction present in ‘your smartphone might be compromised for receiving text messages’ but ‘our Authenticator App installed on the same smartphone will be secure’ should be obvious to everyone – it was a complete deception yet very few questioned it. And very few looked at what the authenticator apps from Microsoft and Google were really doing. On the surface they were simply yet another way of generating codes that worked similar to how RSA tokens operated (and failed).

However, and as I proved during a demonstration at my previous university, what they actually did was allowed your employer and Microsoft to surreptitiously collect information including when and where you accessed the Authenticator App. I demonstrated that when I was at conference in another city, the University’s Microsoft Azure Dynamics platform recorded my location (even with location services completely disabled on my iPhone) every time I synced my email or opened the Authenticator App. They were spying on us, and even the supposedly privacy-conscious Apple were complicit in allowing it to happen.

But there is an even more subtle deceipt going on here that many of the quite frankly insipient CEOs, CIOs, IT managers and helpdesk staff of today are either not trained or experienced enough to see, are corruptly paid to ignore, or simply don’t care about in the ever-present push to get IT costs to completely disappear off the bottom of the balance sheet. That is, the progressive encroachment on user privacy.

While some might say that if it’s your organisation or employer’s machine, it’s their call not yours, since 2020 there has been a decidedly Orwellian encroachment into the personal and private lives of students, academics and even employees. Microsoft added the ability to track and monitor users via Microsoft Dynamics (that platform that was tracking our locations before) and tools like Microsoft Viva Insights and work tracking software built into the Microsoft Office suite of applications. Employers and Microsoft themselves can even track and report on everything you do inside Microsoft Teams. This tracking and reporting across the Microsoft cloud-hosted suite of applications even includes Microsoft-calculated ‘compliance scores’ and a rating of your ‘productivity’.

The latest encroachment on our privacy is that seemingly in response to the run of ‘Microsoft’s cloud has been hacked yet again’ reports in the tech media during the last few month, Microsoft are now telling universities and organisations to force all staff to install Microsoft InTune and other Microsoft ‘tools’ on all systems – because it will make them more secure.

Nobody pauses for a moment to think: but haven’t we heard that too many times before? No, they simply send emails around telling us that, even on our privately owned devices, we will have to install these ‘tools’ if we want to continue accessing the services provided by Microsoft to our organisation, and hence, by our organisation to us.

Does anyone stop to wonder, as we should have with the authenticator apps and other changes, what these ‘tools’ are actually doing? What information they might be surreptitiously collecting and sharing with employers, organisations and Microsoft? Do they care that under the guise of improving security these tools will allow your employer complete control over the security settings, and potentially even which apps you can/cannot install and use, on what is ostensibly your personally owned device?

Do they care that these always-on ‘security’ services when coupled to apps like Teams and the data already collected, mean that Microsoft have essentially total access to everything you do, say, type or view on your device? That all current Microsoft OS and Office apps intentionally operate as keyloggers, that Microsoft keylogger is enabled by default, and both your employer and Microsoft can see everything you type?

Nope. Like sheep, they simply line up at the slaughterhouse gate and do so entirely without question.

Well, I for one have refused. I do not have an employer-provided device and I own both my laptop and smartphone and, as such, I have created an empty virtual machine on my personal laptop in which to connect to my university staff account, Teams and other University-provided systems.

All of my research, web browsing and personal email traffic will occur only outside that virtual machine in the hardware operating system – so no matter what, they will get nothing salient from me because for my own privacy and security 99% of my digital activity will not occur in that virtual machine.

It also means that for the first time in two decades, I have no access to work email on my smartphone because I refuse to allow Microsoft and the university to install and run apps and change the settings on my £1,300 personally owned device. That urgent email about such-and-such when I am away from my office/desk and couldn’t see the otherwise empty ‘work virtual machine’? I never saw it.

And to make matters worse, there are, as I indicated above, too few now who can build IT that could take us back out of the cloud. The brain drain has gone full circle and we are fast becoming trapped in the Orwellian information dystopia. MiniTrue is operating from these cloud providers as much as your local parliamentary office.

Oh well… I hope that important email wasn’t a meme about how Bill Gates is fiddling while the whole planet burns

1. https://www.nursingtimes.net/opinion/are-we-failing-to-fail-students-17-03-2023/

2. https://www.abc.net.au/news/2015-04-20/weak-nursing-students-endangering-public-safety/6405996 and https://go8.edu.au/unis-cannot-afford-to-fail-them

3. Sadly Revera on the North shore of Auckland during the period from 2005 to at least 2010, I am looking at you. So many clients would complain to me that they moved their IT systems or digital assets to Revera’s datacentre on the promise of cheap and well-managed service – only to literally hit a 90’ straight up price hike after as little as the first 6 months of their 3, 5 or more year service agreement. During my IT career I only ever worked on the migration of one client INTO Revera… but I worked on the projects to exfiltrate a double-digit number FROM Revera.
4. University of Southern California’s Lockheed Martin Quantum Computing Centre claimed to have successfully made the first operational quantum computer in 2011.
5. Jacobson and Comiskey developed the eReader for digital books while Jacobson was doing his postdoc in Stanford University’s IT School media lab, and Comiskey was an undergrad math student.

Source: Substack

Please Donate Below To Support Our Ongoing Work To Defend The Scientific Method

PRINCIPIA SCIENTIFIC INTERNATIONAL, legally registered in the UK as a company incorpor

ated for charitable purposes. Head Office: 27 Old Gloucester Street, London WC1N 3AX.

Trackback from your site.

Comments (11)

  • Avatar

    Ken Hughes

    |

    When I read Mechanical Engineering, we had 36 contact hours per week. That’s pretty much a full working week, plus lab report write ups and set “homework”. Considering the nature of lectures, the constant “taking on board” of new information, that’s a mentally pretty tough workload. I now hear of students being awarded “degrees”, “earned” on the basis of one day a week plus an evening class. How the hell can that be anywhere near the same standard? There were thirty six in the final year. Six failed.
    Some degrees today are nothing but untested awards written on rolls of toilet paper.

    Reply

  • Avatar

    VOWG

    |

    The less you know the less you understand and it makes you dumb. The inability to think critically and rationally is a serious problem.

    Reply

    • Avatar

      Jerry Krause

      |

      Hi VOWG,

      I disagree. The problem is the failure to think practically. about one can see if one tries to look.

      Have a good day

      Ha

      Reply

  • Avatar

    Lorraine

    |

    It seems there’s an overall lack of common sense in the mix as well as a fear of hard mental and physical work. Was any great civilization in history built from nothing by intellectuals or academics? Civilization must have its roots in a visionary idea but it is built by the working class with or without formal education.

    Reply

    • Avatar

      Jerry Krause

      |

      Hi VOWG and others,

      Lorraine is coorect, “it is built by the working class with or without formal education.” To survive the working class have to be practical.

      Have a good day

      Reply

      • Avatar

        Jerry krause

        |

        Which came first, the working class or the intellectual philosophers?

        Reply

    • Avatar

      Howdy

      |

      There need be no ‘class’, and this is simply a social construct.

      Intellectuals, in the main seem to talk about whatever makes them popular. Society doesn’t need them, unlike those with problem solving capability.

      Reply

  • Avatar

    Alan

    |

    Education is designed to enable people to do a job and no more. Obeying rules and not thinking are the main objectives. It has always been this way.

    Reply

  • Avatar

    Jakie

    |

    Weren’t John Dewey & fellow Progressives a century ago intent on lowering school standards with the Goal of educating kids just enough to work in factories??? Mind numbing boring repetitive factory work.

    Eliminating critical thinking being job one.

    Reply

Leave a comment

Save my name, email, and website in this browser for the next time I comment.
Share via