Wikiversity's “IT Fundamentals”
After the disappointing “Practical C” published by O'Reilly in 2005 and the realization that the Technical University roughly an hour away from me doesn't put any emphasis on IT at all (no shit, most of their courses are targeting communication, media, and economics students, while the few electrical and electro-technical courses jump straight from mechanical engineering (!) to a tinsy bit of Java in practice!), I put my last hope on Wikiversity to close remaining knowledge holes.
By Zeus, you've got to be kidding me!
Wikiversity offers free access to CompTIA IT Fundamentals, which is part of a certification program of the same name. As a “college-level introductory computer support course”, it's already quite obvious that the stakes aren't particularly high, given that American college education is, on average, equal to Germany's “Sekundarstufe II” (yes, our secondary education is – or rather used to be – tougher). And even excluding formal education, I could already tell that I would get bored quickly due to already being familiar with many topics.
Still, just three pages later and I already encountered my first issue. The section “Introduction” tasks the reader to consider the impact of information technology in their current “student and work environment”. As an introductory course, I have got no idea what kind of fancy answers of authors are expecting, as the questions itself go from the very basics to... “best practices” within computer security:
- What information systems do you use? What data is processed? What information is provided?
The first question already is a tricky one, as the definition of “information system” is dependent on the context. According to Wikipedia, it is seldom used outside of business environments and so usually refers to the technical sides of the business environment itself, rather than being synonymous with “computer system”.
The irony: While the introduction implies a typical work environment, a few pages later it's being specified that this course uses the latter definition. Poor language surely is quite the indicator of the quality of a lecture.
- What infrastructure (devices and networks) do you use?
Two workstations, two laptops, one iPad, one iPhone, all connected to a home network via Wi-Fi (one workstation via Ethernet cable due to lacking wireless networking). Next!
- What applications and software do you use? Which operating systems are you familiar with (PC, mobile)?
Already differentiating between application and software? That's quite bold, considering they are... the same thing. At least PC and mobile operating systes aren't being mixed together, so in that case... Windows (95, 98, 2000, XP, Vista, 7, 8, 8.1, 10), iOS (6 to 14), Linux... Hm, Debian branch, Ubuntu branch, Arch branch, antiX, exotic ones like Puppy–
Okay, I guess I already a overqualified for this certificate.
- What programming experience do you have?
I repeatedly tried to get a “Hello World” in C running, only for gcc to print a binary. I still am trying to figure out why this issue only affects Arch-flavored distributions, as this occurs on both EndeavourOS and the rather pathetic Archcraft. I can write a few scripts I never use, though.
- What databases have you used?
None, and I'd rather not to anymore, now that they all are designed to be permanently online. I own a guide to the East German “REDABAS”, which is just 80's SQL with a different name and color scheme. Though I can't test it myself, it's just much easier to understand because it lacks all the internet stuff.
- How do you apply computer security best practices? Do you have backups for disaster recovery?
Yeah, this clearly is targeting IT professionals, although they already should know any of the covered topics, especially if they already are expected to be familiar with “best practices”, which, surprisingly, gets covered, as well. What even is the point of this course, other than to hand out a dumb certificate?!
Rhetorical question, obviously. Number 2 tasks the reader to complete a tutorial on Owlcation¹, involving conversion of decimal, binary, and hexadecimal values. The course itself doesn't cover the theoretical aspects of each system but merely redirects to a random website that provides nothing but a very short practical guide to each and a few tasks.
[¹]: Prior to this course, I never heard of Owlcation and decided to give it a quick search. While branding itself as a website for academics about academia, I came across multiple low-quality psychology articles and trivial How-To's, including “How to Write A Killer Movie or Music Review”. Given that not even Wikipedia itself is aware of Owlcation's existence, it's a highly dubious source.
Just below the tutorial task, IT Fundamentals lists one calculator for each operating system, implying that one of those should be used for the task.
While Windows, macOS, and “mobile” users – iOS and Android apparently are the same smartphone OS and are free to download “any free programmer or developer calculator” – are all being recommended GUI calculators, Linux users are advised to use the old-school UNIX command bc
. As if there aren't any GUI converters available for Linux at all.
Windows: Review Adelphi: Windows Calculator (https://home.adelphi.edu/~pe16132/csc170/hardware/binary/Binary2.htm). Use the Windows Calculator to convert between decimal, binary, and hexadecimal.
macOS: Review OSX Daily: Access the Scientific Calculator & Programmer Calculator in Mac OS X (https://osxdaily.com/2015/06/14/access-scientific-programmer-calculator-mac-os-x/). Use the macOS Calculator to convert between decimal, binary, and hexadecimal.
Linux: Review Linux Journal: Fancy Tricks for Changing Numeric Base (https://www.linuxjournal.com/content/fancy-tricks-changing-numeric-base). Use bc to convert between decimal, binary, and hexadecimal.
Mobile: Download a free programmer or developer calculator from the app store and use it to convert between decimal, binary, and hexadecimal.
There is no reason given as to what the point of this task is and why it is the second task of this course. Just use a calculator, so you know how to use a calculator program, even though the computer itself is a large calculator?
Whatever happened to IT education, this isn't even being worth of being labeled “college level”. I know American education is poor but come on, this got “don't ask any questions, just do what businesses demand you to do” written all over it!
The third tasks deals with getting familiar with ASCII. Read a Wiki page, then use a converter to convert ASCII to hexadecimal. Why, no idea, just do it.
Unfortunately, I can't because I'm a Linux user and Rapid Tables just doesn't work on LibreWolf. Checking this site on my iPad, I can see why LibreWolf refuses to let the site “convert” values. Canvas, which are often being used for tracking purposes, are being blocked by my browser.
It's fascinating that this course excludes data privacy and downright expects users to accept tracking technology. Who's gonna tell them that this possibly violates European data laws?
Right, okay, topic four. Unicode.
- Linux: Run the gedit application and use the GNOME Character Map utility to select special characters and paste them into gedit.
Yeah, no thanks, I loathe GNOME and so do many Linux users. It's ugly and too heavy for both of my laptops, which run a simple window manager. I also can't be arsed to do more copy-pasting because it's already tiring to copy and paste a bunch of useless tasks.
- Review World Intellectual Property Organization: What is Intellectual Property? (https://www.wipo.int/publications/en/details.jsp?id=4528). Consider the impact of intellectual property rights in your current student or work environment. What trademarks and patents are you familiar with? What copyrighted resources do you regularly access? Do you access these resources legally?
Sci-Hub and Torrent go brrrrr
- Review Wikipedia: Data-informed decision-making. Consider the impact of data collection, data analysis, and data reporting in your current student or work environment. Would you describe the impact as “data based decision making” or “data-informed decision making”? What is the difference between these two approaches from your perspective?
By now, I am considering all of these violations of privacy laws that caused education to decline in the first place. Miss me with that capitalist micromanagement bullshit and go on with the summary:
Overview
- CompTIA IT Fundamentals certification covers IT concepts and terminology, infrastructure, applications and software, software development, database fundamentals, and security.[2]
- IT concepts and terminology include data representation, data processing, and information value.[3]
- Infrastructure includes devices, components, networking, and Internet services.[4]
- Applications and software include operating systems, applications, and uses.[5]
- Software development includes programming languages and program structure.[6]
- Database fundamentals include database concepts and use.[7]
- Security includes security concepts and best practices, as well as business continuity through fault tolerance and disaster recovery.[8]
It reads like a guide to joining a cult. Yeah, I'm outta here.
TL;DR
Any computer-centered course that doesn't start with “bits and bytes” almost always is a bad course. Not really surprising at this point, both received just one single mention each in “Objectives and Skills”, which includes a task in which students have to practice to pair a Bluetooth device to a computer or smartphone. (Honestly, if you're a college student and DON`T know how to use Bluetooth, you're either Amish or plain [REDACTED DUE TO SHEER LENGTH OF THE INSULT]).
While it was shocking to learn that the Technical University in my area offers a course that is called “electrical engineering and information technology”, yet bombards its students with mechanical (!) engineering and loads of irrelevant formulas in the first semester alone (not surprising that former students report a high dropout rate), Wikiversity is just as chaotic and visibly neglected.
Back in 2012, computer scientist and former Wikiversity contributor Debora Weber-Wulff criticized the lack of standards of Wikipedia's sister project, alleging that “anyone can claim to know and teach anything about computer science” (German source). “IT Fundamentals” is one such course that is only meant to sell a worthless credential, with the lowest price tier starting at a whopping $134. CompTIA might claim to be a “non-profit trades association”, however many Trustpilot users are highlighting some of the business' shady practices, including false advertisements, questionable test ratings, poor communication, and a general “no refunds” policy for vouchers.
I guess I'll stick to my trial-and-error approach and kill my own systems a few more times to figure stuff out, as most forums... okay, let's rather not talk about it.