Should how to use computers effectively be taught in mainstream education?
Since computers and computer-like devices are prevalent in most modern societies shouldn't we be teaching people how to use them effectively and for purpose, rather than saying "oh, they'll pick it up" or "they grew up with it, they'll understand it just fine". Both of which, are clearly not the case.
What does tildes think of a mandatory computing class in early grades, and/or several years of classes to master the concepts, like the U.S. does with History, English literature, Math, and Sciences?
Should computers be necessary to learn as an academic subject?
Or is it fine that many people can't do very simple tasks on computers?
Is it fine that they do not understand basic computing concepts? e.g. keyboard shortcuts, searching, folder management
I think the answer is "yes" and very few people, certainly not many who would be drawn to this website, would disagree.
Maybe this should be wrapped in with internet safety and use some fearmongering over elsagate-type shit for a positive outcome. In fact, that seems to be exactly what's happening in a lot of schools which are already making strides in this.
Yeah, with how ubiquitous the Internet is, I would definitely advocate for a computer basics class. Not just how to use common office software and email and stuff, but safety and privacy topics like
Computer usage and basic diagnostics/troubleshooting? Yes.
Programming? No.
The former is- and has been- essential in daily life for over 20 years now.
The latter is not essential in most people's lives, and would be quickly forgotten by a majority of those who take it.
Do not make the mistake of confating the two, like many technology evangelists do. They are not the same thing.
Edit- also I want to say, there are plenty of classes that were cut during the 2008 recession that should come back before making computers mandatory for everyone. These include:
Home Ec / cooking
Personal finance
Automotive classes
Job training courses for high school seniors (such as welding, and yes- computer repair and programming)
How to use Google to solve most of your own IT related problems would be welcomed by most IT departments, I reckon. Each week break something different in each computer (Word table formatting, Excel #REF!, Desktop rotated 90 degrees, etc) in the lab and grade the students on how they went about solving it.
That could be trickier at schools with heavy web filtering, any site with a forum was blocked.
Can't teachers allow access for certain classes? I teach in a Korean high school and my computer lab has granular controls - teachers can allow access to site categories based on need. For example, YouTube/Twitch etc I enable once 80% of the students have finished the day's assignments. It's a bit of a learning curve for most of the others, but it's a DIY setup and not a service.
At least from my experience in U.S. schools, there is no on/off switch, the IT person sets the filter and there really isn't much else to be done.
Though, where I'm at now there is a whole different problem of government filtering, while there is a nice little work-around (not a VPN, but just blocking connections to filtered websites), I don't think I've gotten any students to actually use it. Though I did put it on the computers in the lab, so that's a small victory I suppose.
Day one: how to use a proxy to bypass all that bullshit :p
I don’t disagree about the programming but high school students also learn about differential equations. I think both is ridiculous but maybe a middle ground would be learning about it, without going into detail about actually becoming a programmer/mathematician.
Maybe not make Java programming a required class, but spending a semester or so using scratch and JS would be helpful to most I think, if only so they can learn logic statements.
We spent like a week on BASIC in 5th grade. Nothing wrong with spending a week (or an elective) with people making JS Would you Rather games to learn logic, some code basics and introduce them to something they may want to learn more of and make a career out of.
But HOME EC. Man. That would have been useful.
Yes, but not if the classes are like the one I took in 9th grade.
I know that varying levels of skill exist, so to a certain extent teaching to the lowest common denominator must happen, but things need to be more flexible than the class I took. We had a horrible textbook that took us step-by-step through things like creating a Word or Excel document. "Merge these three cells. Type [Header text] inside them. Make the text bold."
As someone who'd been using computers for most of my life (not even programming, just using programs like Word and Photoshop and surfing the web), it was infuriating. And to be honest, it put me off trying the higher-level courses that would have actually introduced me to programming. This mindless work gave me an utter hatred for computer classes, which is probably why I discovered I enjoyed programming way too late to pursue it in college... so I might be a little bitter, haha.
For me, I think it is less important to teach kids (or even adults) how to use a specific program than it is how to teach them to explore new ones. I learned a ton by just opening up Photoshop, clicking around in menus to see what things did, and searching online to see what I could do. I started to understand the patterns of how menus in different programs worked, so I could more easily find expected functions in new programs. It's a kind of literacy; I'm not saying I'm a genius at it, but based on my ability to pick up new programs at work I'm not terrible at it, either.
I absolutely agree with this.
Most of the very basic "how to computer" classes I've seen were of the "Here's how you use MS Word and Excel to do basic office work" variety. Those are good to know, but the students coming out of the classes still often seem to lack the "computer literacy" to, for example, translate their knowledge of Word into the ability to use LibreOffice competently, or translate concepts between Outlook and GMail, let alone actually manage the installed programs on their own computer.
An English Literature class doesn't just aim to have students memorize various facts and interpretations of the classics, but to teach the ability to understand, critique and analyze any text. I'd be very interested to see a course designed around teaching that kind of empowering literacy for computers and software.
Definitely. It'd take some work, especially if it's a required class and you don't want to bore more advanced users to death, but I think it's doable. (I'd have loved to have been able to test out of things in my hated computer class...)
I think your comment about being able to "translate" skills or concepts to other programs is a good way of describing what I was trying to get at. After posting my initial comment, I kept thinking of other skills that would be applicable to a wide range of programs and tasks-- navigating file systems, wording searches to get the results you need, finding your way around a new website. Basic stuff, but all things I've seen someone struggle with.
It would make sense for the core concepts of computer usage to be taught starting at a young age. The skills are becoming increasingly more vital in the workplace, and I believe that teaching these skills early would remove some of the intimidation that people feel towards working with tech.
In my opinion, a bit of programming and computer knowledge should be brought up just the way you do a bit of calc, trig, chemistry, etc. in highschool. Enough so that you're not clueless in the subject and so that you have an idea of whether or not you want to pursue it later.
My cousin is actually learning logic at 9 years old right now, with one of those graphical "programming" applications, where you connect blocks together to form conditionals, loops, etc.
Basic computer literacy is a requirement for essentially every job in today's world, yet I still see young HS graduates who don't have a clue about how computers work, and are content in sledding by with "I'm just not a computer person".
Speaking as someone who had a class exactly like that as a child; it was fairly useless. I can't say I support it; it'd be much more useful to teach them a new language (actual or programming) or to teach them chemistry.
Counter-point, as someone who had a class exactly like that as a child, it was immensely useful. It got me interested in computing and put me on the path of where I am today. I have found, in my DTD life, learning French much less useful, even being from a country where it is one of our two official languages.
That's exactly the same for me, over twenty years ago in the UK. I've forgotten all my French vocabulary, but the computer skills have just kept on building.
Maybe it's more useful for people in countries where they aren't common-place yet? That'd make sense.
Are you asking if this already happens or if it should happen?
In the former case – it definitely does happen, kids as young as pre-k and kindergarteners have to take those kinds of classes now in some school districts. And I can see why, in some cases the cost of that technology is equal to the cost of textbooks, so it makes sense to go the digital route.
In the latter case, I do think that kids should be introduced to computer literacy from an early age, but I think that there's a difference between showing a kid computer literacy in a controlled classroom environment versus distracting a kid with an ipad for hours at a time the way some parents do at home. A lot of those kids may only have been exposed to computers in the form of smartphones and tablets, but no desktops or laptops depending on their family's economic level.
In terms of tradeoffs a child stands to benefit more from too much computer interaction in a classroom environment than too little, because I don't think the future will be very kind to the technologically illiterate.
I think schools should teach differnt levels building up to being an expert at microsoft office. No matter what you do in your life, it will come in handy. Our office currently holds these classes for business professionals.
My first instinct is to say yes, it should be required. If that were to happen, I think it would be useful to include learning basic troubleshooting. I wonder about the implementation of it, though. As Eva brought up above, it could easily end up being a useless class, especially depending on the computer literacy level of the teacher.
It's so frustrating that people don't know how to troubleshoot. 90% of the of issues I get calls for can easily be solved by poking around basic settings for 5 minutes.
Definitely. Computers have been more and more becoming an integral part of society, with chatting services such as Google Hangouts, Discord, and Skype (However, Skype has been on a rapid decline) and email which students could use to discuss work and projects with, for example.
It's also an important building block in most workspaces, so it would make sense to teach computer literacy and programming like schools in my area are doing already.
I think the math curriculum should be completely rethought to include computers. For example, you could start working with calculus and differential equations about the same time as you do algebra—differential equations can be hard to solve, but once you know the language of algebra, they're not hard to work with using numerical methods on a computer. Programming could be introduced in the context of helping with math problems.
As both a programmer and someone with a better than average understanding of math, I feel like this may not be the best option. Some concepts should be introduced in theory (with practical examples) before being thrown into the practice realm. If not, it could for many students become route memorization of what buttons to push to do what and not why you push each button
If you did it right, it wouldn't be about "pushing buttons". Look at the LOGO programming language, for example. You can draw something approximating a circle using the program "repeat 120 [forward 5 right 3]". This is essentially a differential equation: at each time-step, you go forward a little, then turn a little, and repeat. We don't teach kids field theory before arithmetic, so why teach them differential equations before this sort of "differential arithmetic"?
Anyway, the differential equations thing was just an example. My point is that computers can give us access to an intuitive understanding of mathematical systems without having to build up a complicated theory beforehand. Computers themselves are also an intrinsically mathematical/symbolic phenomenon; later on you can turn things around and build theories about the operation of the computer itself. I think math classes would be a natural place to put computer education.
Yes! I encounter so many college students who don't know how to do basic things such as turn on a computer or work with MS Office. These are things you should know by the time you get to college, but a lot of the students I work with tend to be older adults returning to school or traditional-age students who may have come from less than stellar schools.
Given the importance of having a computer and knowing how to operate one, I'd say yes. This would include topics like system maintanence (how to check for updates and perform them), staying private and secure on the Internet, and possibly a tiny bit of using a shell so they can understand how the system works and how to configure it (whereas the Common People nowadays think a shell is black magic that you should avoid at all costs).
I would also say that if such a thing is implemented, this should be done using free software. Whether exclusively free software, or in combination with, I'm unsure what would be the most feasible. Personally, as a big free software advocate, I'd say exclusively, but this may not be feasible in practice at this moment. At the very least, the operating system to be used should be free software, so to not entangle students into a proprietary ecosystem that is increasingly hard to leave without a trace.
That said, I also would not mind to see use of different operating systems in such scenarios, so that students can be trained how to install an operating system, and understand that there is more than one operating system available in the world.
I'm very much a free software advocate myself, and I try to use as much of it as I can everyday. Though, it seems everyone here (at the school I'm in now and in the others schools I've been to) would rather use pirated Microsoft products than even consider used anything remotely FOSS.
Now that I think about it, though I would really like to make a tech policy and have everyone move to open standards. I might do that for the upcoming year, though getting them all to actually do it, might be an up-hill battle, everyone here likes their Office 2010 with the red bar telling them to purchase a licence. Perhaps it's soothing?
I think it's a combination of a couple things that keeps them using pirated proprietary software over the free alternatives:
Good education, I think, can solve the second point, and force them to reconsider their stance on the third. If educational institutes would supply machines with free software, it would immediately take on the first point. Even better would be if hardware were to be supplied as-is, without preloaded software on it, so the consumer can make their own (sometimes educated) decision on what to do with their hardware. This may force them to put at least some level of effort in to researching the possibilities.
Yes, but I think part of the issue is that no one who is really competent with a computer will want to spend their working hours teaching children, or adults for that matter, how to use computers. It is also an issue that the curriculums become outdated very quickly. I agree with others when they say that some computer know-how is vital for practically everyone, but I don't think it will be easy to ensure that such classes actually teach the students anything worthwhile.
Sorry, but could you please elaborate on why you feel this way?
Surely there are competent people out there who also like to teach, it's not like the two things are diametrically opposed to one another.
The way I see it there are several reasons for this. Firstly, the field of computers is different from other fields in that it is constantly changing. If you are spending your time teaching others about computers, and not actually working in the field yourself, you will fall behind. Secondly, if you are competent you'll make three times as much doing pretty much any other job within IT than you would do as a teacher. I also don't view anyone as competent before they've had some working experience in the field, and I personally doubt that many people would be willing to step down from lucrative IT jobs to become teachers instead. Depending on the country one might need further education aswell in order to become a teacher.
From my limited experience any so-called informatics teacher at a secondary or high school level usually has less knowledge when it comes to the use of computers than their students. It's been a few years since I went to school mind, so things could be better now. I might also be projecting a little, as someone who has worked in IT support there is nothing I would want to do less than have to teach people how to use a keyboard.
The way I look at it, if you're teaching about computers, you would know that it's a constantly changing field^1 and take some time to keep learning about the field.
Surely there are other reasons to do things in life besides aiming for a high score.
Personally, I know I could be making 2 or 3 times my current salary if I took a job in the US, but I would prefer a calmer setting where I can do some actual good, rather than flail against the system, as it were.
And, as a counter-point, I believe some people do actually like/prefer teaching to industry, despite the difference in salary.
People can be decent, nay competent, at things despite not being in that particular field, that's why you see people changing industries. I would argue that there is such a thing as practice that people can get^2 despite not being in the field. Which should be viewed on a case-by-case basis.
Lucrative, perhaps, but after reading /r/talesfromtechsupport it reminds me why I didn't go into that field.
I must say, this does not match my experience.
I understand the sentiment, it's perfectly acceptable to not want to be a teacher, but I don't think it's right to paint all people who like or work in IT with the same broad brush and to call those that never worked in IT incapable of being called competent.
1 I would argue that many fields are changing and educators need to stay up to date, in general
2 hobbies, volunteer work, projects, etc.
edit: How the heck do you do superscript on tildes? It's not in the documentation, HTML tags
<sup></sup>
don't work, markdown from reddit^
doesn't work, what does?I agree with this, but I also think that most people who like or prefer teaching are not going to take an education within computer science first before they find out that they want to teach. Most will study to become teachers. Maybe the solution is to introduce computer science as a course for people studying didactics?
This is true, but it is hard to get hired for an IT job based on hobby projects when the people hiring you know nothing about IT. They will usually require some sort of formal education or experience.
I agree, I realized as I was typing my previous post that I wasn't being very objective.
I somewhat disagree. Concepts like addition in mathematics or basic grammar in english are not going to suddenly become irrelevant, while in informatics many of the toolsets you learn today could become obsolete for anything but maintenance work five years down the line.
Does it necessarily have to be computer science, though? Isn't that mostly programming and math? (For example, at my university, a CS major would not be taking IT classes) Wouldn't it be better to have them also be learning more about the overall field?
Fair point, it seems I've been outside education in the west long enough to forget what is taught. I was generally thinking about disciplines like business or engineering, where, while the math may not change, case studies definitely will. Or even history, where new facts or interpretations of them can emerge. And just education in general seems to be constantly changing.