New university task force works on clearer protocols around use of AI tools in the classroom, provides provisional guidelines ahead of the fall semester

The recent rise in generative artificial intelligence use has pushed universities to address the lack definitive and researched protocols for its use in the classroom.  

On May 1, 2023, the Paul R. MacPherson Institute for Leadership, Innovation and Excellence in Teaching launched their Generative Artificial Intelligence in Teaching and Learning Task Force. The task force’s goal is to better understand the impact of generative AI through an educational lens and develop recommendations for policies around its use for at McMaster University.

"Task Force members representing all six Faculties included faculty, undergraduate and graduate students, staff and senior administrators. The efforts of this diverse group of experts are summarized in a Final Report. . .The Final Report will also include recommendations for continued work across all areas of the University, which may include research, teaching and learning and staff work,” said Kim Dej and Matheus Grasselli, co-chairs of the task force, in a written statement. 

On Sept. 10, they will submit their recommendations to Susan Tighe, provost and vice president (academic), after which they will undergo further review before being released.  

Until this is completed the provisional guidelines have been released by the university to help guide the use of generative AI in the meantime.  

As McMaster prepares to release its specific policies and guide for generative AI, everyone is encouraged to use the provisional guidelines and resources provided on the Generative Artificial Intelligence in Teaching and Learning website. 

Transparency is at the core of these guidelines. Instructors are permitted to integrate generative AI tools, such as Chat GPT, into their courses, if they so choose, but they must communicate clearly with their students the extent to which these tools will be and are permitted to be used.  

When it comes to student work and assessments, while instructors are again permitted to integrate generative AI tools into these tasks, unless told otherwise, students should operate with the assumption that the use of these tools is not permitted. 

If members of the McMaster educational community have any comments or concerns about the Provisional Guideline provided and future guidelines they are encouraged to share through the task forces form

Photo by Catherine Goce

By: Evonne Syed

The topic of integrating artificial intelligence and robots into the workforce rouses the concern of anyone wishing to enter the job market, and the same goes for postsecondary students.

Fortunately, the future is optimistic for students as automation is not expected to prevent graduates from attaining their career goals.

In fact, the rise of automation actually improves career prospects for university graduates, as it is creating a new job market. Forbes Magazine reports that artificial intelligence is predicted to create 58 million jobs as 2022 approaches.

As the popularity of automation systems and the use of artificial intelligence in the workplace becomes more widespread, there will be more and more people required to actually build and develop these systems.

This will open up opportunities for those who wish to enter the fields of robotics and information technology. BBC News anticipates the prominence of data analysts, social media specialists and software developers, as a result.

For this reason, while one may argue that automation has resulted in the elimination of certain jobs, the introduction of automation in the workforce is actually creating more jobs and opportunities in our current digital age.

Luckily, McMaster University has many programs to equip students with the necessary skills to flourish in our digital age. The recent construction of the Hatch Centre shows McMaster’s testament to students advancing in these fields.  

Even if one is not interested in working in the field of automation, that does not mean that they are otherwise at risk of being unable to obtain a job. There is an increasing demand for “human skills” in the workforce since these skills are what distinguish robots from actual human beings.

University graduates tend to seek out careers that require a higher level of education which simply cannot be programmed into automation systems. It would be way too costly and time consuming to teach a robot the knowledge a person has acquired from their post-secondary education.

There are also plenty of skills, academic and otherwise, that students learn and develop through their time at university. Education and experiential opportunities prepare students to apply their knowledge in a variety of situations.

For example, critical thinking skills and problem solving are transferable “soft skills” that employers seek and students develop during their time at university.

Some jobs require humanistic qualities, which are simply not possible for a machine to replicate. For instance, no matter how much technology advances, robots may never be capable of understanding human emotions and experiences.

The interpersonal skills, empathy and compassion that people develop by interacting with one another are skills that are beneficial for the work environment. These skills equip anyone to thrive professionally as the future of the job outlook changes.

Technological advancements such as automation will inevitably impact life as we know it, and that includes changing our work environments. However, these changes are not inherently harmful and the possibilities for post-secondary graduates remain promising.

Students must be proactive, take initiative to educate themselves as much as possible and work on developing these skills. Provided that students make the most of their university experience, and are willing to undergo some extra training to keep their learning sharp, robots are sure to have nothing on them.

 

[thesil_related_posts_sc]Related Posts[/thesil_related_posts_sc]

By Abdullahi Sheikh

Would it surprise you to learn that there is an initiative to attain immortality by the year 2045? A Russian entrepreneur Dmitry Itskov and his team of scientists seek to bring about exactly that. Although it may sound like a pipe dream, just like flying cars were to the ‘90s, maybe you should give it some more thought. We live in a world today where the line between humans and technology is slowly blurring, and it doesn’t seem to be on the road to becoming any clearer in the future.

For example, I doubt you’re aware of a little thing called Project Aiko. It’s a Canadian-made robot from my hometown of Brampton, intended to perform normal house functions and generally serve as a companion.

Although it’s no Megaman, it certainly is an interesting endeavor, and one that only serves to underscore the fact that we truly live in a cyberpunk age. The author of Nueromancer, the quintessential cyberpunk novel, has even been recorded as saying that modern day Tokyo fits his image of a cyberpunk city perfectly.

Now, this is all fine and dandy for a scholarly article type bit, but where’s the opinion? Well I, for one, welcome our new robot overlords. In all honesty, I can’t see any sort of downside to this. Well, at least not one that’s important enough for us to turn back. The thought of our children or our children’s children enjoying life in a world with robots bearing artificial intelligence aiding their day to day life, playing video games in virtual reality and doing God-knows-what-else speaks to both the child and the romantic in me.

I mean, these are things that have captivated me since I was a child, and to this day still make me tremble when I think of how close we are to reaching them.

Things that we have thought were impossible and even unthinkable are now just within the realm of possibility. It may take a decade or two, but the simple knowledge that these developments are within reach is incredibly satisfying. Now personally, I think immortality is a bit much to be aiming for but if you aim for the moon and miss, you still hit stars.

So even if that specific goal is just a bit too high up to reach, who knows what else we will find while we’re up there?

Our parents may not have had the opportunity to see us drive around in flying cars, but maybe we’ll be able to see our kids pilot theirs.

Shashanth Shetty

The Silhouette

 

Way back when, about three of four years ago, a couple of buddies and I went to see a movie called Eagle Eye. Terrible movie. Never rent it. Don’t even bother pirating it. It was a complete waste of the eight-dollar admission fee. The storyline was completely unrealistic, as was the acting of a clearly groggy Shia “I’ve-seen-more-pathos-from-those-robots-you-always-hang-around” Lebouf.

Basically, the plot went a little something like this: Man trusts machine, machine loses trust in man, machine attacks man, man ultimately defeats machine. The main villain in the movie was a super computer named ARIA, and “her” plot was to kill the president of the U.S. How did she intend on accomplishing such a difficult assassination? Interestingly, she relied primarily on using an Unmanned Aerial Vehicle, or a drone. Obviously, in the end, she was not victorious. Lebouf’s character ended up defeating Aria/Megatron/T-1000/whatever and the earth was saved. I promptly went home and spent the day trying to erase the contents of the movie from my mind. What had initially been described as “this year’s number-one sci-fi thriller” had turned out to be something less.

That’s not to say that the movie was a complete loss. It did spark an interest in me that was to follow for quite a while. It got me interested in drones.

Don’t get me wrong. I’m no misanthropic criminal mastermind, nor likewise, an engineer. I have no inclination, nor intent, to ever build or fly one of these. I’m interested in drones solely because I see value in them, because they truly are the future. Not only the future of warfare, but contrastingly, of human rights protection as well.

Let’s start off by doing a bit of explaining, though, shall we? Despite your startling good looks and your clearly impeccable grasp of the English language, I’m inclined to assume that you don’t know much about drones. I’m also inclined to assume that much of what you do know about drones comes from either news reports, or for the less involved, I guess maybe Call of Duty.

Drones have moved far beyond their objective war capabilities, and are now being used for all sorts of purposes, especially the reconnaissance drones. At the same time, drones and drone technology have simultaneously moved from the exclusionary hands of the U.S. Army into the private sphere. Customers can now buy drones from contractors for a pittance. What was once priced at upwards of a million dollars is nowadays priced at less then $600,000 to $700,000 thousand U.S. dollars. Not the predator drones, mind you. Those still cost a fortune. The reconnaissance drones, however, are another story.

You might be thinking, well what of it? What use do I have for a reconnaissance drone? Well, there are several, to be honest. Some of them are drawing investment right now. Greenpeace, for example, has purchased some of them, with the stated purpose of using them to watch Japanese whale hunters.

But why stop there? With a highly focused bird’s eye view of nearly any location, the possibilities become limitless. Think high-def surveillance of the African Savannah, capturing and documenting the hunting grounds and trade routes of poachers. Think fly-byes over the Amazon – some of the world’s most remote locations, now open to discovery. Think video footage of the massacres in Syria, actual real-time evidence of the atrocities Assad and his bunch of thugs are capable of.

No longer would we have to count on the shaky cell phone films for an estimate of the dead and wounded. With drones, we could count the number of dead ourselves, and we could ensure that no face is ever forgotten. And because drones can be operated from neighboring countries, the use of these drones would pose no risk to their controller. NGO organizations would no longer have to choose between keeping their volunteers safe and actually getting results. News organizations would no longer have to worry about protecting their reporters and their camera crew.

Legally, this still remains a murky subject. States are not allowed to use reconnaissance drones against one another; it’s considered a violation of sovereignty. But as control of these drones shifts from the state to the private sector, the legality becomes increasingly less clear and more open to interpretation.

Some nations are dead set against the use of drones in any situation, including reporting, but, of course, those are the states that might have something to lose. They’ve gone so far as to threaten to shoot down any drones that enter their airspace. But as drone prices go down and casualty numbers go up, you can be sure someone will be willing to take the risk to capture definitive proof of human right violations. Drones have changed the nature of the game forever.

And what, you might be asking, is Canada’s position on all this? Well, that remains to be seen. As far as I know, we don’t currently have any drones, reconnaissance or predator, despite all their advantages. Certainly, few of our news organizations and none of our NGOs could afford one. So, we are faced with a choice: as Canadians, we can either embrace drones wholeheartedly, or fall behind forever. Rest assured, those remain our only two choices. We’ve entered the age of drones, and now, there’s no going back.

Subscribe to our Mailing List

© 2024 The Silhouette. All Rights Reserved. McMaster University's Student Newspaper.
magnifiercrossmenu