Artificial intelligence is not a pandemic

There are concerns that ChatGPT could undermine the educational system, and growing demands for rules governing its use. Gerd Kortemeyer argues that anything but the most common sense regulations could end up being counterproductive. When COVID-19 first hit, we fell over ourselves generating, proclaiming, and retracting rules and restrictions; we produced a torrent of partially contradictory regulations about masks, testing, travel, and vaccinations. Today, certain measures may seem slightly ridiculous, ineffective, or too far-reaching. At the time, though, imperfection was better than inaction: it was imperative to contain the deadly pandemic. AI is not a deadly pandemic, yet we are in danger of once again scrambling to issue draconian rules and regulations to contain its spread. Compared to other disruptive technologies, tools like ChatGPT have admittedly burst into the public domain rather abruptly, but already there are calls for a complete freeze on the development of new AI models.
account creation

TO READ THIS ARTICLE, CREATE YOUR ACCOUNT

And extend your reading, free of charge and with no commitment.



Your Benefits

  • Access to all content
  • Receive newsmails for news and jobs
  • Post ads

myScience