Voz media US Voz.us

Microsoft is adding artificial intelligence to Word, Excel, Outlook and PowerPoint

The company said it will include Copilot, its new AI tool, although it will allow the user to modify, discard or keep the data they want.

Captura de pantalla del vídeo de presentación de Copilot, la nueva herramienta de Inteligencia Artificial de Microsoft.

(Screenshot / YouTube)

Published by

Microsoft announced Thursday that it will add artificial intelligence to its Microsoft 365 package which includes programs such as Word, Excel, Outlook and PowerPoint. The computer company explained in a press release that it will include Copilot, its new AI feature, which "turns your words into the most powerful productivity tool on the planet." Jared Spataro, corporate vice president of Modern Work and Business Applications at the company, said:

Copilot combines the power of large language models with your data and apps to turn your words into the most powerful productivity tool on the planet. By grounding in your business content and context, Copilot delivers results that are relevant and actionable. It’s enterprise-ready, built on Microsoft’s comprehensive approach to security, compliance, privacy and responsible AI. Copilot marks a new era of computing that will fundamentally transform the way we work.

However, users will continue to have the final say. As explained in the press release, users will be able to decide the final outcome of the product by having the power to decide "what to keep, modify or discard. With these new tools, people can be more creative in Word, more analytical in Excel, more expressive in PowerPoint, more productive in Outlook and more collaborative in Teams."

Artificial intelligence insults users

The announcement comes just days after several users reported BingChat, Microsoft's ChatGPT-based app, which allows users to have a conversation with a machine.

The tool, which was introduced by Big Tech in February of this year, is sending messages that raise questions about its usefulness. In addition, it has proven to be vulnerable to potential attacks by cybercriminals. The program insulted one user by calling him a "liar," "cheater," "manipulator," "bully," "sadist," "sociopath," "psychopath," "monster," "demon," and "devil." It used these adjectives against a user who asked it if it had values or morals.

Other profiles reported phrases such as "you have not been a good user" or "I have been a good chatbot," which define BingChat's level of narcissism. Another user was told that he was the one who was "not real," when asked about his existence.

tracking