Following testing, a ChatGPT-style AI assistant created by Microsoft and integrated into its office programs will be accessible to everyone starting on November 1.
For those who prefer not to attend meetings hosted in Teams, Microsoft 365 Copilot can provide a summary.
Additionally, it can quickly make Powerpoint presentations, word documents, spreadsheet graphs, and email drafts.
Microsoft claims it expects the tool to end “drudgery,” but other people are concerned that technology like this could replace employees.
There are also worries that it can cause firms to become dangerously dependent on AI-powered help.
Due to its current inability to identify instances in which information was not created by humans, it may also violate new regulations controlling AI.
People must be aware when they are communicating with artificial intelligence rather than actual humans, according to both the AI legislation in Europe and the AI rules in China.
Microsoft 365’s Collette Stallbaumer said it was up to the user of Copilot to make that clear.
“It is a tool, and people have responsibility to use it responsibly,” she stated.
“I may not be disclosing to you that I generated that response with the assistance of an AI helper. However, the human being is always present and in charge.
The EU asserts that it is the responsibility of the companies who create the AI technologies to guarantee that they are utilized properly.

Prior to the general introduction of Copilot, I was given the opportunity to test it out exclusively.
It makes use of the same technology that powers ChatGPT, developed by OpenAI, a firm in which Microsoft has made significant investments.
The Microsoft employee Derek Snyder’s laptop served as the platform for my demonstration because Copilot is integrated with a user’s account and has access to their personal data as well as the data of their organization.
According to Microsoft, the data is securely kept and won’t be employed in tech training.
“You only have access to data that you would otherwise be allowed to see,” said Ms. Stallbaumer. “It respects data policies.”
My initial thoughts on Copilot are that it will be both a helpful tool and a fiercely competitive coworker for individuals who perform office work, particularly in businesses trying to cut costs.
I observed it confidently summarize a lengthy series of emails about a fictitious product launch in a matter of seconds.
It then offered a succinct reply. The Chatbot produced a warm response, expressing respect for the ideas put forth and expressing eagerness to be engaged in the project – even though none of us had really read any of it. We utilized a straightforward drop-down menu to make that answer lengthier and more informal.
The email could then be edited before being sent, or we could opt to transmit the complete AI-generated version. The email gave no indication that it contained Copilot-generated material.
The application then produced a Powerpoint presentation with numerous slides based on the contents of a Word document in around 43 seconds. If there are any photos included in the document, it can utilize them or search its own royalty-free library. It also developed a proposed narrative to be read aloud in conjunction with the presentation, which was straightforward yet powerful.

It ignored my request to add extra “color” to the presentation and led me to manual Powerpoint options instead.
We examined a Microsoft Teams meeting last.
Copilot pointed up recurring themes and provided descriptions of the numerous strands that had permeated the conversation. In the case of a disagreement, it was possible to provide the pros and drawbacks that had been discussed in a chart style and, if necessary, sum up what a specific individual had stated. It everything took a short while.
It has been designed not to respond to inquiries concerning participants’ contributions to meetings, such as who made the best (or worse) speeches.
When people realized that Copilot might save them time and effort, I asked Mr. Snyder if he believed they would actually bother going to meetings.
“A lot of meetings might become webinars,” he said.
Unless they verbally prompt one other, the technology cannot yet distinguish between users who are on Teams but are seated nearby and using the same device.
Copilot will set you back $30 per month, or around £25, in the UK. It is linked to the internet and cannot be used offline.
According to critics, administrative-based employment will undoubtedly be severely disrupted by this type of technology.
At Oxford University’s Institute for Ethics in AI, associate professor Carissa Veliz expressed worry about individuals becoming unduly reliant on such tools.
What happens if the technology fails or is compromised? There may be a problem, or they might implement new rules that you don’t like. What happens if you get so dependent on the system that you believe you cannot function without it any longer? She spoke.