New Copilot and agentic experiences make powerful AI easy on Windows 11
Today, we’re taking an exciting step forward with a new wave of updates that make every Windows 11 PC an AI PC – with Copilot at the center of it al
No I’m not, I addressed them. LLMs not being able to do maths/spelling is a known shortcoming. Anyone using it to do that is literally using it wrong. The studies you talk about were ridiculous, I know the ones you’re talking about. Of course people that don’t learn something won’t know how to do it, for example - but the fact that they can do it with AI is a positive. Obviously getting AI to write an essay means that the person will feel less “proud” of their work, as one of the studies said - but that’s not a “bad” thing. Just like how people don’t need to learn how to hunt and gather anymore doesn’t mean that it’s a bad thing - the world as it is, and as it always will be from here on out, means we don’t need to know that unless we want to do it.
Again - AI is a tool, and idiots being able to use it to great effect doesn’t mean that the tool is bad. If anything that’s a showing of how good the tool is.
Those studies aren’t about them feeling less proud, they’re about the degradation of critical thinking skills.
I have repeatedly said that isn’t worth anything largely because it doesn’t do anything I can’t do with relative ease. Why do you think it’s so great? What do you honestly use it for?
As one example I built an MCP server that lets LLMs access a reporting database, and made a Copilot Agent and integrated it into Teams, so now the entire business can ask a chat bot questions, using natural language in teams, about business data. It can run reports for them on demand, pulling in new columns/tables. It can identify when there might be something wrong as it also reads from our logs.
These people don’t know databases. They don’t know how to read debug/error logs.
I also use GitHub copilot.
But sure, it can’t be of any help to anyone ever lol
No I’m not, I addressed them. LLMs not being able to do maths/spelling is a known shortcoming. Anyone using it to do that is literally using it wrong. The studies you talk about were ridiculous, I know the ones you’re talking about. Of course people that don’t learn something won’t know how to do it, for example - but the fact that they can do it with AI is a positive. Obviously getting AI to write an essay means that the person will feel less “proud” of their work, as one of the studies said - but that’s not a “bad” thing. Just like how people don’t need to learn how to hunt and gather anymore doesn’t mean that it’s a bad thing - the world as it is, and as it always will be from here on out, means we don’t need to know that unless we want to do it.
Again - AI is a tool, and idiots being able to use it to great effect doesn’t mean that the tool is bad. If anything that’s a showing of how good the tool is.
Those studies aren’t about them feeling less proud, they’re about the degradation of critical thinking skills.
I have repeatedly said that isn’t worth anything largely because it doesn’t do anything I can’t do with relative ease. Why do you think it’s so great? What do you honestly use it for?
As one example I built an MCP server that lets LLMs access a reporting database, and made a Copilot Agent and integrated it into Teams, so now the entire business can ask a chat bot questions, using natural language in teams, about business data. It can run reports for them on demand, pulling in new columns/tables. It can identify when there might be something wrong as it also reads from our logs.
These people don’t know databases. They don’t know how to read debug/error logs.
I also use GitHub copilot.
But sure, it can’t be of any help to anyone ever lol