Microsoft Edge Canary has been updated with an interesting feature called Copilot Vision, but it's still in testing.
The current implementation of Copilot in Microsoft Edge is quite helpful as it allows you to quickly send content to the Copilot sidebar. However, it still has certain limitations.
For example, it's not good at understanding what you're doing on a webpage or what you're looking at inside the browser.
Announced on October 1, Copilot Vision lets Copilot understand the webpage you're viewing and help answer questions or suggest next steps, all by using natural language. It's completely optional and only works when you choose to use it.
As spotted by Leo on X, Microsoft has already added the feature in Edge Canary, but it's not ready yet.
In our tests, BleepingComputer observed that Copilot Vision appears at the bottom of the screen, and it can be invoked when you hover over the Copilot Vision flyout. Unfortunately, it doesn't work correctly at the moment.
If you want to try it in your Edge browser, use these steps:
- Open Microsoft Edge Canary.
- In the address bar, type
edge://flags
and press Enter. - In the search box at the top of the page, type Copilot Vision.
- Find the flag called Copilot Vision (Enables the Copilot Vision experience – Mac, Windows, Linux).
- In the dropdown menu next to it, select Enabled Voice + TXT + IMAGE2.
- Click Restart to apply the changes.
If you follow the steps correctly, you'll see the Copilot flyout at the bottom of the browser.
In a blog post, Microsoft previously confirmed that Copilot Vision is supposed to work on a select number of pre-approved websites and won’t work on paywalled or sensitive content.
When you use it, the content is not stored or used for training, and as soon as you close the feature, everything is deleted.
Although Copilot Vision is still in its trial phase and not fully ready for everyone yet, it's an interesting idea that could help you easily understand the webpage content and even interact with it.
Comments
powerspork - 1 month ago
"Unfortunately, it doesn't work correctly at the moment." So we should expect to see this rolled out live in a few days?
This looks like it would be a great feature for those with disabilities if they can get it to work right.